Sample records for processing spreadsheet database

  1. Definition and maintenance of a telemetry database dictionary

    NASA Technical Reports Server (NTRS)

    Knopf, William P. (Inventor)

    2007-01-01

    A telemetry dictionary database includes a component for receiving spreadsheet workbooks of telemetry data over a web-based interface from other computer devices. Another component routes the spreadsheet workbooks to a specified directory on the host processing device. A process then checks the received spreadsheet workbooks for errors, and if no errors are detected the spreadsheet workbooks are routed to another directory to await initiation of a remote database loading process. The loading process first converts the spreadsheet workbooks to comma separated value (CSV) files. Next, a network connection with the computer system that hosts the telemetry dictionary database is established and the CSV files are ported to the computer system that hosts the telemetry dictionary database. This is followed by a remote initiation of a database loading program. Upon completion of loading a flatfile generation program is manually initiated to generate a flatfile to be used in a mission operations environment by the core ground system.

  2. DataSpread: Unifying Databases and Spreadsheets.

    PubMed

    Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya

    2015-08-01

    Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current "pane" (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases.

  3. DataSpread: Unifying Databases and Spreadsheets

    PubMed Central

    Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya

    2015-01-01

    Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current “pane” (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases. PMID:26900487

  4. What Software Skills Do Employers Want Their Employees to Possess?

    ERIC Educational Resources Information Center

    Perry, William

    1998-01-01

    Computer skills were identified and grouped as follows: operating systems, graphical user interface, word processing, spreadsheets, and databases. Responses from 47 of 420 employers rated proficiency in all of these groups essential. Database skills were particularly highly rated. (SK)

  5. Negative Effects of Learning Spreadsheet Management on Learning Database Management

    ERIC Educational Resources Information Center

    Vágner, Anikó; Zsakó, László

    2015-01-01

    A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…

  6. Using Spreadsheets to Teach Statistics in Geography.

    ERIC Educational Resources Information Center

    Lee, M. P.; Soper, J. B.

    1987-01-01

    Maintains that teaching methods of statistical calculation in geography may be enhanced by using a computer spreadsheet. The spreadsheet format of rows and columns allows the data to be inspected and altered to demonstrate various statistical properties. The inclusion of graphics and database facilities further adds to the value of a spreadsheet.…

  7. Software Reviews: Programs Worth a Second Look.

    ERIC Educational Resources Information Center

    Classroom Computer Learning, 1989

    1989-01-01

    Reviews three software programs: (1) "Microsoft Works 2.0": word processing, data processing, and telecommunications, grades 7 and up; (2) "AppleWorks GS": word processor, database, spreadsheet, graphics, and telecommunications, grades 3-12, Apple IIGS; (3) "Choices, Choices: On the Playground, Taking Responsibility":…

  8. CTEPP STANDARD OPERATING PROCEDURE FOR ENTERING OR IMPORTING ELECTRONIC DATA INTO THE CTEPP DATABASE (SOP-4.12)

    EPA Science Inventory

    This SOP described the method used to automatically parse analytical data generated from gas chromatography/mass spectrometry (GC/MS) analyses into CTEPP summary spreadsheets and electronically import the summary spreadsheets into the CTEPP study database.

  9. Diary of a Conversion--Lotus 1-2-3 to Symphony 1.1.

    ERIC Educational Resources Information Center

    Dunnewin, Larry

    1986-01-01

    Describes the uses of Lotus 1-2-3 (a spreadsheet-graphics-database program created by Lotus Development Corporation) and Symphony 1.1 (a refinement and expansion of Symphony 1.01 providing memory efficiency, speed, ease of use, greater file compatibility). Spreadsheet and graphics capabilities, the use of windows, database environment, and…

  10. 78 FR 9721 - Privacy Act of 1974; New System of Records, Office of General Counsel E-Discovery Management...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ... massive emails, word processing documents, PDF files, spreadsheets, presentations, database entries, and....pdf . PURPOSES: OGC-EDMS provides OGC with a method to initiate, track, and manage the collection...

  11. Computer Literacy for Teachers.

    ERIC Educational Resources Information Center

    Sarapin, Marvin I.; Post, Paul E.

    Basic concepts of computer literacy are discussed as they relate to industrial arts/technology education. Computer hardware development is briefly examined, and major software categories are defined, including database management, computer graphics, spreadsheet programs, telecommunications and networking, word processing, and computer assisted and…

  12. Recommended Computer End-User Skills for Business Students by Inc. 500 Executives and Office Systems Educators.

    ERIC Educational Resources Information Center

    Zhao, Jensen J.; Ray, Charles M.; Dye, Lee J.; Davis, Rodney

    1998-01-01

    Executives (n=63) and office-systems educators (n=88) recommended for workers the following categories of computer end-user skills: hardware, operating systems, word processing, spreadsheets, database, desktop publishing, and presentation. (SK)

  13. The UNIX/XENIX Advantage: Applications in Libraries.

    ERIC Educational Resources Information Center

    Gordon, Kelly L.

    1988-01-01

    Discusses the application of the UNIX/XENIX operating system to support administrative office automation functions--word processing, spreadsheets, database management systems, electronic mail, and communications--at the Central Michigan University Libraries. Advantages and disadvantages of the XENIX operating system and system configuration are…

  14. Schools Inc.: An Administrator's Guide to the Business of Education.

    ERIC Educational Resources Information Center

    McCarthy, Bob; And Others

    1989-01-01

    This theme issue describes ways in which educational administrators are successfully automating many of their administrative tasks. Articles focus on student management; office automation, including word processing, databases, and spreadsheets; human resources; support services, including supplies, textbooks, and learning resources; financial…

  15. Pupils, Teachers & Palmtop Computers.

    ERIC Educational Resources Information Center

    Robertson, S. I.; And Others

    1996-01-01

    To examine the effects of introducing portable computers into secondary schools, a study was conducted regarding information technology skills and attitudes of staff and eighth grade students prior to and after receiving individual portable computers. Knowledge and use of word processing, spreadsheets, and database applications increased for both…

  16. The Microcomputer in the Administrative Office.

    ERIC Educational Resources Information Center

    Huntington, Fred

    1983-01-01

    Discusses microcomputer uses for administrative computing in education at site level and central office and recommends that administrators start with a word processing program for time management, an electronic spreadsheet for financial accounting, a database management system for inventories, and self-written programs to alleviate paper…

  17. Applied Educational Computing: Putting Skills to Practice.

    ERIC Educational Resources Information Center

    Thomerson, J. D.

    The College of Education at Valdosta State University (Georgia) developed a followup course to their required entry-level educational computing course. The introductory course covers word processing, spreadsheet, database, presentation, Internet, electronic mail, and operating system software and basic computer concepts. Students expressed a need…

  18. Computerizing Your Program.

    ERIC Educational Resources Information Center

    Curtis, Rick

    This paper summarizes information about using computer hardware and software to aid in making purchase decisions that are based on user needs. The two major options in hardware are IBM-compatible machines and the Apple Macintosh line. The three basic software applications include word processing, database management, and spreadsheet applications.…

  19. Student Computer Use in Selected Undergraduate Agriculture Courses: An Examination of Required Tasks.

    ERIC Educational Resources Information Center

    Johnson, Donald M.; Ferguson, James A.; Vokins, Nancy W.; Lester, Melissa L.

    2000-01-01

    Over 50% of faculty teaching undergraduate agriculture courses (n=58) required use of word processing, Internet, and electronic mail; less than 50% required spreadsheets, databases, graphics, or specialized software. They planned to maintain or increase required computer tasks in their courses. (SK)

  20. Learning about Tasks Computers Can Perform. ERIC Digest.

    ERIC Educational Resources Information Center

    Brosnan, Patricia A.

    Knowing what different kinds of computer equipment can do is the first step in choosing the computer that is right for you. This digest describes a developmental progression of computer capabilities. First the basic three software programs (word processing, spreadsheets, and database programs) are discussed using examples. Next, an explanation of…

  1. Evaluating Technology Integration in the Elementary School: A Site-Based Approach.

    ERIC Educational Resources Information Center

    Mowe, Richard

    This book enables educators at the elementary level to conduct formative evaluations of their technology programs in minimum time. Most of the technology is computer related, including word processing, graphics, desktop publishing, spreadsheets, databases, instructional software, programming, and telecommunications. The design of the book is aimed…

  2. The Computer Bulletin Board. Modified Gran Plots of Very Weak Acids on a Spreadsheet.

    ERIC Educational Resources Information Center

    Chau, F. T.; And Others

    1990-01-01

    Presented are two applications of computer technology to chemistry instruction: the use of a spreadsheet program to analyze acid-base titration curves and the use of database software to catalog stockroom inventories. (CW)

  3. Mission: Define Computer Literacy. The Illinois-Wisconsin ISACS Computer Coordinators' Committee on Computer Literacy Report (May 1985).

    ERIC Educational Resources Information Center

    Computing Teacher, 1985

    1985-01-01

    Defines computer literacy and describes a computer literacy course which stresses ethics, hardware, and disk operating systems throughout. Core units on keyboarding, word processing, graphics, database management, problem solving, algorithmic thinking, and programing are outlined, together with additional units on spreadsheets, simulations,…

  4. Establishing the Content Validity of a Basic Computer Literacy Course.

    ERIC Educational Resources Information Center

    Clements, James; Carifio, James

    1995-01-01

    Content analysis of 13 textbooks and 2 Department of Education documents was conducted to ascertain common word processing, database, and spreadsheet software skills in order to determine which specific skills should be taught in a high school computer literacy course. Aspects of a basic computer course, created from this analysis, are described.…

  5. Development of a Conference Planning Model Using Integrated Database, Word Processing, and Spreadsheet Software.

    ERIC Educational Resources Information Center

    Stevens, William E.

    This report presents a model for conducting a statewide conference for the approximately 900 members of the South Carolina Council of Teachers of Mathematics (SCCTM) using the AppleWorks integrated software as the basis of the implementation plan. The first and second chapters provide background information on the conference and the…

  6. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  7. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  8. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  9. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  10. Livestock Anaerobic Digester Database

    EPA Pesticide Factsheets

    The Anaerobic Digester Database provides basic information about anaerobic digesters on livestock farms in the United States, organized in Excel spreadsheets. It includes projects that are under construction, operating, or shut down.

  11. 2008 rural national transit database

    DOT National Transportation Integrated Search

    2008-01-01

    This spreadsheet includes the following data from the 2008 Rural National Transit Database: : > Sub-Recipient Information : > Service Data : > Revenue Vehicle Inventory : > Counties Served : Each one of the categories above are in worksheets within t...

  12. Database management systems for process safety.

    PubMed

    Early, William F

    2006-03-17

    Several elements of the process safety management regulation (PSM) require tracking and documentation of actions; process hazard analyses, management of change, process safety information, operating procedures, training, contractor safety programs, pre-startup safety reviews, incident investigations, emergency planning, and compliance audits. These elements can result in hundreds of actions annually that require actions. This tracking and documentation commonly is a failing identified in compliance audits, and is difficult to manage through action lists, spreadsheets, or other tools that are comfortably manipulated by plant personnel. This paper discusses the recent implementation of a database management system at a chemical plant and chronicles the improvements accomplished through the introduction of a customized system. The system as implemented modeled the normal plant workflows, and provided simple, recognizable user interfaces for ease of use.

  13. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  14. A pier-scour database: 2,427 field and laboratory measurements of pier scour

    USGS Publications Warehouse

    Benedict, Stephen T.; Caldwell, Andral W.

    2014-01-01

    The U.S. Geological Survey conducted a literature review to identify potential sources of published pier-scour data, and selected data were compiled into a digital spreadsheet called the 2014 USGS Pier-Scour Database (PSDb-2014) consisting of 569 laboratory and 1,858 field measurements. These data encompass a wide range of laboratory and field conditions and represent field data from 23 States within the United States and from 6 other countries. The digital spreadsheet is available on the Internet and offers a valuable resource to engineers and researchers seeking to understand pier-scour relations in the laboratory and field.

  15. Emissions & Generation Resource Integrated Database (eGRID), eGRID2002 (with years 1996 - 2000 data)

    EPA Pesticide Factsheets

    The Emissions & Generation Resource Integrated Database (eGRID) is a comprehensive source of data on the environmental characteristics of almost all electric power generated in the United States. These environmental characteristics include air emissions for nitrogen oxides, sulfur dioxide, carbon dioxide, methane, nitrous oxide, and mercury; emissions rates; net generation; resource mix; and many other attributes. eGRID2002 (years 1996 through 2000 data) contains 16 Excel spreadsheets and the Technical Support Document, as well as the eGRID Data Browser, User's Manual, and Readme file. Archived eGRID data can be viewed as spreadsheets or by using the eGRID Data Browser. The eGRID spreadsheets can be manipulated by data users and enables users to view all the data underlying eGRID. The eGRID Data Browser enables users to view key data using powerful search features. Note that the eGRID Data Browser will not run on a Mac-based machine without Windows emulation.

  16. Development of Spreadsheet-Based Integrated Transaction Processing Systems and Financial Reporting Systems

    NASA Astrophysics Data System (ADS)

    Ariana, I. M.; Bagiada, I. M.

    2018-01-01

    Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).

  17. Determining Faculty Staffing Using Lotus 1-2-3.

    ERIC Educational Resources Information Center

    Ebner, Stanley G.

    1987-01-01

    Discusses how to manipulate a database to create a spreadsheet which can be used to help decide which teaching areas are understaffed and by how much. Focuses on the use of the Lotus 1-2-3 database statistical functions. (TW)

  18. Introducing a New Interface for the Online MagIC Database by Integrating Data Uploading, Searching, and Visualization

    NASA Astrophysics Data System (ADS)

    Jarboe, N.; Minnett, R.; Constable, C.; Koppers, A. A.; Tauxe, L.

    2013-12-01

    The Magnetics Information Consortium (MagIC) is dedicated to supporting the paleomagnetic, geomagnetic, and rock magnetic communities through the development and maintenance of an online database (http://earthref.org/MAGIC/), data upload and quality control, searches, data downloads, and visualization tools. While MagIC has completed importing some of the IAGA paleomagnetic databases (TRANS, PINT, PSVRL, GPMDB) and continues to import others (ARCHEO, MAGST and SECVR), further individual data uploading from the community contributes a wealth of easily-accessible rich datasets. Previously uploading of data to the MagIC database required the use of an Excel spreadsheet using either a Mac or PC. The new method of uploading data utilizes an HTML 5 web interface where the only computer requirement is a modern browser. This web interface will highlight all errors discovered in the dataset at once instead of the iterative error checking process found in the previous Excel spreadsheet data checker. As a web service, the community will always have easy access to the most up-to-date and bug free version of the data upload software. The filtering search mechanism of the MagIC database has been changed to a more intuitive system where the data from each contribution is displayed in tables similar to how the data is uploaded (http://earthref.org/MAGIC/search/). Searches themselves can be saved as a permanent URL, if desired. The saved search URL could then be used as a citation in a publication. When appropriate, plots (equal area, Zijderveld, ARAI, demagnetization, etc.) are associated with the data to give the user a quicker understanding of the underlying dataset. The MagIC database will continue to evolve to meet the needs of the paleomagnetic, geomagnetic, and rock magnetic communities.

  19. Data from fitting Gaussian process models to various data sets using eight Gaussian process software packages.

    PubMed

    Erickson, Collin B; Ankenman, Bruce E; Sanchez, Susan M

    2018-06-01

    This data article provides the summary data from tests comparing various Gaussian process software packages. Each spreadsheet represents a single function or type of function using a particular input sample size. In each spreadsheet, a row gives the results for a particular replication using a single package. Within each spreadsheet there are the results from eight Gaussian process model-fitting packages on five replicates of the surface. There is also one spreadsheet comparing the results from two packages performing stochastic kriging. These data enable comparisons between the packages to determine which package will give users the best results.

  20. The Evolution of Spreadsheets.

    ERIC Educational Resources Information Center

    Schuyler, Michael

    1985-01-01

    Discusses basic features and functions of spreadsheet programs and describes additional capabilities (editing, windowing, graphics, and word processing) of two second-generation spreadsheet programs: Lotus 1-2-3 and Symphony. (MBR)

  1. Tree Height Calculator: An Android App for Estimating Tree Height

    NASA Astrophysics Data System (ADS)

    Burca, V. S.; Htet, N. M.; Huang, X.; de Lanerolle, T. R.; Morelli, R.; Gourley, J. R.

    2011-12-01

    Conventionally, measuring tree height requires a collection of different tools - clinometer, transit, pencil, paper, laptop computer. Results are recorded manually and entered into a spreadsheet or database for future calculation and analysis. Tree Height Calculator is a mobile Android app the integrates the various steps in this process thereby improving the accuracy and dramatically reducing the time required to go from taking measurements to analyzing data. Given the user's height and the distance from the base of the tree (which can be downloaded into the app from a server), the app uses the phone's orientation sensor to calculate the angle of elevation. A simple trigonometric formula is then used to calculate and record the tree's height in the phone's database. When the phone has a WiFi connection, the data are transmitted to a server, from where they can be downloaded directly into a spreadsheet. The application was first tested in an Environmental Science laboratory at Trinity College. On the first trial, 103 data samples were collected, stored, and uploaded to the online database with only couple of dropped data points. On the second trial, 98 data samples were gathered with no loss of data. The app combined the individual measurements taken by the students in the lab, reducing the time required to produce a graph of the class's results from days to hours.

  2. The VirusBanker database uses a Java program to allow flexible searching through Bunyaviridae sequences.

    PubMed

    Fourment, Mathieu; Gibbs, Mark J

    2008-02-05

    Viruses of the Bunyaviridae have segmented negative-stranded RNA genomes and several of them cause significant disease. Many partial sequences have been obtained from the segments so that GenBank searches give complex results. Sequence databases usually use HTML pages to mediate remote sorting, but this approach can be limiting and may discourage a user from exploring a database. The VirusBanker database contains Bunyaviridae sequences and alignments and is presented as two spreadsheets generated by a Java program that interacts with a MySQL database on a server. Sequences are displayed in rows and may be sorted using information that is displayed in columns and includes data relating to the segment, gene, protein, species, strain, sequence length, terminal sequence and date and country of isolation. Bunyaviridae sequences and alignments may be downloaded from the second spreadsheet with titles defined by the user from the columns, or viewed when passed directly to the sequence editor, Jalview. VirusBanker allows large datasets of aligned nucleotide and protein sequences from the Bunyaviridae to be compiled and winnowed rapidly using criteria that are formulated heuristically.

  3. Groups: knowledge spreadsheets for symbolic biocomputing.

    PubMed

    Travers, Michael; Paley, Suzanne M; Shrager, Jeff; Holland, Timothy A; Karp, Peter D

    2013-01-01

    Knowledge spreadsheets (KSs) are a visual tool for interactive data analysis and exploration. They differ from traditional spreadsheets in that rather than being oriented toward numeric data, they work with symbolic knowledge representation structures and provide operations that take into account the semantics of the application domain. 'Groups' is an implementation of KSs within the Pathway Tools system. Groups allows Pathway Tools users to define a group of objects (e.g. groups of genes or metabolites) from a Pathway/Genome Database. Groups can be transformed (e.g. by transforming a metabolite group to the group of pathways in which those metabolites are substrates); combined through set operations; analysed (e.g. through enrichment analysis); and visualized (e.g. by painting onto a metabolic map diagram). Users of the Pathway Tools-based BioCyc.org website have made extensive use of Groups, and an informal survey of Groups users suggests that Groups has achieved the goal of allowing biologists themselves to perform some data manipulations that previously would have required the assistance of a programmer. Database URL: BioCyc.org.

  4. NCBI GEO: mining tens of millions of expression profiles--database and tools update.

    PubMed

    Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Edgar, Ron

    2007-01-01

    The Gene Expression Omnibus (GEO) repository at the National Center for Biotechnology Information (NCBI) archives and freely disseminates microarray and other forms of high-throughput data generated by the scientific community. The database has a minimum information about a microarray experiment (MIAME)-compliant infrastructure that captures fully annotated raw and processed data. Several data deposit options and formats are supported, including web forms, spreadsheets, XML and Simple Omnibus Format in Text (SOFT). In addition to data storage, a collection of user-friendly web-based interfaces and applications are available to help users effectively explore, visualize and download the thousands of experiments and tens of millions of gene expression patterns stored in GEO. This paper provides a summary of the GEO database structure and user facilities, and describes recent enhancements to database design, performance, submission format options, data query and retrieval utilities. GEO is accessible at http://www.ncbi.nlm.nih.gov/geo/

  5. NetpathXL - An Excel Interface to the Program NETPATH

    USGS Publications Warehouse

    Parkhurst, David L.; Charlton, Scott R.

    2008-01-01

    NetpathXL is a revised version of NETPATH that runs under Windows? operating systems. NETPATH is a computer program that uses inverse geochemical modeling techniques to calculate net geochemical reactions that can account for changes in water composition between initial and final evolutionary waters in hydrologic systems. The inverse models also can account for the isotopic composition of waters and can be used to estimate radiocarbon ages of dissolved carbon in ground water. NETPATH relies on an auxiliary, database program, DB, to enter the chemical analyses and to perform speciation calculations that define total concentrations of elements, charge balance, and redox state of aqueous solutions that are then used in inverse modeling. Instead of DB, NetpathXL relies on Microsoft Excel? to enter the chemical analyses. The speciation calculation formerly included in DB is implemented within the program NetpathXL. A program DBXL can be used to translate files from the old DB format (.lon files) to NetpathXL spreadsheets, or to create new NetpathXL spreadsheets. Once users have a NetpathXL spreadsheet with the proper format, new spreadsheets can be generated by copying or saving NetpathXL spreadsheets. In addition, DBXL can convert NetpathXL spreadsheets to PHREEQC input files. New capabilities in PHREEQC (version 2.15) allow solution compositions to be written to a .lon file, and inverse models developed in PHREEQC to be written as NetpathXL .pat and model files. NetpathXL can open NetpathXL spreadsheets, NETPATH-format path files (.pat files), and NetpathXL-format path files (.pat files). Once the speciation calculations have been performed on a spreadsheet file or a .pat file has been opened, the NetpathXL calculation engine is identical to the original NETPATH. Development of models and viewing results in NetpathXL rely on keyboard entry as in NETPATH.

  6. The VirusBanker database uses a Java program to allow flexible searching through Bunyaviridae sequences

    PubMed Central

    Fourment, Mathieu; Gibbs, Mark J

    2008-01-01

    Background Viruses of the Bunyaviridae have segmented negative-stranded RNA genomes and several of them cause significant disease. Many partial sequences have been obtained from the segments so that GenBank searches give complex results. Sequence databases usually use HTML pages to mediate remote sorting, but this approach can be limiting and may discourage a user from exploring a database. Results The VirusBanker database contains Bunyaviridae sequences and alignments and is presented as two spreadsheets generated by a Java program that interacts with a MySQL database on a server. Sequences are displayed in rows and may be sorted using information that is displayed in columns and includes data relating to the segment, gene, protein, species, strain, sequence length, terminal sequence and date and country of isolation. Bunyaviridae sequences and alignments may be downloaded from the second spreadsheet with titles defined by the user from the columns, or viewed when passed directly to the sequence editor, Jalview. Conclusion VirusBanker allows large datasets of aligned nucleotide and protein sequences from the Bunyaviridae to be compiled and winnowed rapidly using criteria that are formulated heuristically. PMID:18251994

  7. Spreadsheets for Analyzing and Optimizing Space Missions

    NASA Technical Reports Server (NTRS)

    Some, Raphael R.; Agrawal, Anil K.; Czikmantory, Akos J.; Weisbin, Charles R.; Hua, Hook; Neff, Jon M.; Cowdin, Mark A.; Lewis, Brian S.; Iroz, Juana; Ross, Rick

    2009-01-01

    XCALIBR (XML Capability Analysis LIBRary) is a set of Extensible Markup Language (XML) database and spreadsheet- based analysis software tools designed to assist in technology-return-on-investment analysis and optimization of technology portfolios pertaining to outer-space missions. XCALIBR is also being examined for use in planning, tracking, and documentation of projects. An XCALIBR database contains information on mission requirements and technological capabilities, which are related by use of an XML taxonomy. XCALIBR incorporates a standardized interface for exporting data and analysis templates to an Excel spreadsheet. Unique features of XCALIBR include the following: It is inherently hierarchical by virtue of its XML basis. The XML taxonomy codifies a comprehensive data structure and data dictionary that includes performance metrics for spacecraft, sensors, and spacecraft systems other than sensors. The taxonomy contains >700 nodes representing all levels, from system through subsystem to individual parts. All entries are searchable and machine readable. There is an intuitive Web-based user interface. The software automatically matches technologies to mission requirements. The software automatically generates, and makes the required entries in, an Excel return-on-investment analysis software tool. The results of an analysis are presented in both tabular and graphical displays.

  8. Spreadsheet-Like Image Analysis

    DTIC Science & Technology

    1992-08-01

    1 " DTIC AD-A254 395 S LECTE D, ° AD-E402 350 Technical Report ARPAD-TR-92002 SPREADSHEET-LIKE IMAGE ANALYSIS Paul Willson August 1992 U.S. ARMY...August 1992 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS SPREADSHEET-LIKE IMAGE ANALYSIS 6. AUTHOR(S) Paul Willson 7. PERFORMING ORGANIZATION NAME(S) AND...14. SUBJECT TERMS 15. NUMBER OF PAGES Image analysis , nondestructive inspection, spreadsheet, Macintosh software, 14 neural network, signal processing

  9. The EnzymeTracker: an open-source laboratory information management system for sample tracking.

    PubMed

    Triplet, Thomas; Butler, Gregory

    2012-01-26

    In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50 scientists. The EnzymeTracker is freely available online at http://cubique.fungalgenomics.ca/enzymedb/index.html under the GNU GPLv3 license.

  10. The EnzymeTracker: an open-source laboratory information management system for sample tracking

    PubMed Central

    2012-01-01

    Background In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. Results In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. Conclusions The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50 scientists. The EnzymeTracker is freely available online at http://cubique.fungalgenomics.ca/enzymedb/index.html under the GNU GPLv3 license. PMID:22280360

  11. Economic Comparison of Processes Using Spreadsheet Programs

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Pappano, A. W.; Jennings, C. N.

    1986-01-01

    Inexpensive approach aids plant-design decisions. Commercially available electronic spreadsheet programs aid economic comparison of different processes for producing particular end products. Facilitates plantdesign decisions without requiring large expenditures for powerful mainframe computers.

  12. The Growing Problems with Spreadsheet Budgeting

    ERIC Educational Resources Information Center

    Solomon, Jeff; Johnson, Stella; Wilcox, Leon; Olson, Tom

    2010-01-01

    The ubiquitous spreadsheet in some version has been the sole and unrivaled instrument of financial management for decades. And it has served well. The spreadsheet provides the flexibility to design a unique business process. It allows users to create formulas that execute complex calculations, and it is available in the globally standardized Excel…

  13. A Generic Data Harmonization Process for Cross-linked Research and Network Interaction. Construction and Application for the Lung Cancer Phenotype Database of the German Center for Lung Research.

    PubMed

    Firnkorn, D; Ganzinger, M; Muley, T; Thomas, M; Knaup, P

    2015-01-01

    Joint data analysis is a key requirement in medical research networks. Data are available in heterogeneous formats at each network partner and their harmonization is often rather complex. The objective of our paper is to provide a generic approach for the harmonization process in research networks. We applied the process when harmonizing data from three sites for the Lung Cancer Phenotype Database within the German Center for Lung Research. We developed a spreadsheet-based solution as tool to support the harmonization process for lung cancer data and a data integration procedure based on Talend Open Studio. The harmonization process consists of eight steps describing a systematic approach for defining and reviewing source data elements and standardizing common data elements. The steps for defining common data elements and harmonizing them with local data definitions are repeated until consensus is reached. Application of this process for building the phenotype database led to a common basic data set on lung cancer with 285 structured parameters. The Lung Cancer Phenotype Database was realized as an i2b2 research data warehouse. Data harmonization is a challenging task requiring informatics skills as well as domain knowledge. Our approach facilitates data harmonization by providing guidance through a uniform process that can be applied in a wide range of projects.

  14. Digital Archiving: Where the Past Lives Again

    NASA Astrophysics Data System (ADS)

    Paxson, K. B.

    2012-06-01

    The process of digital archiving for variable star data by manual entry with an Excel spreadsheet is described. Excel-based tools including a Step Magnitude Calculator and a Julian Date Calculator for variable star observations where magnitudes and Julian dates have not been reduced are presented. Variable star data in the literature and the AAVSO International Database prior to 1911 are presented and reviewed, with recent archiving work being highlighted. Digitization using optical character recognition software conversion is also demonstrated, with editing and formatting suggestions for the OCR-converted text.

  15. Office Computer Software: A Comprehensive Review of Software Programs.

    ERIC Educational Resources Information Center

    Secretary, 1992

    1992-01-01

    Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)

  16. Gravity Data for West-Central Colorado

    DOE Data Explorer

    Richard Zehner

    2012-04-06

    Modeled Bouger-Corrected Gravity data was extracted from the Pan American Center for Earth and Environmental Studies Gravity Database of the U.S. at http://irpsrvgis08.utep.edu/viewers/Flex/GravityMagnetic/GravityMagnetic_CyberShare/ on 2/29/2012. The downloaded text file was opened in an Excel spreadsheet. This spreadsheet data was then converted into an ESRI point shapefile in UTM Zone 13 NAD27 projection, showing location and gravity (in milligals). This data was then converted to grid and then contoured using ESRI Spatial Analyst. Data from From University of Texas: Pan American Center for Earth and Environmental Studies

  17. The Computerized Reference Department: Buying the Future.

    ERIC Educational Resources Information Center

    Kriz, Harry M.; Kok, Victoria T.

    1985-01-01

    Basis for systematic computerization of academic research library's reference, collection development, and collection management functions emphasizes productivity enhancement for librarians and support staff. Use of microcomputer and university's mainframe computer to develop applications of database management systems, electronic spreadsheets,…

  18. Synthesizer: Expediting synthesis studies from context-free data with information retrieval techniques.

    PubMed

    Gandy, Lisa M; Gumm, Jordan; Fertig, Benjamin; Thessen, Anne; Kennish, Michael J; Chavan, Sameer; Marchionni, Luigi; Xia, Xiaoxin; Shankrit, Shambhavi; Fertig, Elana J

    2017-01-01

    Scientists have unprecedented access to a wide variety of high-quality datasets. These datasets, which are often independently curated, commonly use unstructured spreadsheets to store their data. Standardized annotations are essential to perform synthesis studies across investigators, but are often not used in practice. Therefore, accurately combining records in spreadsheets from differing studies requires tedious and error-prone human curation. These efforts result in a significant time and cost barrier to synthesis research. We propose an information retrieval inspired algorithm, Synthesize, that merges unstructured data automatically based on both column labels and values. Application of the Synthesize algorithm to cancer and ecological datasets had high accuracy (on the order of 85-100%). We further implement Synthesize in an open source web application, Synthesizer (https://github.com/lisagandy/synthesizer). The software accepts input as spreadsheets in comma separated value (CSV) format, visualizes the merged data, and outputs the results as a new spreadsheet. Synthesizer includes an easy to use graphical user interface, which enables the user to finish combining data and obtain perfect accuracy. Future work will allow detection of units to automatically merge continuous data and application of the algorithm to other data formats, including databases.

  19. Synthesizer: Expediting synthesis studies from context-free data with information retrieval techniques

    PubMed Central

    Gumm, Jordan; Fertig, Benjamin; Thessen, Anne; Kennish, Michael J.; Chavan, Sameer; Marchionni, Luigi; Xia, Xiaoxin; Shankrit, Shambhavi; Fertig, Elana J.

    2017-01-01

    Scientists have unprecedented access to a wide variety of high-quality datasets. These datasets, which are often independently curated, commonly use unstructured spreadsheets to store their data. Standardized annotations are essential to perform synthesis studies across investigators, but are often not used in practice. Therefore, accurately combining records in spreadsheets from differing studies requires tedious and error-prone human curation. These efforts result in a significant time and cost barrier to synthesis research. We propose an information retrieval inspired algorithm, Synthesize, that merges unstructured data automatically based on both column labels and values. Application of the Synthesize algorithm to cancer and ecological datasets had high accuracy (on the order of 85–100%). We further implement Synthesize in an open source web application, Synthesizer (https://github.com/lisagandy/synthesizer). The software accepts input as spreadsheets in comma separated value (CSV) format, visualizes the merged data, and outputs the results as a new spreadsheet. Synthesizer includes an easy to use graphical user interface, which enables the user to finish combining data and obtain perfect accuracy. Future work will allow detection of units to automatically merge continuous data and application of the algorithm to other data formats, including databases. PMID:28437440

  20. PRIDE: new developments and new datasets.

    PubMed

    Jones, Philip; Côté, Richard G; Cho, Sang Yun; Klie, Sebastian; Martens, Lennart; Quinn, Antony F; Thorneycroft, David; Hermjakob, Henning

    2008-01-01

    The PRIDE (http://www.ebi.ac.uk/pride) database of protein and peptide identifications was previously described in the NAR Database Special Edition in 2006. Since this publication, the volume of public data in the PRIDE relational database has increased by more than an order of magnitude. Several significant public datasets have been added, including identifications and processed mass spectra generated by the HUPO Brain Proteome Project and the HUPO Liver Proteome Project. The PRIDE software development team has made several significant changes and additions to the user interface and tool set associated with PRIDE. The focus of these changes has been to facilitate the submission process and to improve the mechanisms by which PRIDE can be queried. The PRIDE team has developed a Microsoft Excel workbook that allows the required data to be collated in a series of relatively simple spreadsheets, with automatic generation of PRIDE XML at the end of the process. The ability to query PRIDE has been augmented by the addition of a BioMart interface allowing complex queries to be constructed. Collaboration with groups outside the EBI has been fruitful in extending PRIDE, including an approach to encode iTRAQ quantitative data in PRIDE XML.

  1. Informatics applied to cytology

    PubMed Central

    Hornish, Maryanne; Goulart, Robert A.

    2008-01-01

    Automation and emerging information technologies are being adopted by cytology laboratories to augment Pap test screening and improve diagnostic accuracy. As a result, informatics, the application of computers and information systems to information management, has become essential for the successful operation of the cytopathology laboratory. This review describes how laboratory information management systems can be used to achieve an automated and seamless workflow process. The utilization of software, electronic databases and spreadsheets to perform necessary quality control measures are discussed, as well as a Lean production system and Six Sigma approach, to reduce errors in the cytopathology laboratory. PMID:19495402

  2. 12 CFR 1732.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... home computer systems of an employee; or (4) Whether the information is active or inactive. (k) Record... (e.g., e-mail, databases, spreadsheets, PowerPoint presentations, electronic reporting systems... information is stored or located, including network servers, desktop or laptop computers and handheld...

  3. WASP Model FAQ

    EPA Pesticide Factsheets

    Contains frequently asked questions: Is there an Email Support Group for WASP, Do I Need Admin Rights to Install, How to Run WASP after Installation, Can I use my WASP7 File, Attaching to an Excel Spreadsheet or Access Database, Converting QUAL2K to WASP

  4. Comprehensive Data Collected from the Petroleum Refining Sector

    EPA Pesticide Factsheets

    On April 1, 2011 EPA sent a comprehensive industry-wide information collection request (ICR) to all facilities in the U.S. petroleum refining industry. EPA has received this ICR data and compiled these data into databases and spreadsheets for the web

  5. 40 CFR 63.11519 - What are my notification, recordkeeping, and reporting requirements?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... this section. (2) Dates. Unless the Administrator has approved or agreed to a different schedule for... records may be maintained as electronic spreadsheets or as a database. (ii) As specified in § 63.10(b)(1...

  6. Data regarding hydraulic fracturing distributions and treatment fluids, additives, proppants, and water volumes applied to wells drilled in the United States from 1947 through 2010

    USGS Publications Warehouse

    Gallegos, Tanya J.; Varela, Brian A.

    2015-01-01

    Comprehensive, published, and publicly available data regarding the extent, location, and character of hydraulic fracturing in the United States are scarce. The objective of this data series is to publish data related to hydraulic fracturing in the public domain. The spreadsheets released with this data series contain derivative datasets aggregated temporally and spatially from the commercial and proprietary IHS database of U.S. oil and gas production and well data (IHS Energy, 2011). These datasets, served in 21 spreadsheets in Microsoft Excel (.xlsx) format, outline the geographical distributions of hydraulic fracturing treatments and associated wells (including well drill-hole directions) as well as water volumes, proppants, treatment fluids, and additives used in hydraulic fracturing treatments in the United States from 1947 through 2010. This report also describes the data—extraction/aggregation processing steps, field names and descriptions, field types and sources. An associated scientific investigation report (Gallegos and Varela, 2014) provides a detailed analysis of the data presented in this data series and comparisons of the data and trends to the literature.

  7. Ground Magnetic Data for West-Central Colorado

    DOE Data Explorer

    Richard Zehner

    2012-03-08

    Modeled ground magnetic data was extracted from the Pan American Center for Earth and Environmental Studies database at http://irpsrvgis08.utep.edu/viewers/Flex/GravityMagnetic/GravityMagnetic_CyberShare/ on 2/29/2012. The downloaded text file was then imported into an Excel spreadsheet. This spreadsheet data was converted into an ESRI point shapefile in UTM Zone 13 NAD27 projection, showing location and magnetic field strength in nano-Teslas. This point shapefile was then interpolated to an ESRI grid using an inverse-distance weighting method, using ESRI Spatial Analyst. The grid was used to create a contour map of magnetic field strength.

  8. Strontium-90 Error Discovered in Subcontract Laboratory Spreadsheet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. D. Brown A. S. Nagel

    1999-07-31

    West Valley Demonstration Project health physicists and environment scientists discovered a series of errors in a subcontractor's spreadsheet being used to reduce data as part of their strontium-90 analytical process.

  9. Improvements to the Magnetics Information Consortium (MagIC) Paleo and Rock Magnetic Database

    NASA Astrophysics Data System (ADS)

    Jarboe, N.; Minnett, R.; Tauxe, L.; Koppers, A. A. P.; Constable, C.; Jonestrask, L.

    2015-12-01

    The Magnetic Information Consortium (MagIC) database (http://earthref.org/MagIC/) continues to improve the ease of data uploading and editing, the creation of complex searches, data visualization, and data downloads for the paleomagnetic, geomagnetic, and rock magnetic communities. Online data editing is now available and the need for proprietary spreadsheet software is therefore entirely negated. The data owner can change values in the database or delete entries through an HTML 5 web interface that resembles typical spreadsheets in behavior and uses. Additive uploading now allows for additions to data sets to be uploaded with a simple drag and drop interface. Searching the database has improved with the addition of more sophisticated search parameters and with the facility to use them in complex combinations. A comprehensive summary view of a search result has been added for increased quick data comprehension while a raw data view is available if one desires to see all data columns as stored in the database. Data visualization plots (ARAI, equal area, demagnetization, Zijderveld, etc.) are presented with the data when appropriate to aid the user in understanding the dataset. MagIC data associated with individual contributions or from online searches may be downloaded in the tab delimited MagIC text file format for susbsequent offline use and analysis. With input from the paleomagnetic, geomagnetic, and rock magnetic communities, the MagIC database will continue to improve as a data warehouse and resource.

  10. Science: Database Programs and the Study of Seashells.

    ERIC Educational Resources Information Center

    McCurry, Niki; McCurry, Alan

    1992-01-01

    Discusses the dynamics and outcomes of an unplanned classroom activity that developed from the integration of the use of spreadsheets with the study of the characteristics of previously collected seashells, specifically their color, size, shape, texture, and any other obvious differences. (JJK)

  11. Web-based X-ray quality control documentation.

    PubMed

    David, George; Burnett, Lou Ann; Schenkel, Robert

    2003-01-01

    The department of radiology at the Medical College of Georgia Hospital and Clinics has developed an equipment quality control web site. Our goal is to provide immediate access to virtually all medical physics survey data. The web site is designed to assist equipment engineers, department management and technologists. By improving communications and access to equipment documentation, we believe productivity is enhanced. The creation of the quality control web site was accomplished in three distinct steps. First, survey data had to be placed in a computer format. The second step was to convert these various computer files to a format supported by commercial web browsers. Third, a comprehensive home page had to be designed to provide convenient access to the multitude of surveys done in the various x-ray rooms. Because we had spent years previously fine-tuning the computerization of the medical physics quality control program, most survey documentation was already in spreadsheet or database format. A major technical decision was the method of conversion of survey spreadsheet and database files into documentation appropriate for the web. After an unsatisfactory experience with a HyperText Markup Language (HTML) converter (packaged with spreadsheet and database software), we tried creating Portable Document Format (PDF) files using Adobe Acrobat software. This process preserves the original formatting of the document and takes no longer than conventional printing; therefore, it has been very successful. Although the PDF file generated by Adobe Acrobat is a proprietary format, it can be displayed through a conventional web browser using the freely distributed Adobe Acrobat Reader program that is available for virtually all platforms. Once a user installs the software, it is automatically invoked by the web browser whenever the user follows a link to a file with a PDF extension. Although no confidential patient information is available on the web site, our legal department recommended that we secure the site in order to keep out those wishing to make mischief. Our interim solution has not been to password protect the page, which we feared would hinder access for occasional legitimate users, but also not to provide links to it from other hospital and department pages. Utility and productivity were improved and time and money were saved by making radiological equipment quality control documentation instantly available on-line.

  12. NETMARK

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Koga, Dennis (Technical Monitor)

    2002-01-01

    This presentation discuss NASA's proposed NETMARK knowledge management tool which aims 'to control and interoperate with every block in a document, email, spreadsheet, power point, database, etc. across the lifecycle'. Topics covered include: system software requirements and hardware requirements, seamless information systems, computer architecture issues, and potential benefits to NETMARK users.

  13. Soda Lake Well Lithology Data and Geologic Cross-Sections

    DOE Data Explorer

    Faulds, James E.

    2013-12-31

    Comprehensive catalogue of drill‐hole data in spreadsheet, shapefile, and Geosoft database formats. Includes XYZ locations of well heads, year drilled, type of well, operator, total depths, well path data (deviations), lithology logs, and temperature data. Plus, 13 cross‐sections in Adobe Illustrator format.

  14. Biographical Study and Hypothesis Testing. Instructional Technology.

    ERIC Educational Resources Information Center

    Little, Timothy H.

    1995-01-01

    Asserts that the story of Amelia Earhart holds an ongoing fascination for students. Presents an instructional unit using a spreadsheet to create a database about Earhart's final flight. Includes student objectives, step-by-step instructional procedures, and eight graphics of student information or teacher examples. (CFR)

  15. "Utstein style" spreadsheet and database programs based on Microsoft Excel and Microsoft Access software for CPR data management of in-hospital resuscitation.

    PubMed

    Adams, Bruce D; Whitlock, Warren L

    2004-04-01

    In 1997, The American Heart Association in association with representatives of the International Committee on Resuscitation (ILCOR) published recommended guidelines for reviewing, reporting and conducting in-hospital cardiopulmonary resuscitation (CPR) outcomes using the "Utstein style". Using these guidelines, we developed two Microsoft Office based database management programs that may be useful to the resuscitation community. We developed a user-friendly spreadsheet based on MS Office Excel. The user enters patient variables such as name, age, and diagnosis. Then, event resuscitation variables such as time of collapse and CPR team arrival are entered from a "code flow sheet". Finally, outcome variables such as patient condition at different time points are recorded. The program then makes automatic calculations of average response times, survival rates and other important outcome measurements. Also using the Utstein style, we developed a database program based on MS Office Access. To promote free public access to these programs, we established at a website. These programs will help hospitals track, analyze, and present their CPR outcomes data. Clinical CPR researchers might also find the programs useful because they are easily modified and have statistical functions.

  16. Learning to Work with Databases in Astronomy: Quantitative Analysis of Science Educators' and Students' Pre-/Post-Tests

    NASA Astrophysics Data System (ADS)

    Schwortz, Andria C.; Burrows, Andrea C.; Myers, Adam D.

    2015-01-01

    Astronomy is increasingly moving towards working with large databases, from the state-of-the-art Sloan Digital Sky Survey Data Release 10, to the historical Digital Access to a Sky Century at Harvard. Non-astronomy fields as well tend to work with large datasets, be it in the form of warehouse inventory, health trends, or the stock market. However very few fields explicitly teach students the necessary skills to analyze such data. The authors studied a matched set of 37 participants working with 200-entry databases in astronomy using Google Spreadsheets, with limited information about a random set of quasars drawn from SDSS DR5. Here the authors present the quantitative results from an eight question pre-/post-test, with questions designed to span Bloom's taxonomy, on both the topics of the skills of using spreadsheets, and the content of quasars. Participants included both Astro 101 summer students and professionals including in-service K-12 teachers and science communicators. All groups showed statistically significant gains (as per Hake, 1998), with the greatest difference between women's gains of 0.196 and men's of 0.480.

  17. Engine Icing Data - An Analytics Approach

    NASA Technical Reports Server (NTRS)

    Fitzgerald, Brooke A.; Flegel, Ashlie B.

    2017-01-01

    Engine icing researchers at the NASA Glenn Research Center use the Escort data acquisition system in the Propulsion Systems Laboratory (PSL) to generate and collect a tremendous amount of data every day. Currently these researchers spend countless hours processing and formatting their data, selecting important variables, and plotting relationships between variables, all by hand, generally analyzing data in a spreadsheet-style program (such as Microsoft Excel). Though spreadsheet-style analysis is familiar and intuitive to many, processing data in spreadsheets is often unreproducible and small mistakes are easily overlooked. Spreadsheet-style analysis is also time inefficient. The same formatting, processing, and plotting procedure has to be repeated for every dataset, which leads to researchers performing the same tedious data munging process over and over instead of making discoveries within their data. This paper documents a data analysis tool written in Python hosted in a Jupyter notebook that vastly simplifies the analysis process. From the file path of any folder containing time series datasets, this tool batch loads every dataset in the folder, processes the datasets in parallel, and ingests them into a widget where users can search for and interactively plot subsets of columns in a number of ways with a click of a button, easily and intuitively comparing their data and discovering interesting dynamics. Furthermore, comparing variables across data sets and integrating video data (while extremely difficult with spreadsheet-style programs) is quite simplified in this tool. This tool has also gathered interest outside the engine icing branch, and will be used by researchers across NASA Glenn Research Center. This project exemplifies the enormous benefit of automating data processing, analysis, and visualization, and will help researchers move from raw data to insight in a much smaller time frame.

  18. A web-based relational database for monitoring and analyzing mosquito population dynamics.

    PubMed

    Sucaet, Yves; Van Hemert, John; Tucker, Brad; Bartholomay, Lyric

    2008-07-01

    Mosquito population dynamics have been monitored on an annual basis in the state of Iowa since 1969. The primary goal of this project was to integrate light trap data from these efforts into a centralized back-end database and interactive website that is available through the internet at http://iowa-mosquito.ent.iastate.edu. For comparative purposes, all data were categorized according to the week of the year and normalized according to the number of traps running. Users can readily view current, weekly mosquito abundance compared with data from previous years. Additional interactive capabilities facilitate analyses of the data based on mosquito species, distribution, or a time frame of interest. All data can be viewed in graphical and tabular format and can be downloaded to a comma separated value (CSV) file for import into a spreadsheet or more specialized statistical software package. Having this long-term dataset in a centralized database/website is useful for informing mosquito and mosquito-borne disease control and for exploring the ecology of the species represented therein. In addition to mosquito population dynamics, this database is available as a standardized platform that could be modified and applied to a multitude of projects that involve repeated collection of observational data. The development and implementation of this tool provides capacity for the user to mine data from standard spreadsheets into a relational database and then view and query the data in an interactive website.

  19. The opportunities and obstacles in developing a vascular birthmark database for clinical and research use.

    PubMed

    Sharma, Vishal K; Fraulin, Frankie Og; Harrop, A Robertson; McPhalen, Donald F

    2011-01-01

    Databases are useful tools in clinical settings. The authors review the benefits and challenges associated with the development and implementation of an efficient electronic database for the multidisciplinary Vascular Birthmark Clinic at the Alberta Children's Hospital, Calgary, Alberta. The content and structure of the database were designed using the technical expertise of a data analyst from the Calgary Health Region. Relevant clinical and demographic data fields were included with the goal of documenting ongoing care of individual patients, and facilitating future epidemiological studies of this patient population. After completion of this database, 10 challenges encountered during development were retrospectively identified. Practical solutions for these challenges are presented. THE CHALLENGES IDENTIFIED DURING THE DATABASE DEVELOPMENT PROCESS INCLUDED: identification of relevant data fields; balancing simplicity and user-friendliness with complexity and comprehensive data storage; database expertise versus clinical expertise; software platform selection; linkage of data from the previous spreadsheet to a new data management system; ethics approval for the development of the database and its utilization for research studies; ensuring privacy and limited access to the database; integration of digital photographs into the database; adoption of the database by support staff in the clinic; and maintaining up-to-date entries in the database. There are several challenges involved in the development of a useful and efficient clinical database. Awareness of these potential obstacles, in advance, may simplify the development of clinical databases by others in various surgical settings.

  20. Geologic datasets for weights-of-evidence analysis in northeast Washington: 2. Mineral databases

    USGS Publications Warehouse

    Boleneus, D.E.

    1999-01-01

    Digital mineral databases are necessary to carry out weights-of-evidence modeling of mineral resources for epithermal gold and carbonate-hosted lead-zinc deposits in northeast Washington. This report describes spreadsheet tables consisting of: 1) training sites for epithermal gold, 2) placer gold sites, 3) training sites for carbonate-hosted lead-zinc, and 4) small lead-zinc mines and prospects. A fifth table provides location data about sites in the four tables.

  1. PHASS99: A software program for retrieving and decoding the radiometric ages of igneous rocks from the international database IGBADAT

    NASA Astrophysics Data System (ADS)

    Al-Mishwat, Ali T.

    2016-05-01

    PHASS99 is a FORTRAN program designed to retrieve and decode radiometric and other physical age information of igneous rocks contained in the international database IGBADAT (Igneous Base Data File). In the database, ages are stored in a proprietary format using mnemonic representations. The program can handle up to 99 ages in an igneous rock specimen and caters to forty radiometric age systems. The radiometric age alphanumeric strings assigned to each specimen description in the database consist of four components: the numeric age and its exponential modifier, a four-character mnemonic method identification, a two-character mnemonic name of analysed material, and the reference number in the rock group bibliography vector. For each specimen, the program searches for radiometric age strings, extracts them, parses them, decodes the different age components, and converts them to high-level English equivalents. IGBADAT and similarly-structured files are used for input. The output includes three files: a flat raw ASCII text file containing retrieved radiometric age information, a generic spreadsheet-compatible file for data import to spreadsheets, and an error file. PHASS99 builds on the old program TSTPHA (Test Physical Age) decoder program and expands greatly its capabilities. PHASS99 is simple, user friendly, fast, efficient, and does not require users to have knowledge of programing.

  2. Designing a data portal for synthesis modeling

    NASA Astrophysics Data System (ADS)

    Holmes, M. A.

    2006-12-01

    Processing of field and model data in multi-disciplinary integrated science studies is a vital part of synthesis modeling. Collection and storage techniques for field data vary greatly between the participating scientific disciplines due to the nature of the data being collected, whether it be in situ, remotely sensed, or recorded by automated data logging equipment. Spreadsheets, personal databases, text files and binary files are used in the initial storage and processing of the raw data. In order to be useful to scientists, engineers and modelers the data need to be stored in a format that is easily identifiable, accessible and transparent to a variety of computing environments. The Model Operations and Synthesis (MOAS) database and associated web portal were created to provide such capabilities. The industry standard relational database is comprised of spatial and temporal data tables, shape files and supporting metadata accessible over the network, through a menu driven web-based portal or spatially accessible through ArcSDE connections from the user's local GIS desktop software. A separate server provides public access to spatial data and model output in the form of attributed shape files through an ArcIMS web-based graphical user interface.

  3. Sharing mutants and experimental information prepublication using FgMutantDB

    USDA-ARS?s Scientific Manuscript database

    There has been no central location for storing generated mutants of Fusarium graminearum or for data associated with these mutants. Instead researchers relied on several independent, non-integrated databases. FgMutantDB was designed as a simple spreadsheet that is accessible globally on the web th...

  4. What Spreadsheet and Database Skills Do Business Students Need?

    ERIC Educational Resources Information Center

    Coleman, Phillip D.; Blankenship, Ray J.

    2017-01-01

    The Principles of Information Systems course taught at a medium-sized Midwest University consists of Information Systems conceptual material plus Microsoft Excel and Access skills that the Information Systems faculty feel are most important to business students from all business disciplines. These skills range from using basic mathematic functions…

  5. LOTUS 1-2-3 Macros for Library Applications.

    ERIC Educational Resources Information Center

    Howden, Norman

    1987-01-01

    Describes LOTUS 1-2-3, an advanced spreadsheet with database and text manipulation functions that can be used with microcomputers by librarians to provide customized calculation and data acquisition tools. Macro commands and the menu system are discussed, and an example is given of an invoice procedure. (Author/LRW)

  6. Out of the Stone Age

    ERIC Educational Resources Information Center

    Kademan, Robyn

    2005-01-01

    One of the most beneficial uses for technology in the science classroom is data manipulation. During labs and other learning experiences, students can quickly put the data they collect into spreadsheets or databases. Then they can make comparisons, create graphs, draw conclusions, sort the data in new ways, and, ultimately, give their data…

  7. Automatic Grading of Spreadsheet and Database Skills

    ERIC Educational Resources Information Center

    Kovacic, Zlatko J.; Green, John Steven

    2012-01-01

    Growing enrollment in distance education has increased student-to-lecturer ratios and, therefore, increased the workload of the lecturer. This growing enrollment has resulted in mounting efforts to develop automatic grading systems in an effort to reduce this workload. While research in the design and development of automatic grading systems has a…

  8. The Devil and Daniel's Spreadsheet

    ERIC Educational Resources Information Center

    Burke, Maurice J.

    2012-01-01

    "When making mathematical models, technology is valuable for varying assumptions, exploring consequences, and comparing predictions with data," notes the Common Core State Standards Initiative (2010, p. 72). This exploration of the recursive process in the Devil and Daniel Webster problem reveals that the symbolic spreadsheet fits this bill.…

  9. Network-Based Visual Analysis of Tabular Data

    ERIC Educational Resources Information Center

    Liu, Zhicheng

    2012-01-01

    Tabular data is pervasive in the form of spreadsheets and relational databases. Although tables often describe multivariate data without explicit network semantics, it may be advantageous to explore the data modeled as a graph or network for analysis. Even when a given table design conveys some static network semantics, analysts may want to look…

  10. Learning the Semantics of Structured Data Sources

    ERIC Educational Resources Information Center

    Taheriyan, Mohsen

    2015-01-01

    Information sources such as relational databases, spreadsheets, XML, JSON, and Web APIs contain a tremendous amount of structured data, however, they rarely provide a semantic model to describe their contents. Semantic models of data sources capture the intended meaning of data sources by mapping them to the concepts and relationships defined by a…

  11. Computer Applications Course Goals, Outlines, and Objectives.

    ERIC Educational Resources Information Center

    Law, Debbie; Morgan, Michele

    This document contains a curriculum model that is designed to provide high school computer teachers with practical ideas for a 1-year computer applications course combining 3 quarters of instruction in keyboarding and 1 quarter of basic instruction in databases and spreadsheets. The document begins with a rationale and a 10-item list of…

  12. 40 CFR 80.1164 - What are the attest engagement requirements under the RFS program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... volumes, contained in the inventory reconciliation analysis under § 80.133, and verify that the volumes reported to EPA agree with the volumes in the inventory reconciliation analysis. (iv) Compute and report as... reported to EPA. (v) Obtain the database, spreadsheet, or other documentation for all RINs used for...

  13. 40 CFR 80.1164 - What are the attest engagement requirements under the RFS program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... volumes, contained in the inventory reconciliation analysis under § 80.133, and verify that the volumes reported to EPA agree with the volumes in the inventory reconciliation analysis. (iv) Compute and report as... reported to EPA. (v) Obtain the database, spreadsheet, or other documentation for all RINs used for...

  14. Animated-simulation modeling facilitates clinical-process costing.

    PubMed

    Zelman, W N; Glick, N D; Blackmore, C C

    2001-09-01

    Traditionally, the finance department has assumed responsibility for assessing process costs in healthcare organizations. To enhance process-improvement efforts, however, many healthcare providers need to include clinical staff in process cost analysis. Although clinical staff often use electronic spreadsheets to model the cost of specific processes, PC-based animated-simulation tools offer two major advantages over spreadsheets: they allow clinicians to interact more easily with the costing model so that it more closely represents the process being modeled, and they represent cost output as a cost range rather than as a single cost estimate, thereby providing more useful information for decision making.

  15. Introducing Artificial Neural Networks through a Spreadsheet Model

    ERIC Educational Resources Information Center

    Rienzo, Thomas F.; Athappilly, Kuriakose K.

    2012-01-01

    Business students taking data mining classes are often introduced to artificial neural networks (ANN) through point and click navigation exercises in application software. Even if correct outcomes are obtained, students frequently do not obtain a thorough understanding of ANN processes. This spreadsheet model was created to illuminate the roles of…

  16. Development of standardized air-blown coal gasifier/gas turbine concepts for future electric power systems, Volume 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-02-01

    This appendix is a compilation of work done to predict overall cycle performance from gasifier to generator terminals. A spreadsheet has been generated for each case to show flows within a cycle. The spreadsheet shows gaseous or solid composition of flow, temperature of flow, quantity of flow, and heat heat content of flow. Prediction of steam and gas turbine performance was obtained by the computer program GTPro. Outputs of all runs for each combined cycle reviewed has been added to this appendix. A process schematic displaying all flows predicted through GTPro and the spreadsheet is also added to this appendix.more » The numbered bubbles on the schematic correspond to columns on the top headings of the spreadsheet.« less

  17. A World Wide Web (WWW) server database engine for an organelle database, MitoDat.

    PubMed

    Lemkin, P F; Chipperfield, M; Merril, C; Zullo, S

    1996-03-01

    We describe a simple database search engine "dbEngine" which may be used to quickly create a searchable database on a World Wide Web (WWW) server. Data may be prepared from spreadsheet programs (such as Excel, etc.) or from tables exported from relationship database systems. This Common Gateway Interface (CGI-BIN) program is used with a WWW server such as available commercially, or from National Center for Supercomputer Algorithms (NCSA) or CERN. Its capabilities include: (i) searching records by combinations of terms connected with ANDs or ORs; (ii) returning search results as hypertext links to other WWW database servers; (iii) mapping lists of literature reference identifiers to the full references; (iv) creating bidirectional hypertext links between pictures and the database. DbEngine has been used to support the MitoDat database (Mendelian and non-Mendelian inheritance associated with the Mitochondrion) on the WWW.

  18. Mathematical Modeling with MyMaps and Spreadsheets

    ERIC Educational Resources Information Center

    Weber, Victoria; Fortune, Nicholas; Williams, Derek; Whitehead, Ashley

    2016-01-01

    Software programs such as Tinkerplots ® or Geometer's Sketchpad ® can help students solve problems in mathematics classes, but may not be available to them after high school. In contrast, many students who become familiar with Internet tools and programs in office packages (word processing, spreadsheets, etc.) may use them daily to enhance their…

  19. ALOG user's manual: A Guide to using the spreadsheet-based artificial log generator

    Treesearch

    Matthew F. Winn; Philip A. Araman; Randolph H. Wynne

    2012-01-01

    Computer programs that simulate log sawing can be valuable training tools for sawyers, as well as a means oftesting different sawing patterns. Most available simulation programs rely on diagrammed-log databases, which canbe very costly and time consuming to develop. Artificial Log Generator (ALOG) is a user-friendly Microsoft® Excel®...

  20. Critical Discourse Analysis in Education: A Review of the Literature, 2004 to 2012

    ERIC Educational Resources Information Center

    Rogers, Rebecca; Schaenen, Inda; Schott, Christopher; O'Brien, Kathryn; Trigos-Carrillo, Lina; Starkey, Kim; Chasteen, Cynthia Carter

    2016-01-01

    This article reviews critical discourse analysis scholarship in education research from 2004 to 2012. Our methodology was carried out in three stages. First, we searched educational databases. Second, we completed an analytic review template for each article and encoded these data into a digital spreadsheet to assess macro-trends in the field.…

  1. When Spreadsheets Become Software - Quality Control Challenges and Approaches - 13360

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fountain, Stefanie A.; Chen, Emmie G.; Beech, John F.

    2013-07-01

    As part of a preliminary waste acceptance criteria (PWAC) development, several commercial models were employed, including the Hydrologic Evaluation of Landfill Performance model (HELP) [1], the Disposal Unit Source Term - Multiple Species model (DUSTMS) [2], and the Analytical Transient One, Two, and Three-Dimensional model (AT123D) [3]. The results of these models were post-processed in MS Excel spreadsheets to convert the model results to alternate units, compare the groundwater concentrations to the groundwater concentration thresholds, and then to adjust the waste contaminant masses (based on average concentration over the waste volume) as needed in an attempt to achieve groundwater concentrationsmore » at the limiting point of assessment that would meet the compliance concentrations while maximizing the potential use of the landfill (i.e., maximizing the volume of projected waste being generated that could be placed in the landfill). During the course of the PWAC calculation development, one of the Microsoft (MS) Excel spreadsheets used to post-process the results of the commercial model packages grew to include more than 575,000 formulas across 18 worksheets. This spreadsheet was used to assess six base scenarios as well as nine uncertainty/sensitivity scenarios. The complexity of the spreadsheet resulted in the need for a rigorous quality control (QC) procedure to verify data entry and confirm the accuracy of formulas. (authors)« less

  2. Modeling Constellation Virtual Missions Using the Vdot(Trademark) Process Management Tool

    NASA Technical Reports Server (NTRS)

    Hardy, Roger; ONeil, Daniel; Sturken, Ian; Nix, Michael; Yanez, Damian

    2011-01-01

    The authors have identified a software tool suite that will support NASA's Virtual Mission (VM) effort. This is accomplished by transforming a spreadsheet database of mission events, task inputs and outputs, timelines, and organizations into process visualization tools and a Vdot process management model that includes embedded analysis software as well as requirements and information related to data manipulation and transfer. This paper describes the progress to date, and the application of the Virtual Mission to not only Constellation but to other architectures, and the pertinence to other aerospace applications. Vdot s intuitive visual interface brings VMs to life by turning static, paper-based processes into active, electronic processes that can be deployed, executed, managed, verified, and continuously improved. A VM can be executed using a computer-based, human-in-the-loop, real-time format, under the direction and control of the NASA VM Manager. Engineers in the various disciplines will not have to be Vdot-proficient but rather can fill out on-line, Excel-type databases with the mission information discussed above. The author s tool suite converts this database into several process visualization tools for review and into Microsoft Project, which can be imported directly into Vdot. Many tools can be embedded directly into Vdot, and when the necessary data/information is received from a preceding task, the analysis can be initiated automatically. Other NASA analysis tools are too complex for this process but Vdot automatically notifies the tool user that the data has been received and analysis can begin. The VM can be simulated from end-to-end using the author s tool suite. The planned approach for the Vdot-based process simulation is to generate the process model from a database; other advantages of this semi-automated approach are the participants can be geographically remote and after refining the process models via the human-in-the-loop simulation, the system can evolve into a process management server for the actual process.

  3. Whole-rock and glass major-element geochemistry of Kilauea Volcano, Hawaii, near-vent eruptive products: September 1994 through September 2001

    USGS Publications Warehouse

    Thornber, Carl R.; Sherrod, David R.; Siems, David F.; Heliker, Christina C.; Meeker, Gregory P.; Oscarson, Robert L.; Kauahikaua, James P.

    2002-01-01

    This report presents major-element geochemical data for glasses and whole-rock aliquots among 523 lava samples collected near the vent on Kilauea's east rift zone between September 1994 and October 2001. Information on sample collection, analysis techniques and analytical standard reproducibility are presented as a PDF file, which also includes a detailed explantion of the categories of sample information presented in the database spreadsheet. The sample database is downloadable as a separate Microsoft Excel file.

  4. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    NASA Astrophysics Data System (ADS)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  5. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations.

    PubMed

    Laloo, Jalal Z A; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  6. Solving L-L Extraction Problems with Excel Spreadsheet

    ERIC Educational Resources Information Center

    Teppaitoon, Wittaya

    2016-01-01

    This work aims to demonstrate the use of Excel spreadsheets for solving L-L extraction problems. The key to solving the problems successfully is to be able to determine a tie line on the ternary diagram where the calculation must be carried out. This enables the reader to analyze the extraction process starting with a simple operation, the…

  7. Ideas Tried, Lessons Learned, and Improvements to Make: A Journey in Moving a Spreadsheet-Intensive Course Online

    ERIC Educational Resources Information Center

    Berardi, Victor L.

    2012-01-01

    Using information systems to solve business problems is increasingly required of everyone in an organization, not just technical specialists. In the operations management class, spreadsheet usage has intensified with the focus on building decision models to solve operations management concerns such as forecasting, process capability, and inventory…

  8. Teaching Simulation and Computer-Aided Separation Optimization in Liquid Chromatography by Means of Illustrative Microsoft Excel Spreadsheets

    ERIC Educational Resources Information Center

    Fasoula, S.; Nikitas, P.; Pappa-Louisi, A.

    2017-01-01

    A series of Microsoft Excel spreadsheets were developed to simulate the process of separation optimization under isocratic and simple gradient conditions. The optimization procedure is performed in a stepwise fashion using simple macros for an automatic application of this approach. The proposed optimization approach involves modeling of the peak…

  9. The eNanoMapper database for nanomaterial safety information

    PubMed Central

    Chomenidis, Charalampos; Doganis, Philip; Fadeel, Bengt; Grafström, Roland; Hardy, Barry; Hastings, Janna; Hegi, Markus; Jeliazkov, Vedrin; Kochev, Nikolay; Kohonen, Pekka; Munteanu, Cristian R; Sarimveis, Haralambos; Smeets, Bart; Sopasakis, Pantelis; Tsiliki, Georgia; Vorgrimmler, David; Willighagen, Egon

    2015-01-01

    Summary Background: The NanoSafety Cluster, a cluster of projects funded by the European Commision, identified the need for a computational infrastructure for toxicological data management of engineered nanomaterials (ENMs). Ontologies, open standards, and interoperable designs were envisioned to empower a harmonized approach to European research in nanotechnology. This setting provides a number of opportunities and challenges in the representation of nanomaterials data and the integration of ENM information originating from diverse systems. Within this cluster, eNanoMapper works towards supporting the collaborative safety assessment for ENMs by creating a modular and extensible infrastructure for data sharing, data analysis, and building computational toxicology models for ENMs. Results: The eNanoMapper database solution builds on the previous experience of the consortium partners in supporting diverse data through flexible data storage, open source components and web services. We have recently described the design of the eNanoMapper prototype database along with a summary of challenges in the representation of ENM data and an extensive review of existing nano-related data models, databases, and nanomaterials-related entries in chemical and toxicogenomic databases. This paper continues with a focus on the database functionality exposed through its application programming interface (API), and its use in visualisation and modelling. Considering the preferred community practice of using spreadsheet templates, we developed a configurable spreadsheet parser facilitating user friendly data preparation and data upload. We further present a web application able to retrieve the experimental data via the API and analyze it with multiple data preprocessing and machine learning algorithms. Conclusion: We demonstrate how the eNanoMapper database is used to import and publish online ENM and assay data from several data sources, how the “representational state transfer” (REST) API enables building user friendly interfaces and graphical summaries of the data, and how these resources facilitate the modelling of reproducible quantitative structure–activity relationships for nanomaterials (NanoQSAR). PMID:26425413

  10. MetaBar - a tool for consistent contextual data acquisition and standards compliant submission.

    PubMed

    Hankeln, Wolfgang; Buttigieg, Pier Luigi; Fink, Dennis; Kottmann, Renzo; Yilmaz, Pelin; Glöckner, Frank Oliver

    2010-06-30

    Environmental sequence datasets are increasing at an exponential rate; however, the vast majority of them lack appropriate descriptors like sampling location, time and depth/altitude: generally referred to as metadata or contextual data. The consistent capture and structured submission of these data is crucial for integrated data analysis and ecosystems modeling. The application MetaBar has been developed, to support consistent contextual data acquisition. MetaBar is a spreadsheet and web-based software tool designed to assist users in the consistent acquisition, electronic storage, and submission of contextual data associated to their samples. A preconfigured Microsoft Excel spreadsheet is used to initiate structured contextual data storage in the field or laboratory. Each sample is given a unique identifier and at any stage the sheets can be uploaded to the MetaBar database server. To label samples, identifiers can be printed as barcodes. An intuitive web interface provides quick access to the contextual data in the MetaBar database as well as user and project management capabilities. Export functions facilitate contextual and sequence data submission to the International Nucleotide Sequence Database Collaboration (INSDC), comprising of the DNA DataBase of Japan (DDBJ), the European Molecular Biology Laboratory database (EMBL) and GenBank. MetaBar requests and stores contextual data in compliance to the Genomic Standards Consortium specifications. The MetaBar open source code base for local installation is available under the GNU General Public License version 3 (GNU GPL3). The MetaBar software supports the typical workflow from data acquisition and field-sampling to contextual data enriched sequence submission to an INSDC database. The integration with the megx.net marine Ecological Genomics database and portal facilitates georeferenced data integration and metadata-based comparisons of sampling sites as well as interactive data visualization. The ample export functionalities and the INSDC submission support enable exchange of data across disciplines and safeguarding contextual data.

  11. The eNanoMapper database for nanomaterial safety information.

    PubMed

    Jeliazkova, Nina; Chomenidis, Charalampos; Doganis, Philip; Fadeel, Bengt; Grafström, Roland; Hardy, Barry; Hastings, Janna; Hegi, Markus; Jeliazkov, Vedrin; Kochev, Nikolay; Kohonen, Pekka; Munteanu, Cristian R; Sarimveis, Haralambos; Smeets, Bart; Sopasakis, Pantelis; Tsiliki, Georgia; Vorgrimmler, David; Willighagen, Egon

    2015-01-01

    The NanoSafety Cluster, a cluster of projects funded by the European Commision, identified the need for a computational infrastructure for toxicological data management of engineered nanomaterials (ENMs). Ontologies, open standards, and interoperable designs were envisioned to empower a harmonized approach to European research in nanotechnology. This setting provides a number of opportunities and challenges in the representation of nanomaterials data and the integration of ENM information originating from diverse systems. Within this cluster, eNanoMapper works towards supporting the collaborative safety assessment for ENMs by creating a modular and extensible infrastructure for data sharing, data analysis, and building computational toxicology models for ENMs. The eNanoMapper database solution builds on the previous experience of the consortium partners in supporting diverse data through flexible data storage, open source components and web services. We have recently described the design of the eNanoMapper prototype database along with a summary of challenges in the representation of ENM data and an extensive review of existing nano-related data models, databases, and nanomaterials-related entries in chemical and toxicogenomic databases. This paper continues with a focus on the database functionality exposed through its application programming interface (API), and its use in visualisation and modelling. Considering the preferred community practice of using spreadsheet templates, we developed a configurable spreadsheet parser facilitating user friendly data preparation and data upload. We further present a web application able to retrieve the experimental data via the API and analyze it with multiple data preprocessing and machine learning algorithms. We demonstrate how the eNanoMapper database is used to import and publish online ENM and assay data from several data sources, how the "representational state transfer" (REST) API enables building user friendly interfaces and graphical summaries of the data, and how these resources facilitate the modelling of reproducible quantitative structure-activity relationships for nanomaterials (NanoQSAR).

  12. MetaBar - a tool for consistent contextual data acquisition and standards compliant submission

    PubMed Central

    2010-01-01

    Background Environmental sequence datasets are increasing at an exponential rate; however, the vast majority of them lack appropriate descriptors like sampling location, time and depth/altitude: generally referred to as metadata or contextual data. The consistent capture and structured submission of these data is crucial for integrated data analysis and ecosystems modeling. The application MetaBar has been developed, to support consistent contextual data acquisition. Results MetaBar is a spreadsheet and web-based software tool designed to assist users in the consistent acquisition, electronic storage, and submission of contextual data associated to their samples. A preconfigured Microsoft® Excel® spreadsheet is used to initiate structured contextual data storage in the field or laboratory. Each sample is given a unique identifier and at any stage the sheets can be uploaded to the MetaBar database server. To label samples, identifiers can be printed as barcodes. An intuitive web interface provides quick access to the contextual data in the MetaBar database as well as user and project management capabilities. Export functions facilitate contextual and sequence data submission to the International Nucleotide Sequence Database Collaboration (INSDC), comprising of the DNA DataBase of Japan (DDBJ), the European Molecular Biology Laboratory database (EMBL) and GenBank. MetaBar requests and stores contextual data in compliance to the Genomic Standards Consortium specifications. The MetaBar open source code base for local installation is available under the GNU General Public License version 3 (GNU GPL3). Conclusion The MetaBar software supports the typical workflow from data acquisition and field-sampling to contextual data enriched sequence submission to an INSDC database. The integration with the megx.net marine Ecological Genomics database and portal facilitates georeferenced data integration and metadata-based comparisons of sampling sites as well as interactive data visualization. The ample export functionalities and the INSDC submission support enable exchange of data across disciplines and safeguarding contextual data. PMID:20591175

  13. Development of standardized air-blown coal gasifier/gas turbine concepts for future electric power systems, Volume 4. Appendix C: Design and performance of standardized fixed bed air-blown gasifier IGCC systems for future electric power generation: Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-02-01

    This appendix is a compilation of work done to predict overall cycle performance from gasifier to generator terminals. A spreadsheet has been generated for each case to show flows within a cycle. The spreadsheet shows gaseous or solid composition of flow, temperature of flow, quantity of flow, and heat heat content of flow. Prediction of steam and gas turbine performance was obtained by the computer program GTPro. Outputs of all runs for each combined cycle reviewed has been added to this appendix. A process schematic displaying all flows predicted through GTPro and the spreadsheet is also added to this appendix.more » The numbered bubbles on the schematic correspond to columns on the top headings of the spreadsheet.« less

  14. ALOG: A spreadsheet-based program for generating artificial logs

    Treesearch

    Matthew F. Winn; Randolph H. Wynne; Philip A. Araman

    2004-01-01

    Log sawing simulation computer programs can be valuable tools for training sawyers as well as for testing different sawing patterns. Most available simulation programs rely on databases from which to draw logs and can be very costly and time-consuming to develop. ALOG (Artificial LOg Generator) is a Microsoft Excel®-based computer program that was developed to...

  15. Spreadsheet Applications using VisiCalc and Lotus 1-2-3 Programs.

    ERIC Educational Resources Information Center

    Cortland-Madison Board of Cooperative Educational Services, Cortland, NY.

    The VisiCalc program is visual calculation on a computer making use of an electronic worksheet that is beneficial to the business user in dealing with numerous accounting and clerical procedures. The Lotus 1-2-3 program begins with VisiCalc and improves upon it by adding graphics and a database as well as more efficient ways to manipulate and…

  16. How To Use the Spreadsheet as a Tool in the Secondary School Mathematics Classroom. Second Edition (for Windows and Macintosh Operating Systems).

    ERIC Educational Resources Information Center

    Masalski, William J.

    This book seeks to develop, enhance, and expand students' understanding of mathematics by using technology. Topics covered include the advantages of spreadsheets along with the opportunity to explore the 'what if?' type of questions encountered in the problem-solving process, enhancing the user's insight into the development and use of algorithms,…

  17. OntoMaton: a bioportal powered ontology widget for Google Spreadsheets.

    PubMed

    Maguire, Eamonn; González-Beltrán, Alejandra; Whetzel, Patricia L; Sansone, Susanna-Assunta; Rocca-Serra, Philippe

    2013-02-15

    Data collection in spreadsheets is ubiquitous, but current solutions lack support for collaborative semantic annotation that would promote shared and interdisciplinary annotation practices, supporting geographically distributed players. OntoMaton is an open source solution that brings ontology lookup and tagging capabilities into a cloud-based collaborative editing environment, harnessing Google Spreadsheets and the NCBO Web services. It is a general purpose, format-agnostic tool that may serve as a component of the ISA software suite. OntoMaton can also be used to assist the ontology development process. OntoMaton is freely available from Google widgets under the CPAL open source license; documentation and examples at: https://github.com/ISA-tools/OntoMaton.

  18. A document-centric approach for developing the tolAPC ontology.

    PubMed

    Blfgeh, Aisha; Warrender, Jennifer; Hilkens, Catharien M U; Lord, Phillip

    2017-11-28

    There are many challenges associated with ontology building, as the process often touches on many different subject areas; it needs knowledge of the problem domain, an understanding of the ontology formalism, software in use and, sometimes, an understanding of the philosophical background. In practice, it is very rare that an ontology can be completed by a single person, as they are unlikely to combine all of these skills. So people with these skills must collaborate. One solution to this is to use face-to-face meetings, but these can be expensive and time-consuming for teams that are not co-located. Remote collaboration is possible, of course, but one difficulty here is that domain specialists use a wide-variety of different "formalisms" to represent and share their data - by the far most common, however, is the "office file" either in the form of a word-processor document or a spreadsheet. Here we describe the development of an ontology of immunological cell types; this was initially developed by domain specialists using an Excel spreadsheet for collaboration. We have transformed this spreadsheet into an ontology using highly-programmatic and pattern-driven ontology development. Critically, the spreadsheet remains part of the source for the ontology; the domain specialists are free to update it, and changes will percolate to the end ontology. We have developed a new ontology describing immunological cell lines built by instantiating ontology design patterns written programmatically, using values from a spreadsheet catalogue. This method employs a spreadsheet that was developed by domain experts. The spreadsheet is unconstrained in its usage and can be freely updated resulting in a new ontology. This provides a general methodology for ontology development using data generated by domain specialists.

  19. Converting analog interpretive data to digital formats for use in database and GIS applications

    USGS Publications Warehouse

    Flocks, James G.

    2004-01-01

    There is a growing need by researchers and managers for comprehensive and unified nationwide datasets of scientific data. These datasets must be in a digital format that is easily accessible using database and GIS applications, providing the user with access to a wide variety of current and historical information. Although most data currently being collected by scientists are already in a digital format, there is still a large repository of information in the literature and paper archive. Converting this information into a format accessible by computer applications is typically very difficult and can result in loss of data. However, since scientific data are commonly collected in a repetitious, concise matter (i.e., forms, tables, graphs, etc.), these data can be recovered digitally by using a conversion process that relates the position of an attribute in two-dimensional space to the information that the attribute signifies. For example, if a table contains a certain piece of information in a specific row and column, then the space that the row and column occupies becomes an index of that information. An index key is used to identify the relation between the physical location of the attribute and the information the attribute contains. The conversion process can be achieved rapidly, easily and inexpensively using widely available digitizing and spreadsheet software, and simple programming code. In the geological sciences, sedimentary character is commonly interpreted from geophysical profiles and descriptions of sediment cores. In the field and laboratory, these interpretations were typically transcribed to paper. The information from these paper archives is still relevant and increasingly important to scientists, engineers and managers to understand geologic processes affecting our environment. Direct scanning of this information produces a raster facsimile of the data, which allows it to be linked to the electronic world. But true integration of the content with database and GIS software as point, vector or text information is commonly lost. Sediment core descriptions and interpretation of geophysical profiles are usually portrayed as lines, curves, symbols and text information. They have vertical and horizontal dimensions associated with depth, category, time, or geographic position. These dimensions are displayed in consistent positions, which can be digitized and converted to a digital format, such as a spreadsheet. Once this data is in a digital, tabulated form it can easily be made available to a wide variety of imaging and data manipulation software for compilation and world-wide dissemination.

  20. Process Improvement in a Radically Changing Organization

    NASA Technical Reports Server (NTRS)

    Varga, Denise M.; Wilson, Barbara M.

    2007-01-01

    This presentation describes how the NASA Glenn Research Center planned and implemented a process improvement effort in response to a radically changing environment. As a result of a presidential decision to redefine the Agency's mission, many ongoing projects were canceled and future workload would be awarded based on relevance to the Exploration Initiative. NASA imposed a new Procedural Requirements standard on all future software development, and the Center needed to redesign its processes from CMM Level 2 objectives to meet the new standard and position itself for CMMI. The intended audience for this presentation is systems/software developers and managers in a large, research-oriented organization that may need to respond to imposed standards while also pursuing CMMI Maturity Level goals. A set of internally developed tools will be presented, including an overall Process Improvement Action Item database, a formal inspection/peer review tool, metrics collection spreadsheet, and other related technologies. The Center also found a need to charter Technical Working Groups (TWGs) to address particular Process Areas. In addition, a Marketing TWG was needed to communicate the process changes to the development community, including an innovative web site portal.

  1. Refinery spreadsheet highlights microcomputer process applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tucker, M.A.

    1984-01-23

    Microcomputer applications in the process areas at Chevron U.S.A. refineries and at the Chevron Research Co. illustrate how the microcomputer has changed the way we do our jobs. This article will describe major uses of the microcomputer as a personal work tool in Chevron process areas. It will also describe how and why many of Chevron's microcomputer applications were developed and their characteristics. One of our earliest microcomputer applications, developed in late 1981, was an electronic spreadsheet program using a small desktop microcomputer. It was designed to help a refinery planner prepare monthly plans for a small portion of onemore » of our major refineries. This particular microcomputer had a tiny 4-in. screen, and the reports were several strips of print-out from the microcomputer's 3-in.-wide internal printer taped together. In spite of these archaic computing conditions, it was a successful application. It automated what had been very tedious and time-consuming calculations with a pencil, a calculator, and a great deal of erasing. It eliminated filling out large ''horseblanket'' reports. The electronic spreadsheet was also flexible; the planner could easily change the worksheet to match new operating constraints, new process conditions, and new feeds and products. Fortunately, within just a few months, this application graduated to a similar electronic spreadsheet program on a new, more powerful microcomputer. It had a bigger display screen and a letter-size printer. The same application is still in use today, although it has been greatly enhanced and altered to match extensive plant modifications. And there are plans to expand it again onto yet another, more powerful microcomputer.« less

  2. CSAM Metrology Software Tool

    NASA Technical Reports Server (NTRS)

    Vu, Duc; Sandor, Michael; Agarwal, Shri

    2005-01-01

    CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.

  3. The personal computer and GP-B management. [Gravity Probe experiment

    NASA Technical Reports Server (NTRS)

    Neighbors, A. K.

    1986-01-01

    The Gravity Probe-B (GP-B) experiment is one of the most sophisticated and challenging developments to be undertaken by NASA. Its objective is to measure the relativistic drift of gyroscopes in orbit about the earth. In this paper, the experiment is described, and the strategy of phased procurements for accomplishing the engineering development of the hardware is discussed. The microcomputer is a very convenient and powerful tool in the management of GP-B. It is used in creating and monitoring such project data as schedules, budgets, hardware procurements and technical and interface requirements. Commercially available software in word processing, database management, communications, spreadsheet, graphics and program management are used. Examples are described of the efficacy of the application of the computer by the management team.

  4. Using Quasi-Horizontal Alignment in the absence of the actual alignment.

    PubMed

    Banihashemi, Mohamadreza

    2016-10-01

    Horizontal alignment is a major roadway characteristic used in safety and operational evaluations of many facility types. The Highway Safety Manual (HSM) uses this characteristic in crash prediction models for rural two-lane highways, freeway segments, and freeway ramps/C-D roads. Traffic simulation models use this characteristic in their processes on almost all types of facilities. However, a good portion of roadway databases do not include horizontal alignment data; instead, many contain point coordinate data along the roadways. SHRP 2 Roadway Information Database (RID) is a good example of this type of data. Only about 5% of this geodatabase contains alignment information and for the rest, point data can easily be produced. Even though the point data can be used to extract actual horizontal alignment data but, extracting horizontal alignment is a cumbersome and costly process, especially for a database of miles and miles of highways. This research introduces a so called "Quasi-Horizontal Alignment" that can be produced easily and automatically from point coordinate data and can be used in the safety and operational evaluations of highways. SHRP 2 RID for rural two-lane highways in Washington State is used in this study. This paper presents a process through which Quasi-Horizontal Alignments are produced from point coordinates along highways by using spreadsheet software such as MS EXCEL. It is shown that the safety and operational evaluations of the highways with Quasi-Horizontal Alignments are almost identical to the ones with the actual alignments. In the absence of actual alignment the Quasi-Horizontal Alignment can easily be produced from any type of databases that contain highway coordinates such geodatabases and digital maps. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Emissions & Generation Resource Integrated Database (eGRID), eGRID2010

    EPA Pesticide Factsheets

    The Emissions & Generation Resource Integrated Database (eGRID) is a comprehensive source of data on the environmental characteristics of almost all electric power generated in the United States. These environmental characteristics include air emissions for nitrogen oxides, sulfur dioxide, carbon dioxide, methane, and nitrous oxide; emissions rates; net generation; resource mix; and many other attributes.eGRID2010 contains the complete release of year 2007 data, as well as years 2005 and 2004 data. Excel spreadsheets, full documentation, summary data, eGRID subregion and NERC region representational maps, and GHG emission factors are included in this data set. The Archived data in eGRID2002 contain years 1996 through 2000 data.For year 2007 data, the first Microsoft Excel workbook, Plant, contains boiler, generator, and plant spreadsheets. The second Microsoft Excel workbook, Aggregation, contains aggregated data by state, electric generating company, parent company, power control area, eGRID subregion, NERC region, and U.S. total levels. The third Microsoft Excel workbook, ImportExport, contains state import-export data, as well as U.S. generation and consumption data for years 2007, 2005, and 2004. For eGRID data for years 2005 and 2004, a user friendly web application, eGRIDweb, is available to select, view, print, and export specified data.

  6. Database improvements for motor vehicle/bicycle crash analysis

    PubMed Central

    Lusk, Anne C; Asgarzadeh, Morteza; Farvid, Maryam S

    2015-01-01

    Background Bicycling is healthy but needs to be safer for more to bike. Police crash templates are designed for reporting crashes between motor vehicles, but not between vehicles/bicycles. If written/drawn bicycle-crash-scene details exist, these are not entered into spreadsheets. Objective To assess which bicycle-crash-scene data might be added to spreadsheets for analysis. Methods Police crash templates from 50 states were analysed. Reports for 3350 motor vehicle/bicycle crashes (2011) were obtained for the New York City area and 300 cases selected (with drawings and on roads with sharrows, bike lanes, cycle tracks and no bike provisions). Crashes were redrawn and new bicycle-crash-scene details were coded and entered into the existing spreadsheet. The association between severity of injuries and bicycle-crash-scene codes was evaluated using multiple logistic regression. Results Police templates only consistently include pedal-cyclist and helmet. Bicycle-crash-scene coded variables for templates could include: 4 bicycle environments, 18 vehicle impact-points (opened-doors and mirrors), 4 bicycle impact-points, motor vehicle/bicycle crash patterns, in/out of the bicycle environment and bike/relevant motor vehicle categories. A test of including these variables suggested that, with bicyclists who had minor injuries as the control group, bicyclists on roads with bike lanes riding outside the lane had lower likelihood of severe injuries (OR, 0.40, 95% CI 0.16 to 0.98) compared with bicyclists riding on roads without bicycle facilities. Conclusions Police templates should include additional bicycle-crash-scene codes for entry into spreadsheets. Crash analysis, including with big data, could then be conducted on bicycle environments, motor vehicle potential impact points/doors/mirrors, bicycle potential impact points, motor vehicle characteristics, location and injury. PMID:25835304

  7. Spreadsheets Answer "What If...?

    ERIC Educational Resources Information Center

    Pogge, Alfred F.; Lunetta, Vincent N.

    1987-01-01

    Demonstrates how a spreadsheet program can do calculations, freeing students to question, analyze data and learn science. Notes several popular spreadsheet programs. Gives an example using Lotus 1-2-3 spreadsheets for a sampling experiment in Biology. Shows other examples of spreadsheet use in laboratory activities. (CW)

  8. Simulation of 2D Waves in Circular Membrane Using Excel Spreadsheet with Visual Basic for Teaching Activity

    NASA Astrophysics Data System (ADS)

    Eso, R.; Safiuddin, L. O.; Agusu, L.; Arfa, L. M. R. F.

    2018-04-01

    We propose a teaching instrument demonstrating the circular membrane waves using the excel interactive spreadsheets with the Visual Basic for Application (VBA) programming. It is based on the analytic solution of circular membrane waves involving Bessel function. The vibration modes and frequencies are determined by using Bessel approximation and initial conditions. The 3D perspective based on the spreadsheets functions and facilities has been explored to show the 3D moving objects in transitional or rotational processes. This instrument is very useful both in teaching activity and learning process of wave physics. Visualizing of the vibration of waves in the circular membrane which is showing a very clear manner of m and n vibration modes of the wave in a certain frequency has been compared and matched to the experimental result using resonance method. The peak of deflection varies in time if the initial condition was working and have the same pattern with matlab simulation in zero initial velocity

  9. Organizing, exploring, and analyzing antibody sequence data: the case for relational-database managers.

    PubMed

    Owens, John

    2009-01-01

    Technological advances in the acquisition of DNA and protein sequence information and the resulting onrush of data can quickly overwhelm the scientist unprepared for the volume of information that must be evaluated and carefully dissected to discover its significance. Few laboratories have the luxury of dedicated personnel to organize, analyze, or consistently record a mix of arriving sequence data. A methodology based on a modern relational-database manager is presented that is both a natural storage vessel for antibody sequence information and a conduit for organizing and exploring sequence data and accompanying annotation text. The expertise necessary to implement such a plan is equal to that required by electronic word processors or spreadsheet applications. Antibody sequence projects maintained as independent databases are selectively unified by the relational-database manager into larger database families that contribute to local analyses, reports, interactive HTML pages, or exported to facilities dedicated to sophisticated sequence analysis techniques. Database files are transposable among current versions of Microsoft, Macintosh, and UNIX operating systems.

  10. CMO: Cruise Metadata Organizer for JAMSTEC Research Cruises

    NASA Astrophysics Data System (ADS)

    Fukuda, K.; Saito, H.; Hanafusa, Y.; Vanroosebeke, A.; Kitayama, T.

    2011-12-01

    JAMSTEC's Data Research Center for Marine-Earth Sciences manages and distributes a wide variety of observational data and samples obtained from JAMSTEC research vessels and deep sea submersibles. Generally, metadata are essential to identify data and samples were obtained. In JAMSTEC, cruise metadata include cruise information such as cruise ID, name of vessel, research theme, and diving information such as dive number, name of submersible and position of diving point. They are submitted by chief scientists of research cruises in the Microsoft Excel° spreadsheet format, and registered into a data management database to confirm receipt of observational data files, cruise summaries, and cruise reports. The cruise metadata are also published via "JAMSTEC Data Site for Research Cruises" within two months after end of cruise. Furthermore, these metadata are distributed with observational data, images and samples via several data and sample distribution websites after a publication moratorium period. However, there are two operational issues in the metadata publishing process. One is that duplication efforts and asynchronous metadata across multiple distribution websites due to manual metadata entry into individual websites by administrators. The other is that differential data types or representation of metadata in each website. To solve those problems, we have developed a cruise metadata organizer (CMO) which allows cruise metadata to be connected from the data management database to several distribution websites. CMO is comprised of three components: an Extensible Markup Language (XML) database, an Enterprise Application Integration (EAI) software, and a web-based interface. The XML database is used because of its flexibility for any change of metadata. Daily differential uptake of metadata from the data management database to the XML database is automatically processed via the EAI software. Some metadata are entered into the XML database using the web-based interface by a metadata editor in CMO as needed. Then daily differential uptake of metadata from the XML database to databases in several distribution websites is automatically processed using a convertor defined by the EAI software. Currently, CMO is available for three distribution websites: "Deep Sea Floor Rock Sample Database GANSEKI", "Marine Biological Sample Database", and "JAMSTEC E-library of Deep-sea Images". CMO is planned to provide "JAMSTEC Data Site for Research Cruises" with metadata in the future.

  11. Automation of PCXMC and ImPACT for NASA Astronaut Medical Imaging Dose and Risk Tracking

    NASA Technical Reports Server (NTRS)

    Bahadori, Amir; Picco, Charles; Flores-McLaughlin, John; Shavers, Mark; Semones, Edward

    2011-01-01

    To automate astronaut organ and effective dose calculations from occupational X-ray and computed tomography (CT) examinations incorporating PCXMC and ImPACT tools and to estimate the associated lifetime cancer risk per the National Council on Radiation Protection & Measurements (NCRP) using MATLAB(R). Methods: NASA follows guidance from the NCRP on its operational radiation safety program for astronauts. NCRP Report 142 recommends that astronauts be informed of the cancer risks from reported exposures to ionizing radiation from medical imaging. MATLAB(R) code was written to retrieve exam parameters for medical imaging procedures from a NASA database, calculate associated dose and risk, and return results to the database, using the Microsoft .NET Framework. This code interfaces with the PCXMC executable and emulates the ImPACT Excel spreadsheet to calculate organ doses from X-rays and CTs, respectively, eliminating the need to utilize the PCXMC graphical user interface (except for a few special cases) and the ImPACT spreadsheet. Results: Using MATLAB(R) code to interface with PCXMC and replicate ImPACT dose calculation allowed for rapid evaluation of multiple medical imaging exams. The user inputs the exam parameter data into the database and runs the code. Based on the imaging modality and input parameters, the organ doses are calculated. Output files are created for record, and organ doses, effective dose, and cancer risks associated with each exam are written to the database. Annual and post-flight exposure reports, which are used by the flight surgeon to brief the astronaut, are generated from the database. Conclusions: Automating PCXMC and ImPACT for evaluation of NASA astronaut medical imaging radiation procedures allowed for a traceable and rapid method for tracking projected cancer risks associated with over 12,000 exposures. This code will be used to evaluate future medical radiation exposures, and can easily be modified to accommodate changes to the risk calculation procedure.

  12. How I do it: a practical database management system to assist clinical research teams with data collection, organization, and reporting.

    PubMed

    Lee, Howard; Chapiro, Julius; Schernthaner, Rüdiger; Duran, Rafael; Wang, Zhijun; Gorodetski, Boris; Geschwind, Jean-François; Lin, MingDe

    2015-04-01

    The objective of this study was to demonstrate that an intra-arterial liver therapy clinical research database system is a more workflow efficient and robust tool for clinical research than a spreadsheet storage system. The database system could be used to generate clinical research study populations easily with custom search and retrieval criteria. A questionnaire was designed and distributed to 21 board-certified radiologists to assess current data storage problems and clinician reception to a database management system. Based on the questionnaire findings, a customized database and user interface system were created to perform automatic calculations of clinical scores including staging systems such as the Child-Pugh and Barcelona Clinic Liver Cancer, and facilitates data input and output. Questionnaire participants were favorable to a database system. The interface retrieved study-relevant data accurately and effectively. The database effectively produced easy-to-read study-specific patient populations with custom-defined inclusion/exclusion criteria. The database management system is workflow efficient and robust in retrieving, storing, and analyzing data. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  13. Basic statistics with Microsoft Excel: a review.

    PubMed

    Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto

    2017-06-01

    The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.

  14. Basic statistics with Microsoft Excel: a review

    PubMed Central

    Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto

    2017-01-01

    The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel. PMID:28740690

  15. Using Spreadsheets in the Management, Analysis, and Reporting of Evaluation Data.

    ERIC Educational Resources Information Center

    Glowacki, Margaret L.; Rice, Richard L., Jr.

    Currently available spreadsheet programs for microcomputers provide many features that can be very useful for evaluators and researchers. Some of the basic concepts involved in spreadsheet use are introduced, and information is provided on the use of spreadsheets in maintaining and analyzing evaluation data. The spreadsheet used in the discussion…

  16. The Managed Readiness Simulator: A Force Readiness Model

    DTIC Science & Technology

    2011-12-01

    specific mili- tary occupations. Tyche Fleet Mix Model (Eisler, Bourque, and Reive 2009) provides the most effective mix of maritime fleet assets to...through the GUI input screens or imported from an external source such as a corporate database or spreadsheet. As the Simulation Manager, the GUI...Memorandum DRDC CORA TM 2011-xx. (in press) Skraba, A., M. Kljajic, A. Knaflic, D. Kofjac, and I. Podbregar. 2007. “Development of a Human Re

  17. Beyond the Mechanics of Spreadsheets: Using Design Instruction to Address Spreadsheet Errors

    ERIC Educational Resources Information Center

    Schneider, Kent N.; Becker, Lana L.; Berg, Gary G.

    2017-01-01

    Given that the usage and complexity of spreadsheets in the accounting profession are expected to increase, it is more important than ever to ensure that accounting graduates are aware of the dangers of spreadsheet errors and are equipped with design skills to minimize those errors. Although spreadsheet mechanics are prevalent in accounting…

  18. Water-level database update for the Death Valley regional groundwater flow system, Nevada and California, 1907-2007

    USGS Publications Warehouse

    Pavelko, Michael T.

    2010-01-01

    The water-level database for the Death Valley regional groundwater flow system in Nevada and California was updated. The database includes more than 54,000 water levels collected from 1907 to 2007, from more than 1,800 wells. Water levels were assigned a primary flag and multiple secondary flags that describe hydrologic conditions and trends at the time of the measurement and identify pertinent information about the well or water-level measurement. The flags provide a subjective measure of the relative accuracy of the measurements and are used to identify which water levels are appropriate for calculating head observations in a regional transient groundwater flow model. Included in the report appendix are all water-level data and their flags, selected well data, and an interactive spreadsheet for viewing hydrographs and well locations.

  19. Problem Solving with Spreadsheets.

    ERIC Educational Resources Information Center

    Catterall, P.; Lewis, R.

    1985-01-01

    Documents the educational use of spreadsheets through a description of exploratory work which utilizes spreadsheets to achieve the objectives of Conway's Game of Life, a scientific method game for the development of problem-solving techniques. The implementation and classroom use of the spreadsheet programs are discussed. (MBR)

  20. Errors in patient specimen collection: application of statistical process control.

    PubMed

    Dzik, Walter Sunny; Beckman, Neil; Selleng, Kathleen; Heddle, Nancy; Szczepiorkowski, Zbigniew; Wendel, Silvano; Murphy, Michael

    2008-10-01

    Errors in the collection and labeling of blood samples for pretransfusion testing increase the risk of transfusion-associated patient morbidity and mortality. Statistical process control (SPC) is a recognized method to monitor the performance of a critical process. An easy-to-use SPC method was tested to determine its feasibility as a tool for monitoring quality in transfusion medicine. SPC control charts were adapted to a spreadsheet presentation. Data tabulating the frequency of mislabeled and miscollected blood samples from 10 hospitals in five countries from 2004 to 2006 were used to demonstrate the method. Control charts were produced to monitor process stability. The participating hospitals found the SPC spreadsheet very suitable to monitor the performance of the sample labeling and collection and applied SPC charts to suit their specific needs. One hospital monitored subcategories of sample error in detail. A large hospital monitored the number of wrong-blood-in-tube (WBIT) events. Four smaller-sized facilities, each following the same policy for sample collection, combined their data on WBIT samples into a single control chart. One hospital used the control chart to monitor the effect of an educational intervention. A simple SPC method is described that can monitor the process of sample collection and labeling in any hospital. SPC could be applied to other critical steps in the transfusion processes as a tool for biovigilance and could be used to develop regional or national performance standards for pretransfusion sample collection. A link is provided to download the spreadsheet for free.

  1. Determination of Needed Spreadsheet Competencies for Business Personnel in the Mid-South States.

    ERIC Educational Resources Information Center

    Rogers, Betty S.; Arn, Joseph V.

    1993-01-01

    A survey of 209 Mid-South businesses determined spreadsheet usage, what competencies are needed for entry-level and continued employment, and sources of spreadsheet training. Recommended that, because of their widespread use, spreadsheets should be taught to all business students. (Author/JOW)

  2. Compilation of Disruptions to Airports by Volcanic Activity (Version 1.0, 1944-2006)

    USGS Publications Warehouse

    Guffanti, Marianne; Mayberry, Gari C.; Casadevall, Thomas J.; Wunderman, Richard

    2008-01-01

    Volcanic activity has caused significant hazards to numerous airports worldwide, with local to far-ranging effects on travelers and commerce. To more fully characterize the nature and scope of volcanic hazards to airports, we collected data on incidents of airports throughout the world that have been affected by volcanic activity, beginning in 1944 with the first documented instance of damage to modern aircraft and facilities in Naples, Italy, and extending through 2006. Information was gleaned from various sources, including news outlets, volcanological reports (particularly the Smithsonian Institution's Bulletin of the Global Volcanism Network), and previous publications on the topic. This report presents the full compilation of the data collected. For each incident, information about the affected airport and the volcanic source has been compiled as a record in a Microsoft Access database. The database is incomplete in so far as incidents may not have not been reported or documented, but it does present a good sample from diverse parts of the world. Not included are en-route diversions to avoid airborne ash clouds at cruise altitudes. The database has been converted to a Microsoft Excel spreadsheet. To make the PDF version of table 1 in this open-file report resemble the spreadsheet, order the PDF pages as 12, 17, 22; 13, 18, 23; 14, 19, 24; 15, 20, 25; and 16, 21, 26. Analysis of the database reveals that, at a minimum, 101 airports in 28 countries were impacted on 171 occasions from 1944 through 2006 by eruptions at 46 volcanoes. The number of affected airports (101) probably is better constrained than the number of incidents (171) because recurring disruptions at a given airport may have been lumped together or not reported by news agencies, whereas the initial disruption likely is noticed and reported and thus the airport correctly counted.

  3. GERIREX - growing a second generation medical expert system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kocur, J. Jr.; Suh, S.C.

    This article describes GERIREX, a medical expert system as the core module of an integrated system for total management of a medical practice. GERIREX is currently a first-generation consultant in the domain of prescribing for the geriatric patient with multiple ailments. Employing rule and objective probabilistic knowledge representations, the system performs at the near-expert level, correctly ranking single and multiple drug therapy for hypertension and/or congestive heart failure in the presence of between two and seven of 18 common accompanying or underlying conditions. GERIREX creates permanent consultation records and can access patient information from existing databases. System requirements are metmore » by very modest PCs, yet power, speed, flexibility, and ease of use rival or exceed those of many other systems. GERIREX interfaces with a variety of configurations and applications, including text, spreadsheets, databases, and executables, to fit in with current plans to upgrade to a second generation system, providing a degree of self-maintenance through intelligent parsing of a drug data source such as the Physicians` Desk Reference (PDR - CDROM version). Another option under consideration is developing neural networks to both replace the current knowledge base, and to embody the rationale employed by the medical expert in evaluating drug data for treatment selection. In this version, the current drug database would be used as warning data for the network tasked with adding new drugs to the drug database, imitating the process whereby a physician determines their personal arsenal from among the wide range of available options.« less

  4. Solving Optimization Problems with Spreadsheets

    ERIC Educational Resources Information Center

    Beigie, Darin

    2017-01-01

    Spreadsheets provide a rich setting for first-year algebra students to solve problems. Individual spreadsheet cells play the role of variables, and creating algebraic expressions for a spreadsheet to perform a task allows students to achieve a glimpse of how mathematics is used to program a computer and solve problems. Classic optimization…

  5. Using a spreadsheet/table template for economic value added analysis.

    PubMed

    Cassey, Margaret

    2008-01-01

    Translating clinical research into practical applications that are cost effective has received significant attention as staff nurses attempt to expand new knowledge into an already complex daily workflow. spreadsheet/table template created in a word processing format can assist with setting up and carrying out the analysis of costs for comparing different approaches to routine activities. By encouraging nurses to take the initiative to examine parts of everyday nursing practice with an eye to cost analysis, significant contributions can be made to maximizing the bottom line.

  6. Lifelong learning skills: how experienced are students when they enter medical school?

    PubMed

    Whittle, Sue R; Murdoch-Eaton, Deborah G

    2004-09-01

    Widening participation initiatives together with changes in school curricula in England may broaden the range of lifelong learning skills experience of new undergraduates. This project examines the experience levels of current students, as a comparative baseline. First-year medical students completed a questionnaire on arrival, investigating their practice of 31 skills during the previous two years. Responses show that most students have regularly practised transferable skills. However, significant numbers report little experience, particularly in IT skills such as email, using the Internet, spreadsheets and databases. Some remain unfamiliar with word processing. Library research, essay writing and oral presentation are also rarely practised by substantial numbers. One-third of students lack experience of evaluating their own strengths and weaknesses. Current students already show diversity of experience in skills on arrival at medical school. Changes in the near future may increase this range of experience further, and necessitate changes to undergraduate courses.

  7. Simulation Software's Effect on College Students Spreadsheet Project Scores

    ERIC Educational Resources Information Center

    Atkinson, J. Kirk; Thrasher, Evelyn H.; Coleman, Phillip D.

    2011-01-01

    The purpose of this study is to explore the potential impact of support materials on student spreadsheet skill acquisition. Specifically, this study examines the use of an online spreadsheet simulation tool versus a printed book across two independent student groups. This study hypothesizes that the online spreadsheet simulation tool will have a…

  8. MIRAGE: The data acquisition, analysis, and display system

    NASA Technical Reports Server (NTRS)

    Rosser, Robert S.; Rahman, Hasan H.

    1993-01-01

    Developed for the NASA Johnson Space Center and Life Sciences Directorate by GE Government Services, the Microcomputer Integrated Real-time Acquisition Ground Equipment (MIRAGE) system is a portable ground support system for Spacelab life sciences experiments. The MIRAGE system can acquire digital or analog data. Digital data may be NRZ-formatted telemetry packets of packets from a network interface. Analog signal are digitized and stored in experimental packet format. Data packets from any acquisition source are archived to a disk as they are received. Meta-parameters are generated from the data packet parameters by applying mathematical and logical operators. Parameters are displayed in text and graphical form or output to analog devices. Experiment data packets may be retransmitted through the network interface. Data stream definition, experiment parameter format, parameter displays, and other variables are configured using spreadsheet database. A database can be developed to support virtually any data packet format. The user interface provides menu- and icon-driven program control. The MIRAGE system can be integrated with other workstations to perform a variety of functions. The generic capabilities, adaptability and ease of use make the MIRAGE a cost-effective solution to many experimental data processing requirements.

  9. Idea and implementation studies of populating TOPO250 component with the data from TOPO10 - generalization of geographic information in the BDG database. (Polish Title: Koncepcja i studium implementacji procesu zasilania komponentu TOPO250 danymi TOPO10 - generalizacja informacji geograficznej w bazie danych BDG )

    NASA Astrophysics Data System (ADS)

    Olszewski, R.; Pillich-Kolipińska, A.; Fiedukowicz, A.

    2013-12-01

    Implementation of INSPIRE Directive in Poland requires not only legal transposition but also development of a number of technological solutions. The one of such tasks, associated with creation of Spatial Information Infrastructure in Poland, is developing a complex model of georeference database. Significant funding for GBDOT project enables development of the national basic topographical database as a multiresolution database (MRDB). Effective implementation of this type of database requires developing procedures for generalization of geographic information (generalization of digital landscape model - DLM), which, treating TOPO10 component as the only source for creation of TOPO250 component, will allow keeping conceptual and classification consistency between those database elements. To carry out this task, the implementation of the system's concept (prepared previously for Head Office of Geodesy and Cartography) is required. Such system is going to execute the generalization process using constrained-based modeling and allows to keep topological relationships between the objects as well as between the object classes. Full implementation of the designed generalization system requires running comprehensive tests which would help with its calibration and parameterization of the generalization procedures (related to the character of generalized area). Parameterization of this process will allow determining the criteria of specific objects selection, simplification algorithms as well as the operation order. Tests with the usage of differentiated, related to the character of the area, generalization process parameters become nowadays the priority issue. Parameters are delivered to the system in the form of XML files, which, with the help of dedicated tool, are generated from the spreadsheet files (XLS) filled in by user. Using XLS file makes entering and modifying the parameters easier. Among the other elements defined by the external parametric files there are: criteria of object selection, metric parameters of generalization algorithms (e.g. simplification or aggregation) and the operations' sequence. Testing on the trial areas of diverse character will allow developing the rules of generalization process' realization, its parameterization with the proposed tool within the multiresolution reference database. The authors have attempted to develop a generalization process' parameterization for a number of different trial areas. The generalization of the results will contribute to the development of a holistic system of generalized reference data stored in the national geodetic and cartographic resources.

  10. Cold Climate Foundation Retrofit Experimental Hygrothermal Performance. Cloquet Residential Research Facility Laboratory Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Louise F.; Harmon, Anna C.

    2015-04-09

    This project was funded jointly by the National Renewable Energy Laboratory (NREL) and Oak Ridge National Laboratory (ORNL). ORNL focused on developing a full basement wall system experimental database to enable others to validate hygrothermal simulation codes. NREL focused on testing the moisture durability of practical basement wall interior insulation retrofit solutions for cold climates. The project has produced a physically credible and reliable long-term hygrothermal performance database for retrofit foundation wall insulation systems in zone 6 and 7 climates that are fully compliant with the performance criteria in the 2009 Minnesota Energy Code. These data currently span the periodmore » from November 10, 2012 through May 31, 2014 and are anticipated to be extended through November 2014. The experimental data were configured into a standard format that can be published online and that is compatible with standard commercially available spreadsheet and database software.« less

  11. From ClinicalTrials.gov trial registry to an analysis-ready database of clinical trial results.

    PubMed

    Cepeda, M Soledad; Lobanov, Victor; Berlin, Jesse A

    2013-04-01

    The ClinicalTrials.gov web site provides a convenient interface to look up study results, but it does not allow downloading data in a format that can be readily used for quantitative analyses. To develop a system that automatically downloads study results from ClinicalTrials.gov and provides an interface to retrieve study results in a spreadsheet format ready for analysis. Sherlock(®) identifies studies by intervention, population, or outcome of interest and in seconds creates an analytic database of study results ready for analyses. The outcome classification algorithms used in Sherlock were validated against a classification by an expert. Having a database ready for analysis that can be updated automatically, dramatically extends the utility of the ClinicalTrials.gov trial registry. It increases the speed of comparative research, reduces the need for manual extraction of data, and permits answering a vast array of questions.

  12. Finding P-Values for F Tests of Hypothesis on a Spreadsheet.

    ERIC Educational Resources Information Center

    Rochowicz, John A., Jr.

    The calculation of the F statistic for a one-factor analysis of variance (ANOVA) and the construction of an ANOVA tables are easily implemented on a spreadsheet. This paper describes how to compute the p-value (observed significance level) for a particular F statistic on a spreadsheet. Decision making on a spreadsheet and applications to the…

  13. Cognitive Skills, Domain Knowledge, and Self-Efficacy: Effects on Spreadsheet Quality

    ERIC Educational Resources Information Center

    Adkins, Joni K.

    2011-01-01

    Numerous studies have shown that spreadsheets used in companies often have errors which may affect the quality of the decisions made with these tools. Many businesses are unaware or choose to ignore the risks associated with spreadsheet use. The intent of this study was to learn more about the characteristics of spreadsheet end user developers,…

  14. Spreadsheets and Bulgarian goats

    NASA Astrophysics Data System (ADS)

    Sugden, Steve

    2012-10-01

    We consider a problem appearing in an Australian Mathematics Challenge in 2003. This article considers whether a spreadsheet might be used to model this problem, thus allowing students to explore its structure within the spreadsheet environment. It then goes on to reflect on some general principles of problem decomposition when the final goal is a successful and lucid spreadsheet implementation.

  15. Teaching physics using Microsoft Excel

    NASA Astrophysics Data System (ADS)

    Uddin, Zaheer; Ahsanuddin, Muhammad; Khan, Danish Ahmed

    2017-09-01

    Excel is both ubiquitous and easily understandable. Most people from every walk of life know how to use MS office and Excel spreadsheets. Students are also familiar with spreadsheets. Most students know how to use spreadsheets for data analysis. Besides basic use of Excel, some important aspects of spreadsheets are highlighted in this article. MS Excel can be used to visualize effects of various parameters in a physical system. It can be used as a simulating tool; simulation of wind data has been done through spreadsheets in this study. Examples of Lissajous figures and a damped harmonic oscillator are presented in this article.

  16. Using Spreadsheets and Internally Consistent Databases to Explore Thermodynamics

    NASA Astrophysics Data System (ADS)

    Dasgupta, S.; Chakraborty, S.

    2003-12-01

    Much common wisdom has been handed down to generations of petrology students in words - a non-exhaustive list may include (a) do not mix data from two different thermodynamic databases, (b) use of different heat capacity functions or extrapolation beyond the P-T range of fit can have disastrous results, (c) consideration of errors in thermodynamic calculations is crucial, (d) consideration of non-ideality, interaction parameters etc. are important in some cases, but not in others. Actual calculations to demonstrate these effects were either too laborious, tedious, time consuming or involved elaborate computer programming beyond the reaches of the average undergraduate. We have produced "Live" thermodynamic tables in the form of ExcelTM spreadsheets based on standard internally consistent thermodynamic databases (e.g. Berman, Holland and Powell) that allow quick, easy and most importantly, transparent manipulation of thermodynamic data to calculate mineral stabilities and to explore the role of different parameters. We have intentionally avoided the use of advanced tools such as macros, and have set up columns of data that are easy to relate to thermodynamic relationships to enhance transparency. The approach consists of the following basic steps: (i) use a simple supporting spreadsheet to enter mineral compositions (in formula units) to obtain a balanced reaction by matrix inversion. (ii) enter the stoichiometry of this reaction in a designated space and a P and T to get the delta G of the reaction (iii) vary P and or T to locate equilibrium through a change of sign of delta G. These results can be collected to explore practically any problem of chemical equilibrium and mineral stability. Some of our favorites include (a) hierarchical addition of complexity to equilibrium calculations - start with a simple end member reaction ignoring heat capacity and volume derivatives, add the effects of these, followed by addition of compositional effects in the form of ideal solutions, add non-ideality next and finally, explore the role of varying parameters in simple models of non-ideality. (b) Arbitrarily change (i.e. simulate error) or mix data from different sources to see the consequences directly. More traditional exercises such as exploration of slopes of reaction in P-T space are trivial, and other thermodynamic tidbits such as "bigger the mineral formula, greater its thermodynamic weight" become apparent to undergraduates early on through such direct handling of data. The overall outcome is a far more quantitative appreciation of mineral stabilities and thermodynamic variables without actually doing any Math!

  17. Implementation of an interactive database interface utilizing HTML, PHP, JavaScript, and MySQL in support of water quality assessments in the Northeastern North Carolina Pasquotank Watershed

    NASA Astrophysics Data System (ADS)

    Guion, A., Jr.; Hodgkins, H.

    2015-12-01

    The Center of Excellence in Remote Sensing Education and Research (CERSER) has implemented three research projects during the summer Research Experience for Undergraduates (REU) program gathering water quality data for local waterways. The data has been compiled manually utilizing pen and paper and then entered into a spreadsheet. With the spread of electronic devices capable of interacting with databases, the development of an electronic method of entering and manipulating the water quality data was pursued during this project. This project focused on the development of an interactive database to gather, display, and analyze data collected from local waterways. The database and entry form was built in MySQL on a PHP server allowing participants to enter data from anywhere Internet access is available. This project then researched applying this data to the Google Maps site to provide labeling and information to users. The NIA server at http://nia.ecsu.edu is used to host the application for download and for storage of the databases. Water Quality Database Team members included the authors plus Derek Morris Jr., Kathryne Burton and Mr. Jeff Wood as mentor.

  18. Data acquisition and real-time control using spreadsheets: interfacing Excel with external hardware.

    PubMed

    Aliane, Nourdine

    2010-07-01

    Spreadsheets have become a popular computational tool and a powerful platform for performing engineering calculations. Moreover, spreadsheets include a macro language, which permits the inclusion of standard computer code in worksheets, and thereby enable developers to greatly extend spreadsheets' capabilities by designing specific add-ins. This paper describes how to use Excel spreadsheets in conjunction to Visual Basic for Application programming language to perform data acquisition and real-time control. Afterwards, the paper presents two Excel applications with interactive user interfaces developed for laboratory demonstrations and experiments in an introductory course in control. 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  19. A simple node and conductor data generator for SINDA

    NASA Technical Reports Server (NTRS)

    Gottula, Ronald R.

    1992-01-01

    This paper presents a simple, automated method to generate NODE and CONDUCTOR DATA for thermal match modes. The method uses personal computer spreadsheets to create SINDA inputs. It was developed in order to make SINDA modeling less time consuming and serves as an alternative to graphical methods. Anyone having some experience using a personal computer can easily implement this process. The user develops spreadsheets to automatically calculate capacitances and conductances based on material properties and dimensional data. The necessary node and conductor information is then taken from the spreadsheets and automatically arranged into the proper format, ready for insertion directly into the SINDA model. This technique provides a number of benefits to the SINDA user such as a reduction in the number of hand calculations, and an ability to very quickly generate a parametric set of NODE and CONDUCTOR DATA blocks. It also provides advantages over graphical thermal modeling systems by retaining the analyst's complete visibility into the thermal network, and by permitting user comments anywhere within the DATA blocks.

  20. AQUATOX Features and Tools

    EPA Pesticide Factsheets

    Numerous features have been included to facilitate the modeling process, from model setup and data input, presentation and analysis of results, to easy export of results to spreadsheet programs for additional analysis.

  1. 76 FR 70517 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... requested. These systems generally also provide analytics, spreadsheets, and other tools designed to enable funds to analyze the data presented, as well as communication tools to process fund instructions...

  2. Geologic Map and Cross Sections of the McGinness Hills Geothermal Area - GIS Data

    DOE Data Explorer

    Faulds, James E.

    2013-12-31

    Geologic map data in shapefile format that includes faults, unit contacts, unit polygons, attitudes of strata and faults, and surficial geothermal features. 5 cross‐sections in Adobe Illustrator format. Comprehensive catalogue of drill‐hole data in spreadsheet, shapefile, and Geosoft database formats. Includes XYZ locations of well heads, year drilled, type of well, operator, total depths, well path data (deviations), lithology logs, and temperature data. 3D model constructed with EarthVision using geologic map data, cross‐sections, drill‐hole data, and geophysics.

  3. Spreadsheet-based engine data analysis tool - user's guide.

    DOT National Transportation Integrated Search

    2016-07-01

    This record refers to both the spreadsheet tool - Fleet Equipment Performance Measurement Preventive Maintenance Model: Spreadsheet-Based Engine Data Analysis Tool, http://ntl.bts.gov/lib/60000/60000/60007/0-6626-P1_Final.xlsm - and its accompanying ...

  4. Modeling Steady-State Groundwater Flow Using Microcomputer Spreadsheets.

    ERIC Educational Resources Information Center

    Ousey, John Russell, Jr.

    1986-01-01

    Describes how microcomputer spreadsheets are easily adapted for use in groundwater modeling. Presents spreadsheet set-ups and the results of five groundwater models. Suggests that this approach can provide a basis for demonstrations, laboratory exercises, and student projects. (ML)

  5. Numerical Stimulation of Multicomponent Chromatography Using Spreadsheets.

    ERIC Educational Resources Information Center

    Frey, Douglas D.

    1990-01-01

    Illustrated is the use of spreadsheet programs for implementing finite difference numerical simulations of chromatography as an instructional tool in a separations course. Discussed are differential equations, discretization and integration, spreadsheet development, computer requirements, and typical simulation results. (CW)

  6. How Spreadsheets Boost Productivity.

    ERIC Educational Resources Information Center

    Ross, James

    1988-01-01

    Explains the use of computerized bookkeeping systems called spreadsheets to perform mathematical and accounting functions such as totaling expenditures, averaging test grades, and transferring funds. Advises about adapting spreadsheet programs and discusses several essential features, including linkage, macro functions, and sharing capabilities.…

  7. EVALUATING THE ECONOMICS AND ENVIRONMENTAL FRIENDLINESS OF NEWLY DESIGNED OR RETROFITTED CHEMICAL PROCESSES

    EPA Science Inventory

    This work describes a method for using spreadsheet analyses of process designs and retrofits to provide simple and quick economic and environmental evaluations simultaneously. The method focuses attention onto those streams and components that have the largest monetary values and...

  8. Modeling the Milky Way: Spreadsheet Science.

    ERIC Educational Resources Information Center

    Whitmer, John C.

    1990-01-01

    Described is the generation of a scale model of the solar system and the milky way galaxy using a computer spreadsheet program. A sample spreadsheet including cell formulas is provided. Suggestions for using this activity as a teaching technique are included. (CW)

  9. Using Spreadsheets to Produce Acid-Base Titration Curves.

    ERIC Educational Resources Information Center

    Cawley, Martin James; Parkinson, John

    1995-01-01

    Describes two spreadsheets for producing acid-base titration curves, one uses relatively simple cell formulae that can be written into the spreadsheet by inexperienced students and the second uses more complex formulae that are best written by the teacher. (JRH)

  10. Information Spreadsheet for Engines and Vehicles Compliance Information System (EV-CIS) User Registration

    EPA Pesticide Factsheets

    In this spreadsheet, user(s) provide their company’s manufacturer code, user contact information for EV-CIS, and user roles. This spreadsheet is used for the Company Authorizing Official (CAO), CROMERR Signer, and EV-CIS Submitters.

  11. Compatibility Assessment Tool

    NASA Technical Reports Server (NTRS)

    Egbert, James Allen

    2016-01-01

    In support of ground system development for the Space Launch System (SLS), engineers are tasked with building immense engineering models of extreme complexity. The various systems require rigorous analysis of pneumatics, hydraulic, cryogenic, and hypergolic systems. There are certain standards that each of these systems must meet, in the form of pressure vessel system (PVS) certification reports. These reports can be hundreds of pages long, and require many hours to compile. Traditionally, each component is analyzed individually, often utilizing hand calculations in the design process. The objective of this opportunity is to perform these analyses in an integrated fashion with the parametric CADCAE environment. This allows for systems to be analyzed on an assembly level in a semi-automated fashion, which greatly improves accuracy and efficiency. To accomplish this, component specific parameters were stored in the Windchill database to individual Creo Parametric models based on spec control drawings. These parameters were then accessed by using the Prime Analysis within Creo Parametric. MathCAD Prime spreadsheets were created that automatically extracted these parameters, performed calculations, and generated reports. The reports described component compatibility based on local conditions such as pressure, temperature, density, and flow rates. The reports also determined component pairing compatibility, such as properly sizing relief valves with regulators. The reports stored the input conditions that were used to determine compatibility to increase traceability of component selection. The desired workflow for using this tool would begin with a Creo Schematics diagram of a PVS system. This schematic would store local conditions and locations of components. The schematic would then populate an assembly within Creo Parametric, using Windchill database parts. These parts would have their attributes already assigned, and the MathCAD spreadsheets could begin running through database parts to determine which components would be suited for specific locations within the assembly. This eliminates a significant amount of time from the design process, and makes initial analysis assessments more accurate. Each component that would be checked for a location within the assembly would generate a report, showing whether the component was compatible. These reports could be used to generate the PVS report without the need to perform the same analysis multiple times. This process also has the potential to be expanded upon to further automate PVS reports. The integration of software codes or macros could be used to automatically check through hundreds of parts for each location on the schematic. If the software could recognize which type of component would be necessary for each location, it is possible that simply starting the macro could completely choose all the components needed for the schematic, and in turn the system. This would save many hours of work initially selecting components, which could end up saving money. Overall, this process helps to automate initial component selections for PVS systems to fit local design specifications. These selections will automatically generate reports showing how the design criteria are met by the specific component that was chosen. These reports will contribute to easier compilation of the PVS certification reports, which currently take a great amount of time and effort to produce.

  12. Knowledge, skills and attitudes of hospital pharmacists in the use of information technology and electronic tools to support clinical practice: A Brazilian survey

    PubMed Central

    Vasconcelos, Hemerson Bruno da Silva; Woods, David John

    2017-01-01

    This study aimed to identify the knowledge, skills and attitudes of Brazilian hospital pharmacists in the use of information technology and electronic tools to support clinical practice. Methods: A questionnaire was sent by email to clinical pharmacists working public and private hospitals in Brazil. The instrument was validated using the method of Polit and Beck to determine the content validity index. Data (n = 348) were analyzed using descriptive statistics, Pearson's Chi-square test and Gamma correlation tests. Results: Pharmacists had 1–4 electronic devices for personal use, mainly smartphones (84.8%; n = 295) and laptops (81.6%; n = 284). At work, pharmacists had access to a computer (89.4%; n = 311), mostly connected to the internet (83.9%; n = 292). They felt competent (very capable/capable) searching for a web page/web site on a specific subject (100%; n = 348), downloading files (99.7%; n = 347), using spreadsheets (90.2%; n = 314), searching using MeSH terms in PubMed (97.4%; n = 339) and general searching for articles in bibliographic databases (such as Medline/PubMed: 93.4%; n = 325). Pharmacists did not feel competent in using statistical analysis software (somewhat capable/incapable: 78.4%; n = 273). Most pharmacists reported that they had not received formal education to perform most of these actions except searching using MeSH terms. Access to bibliographic databases was available in Brazilian hospitals, however, most pharmacists (78.7%; n = 274) reported daily use of a non-specific search engine such as Google. This result may reflect the lack of formal knowledge and training in the use of bibliographic databases and difficulty with the English language. The need to expand knowledge about information search tools was recognized by most pharmacists in clinical practice in Brazil, especially those with less time dedicated exclusively to clinical activity (Chi-square, p = 0.006). Conclusion: These results will assist in defining minimal competencies for the training of pharmacists in the field of information technology to support clinical practice. Knowledge and skill gaps are evident in the use of bibliographic databases, spreadsheets and statistical tools. PMID:29272292

  13. Knowledge, skills and attitudes of hospital pharmacists in the use of information technology and electronic tools to support clinical practice: A Brazilian survey.

    PubMed

    Néri, Eugenie Desirèe Rabelo; Meira, Assuero Silva; Vasconcelos, Hemerson Bruno da Silva; Woods, David John; Fonteles, Marta Maria de França

    2017-01-01

    This study aimed to identify the knowledge, skills and attitudes of Brazilian hospital pharmacists in the use of information technology and electronic tools to support clinical practice. A questionnaire was sent by email to clinical pharmacists working public and private hospitals in Brazil. The instrument was validated using the method of Polit and Beck to determine the content validity index. Data (n = 348) were analyzed using descriptive statistics, Pearson's Chi-square test and Gamma correlation tests. Pharmacists had 1-4 electronic devices for personal use, mainly smartphones (84.8%; n = 295) and laptops (81.6%; n = 284). At work, pharmacists had access to a computer (89.4%; n = 311), mostly connected to the internet (83.9%; n = 292). They felt competent (very capable/capable) searching for a web page/web site on a specific subject (100%; n = 348), downloading files (99.7%; n = 347), using spreadsheets (90.2%; n = 314), searching using MeSH terms in PubMed (97.4%; n = 339) and general searching for articles in bibliographic databases (such as Medline/PubMed: 93.4%; n = 325). Pharmacists did not feel competent in using statistical analysis software (somewhat capable/incapable: 78.4%; n = 273). Most pharmacists reported that they had not received formal education to perform most of these actions except searching using MeSH terms. Access to bibliographic databases was available in Brazilian hospitals, however, most pharmacists (78.7%; n = 274) reported daily use of a non-specific search engine such as Google. This result may reflect the lack of formal knowledge and training in the use of bibliographic databases and difficulty with the English language. The need to expand knowledge about information search tools was recognized by most pharmacists in clinical practice in Brazil, especially those with less time dedicated exclusively to clinical activity (Chi-square, p = 0.006). These results will assist in defining minimal competencies for the training of pharmacists in the field of information technology to support clinical practice. Knowledge and skill gaps are evident in the use of bibliographic databases, spreadsheets and statistical tools.

  14. Preparation of School District Budgets with Microcomputer Electronic Spreadsheets.

    ERIC Educational Resources Information Center

    Hinitz, Herman J.

    1996-01-01

    Preparing a microcomputer electronic spreadsheet containing all relevant school district budgetary information is possible with currently available hardware and software (such as Lotus 1-2-3), despite random-access-memory limitations. Spreadsheets can provide financial summaries, inventory-control listings, scheduling alternatives,…

  15. A Spreadsheet in the Mathematics Classroom.

    ERIC Educational Resources Information Center

    Watkins, Will; Taylor, Monty

    1989-01-01

    Demonstrates how spreadsheets can be used to implement linear system solving algorithms in college mathematics classes. Lotus 1-2-3 is described, a linear system of equations is illustrated using spreadsheets, and the interplay between applications, computations, and theory is discussed. (four references) (LRW)

  16. The Iodine-Clock Reaction--A Spreadsheet Simulation To Test.

    ERIC Educational Resources Information Center

    Swain, P. A.

    1997-01-01

    Describes a spreadsheet activity for the iodine-clock reaction which follows the concentrations of all reactions and products for 200 seconds and gives the induction period. Explains that, although there are limitations to the spreadsheet, it is nevertheless illuminating. (Author/ASK)

  17. Station Program Note Pull Automation

    NASA Technical Reports Server (NTRS)

    Delgado, Ivan

    2016-01-01

    Upon commencement of my internship, I was in charge of maintaining the CoFR (Certificate of Flight Readiness) Tool. The tool acquires data from existing Excel workbooks on NASA's and Boeing's databases to create a new spreadsheet listing out all the potential safety concerns for upcoming flights and software transitions. Since the application was written in Visual Basic, I had to learn a new programming language and prepare to handle any malfunctions within the program. Shortly afterwards, I was given the assignment to automate the Station Program Note (SPN) Pull process. I developed an application, in Python, that generated a GUI (Graphical User Interface) that will be used by the International Space Station Safety & Mission Assurance team here at Johnson Space Center. The application will allow its users to download online files with the click of a button, import SPN's based on three different pulls, instantly manipulate and filter spreadsheets, and compare the three sources to determine which active SPN's (Station Program Notes) must be reviewed for any upcoming flights, missions, and/or software transitions. Initially, to perform the NASA SPN pull (one of three), I had created the program to allow the user to login to a secure webpage that stores data, input specific parameters, and retrieve the desired SPN's based on their inputs. However, to avoid any conflicts with sustainment, I altered it so that the user may login and download the NASA file independently. After the user has downloaded the file with the click of a button, I defined the program to check for any outdated or pre-existing files, for successful downloads, to acquire the spreadsheet, convert it from a text file to a comma separated file and finally into an Excel spreadsheet to be filtered and later scrutinized for specific SPN numbers. Once this file has been automatically manipulated to provide only the SPN numbers that are desired, they are stored in a global variable, shown on the GUI, and transferred over to a new Excel worksheet for comparison. I managed to get my application to acquire the CSWG (Computer Safety Working Group) and the SPNWG (Space Station Working Group) SPN's with just two mouse clicks for each pull, as opposed to several from the original process. When all three pulls are performed, an Excel sheet containing all three different results will be generated for the user to compare and determine which SPN's will be presented or reviewed the following month. The experience from this internship has been spectacular. As a high school senior who will begin attending college in the fall, this internship has been both educationally and occupationally beneficial. The internship has allowed me the opportunities to learn new programming languages, effectively network with NASA personnel from a variety of departments at JSC, and allowed me to learn new professional skills and etiquette. My internship at NASA's Johnson Space Center has further motivated me to pursue a Master's degree in Software Engineering and strive for a prosperous career with NASA as a civil servant.

  18. Seabed photographs, sediment texture analyses, and sun-illuminated sea floor topography in the Stellwagen Bank National Marine Sanctuary region off Boston, Massachusetts

    USGS Publications Warehouse

    Valentine, Page C.; Gallea, Leslie B.; Blackwood, Dann S.; Twomey, Erin R.

    2010-01-01

    The U.S. Geological Survey, in collaboration with National Oceanic and Atmospheric Administration's National Marine Sanctuary Program, conducted seabed mapping and related research in the Stellwagen Bank National Marine Sanctuary region from 1993 to 2004. The mapped area is approximately 3,700 km (1,100 nmi) in size and was subdivided into 18 quadrangles. An extensive series of sea-floor maps of the region based on multibeam sonar surveys has been published as paper maps and online in digital format (PDF, EPS, PS). In addition, 2,628 seabed-sediment samples were collected and analyzed and are in the usSEABED: Atlantic Coast Offshore Surficial Sediment Data Release. This report presents for viewing and downloading the more than 10,600 still seabed photographs that were acquired during the project. The digital images are provided in thumbnail, medium (1536 x 1024 pixels), and high (3071 x 2048) resolution. The images can be viewed by quadrangle on the U.S. Geological Survey Woods Hole Coastal and Marine Science Center's photograph database. Photograph metadata are embedded in each image in Exchangeable Image File Format and also provided in spreadsheet format. Published digital topographic maps and descriptive text for seabed features are included here for downloading and serve as context for the photographs. An interactive topographic map for each quadrangle shows locations of photograph stations, and each location is linked to the photograph database. This map also shows stations where seabed sediment was collected for texture analysis; the results of grain-size analysis and associated metadata are presented in spreadsheet format.

  19. XLWrap - Querying and Integrating Arbitrary Spreadsheets with SPARQL

    NASA Astrophysics Data System (ADS)

    Langegger, Andreas; Wöß, Wolfram

    In this paper a novel approach is presented for generating RDF graphs of arbitrary complexity from various spreadsheet layouts. Currently, none of the available spreadsheet-to-RDF wrappers supports cross tables and tables where data is not aligned in rows. Similar to RDF123, XLWrap is based on template graphs where fragments of triples can be mapped to specific cells of a spreadsheet. Additionally, it features a full expression algebra based on the syntax of OpenOffice Calc and various shift operations, which can be used to repeat similar mappings in order to wrap cross tables including multiple sheets and spreadsheet files. The set of available expression functions includes most of the native functions of OpenOffice Calc and can be easily extended by users of XLWrap.

  20. Integrating Critical Spreadsheet Competencies into the Accounting Curriculum

    ERIC Educational Resources Information Center

    Walters, L. Melissa; Pergola, Teresa M.

    2012-01-01

    The American Institute of Certified Public Accountants (AICPA) and the International Accounting Education Standards Board (IAESB) identify spreadsheet technology as a key information technology (IT) competency for accounting professionals. However requisite spreadsheet competencies are not specifically defined by the AICPA or IAESB nor are they…

  1. Exploring Difference Equations with Spreadsheets.

    ERIC Educational Resources Information Center

    Walsh, Thomas P.

    1996-01-01

    When using spreadsheets to explore real-world problems involving periodic change, students can observe what happens at each period, generate a graph, and learn how changing the starting quantity or constants affects results. Spreadsheet lessons for high school students are presented that explore mathematical modeling, linear programming, and…

  2. Visual Basic programs for spreadsheet analysis.

    PubMed

    Hunt, Bruce

    2005-01-01

    A collection of Visual Basic programs, entitled Function.xls, has been written for ground water spreadsheet calculations. This collection includes programs for calculating mathematical functions and for evaluating analytical solutions in ground water hydraulics and contaminant transport. Several spreadsheet examples are given to illustrate their use.

  3. Decision Analysis Using Spreadsheets.

    ERIC Educational Resources Information Center

    Sounderpandian, Jayavel

    1989-01-01

    Discussion of decision analysis and its importance in a business curriculum focuses on the use of spreadsheets instead of commercial software packages for computer assisted instruction. A hypothetical example is given of a company drilling for oil, and suggestions are provided for classroom exercises using spreadsheets. (seven references) (LRW)

  4. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1988-01-01

    Notes two uses of computer spreadsheets in the chemistry classroom. Discusses the general use of the spreadsheet to easily provide changing parameters of equations and then replotting the results on the screen. Presents a molecular orbital spreadsheet calculation of the LCAO-MO approach. Supplies representative printouts and graphs. (MVL)

  5. GENPLOT: A formula-based Pascal program for data manipulation and plotting

    NASA Astrophysics Data System (ADS)

    Kramer, Matthew J.

    Geochemical processes involving alteration, differentiation, fractionation, or migration of elements may be elucidated by a number of discrimination or variation diagrams (e.g., AFM, Harker, Pearce, and many others). The construction of these diagrams involves arithmetic combination of selective elements (involving major, minor, or trace elements). GENPLOT utilizes a formula-based algorithm (an expression parser) which enables the program to manipulate multiparameter databases and plot XY, ternary, tetrahedron, and REE type plots without needing to change either the source code or rearranging databases. Formulae may be any quadratic expression whose variables are the column headings of the data matrix. A full-screen editor with limited equations and arithmetic functions (spreadsheet) has been incorporated into the program to aid data entry and editing. Data are stored as ASCII files to facilitate interchange of data between other programs and computers. GENPLOT was developed in Turbo Pascal for the IBM and compatible computers but also is available in Apple Pascal for the Apple Ile and Ill. Because the source code is too extensive to list here (about 5200 lines of Pascal code), the expression parsing routine, which is central to GENPLOT's flexibility is incorporated into a smaller demonstration program named SOLVE. The following paper includes a discussion on how the expression parser works and a detailed description of GENPLOT's capabilities.

  6. Low-altitude photographic transects of the Arctic Network of National Park Units and Selawik National Wildlife Refuge, Alaska, July 2013

    USGS Publications Warehouse

    Marcot, Bruce G.; Jorgenson, M. Torre; DeGange, Anthony R.

    2014-01-01

    5. A Canon® Rebel 3Ti with a Sigma zoom lens (18–200 mm focal length). The Drift® HD-170 and GoPro® Hero3 cameras were secured to the struts and underwing for nadir (direct downward) imaging. The Panasonic® and Canon® cameras were each hand-held for oblique-angle landscape images, shooting through the airplanes’ windows, targeting both general landscape conditions as well as landscape features of special interest, such as tundra fire scars and landslips. The Drift® and GoPro® cameras each were set for time-lapse photography at 5-second intervals for overlapping coverage. Photographs from all cameras (100 percent .jpg format) were date- and time-synchronized to geographic positioning system waypoints taken during the flights, also at 5-second intervals, providing precise geotagging (latitude-longitude) of all files. All photographs were adjusted for color saturation and gamma, and nadir photographs were corrected for lens distortion for the Drift® and GoPro® cameras’ 170° wide-angle distortion. EXIF (exchangeable image file format) data on camera settings and geotagging were extracted into spreadsheet databases. An additional 1 hour, 20 minutes, and 43 seconds of high-resolution videos were recorded at 60 frames per second with the GoPro® camera along selected transect segments, and also were image-adjusted and corrected for lens distortion. Geotagged locations of 12,395 nadir photographs from the Drift® and GoPro® cameras were overlayed in a geographic information system (ArcMap 10.0) onto a map of 44 ecotypes (land- and water-cover types) of the Arctic Network study area. Presence and area of each ecotype occurring within a geographic information system window centered on the location of each photograph were recorded and included in the spreadsheet databases. All original and adjusted photographs, videos, geographic positioning system flight tracks, and photograph databases are available by contacting ascweb@usgs.gov.

  7. Sharing mutants and experimental information prepublication using FgMutantDb (https://scabusa.org/FgMutantDb).

    PubMed

    Baldwin, Thomas T; Basenko, Evelina; Harb, Omar; Brown, Neil A; Urban, Martin; Hammond-Kosack, Kim E; Bregitzer, Phil P

    2018-06-01

    There is no comprehensive storage for generated mutants of Fusarium graminearum or data associated with these mutants. Instead, researchers relied on several independent and non-integrated databases. FgMutantDb was designed as a simple spreadsheet that is accessible globally on the web that will function as a centralized source of information on F. graminearum mutants. FgMutantDb aids in the maintenance and sharing of mutants within a research community. It will serve also as a platform for disseminating prepublication results as well as negative results that often go unreported. Additionally, the highly curated information on mutants in FgMutantDb will be shared with other databases (FungiDB, Ensembl, PhytoPath, and PHI-base) through updating reports. Here we describe the creation and potential usefulness of FgMutantDb to the F. graminearum research community, and provide a tutorial on its use. This type of database could be easily emulated for other fungal species. Published by Elsevier Inc.

  8. A Database of Woody Vegetation Responses to Elevated Atmospheric CO2 (NDP-072)

    DOE Data Explorer

    Curtis, Peter S [The Ohio State Univ., Columbus, OH (United States); Cushman, Robert M [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brenkert, Antoinette L [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    1999-01-01

    To perform a statistically rigorous meta-analysis of research results on the response by woody vegetation to increased atmospheric CO2 levels, a multiparameter database of responses was compiled. Eighty-four independent CO2-enrichment studies, covering 65 species and 35 response parameters, met the necessary criteria for inclusion in the database: reporting mean response, sample size, and variance of the response (either as standard deviation or standard error). Data were retrieved from the published literature and unpublished reports. This numeric data package contains a 29-field data set of CO2-exposure experiment responses by woody plants (as both a flat ASCII file and a spreadsheet file), files listing the references to the CO2-exposure experiments and specific comments relevant to the data in the data set, and this documentation file (which includes SAS and Fortran codes to read the ASCII data file; SAS is a registered trademark of the SAS Institute, Inc., Cary, North Carolina 27511).

  9. PubSearch and PubFetch: a simple management system for semiautomated retrieval and annotation of biological information from the literature.

    PubMed

    Yoo, Danny; Xu, Iris; Berardini, Tanya Z; Rhee, Seung Yon; Narayanasamy, Vijay; Twigger, Simon

    2006-03-01

    For most systems in biology, a large body of literature exists that describes the complexity of the system based on experimental results. Manual review of this literature to extract targeted information into biological databases is difficult and time consuming. To address this problem, we developed PubSearch and PubFetch, which store literature, keyword, and gene information in a relational database, index the literature with keywords and gene names, and provide a Web user interface for annotating the genes from experimental data found in the associated literature. A set of protocols is provided in this unit for installing, populating, running, and using PubSearch and PubFetch. In addition, we provide support protocols for performing controlled vocabulary annotations. Intended users of PubSearch and PubFetch are database curators and biology researchers interested in tracking the literature and capturing information about genes of interest in a more effective way than with conventional spreadsheets and lab notebooks.

  10. Building a Database for a Quantitative Model

    NASA Technical Reports Server (NTRS)

    Kahn, C. Joseph; Kleinhammer, Roger

    2014-01-01

    A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.

  11. User's Manual for the National Water-Quality Assessment Program Invertebrate Data Analysis System (IDAS) Software: Version 3

    USGS Publications Warehouse

    Cuffney, Thomas F.

    2003-01-01

    The Invertebrate Data Analysis System (IDAS) software provides an accurate, consistent, and efficient mechanism for analyzing invertebrate data collected as part of the National Water-Quality Assessment Program and stored in the Biological Transactional Database (Bio-TDB). The IDAS software is a stand-alone program for personal computers that run Microsoft (MS) Windows?. It allows users to read data downloaded from Bio-TDB and stored either as MS Excel? or MS Access? files. The program consists of five modules. The Edit Data module allows the user to subset, combine, delete, and summarize community data. The Data Preparation module allows the user to select the type(s) of sample(s) to process, calculate densities, delete taxa based on laboratory processing notes, combine lifestages or keep them separate, select a lowest taxonomic level for analysis, delete rare taxa, and resolve taxonomic ambiguities. The Calculate Community Metrics module allows the user to calculate over 130 community metrics, including metrics based on organism tolerances and functional feeding groups. The Calculate Diversities and Similarities module allows the user to calculate nine diversity and eight similarity indices. The Data export module allows the user to export data to other software packages and produce tables of community data that can be imported into spreadsheet and word-processing programs. Though the IDAS program was developed to process invertebrate data downloaded from USGS databases, it will work with other data sets that are converted to the USGS (Bio-TDB) format. Consequently, the data manipulation, analysis, and export procedures provided by the IDAS program can be used by anyone involved in using benthic macroinvertebrates in applied or basic research.

  12. How to Create Automatically Graded Spreadsheets for Statistics Courses

    ERIC Educational Resources Information Center

    LoSchiavo, Frank M.

    2016-01-01

    Instructors often use spreadsheet software (e.g., Microsoft Excel) in their statistics courses so that students can gain experience conducting computerized analyses. Unfortunately, students tend to make several predictable errors when programming spreadsheets. Without immediate feedback, programming errors are likely to go undetected, and as a…

  13. Teaching with Spreadsheets: An Example from Heat Transfer.

    ERIC Educational Resources Information Center

    Drago, Peter

    1993-01-01

    Provides an activity which measures the heat transfer through an insulated cylindrical tank, allowing the student to gain a better knowledge of both the physics involved and the working of spreadsheets. Provides both a spreadsheet solution and a maximum-minimum method of solution for the problem. (MVL)

  14. Spreadsheet-Based Program for Simulating Atomic Emission Spectra

    ERIC Educational Resources Information Center

    Flannigan, David J.

    2014-01-01

    A simple Excel spreadsheet-based program for simulating atomic emission spectra from the properties of neutral atoms (e.g., energies and statistical weights of the electronic states, electronic partition functions, transition probabilities, etc.) is described. The contents of the spreadsheet (i.e., input parameters, formulas for calculating…

  15. Teaching Quality Control with Chocolate Chip Cookies

    ERIC Educational Resources Information Center

    Baker, Ardith

    2014-01-01

    Chocolate chip cookies are used to illustrate the importance and effectiveness of control charts in Statistical Process Control. By counting the number of chocolate chips, creating the spreadsheet, calculating the control limits and graphing the control charts, the student becomes actively engaged in the learning process. In addition, examining…

  16. Teaching Raster GIS Operations with Spreadsheets.

    ERIC Educational Resources Information Center

    Raubal, Martin; Gaupmann, Bernhard; Kuhn, Werner

    1997-01-01

    Defines raster technology in its relationship to geographic information systems and notes that it is typically used with the application of remote sensing techniques and scanning devices. Discusses the role of spreadsheets in a raster model, and describes a general approach based on spreadsheets. Includes six computer-generated illustrations. (MJP)

  17. Spreadsheet Design: An Optimal Checklist for Accountants

    ERIC Educational Resources Information Center

    Barnes, Jeffrey N.; Tufte, David; Christensen, David

    2009-01-01

    Just as good grammar, punctuation, style, and content organization are important to well-written documents, basic fundamentals of spreadsheet design are essential to clear communication. In fact, the very principles of good writing should be integrated into spreadsheet workpaper design and organization. The unique contributions of this paper are…

  18. Computer Applications: Using Electronic Spreadsheets.

    ERIC Educational Resources Information Center

    Riley, Connee; And Others

    This instructional unit is intended to assist teachers in helping students learn to use electronic spreadsheets. The 11 learning activities included, all of which are designed for use in conjunction with Multiplan Spreadsheet Software, are arranged in order of increasing difficulty. An effort has been made to include problems applicable to each of…

  19. Manipulative and Numerical Spreadsheet Templates for the Study of Discrete Structures.

    ERIC Educational Resources Information Center

    Abramovich, Sergei

    1998-01-01

    Argues that basic components of discrete mathematics can be introduced to students through gradual elaboration of experiences with iconic spreadsheet-based simulations of concrete materials. Suggests that the study of homogeneous and heterogeneous patterns of manipulative spreadsheet templates allows for appreciation of the development of…

  20. Excel Spreadsheets for Algebra: Improving Mental Modeling for Problem Solving

    ERIC Educational Resources Information Center

    Engerman, Jason; Rusek, Matthew; Clariana, Roy

    2014-01-01

    This experiment investigates the effectiveness of Excel spreadsheets in a high school algebra class. Students in the experiment group convincingly outperformed the control group on a post lesson assessment. The student responses, teacher observations involving Excel spreadsheet revealed that it operated as a mindtool, which formed the users'…

  1. Why Excel?

    ERIC Educational Resources Information Center

    Barreto, Humberto

    2015-01-01

    This article is not the usual Excel pedagogy fare in that it does not provide an application or example taught via a spreadsheet. Instead, it briefly reviews the history of spreadsheets in the economics classroom and explores the current environment, with an emphasis on modern learning theory. The conclusion is not surprising: spreadsheets improve…

  2. Levels of Student Responses in a Spreadsheet-Based Environment

    ERIC Educational Resources Information Center

    Tabach, Michal; Friedlander, Alex

    2004-01-01

    The purpose of this report is to investigate the range of student responses in three domains--hypothesizing, organizing data, and algebraic generalization of patterns during their work on a spreadsheet-based activity. In a wider context, we attempted to investigate students' utilization schemes of spreadsheets in their learning of introductory…

  3. User's guide: RPGrow$: a red pine growth and analysis spreadsheet for the Lake States.

    Treesearch

    Carol A. Hyldahl; Gerald H. Grossman

    1993-01-01

    Describes RPGrow$, a stand-level, interactive spreadsheet for projecting growth and yield and estimating financial returns of red pine plantations in the Lake States. This spreadsheet is based on published growth models for red pine. Financial analyses are based on discounted cash flow methods.

  4. Spreadsheets and Bulgarian Goats

    ERIC Educational Resources Information Center

    Sugden, Steve

    2012-01-01

    We consider a problem appearing in an Australian Mathematics Challenge in 2003. This article considers whether a spreadsheet might be used to model this problem, thus allowing students to explore its structure within the spreadsheet environment. It then goes on to reflect on some general principles of problem decomposition when the final goal is a…

  5. CEASAW: A User-Friendly Computer Environment Analysis for the Sawmill Owner

    Treesearch

    Guillermo Mendoza; William Sprouse; Philip A. Araman; William G. Luppold

    1991-01-01

    Improved spreadsheet software capabilities have brought optimization to users with little or no background in mathematical programming. Better interface capabilities of spreadsheet models now make it possible to combine optimization models with a spreadsheet system. Sawmill production and inventory systems possess many features that make them suitable application...

  6. Lens ray diagrams with a spreadsheet

    NASA Astrophysics Data System (ADS)

    González, Manuel I.

    2018-05-01

    Physicists create spreadsheets customarily to carry out numerical calculations and to display their results in a meaningful, nice-looking way. Spreadsheets can also be used to display a vivid geometrical model of a physical system. This statement is illustrated with an example taken from geometrical optics: images formed by a thin lens. A careful mixture of standard Excel functions allows to display a realistic automated ray diagram. The suggested spreadsheet is intended as an auxiliary didactic tool for instructors who wish to teach their students to create their own ray diagrams.

  7. Baseliner: an open source, interactive tool for processing sap flux data from thermal dissipation probes.

    Treesearch

    Andrew C. Oishi; David Hawthorne; Ram Oren

    2016-01-01

    Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS). We developed the Baseliner software to help establish a standardized protocol for processing sap...

  8. Using a Readily Available Commercial Spreadsheet to Teach a Graduate Course on Chemical Process Simulation

    ERIC Educational Resources Information Center

    Clarke, Matthew A.; Giraldo, Carlos

    2009-01-01

    Chemical process simulation is one of the most fundamental skills that is expected from chemical engineers, yet relatively few graduates have the opportunity to learn, in depth, how a process simulator works, from programming the unit operations to the sequencing. The University of Calgary offers a "hands-on" postgraduate course in…

  9. Waste Characterization Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, Patrick E.

    2014-11-01

    The purpose is to provide guidance to the Radiological Characterization Reviewer to complete the radiological characterization of waste items. This information is used for Department of Transportation (DOT) shipping and disposal, typically at the Nevada National Security Site (NNSS). Complete characterization ensures compliance with DOT shipping laws and NNSS Waste Acceptance Criteria (WAC). The fines for noncompliance can be extreme. This does not include possible bad press, and endangerment to the public, employees and the environment. A Radiological Characterization Reviewer has an important role in the organization. The scope is to outline the characterization process, but does not to includemore » every possible situation. The Radiological Characterization Reviewer position requires a strong background in Health Physics; therefore, these concepts are minimally addressed. The characterization process includes many Excel spreadsheets that were developed by Michael Enghauser known as the WCT software suite. New Excel spreadsheets developed as part of this project include the Ra- 226 Decider and the Density Calculator by Jesse Bland, MicroShield Density Calculator and Molecular Weight Calculator by Pat Lambert.« less

  10. A quality-based cost model for new electronic systems and products

    NASA Astrophysics Data System (ADS)

    Shina, Sammy G.; Saigal, Anil

    1998-04-01

    This article outlines a method for developing a quality-based cost model for the design of new electronic systems and products. The model incorporates a methodology for determining a cost-effective design margin allocation for electronic products and systems and its impact on manufacturing quality and cost. A spreadsheet-based cost estimating tool was developed to help implement this methodology in order for the system design engineers to quickly estimate the effect of design decisions and tradeoffs on the quality and cost of new products. The tool was developed with automatic spreadsheet connectivity to current process capability and with provisions to consider the impact of capital equipment and tooling purchases to reduce the product cost.

  11. A novel real-time data acquisition using an Excel spreadsheet in pendulum experiment tool with light-based timer

    NASA Astrophysics Data System (ADS)

    Adhitama, Egy; Fauzi, Ahmad

    2018-05-01

    In this study, a pendulum experimental tool with a light-based timer has been developed to measure the period of a simple pendulum. The obtained data was automatically recorded in an Excel spreadsheet. The intensity of monochromatic light, sensed by a 3DU5C phototransistor, dynamically changes as the pendulum swings. The changed intensity varies the resistance value and was processed by the microcontroller, ATMega328, to obtain a signal period as a function of time and brightness when the pendulum crosses the light. Through the experiment, using calculated average periods, the gravitational acceleration value has been accurately and precisely determined.

  12. Lens Ray Diagrams with a Spreadsheet

    ERIC Educational Resources Information Center

    González, Manuel I.

    2018-01-01

    Physicists create spreadsheets customarily to carry out numerical calculations and to display their results in a meaningful, nice-looking way. Spreadsheets can also be used to display a vivid geometrical model of a physical system. This statement is illustrated with an example taken from geometrical optics: images formed by a thin lens. A careful…

  13. Spreadsheet Modeling of Electron Distributions in Solids

    ERIC Educational Resources Information Center

    Glassy, Wingfield V.

    2006-01-01

    A series of spreadsheet modeling exercises constructed as part of a new upper-level elective course on solid state materials and surface chemistry is described. The spreadsheet exercises are developed to provide students with the opportunity to interact with the conceptual framework where the role of the density of states and the Fermi-Dirac…

  14. Designing Spreadsheet-Based Tasks for Purposeful Algebra

    ERIC Educational Resources Information Center

    Ainley, Janet; Bills, Liz; Wilson, Kirsty

    2005-01-01

    We describe the design of a sequence of spreadsheet-based pedagogic tasks for the introduction of algebra in the early years of secondary schooling within the Purposeful Algebraic Activity project. This design combines two relatively novel features to bring a different perspective to research in the use of spreadsheets for the learning and…

  15. 39 CFR 3050.22 - Documentation supporting attributable cost estimates in the Postal Service's section 3652 report.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., Demand Side Variability, and Network Variability studies, including input data, processing programs, and... should include the product or product groups carried under each listed contract; (k) Spreadsheets and...

  16. A Spreadsheet for a 2 x 3 x 2 Log-Linear Analysis. AIR 1991 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Saupe, Joe L.

    This paper describes a personal computer spreadsheet set up to carry out hierarchical log-linear analyses, a type of analysis useful for institutional research into multidimensional frequency tables formed from categorical variables such as faculty rank, student class level, gender, or retention status. The spreadsheet provides a concrete vehicle…

  17. Using Spreadsheets to Help Students Think Recursively

    ERIC Educational Resources Information Center

    Webber, Robert P.

    2012-01-01

    Spreadsheets lend themselves naturally to recursive computations, since a formula can be defined as a function of one of more preceding cells. A hypothesized closed form for the "n"th term of a recursive sequence can be tested easily by using a spreadsheet to compute a large number of the terms. Similarly, a conjecture about the limit of a series…

  18. A Spreadsheet Tool for Learning the Multiple Regression F-Test, T-Tests, and Multicollinearity

    ERIC Educational Resources Information Center

    Martin, David

    2008-01-01

    This note presents a spreadsheet tool that allows teachers the opportunity to guide students towards answering on their own questions related to the multiple regression F-test, the t-tests, and multicollinearity. The note demonstrates approaches for using the spreadsheet that might be appropriate for three different levels of statistics classes,…

  19. Integrated Spreadsheets as a Paradigm of Type II Technology Applications in Mathematics Teacher Education

    ERIC Educational Resources Information Center

    Abramovich, Sergei

    2016-01-01

    The paper presents the use of spreadsheets integrated with digital tools capable of symbolic computations and graphic constructions in a master's level capstone course for secondary mathematics teachers. Such use of spreadsheets is congruent with the Type II technology applications framework aimed at the development of conceptual knowledge in the…

  20. AXAOTHER XL -- A spreadsheet for determining doses for incidents caused by tornadoes or high-velocity straight winds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpkins, A.A.

    1996-09-01

    AXAOTHER XL is an Excel Spreadsheet used to determine dose to the maximally exposed offsite individual during high-velocity straight winds or tornado conditions. Both individual and population doses may be considered. Potential exposure pathways are inhalation and plume shine. For high-velocity straight winds the spreadsheet has the capability to determine the downwind relative air concentration, however for the tornado conditions, the user must enter the relative air concentration. Theoretical models are discussed and hand calculations are performed to ensure proper application of methodologies. A section has also been included that contains user instructions for the spreadsheet.

  1. LabPatch, an acquisition and analysis program for patch-clamp electrophysiology.

    PubMed

    Robinson, T; Thomsen, L; Huizinga, J D

    2000-05-01

    An acquisition and analysis program, "LabPatch," has been developed for use in patch-clamp research. LabPatch controls any patch-clamp amplifier, acquires and records data, runs voltage protocols, plots and analyzes data, and connects to spreadsheet and database programs. Controls within LabPatch are grouped by function on one screen, much like an oscilloscope front panel. The software is mouse driven, so that the user need only point and click. Finally, the ability to copy data to other programs running in Windows 95/98, and the ability to keep track of experiments using a database, make LabPatch extremely versatile. The system requirements include Windows 95/98, at least a 100-MHz processor and 16 MB RAM, a data acquisition card, digital-to-analog converter, and a patch-clamp amplifier. LabPatch is available free of charge at http://www.fhs.mcmaster.ca/huizinga/.

  2. SABIO-RK: an updated resource for manually curated biochemical reaction kinetics

    PubMed Central

    Rey, Maja; Weidemann, Andreas; Kania, Renate; Müller, Wolfgang

    2018-01-01

    Abstract SABIO-RK (http://sabiork.h-its.org/) is a manually curated database containing data about biochemical reactions and their reaction kinetics. The data are primarily extracted from scientific literature and stored in a relational database. The content comprises both naturally occurring and alternatively measured biochemical reactions and is not restricted to any organism class. The data are made available to the public by a web-based search interface and by web services for programmatic access. In this update we describe major improvements and extensions of SABIO-RK since our last publication in the database issue of Nucleic Acid Research (2012). (i) The website has been completely revised and (ii) allows now also free text search for kinetics data. (iii) Additional interlinkages with other databases in our field have been established; this enables users to gain directly comprehensive knowledge about the properties of enzymes and kinetics beyond SABIO-RK. (iv) Vice versa, direct access to SABIO-RK data has been implemented in several systems biology tools and workflows. (v) On request of our experimental users, the data can be exported now additionally in spreadsheet formats. (vi) The newly established SABIO-RK Curation Service allows to respond to specific data requirements. PMID:29092055

  3. Breaking free from chemical spreadsheets.

    PubMed

    Segall, Matthew; Champness, Ed; Leeding, Chris; Chisholm, James; Hunt, Peter; Elliott, Alex; Garcia-Martinez, Hector; Foster, Nick; Dowling, Samuel

    2015-09-01

    Drug discovery scientists often consider compounds and data in terms of groups, such as chemical series, and relationships, representing similarity or structural transformations, to aid compound optimisation. This is often supported by chemoinformatics algorithms, for example clustering and matched molecular pair analysis. However, chemistry software packages commonly present these data as spreadsheets or form views that make it hard to find relevant patterns or compare related compounds conveniently. Here, we review common data visualisation and analysis methods used to extract information from chemistry data. We introduce a new framework that enables scientists to work flexibly with drug discovery data to reflect their thought processes and interact with the output of algorithms to identify key structure-activity relationships and guide further optimisation intuitively. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Spreadsheet log analysis in subsurface geology

    USGS Publications Warehouse

    Doveton, J.H.

    2000-01-01

    Most of the direct knowledge of the geology of the subsurface is gained from the examination of core and drill-cuttings recovered from boreholes drilled by the petroleum and water industries. Wireline logs run in these same boreholes generally have been restricted to tasks of lithostratigraphic correlation and thee location of hydrocarbon pay zones. However, the range of petrophysical measurements has expanded markedly in recent years, so that log traces now can be transformed to estimates of rock composition. Increasingly, logs are available in a digital format that can be read easily by a desktop computer and processed by simple spreadsheet software methods. Taken together, these developments offer accessible tools for new insights into subsurface geology that complement the traditional, but limited, sources of core and cutting observations.

  5. Reflecting on the challenges of building a rich interconnected metadata database to describe the experiments of phase six of the coupled climate model intercomparison project (CMIP6) for the Earth System Documentation Project (ES-DOC) and anticipating the opportunities that tooling and services based on rich metadata can provide.

    NASA Astrophysics Data System (ADS)

    Pascoe, C. L.

    2017-12-01

    The Coupled Model Intercomparison Project (CMIP) has coordinated climate model experiments involving multiple international modelling teams since 1995. This has led to a better understanding of past, present, and future climate. The 2017 sixth phase of the CMIP process (CMIP6) consists of a suite of common experiments, and 21 separate CMIP-Endorsed Model Intercomparison Projects (MIPs) making a total of 244 separate experiments. Precise descriptions of the suite of CMIP6 experiments have been captured in a Common Information Model (CIM) database by the Earth System Documentation Project (ES-DOC). The database contains descriptions of forcings, model configuration requirements, ensemble information and citation links, as well as text descriptions and information about the rationale for each experiment. The database was built from statements about the experiments found in the academic literature, the MIP submissions to the World Climate Research Programme (WCRP), WCRP summary tables and correspondence with the principle investigators for each MIP. The database was collated using spreadsheets which are archived in the ES-DOC Github repository and then rendered on the ES-DOC website. A diagramatic view of the workflow of building the database of experiment metadata for CMIP6 is shown in the attached figure.The CIM provides the formalism to collect detailed information from diverse sources in a standard way across all the CMIP6 MIPs. The ES-DOC documentation acts as a unified reference for CMIP6 information to be used both by data producers and consumers. This is especially important given the federated nature of the CMIP6 project. Because the CIM allows forcing constraints and other experiment attributes to be referred to by more than one experiment, we can streamline the process of collecting information from modelling groups about how they set up their models for each experiment. End users of the climate model archive will be able to ask questions enabled by the interconnectedness of the metadata such as "Which MIPs make use of experiment A?" and "Which experiments use forcing constraint B?".

  6. An Introduction to Simulated Annealing

    ERIC Educational Resources Information Center

    Albright, Brian

    2007-01-01

    An attempt to model the physical process of annealing lead to the development of a type of combinatorial optimization algorithm that takes on the problem of getting trapped in a local minimum. The author presents a Microsoft Excel spreadsheet that illustrates how this works.

  7. Multimodal system planning technique : an analytical approach to peak period operation

    DOT National Transportation Integrated Search

    1995-11-01

    The multimodal system planning technique described in this report is an improvement of the methodology used in the Dallas System Planning Study. The technique includes a spreadsheet-based process to identify the costs of congestion, construction, and...

  8. Computerized Budget Monitoring.

    ERIC Educational Resources Information Center

    Stein, Julian U.; Rowe, Joe N.

    1989-01-01

    This article discusses the importance of budget monitoring in fiscal management; describes ways in which computerized budget monitoring increases accuracy, efficiency, and flexibility; outlines steps in the budget process; and presents sample reports, generated using the Lotus 1-2-3 spreadsheet and graphics program. (IAH)

  9. Life Cycle Inventory (LCI) Data-Treatment Chemicals, Construction Materials, Transportation, On-site Equipment, and other Processes for Use in Spreadsheets for Environmental Footprint Analysis (SEFA): Revised Addition

    EPA Science Inventory

    This report estimates environmental emission factors (EmF) for key chemicals, construction and treatment materials, transportation/on-site equipment, and other processes used at remediation sites. The basis for chemical, construction, and treatment material EmFs is life cycle inv...

  10. Life Cycle Inventory (LCI) Data-Treatment Chemicals, Construction Materials, Transportation, On-site Equipment, and Other Processes for Use in Spreadsheets for Environmental Footprint Analysis (SEFA)

    EPA Science Inventory

    This report estimates environmental emission factors (EmF) for key chemicals, construction and treatment materials, transportation/on-site equipment, and other processes used at remediation sites. The basis for chemical, construction, and treatment material EmFs is life cycle inv...

  11. DARPA Initiative in Concurrent Engineering (DICE). Phase 2

    DTIC Science & Technology

    1990-07-31

    XS spreadsheet tool " Q-Calc spreadsheet tool " TAE+ outer wrapper for XS • Framemaker-based formal EDN (Electronic Design Notebook) " Data...shared global object space and object persistence. Technical Results Module Development XS Integration Environment A prototype of the wrapper concepts...for a spreadsheet integration environment, using an X-Windows based extensible Lotus 1-2-3 emulation called XS , and was (initially) targeted for

  12. On the Use of Spreadsheet Algebra Programs in the Professional Development of Teachers from Selected Township High Schools

    ERIC Educational Resources Information Center

    Gierdien, M. Faaiz

    2014-01-01

    This paper reports on the initial stages of a small-scale project involving the use of "spreadsheet algebra programs" in the professional development of eight teachers from three township high schools. In terms of the education context, the paper draws on social practice theory. It then details what is meant by spreadsheet algebra. An…

  13. Psychology Experiment Authoring Kit (PEAK): formal usability testing of an easy-to-use method for creating computerized experiments.

    PubMed

    Schneider, Walter; Bolger, D J; Eschman, Amy; Neff, Christopher; Zuccolotto, Anthony P

    2005-05-01

    In academic courses in which one task for the students is to understand empirical methodology and the nature of scientific inquiry, the ability of students to create and implement their own experiments allows them to take intellectual ownership of, and greatly facilitates, the learning process. The Psychology Experiment Authoring Kit (PEAK) is a novel spreadsheet-based interface allowing students and researchers with rudimentary spreadsheet skills to create cognitive and cognitive neuroscience experiments in minutes. Students fill in a spreadsheet listing of independent variables and stimuli, insert columns that represent experimental objects such as slides (presenting text, pictures, and sounds) and feedback displays to create complete experiments, all within a single spreadsheet. The application then executes experiments with centisecond precision. Formal usability testing was done in two stages: (1) detailed coding of 10 individual subjects in one-on-one experimenter/subject videotaped sessions and (2) classroom testing of 64 undergraduates. In both individual and classroom testing, the students learned to effectively use PEAK within 2 h, and were able to create a lexical decision experiment in under 10 min. Findings from the individual testing in Stage 1 resulted in significant changes to documentation and training materials and identification of bugs to be corrected. Stage 2 testing identified additional bugs to be corrected and new features to be considered to facilitate student understanding of the experiment model. Such testing will improve the approach with each semester. The students were typically able to create their own projects in 2 h.

  14. An Application Programming Interface for Synthetic Snowflake Particle Structure and Scattering Data

    NASA Technical Reports Server (NTRS)

    Lammers, Matthew; Kuo, Kwo-Sen

    2017-01-01

    The work by Kuo and colleagues on growing synthetic snowflakes and calculating their single-scattering properties has demonstrated great potential to improve the retrievals of snowfall. To grant colleagues flexible and targeted access to their large collection of sizes and shapes at fifteen (15) microwave frequencies, we have developed a web-based Application Programming Interface (API) integrated with NASA Goddard's Precipitation Processing System (PPS) Group. It is our hope that the API will enable convenient programmatic utilization of the database. To help users better understand the API's capabilities, we have developed an interactive web interface called the OpenSSP API Query Builder, which implements an intuitive system of mechanisms for selecting shapes, sizes, and frequencies to generate queries, with which the API can then extract and return data from the database. The Query Builder also allows for the specification of normalized particle size distributions by setting pertinent parameters, with which the API can also return mean geometric and scattering properties for each size bin. Additionally, the Query Builder interface enables downloading of raw scattering and particle structure data packages. This presentation will describe some of the challenges and successes associated with developing such an API. Examples of its usage will be shown both through downloading output and pulling it into a spreadsheet, as well as querying the API programmatically and working with the output in code.

  15. Eggen Card Project: Progress and Plans (Abstract)

    NASA Astrophysics Data System (ADS)

    Silvis, G.

    2016-12-01

    (Abstract only) The Eggen Card Project has been running since 2009 and has involved 30+ AAVSO staff and volunteers. Let me offer a short review of the project, our progress this year and our plans for the future. Phase 1 of the project has been to index the 108,000 card images, identifying the stars they belong too. We've passed the 75% point on this phase. The next phase is how to use this data. Jack Crast has identified the photometric schemes used by Olin and developed a spreadsheet tool to prepare this data for inclusion into the AAVSO International Database (AID). Anyone want good photometry from 1970? We got it!

  16. Plugging into Marketing Education.

    ERIC Educational Resources Information Center

    Lunkenheimer, Gary; Swift, Teri

    This text contains activities that allow marketing education instructors to integrate their curriculum with word-processing, spreadsheet, and presentation software. Their students can gain experience with technology, fulfill marketing education learner outcomes, and meet the demands of a marketing job. The instructor provides an outline for…

  17. Abdominal surgery process modeling framework for simulation using spreadsheets.

    PubMed

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  18. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections

    PubMed Central

    Bailey, Stephanie L.; Bono, Rose S.; Nash, Denis; Kimmel, April D.

    2018-01-01

    Background Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. Methods We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. Results We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Conclusions Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited. PMID:29570737

  19. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections.

    PubMed

    Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D

    2018-01-01

    Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited.

  20. The "chessboard" classification scheme of mineral deposits: Mineralogy and geology from aluminum to zirconium

    NASA Astrophysics Data System (ADS)

    Dill, Harald G.

    2010-06-01

    Economic geology is a mixtum compositum of all geoscientific disciplines focused on one goal, finding new mineral depsosits and enhancing their exploitation. The keystones of this mixtum compositum are geology and mineralogy whose studies are centered around the emplacement of the ore body and the development of its minerals and rocks. In the present study, mineralogy and geology act as x- and y-coordinates of a classification chart of mineral resources called the "chessboard" (or "spreadsheet") classification scheme. Magmatic and sedimentary lithologies together with tectonic structures (1 -D/pipes, 2 -D/veins) are plotted along the x-axis in the header of the spreadsheet diagram representing the columns in this chart diagram. 63 commodity groups, encompassing minerals and elements are plotted along the y-axis, forming the lines of the spreadsheet. These commodities are subjected to a tripartite subdivision into ore minerals, industrial minerals/rocks and gemstones/ornamental stones. Further information on the various types of mineral deposits, as to the major ore and gangue minerals, the current models and the mode of formation or when and in which geodynamic setting these deposits mainly formed throughout the geological past may be obtained from the text by simply using the code of each deposit in the chart. This code can be created by combining the commodity (lines) shown by numbers plus lower caps with the host rocks or structure (columns) given by capital letters. Each commodity has a small preface on the mineralogy and chemistry and ends up with an outlook into its final use and the supply situation of the raw material on a global basis, which may be updated by the user through a direct link to databases available on the internet. In this case the study has been linked to the commodity database of the US Geological Survey. The internal subdivision of each commodity section corresponds to the common host rock lithologies (magmatic, sedimentary, and metamorphic) and structures. Cross sections and images illustrate the common ore types of each commodity. Ore takes priority over the mineral. The minerals and host rocks are listed by their chemical and mineralogical compositions, respectively, separated from the text but supplemented with cross-references to the columns and lines, where they prevalently occur. A metallogenetic-geodynamic overview is given at the bottom of each column in the spreadsheet. It may be taken as the "sum" or the " mean" of a number of geodynamic models and ideas put forward by the various researchers for all the deposits pertaining to a certain clan of lithology or structure. This classical or conservative view of metallotects related to the common plate tectonic settings is supplemented by an approach taken for the first time for such a number of deposits, using the concepts of sequence stratigraphy. This paper, so as to say, is a "launch pad" for a new mindset in metallogenesis rather than the final result. The relationship supergene-hypogene and syngenetic-epigenetic has been the topic of many studies for ages but to keep them as separate entities is often unworkable in practice, especially in the so-called epithermal or near-surface/shallow deposits. Vein-type and stratiform ore bodies are generally handled also very differently. To get these different structural elements (space) and various mineralizing processes (time) together and to allow for a forward modeling in mineral exploration, architectural elements of sequence stratigraphy are adapted to mineral resources. Deposits are geological bodies which need accommodation space created by the environment of formation and the tectonic/geodynamic setting through time. They are controlled by horizontal to subhorizontal reference planes and/or vertical structures. Prerequisites for the deposits to evolve are thermal and/or mechanical gradients. Thermal energy is for most of the settings under consideration deeply rooted in the mantle. A perspective on how this concept might work is given in the text by a pilot project on mineral deposits in Central Europe and in the spreadsheet classification scheme by providing a color-coded categorization into 1. mineralization mainly related to planar architectural elements, e.g. sequence boundaries subaerial and unconformities 2. mineralization mainly related to planar architectural elements, e.g. sequence boundaries submarine, transgressive surfaces and maximum flooding zones/surfaces) 3. mineralization mainly controlled by system tracts (lowstand system tracts transgressive system tracts, highstand system tracts) 4. mineralization of subvolcanic or intermediate level to be correlated with the architectural elements of basin evolution 5. mineralization of deep level to be correlated with the deep-seated structural elements. There are several squares on the chessboard left blank mainly for lack of information on sequence stratigraphy of mineral deposits. This method has not found many users yet in mineral exploration. This review is designed as an "interactive paper" open, for amendments in the electronic spreadsheet version and adjustable to the needs and wants of application, research and training in geosciences. Metamorphic host rock lithologies and commodities are addressed by different color codes in the chessboard classification scheme.

  1. Individualized Human CAD Models: Anthropmetric Morphing and Body Tissue Layering

    DTIC Science & Technology

    2014-07-31

    Part Flow Chart of the Interaction among VBA Macros, Excel® Spreadsheet, and SolidWorks Front View of the Male and Female Soldier CAD Model...yellow highlighting. The spreadsheet is linked to the CAD model by macros created with the Visual Basic for Application ( VBA ) editor in Microsoft Excel...basically three working parts to the anthropometric morphing that are all interconnected ( VBA macros, Excel spreadsheet, and SolidWorks). The flow

  2. A computer assisted tutorial for applications of computer spreadsheets in nursing financial management.

    PubMed

    Edwardson, S R; Pejsa, J

    1993-01-01

    A computer-based tutorial for teaching nursing financial management concepts was developed using the macro function of a commercially available spreadsheet program. The goals of the tutorial were to provide students with an experience with spreadsheets as a computer tool and to teach selected financial management concepts. Preliminary results show the tutorial was well received by students. Suggestions are made for overcoming the general lack of computer sophistication among students.

  3. Using a Spreadsheet to Solve the Schro¨dinger Equations for the Energies of the Ground Electronic State and the Two Lowest Excited States of H[subscript2

    ERIC Educational Resources Information Center

    Ge, Yingbin; Rittenhouse, Robert C.; Buchanan, Jacob C.; Livingston, Benjamin

    2014-01-01

    We have designed an exercise suitable for a lab or project in an undergraduate physical chemistry course that creates a Microsoft Excel spreadsheet to calculate the energy of the S[subscript 0] ground electronic state and the S[subscript 1] and T[subscript 1] excited states of H[subscript 2]. The spreadsheet calculations circumvent the…

  4. Strategies of Successful Technology Integrators. Part I: Streamlining Classroom Management.

    ERIC Educational Resources Information Center

    McNally, Lynn; Etchison, Cindy

    2000-01-01

    Discussion of how to develop curriculum that successfully integrates technology into elementary and secondary school classrooms focuses on solutions for school and classroom management tasks. Highlights include Web-based solutions; student activities; word processing; desktop publishing; draw and paint programs; spreadsheets; and database…

  5. Calculating ellipse area by the Monte Carlo method and analysing dice poker with Excel at high school

    NASA Astrophysics Data System (ADS)

    Benacka, Jan

    2016-08-01

    This paper reports on lessons in which 18-19 years old high school students modelled random processes with Excel. In the first lesson, 26 students formulated a hypothesis on the area of ellipse by using the analogy between the areas of circle, square and rectangle. They verified the hypothesis by the Monte Carlo method with a spreadsheet model developed in the lesson. In the second lesson, 27 students analysed the dice poker game. First, they calculated the probability of the hands by combinatorial formulae. Then, they verified the result with a spreadsheet model developed in the lesson. The students were given a questionnaire to find out if they found the lesson interesting and contributing to their mathematical and technological knowledge.

  6. Software for Storage and Management of Microclimatic Data for Preventive Conservation of Cultural Heritage

    PubMed Central

    Fernández-Navajas, Ángel; Merello, Paloma; Beltrán, Pedro; García-Diego, Fernando-Juan

    2013-01-01

    Cultural Heritage preventive conservation requires the monitoring of the parameters involved in the process of deterioration of artworks. Thus, both long-term monitoring of the environmental parameters as well as further analysis of the recorded data are necessary. The long-term monitoring at frequencies higher than 1 data point/day generates large volumes of data that are difficult to store, manage and analyze. This paper presents software which uses a free open source database engine that allows managing and interacting with huge amounts of data from environmental monitoring of cultural heritage sites. It is of simple operation and offers multiple capabilities, such as detection of anomalous data, inquiries, graph plotting and mean trajectories. It is also possible to export the data to a spreadsheet for analyses with more advanced statistical methods (principal component analysis, ANOVA, linear regression, etc.). This paper also deals with a practical application developed for the Renaissance frescoes of the Cathedral of Valencia. The results suggest infiltration of rainwater in the vault and weekly relative humidity changes related with the religious service schedules. PMID:23447005

  7. Organization's Orderly Interest Exploration: Inception, Development and Insights of AIAA's Topics Database

    NASA Technical Reports Server (NTRS)

    Marshall, Jospeh R.; Morris, Allan T.

    2007-01-01

    Since 2003, AIAA's Computer Systems and Software Systems Technical Committees (TCs) have developed a database that aids technical committee management to map technical topics to their members. This Topics/Interest (T/I) database grew out of a collection of charts and spreadsheets maintained by the TCs. Since its inception, the tool has evolved into a multi-dimensional database whose dimensions include the importance, interest and expertise of TC members and whether or not a member and/or a TC is actively involved with the topic. In 2005, the database was expanded to include the TCs in AIAA s Information Systems Group and then expanded further to include all AIAA TCs. It was field tested at an AIAA Technical Activities Committee (TAC) Workshop in early 2006 through live access by over 80 users. Through the use of the topics database, TC and program committee (PC) members can accomplish relevant tasks such as: to identify topic experts (for Aerospace America articles or external contacts), to determine the interest of its members, to identify overlapping topics between diverse TCs and PCs, to guide new member drives and to reveal emerging topics. This paper will describe the origins, inception, initial development, field test and current version of the tool as well as elucidate the benefits and insights gained by using the database to aid the management of various TC functions. Suggestions will be provided to guide future development of the database for the purpose of providing dynamics and system level benefits to AIAA that currently do not exist in any technical organization.

  8. Iteration with Spreadsheets.

    ERIC Educational Resources Information Center

    Smith, Michael

    1990-01-01

    Presents several examples of the iteration method using computer spreadsheets. Examples included are simple iterative sequences and the solution of equations using the Newton-Raphson formula, linear interpolation, and interval bisection. (YP)

  9. CalTOX (registered trademark), A multimedia total exposure model spreadsheet user's guide. Version 4.0(Beta)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKone, T.E.; Enoch, K.G.

    2002-08-01

    CalTOX has been developed as a set of spreadsheet models and spreadsheet data sets to assist in assessing human exposures from continuous releases to multiple environmental media, i.e. air, soil, and water. It has also been used for waste classification and for setting soil clean-up levels at uncontrolled hazardous wastes sites. The modeling components of CalTOX include a multimedia transport and transformation model, multi-pathway exposure scenario models, and add-ins to quantify and evaluate uncertainty and variability. All parameter values used as inputs to CalTOX are distributions, described in terms of mean values and a coefficient of variation, rather than asmore » point estimates or plausible upper values such as most other models employ. This probabilistic approach allows both sensitivity and uncertainty analyses to be directly incorporated into the model operation. This manual provides CalTOX users with a brief overview of the CalTOX spreadsheet model and provides instructions for using the spreadsheet to make deterministic and probabilistic calculations of source-dose-risk relationships.« less

  10. COMPUTER SIMULATOR (BEST) FOR DESIGNING SULFATE-REDUCING BACTERIA FIELD BIOREACTORS

    EPA Science Inventory

    BEST (bioreactor economics, size and time of operation) is a spreadsheet-based model that is used in conjunction with public domain software, PhreeqcI. BEST is used in the design process of sulfate-reducing bacteria (SRB) field bioreactors to passively treat acid mine drainage (A...

  11. Cloud Computing Based E-Learning System

    ERIC Educational Resources Information Center

    Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.

    2010-01-01

    Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…

  12. DESIGNING SULFATE-REDUCING BACTERIA FIELD BIOREACTORS USING THE BEST MODEL

    EPA Science Inventory

    BEST (bioreactor economics, size and time of operation) is a spreadsheet-based model that is used in conjunction with a public domain computer software package, PHREEQCI. BEST is intended to be used in the design process of sulfate-reducing bacteria (SRB)field bioreactors to pas...

  13. Development, Use, and Impact of a Global Laboratory Database During the 2014 Ebola Outbreak in West Africa.

    PubMed

    Durski, Kara N; Singaravelu, Shalini; Teo, Junxiong; Naidoo, Dhamari; Bawo, Luke; Jambai, Amara; Keita, Sakoba; Yahaya, Ali Ahmed; Muraguri, Beatrice; Ahounou, Brice; Katawera, Victoria; Kuti-George, Fredson; Nebie, Yacouba; Kohar, T Henry; Hardy, Patrick Jowlehpah; Djingarey, Mamoudou Harouna; Kargbo, David; Mahmoud, Nuha; Assefa, Yewondwossen; Condell, Orla; N'Faly, Magassouba; Van Gurp, Leon; Lamanu, Margaret; Ryan, Julia; Diallo, Boubacar; Daffae, Foday; Jackson, Dikena; Malik, Fayyaz Ahmed; Raftery, Philomena; Formenty, Pierre

    2017-06-15

    The international impact, rapid widespread transmission, and reporting delays during the 2014 Ebola outbreak in West Africa highlighted the need for a global, centralized database to inform outbreak response. The World Health Organization and Emerging and Dangerous Pathogens Laboratory Network addressed this need by supporting the development of a global laboratory database. Specimens were collected in the affected countries from patients and dead bodies meeting the case definitions for Ebola virus disease. Test results were entered in nationally standardized spreadsheets and consolidated onto a central server. From March 2014 through August 2016, 256343 specimens tested for Ebola virus disease were captured in the database. Thirty-one specimen types were collected, and a variety of diagnostic tests were performed. Regular analysis of data described the functionality of laboratory and response systems, positivity rates, and the geographic distribution of specimens. With data standardization and end user buy-in, the collection and analysis of large amounts of data with multiple stakeholders and collaborators across various user-access levels was made possible and contributed to outbreak response needs. The usefulness and value of a multifunctional global laboratory database is far reaching, with uses including virtual biobanking, disease forecasting, and adaption to other disease outbreaks. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  14. Comparison of simple additive weighting (SAW) and composite performance index (CPI) methods in employee remuneration determination

    NASA Astrophysics Data System (ADS)

    Karlitasari, L.; Suhartini, D.; Benny

    2017-01-01

    The process of determining the employee remuneration for PT Sepatu Mas Idaman currently are still using Microsoft Excel-based spreadsheet where in the spreadsheet there is the value of criterias that must be calculated for every employee. This can give the effect of doubt during the assesment process, therefore resulting in the process to take much longer time. The process of employee remuneration determination is conducted by the assesment team based on some criterias that have been predetermined. The criteria used in the assessment process are namely the ability to work, human relations, job responsibility, discipline, creativity, work, achievement of targets, and absence. To ease the determination of employee remuneration to be more efficient and effective, the Simple Additive Weighting (SAW) method is used. SAW method can help in decision making for a certain case, and the calculation that generates the greatest value will be chosen as the best alternative. Other than SAW, also by using another method was the CPI method which is one of the calculating method in decision making based on performance index. Where SAW method was more faster by 89-93% compared to CPI method. Therefore it is expected that this application can be an evaluation material for the need of training and development for employee performances to be more optimal.

  15. CALCULATIONAL TOOL FOR SKIN CONTAMINATION DOSE ESTIMATE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HILL, R.L.

    2005-03-31

    A spreadsheet calculational tool was developed to automate the calculations performed for estimating dose from skin contamination. This document reports on the design and testing of the spreadsheet calculational tool.

  16. Use of a Spreadsheet to Calculate the Net Charge of Peptides and Proteins as a Function of pH: An Alternative to Using "Canned" Programs to Estimate the Isoelectric Point of These Important Biomolecules

    ERIC Educational Resources Information Center

    Sims, Paul A.

    2010-01-01

    An approach is presented that utilizes a spreadsheet to allow students to explore different means of calculating and visualizing how the charge on peptides and proteins varies as a function of pH. In particular, the concept of isoelectric point is developed to allow students to compare the results of their spreadsheet calculations with those of…

  17. Enabling Process Improvement and Control in Higher Education Management

    ERIC Educational Resources Information Center

    Bell, Gary; Warwick, Jon; Kennedy, Mike

    2009-01-01

    The emergence of "managerialism" in the governance and direction of UK higher education (HE) institutions has been led by government demands for greater accountability in the quality and cost of universities. There is emerging anecdotal evidence indicating that the estimation performance of HE spreadsheets and regression models are poor.…

  18. Teaching Accounting with Computers.

    ERIC Educational Resources Information Center

    Shaoul, Jean

    This paper addresses the numerous ways that computers may be used to enhance the teaching of accounting and business topics. It focuses on the pedagogical use of spreadsheet software to improve the conceptual coverage of accounting principles and practice, increase student understanding by involvement in the solution process, and reduce the amount…

  19. The Use of Microcomputers in Distance Teaching Systems. ZIFF Papiere 70.

    ERIC Educational Resources Information Center

    Rumble, Greville

    Microcomputers have revolutionized distance education in virtually every area. Used alone, personal computers provide students with a wide range of utilities, including word processing, graphics packages, and spreadsheets. When linked to a mainframe computer or connected to other personal computers in local area networks, microcomputers can…

  20. Nursing Faculty's Evaluations of Technology Integration into the Instructional Setting

    ERIC Educational Resources Information Center

    Yu, Weichieh Wayne; Wang, Jenny; Lin, Chunfu Charlie

    2013-01-01

    A descriptive and correctional research was conducted to assess teachers' perceived expertise in using word processing, spreadsheet, and presentation software applications to facilitate instruction in various nursing subjects. The participants were 313 full- and part-time teachers who taught primarily undergraduate classes and possessed necessary…

  1. Simple Numerical Analysis of Longboard Speedometer Data

    ERIC Educational Resources Information Center

    Hare, Jonathan

    2013-01-01

    Simple numerical data analysis is described, using a standard spreadsheet program, to determine distance, velocity (speed) and acceleration from voltage data generated by a skateboard/longboard speedometer (Hare 2012 "Phys. Educ." 47 409-17). This simple analysis is an introduction to data processing including scaling data as well as…

  2. Use of Computer-Based Case Studies in a Problem-Solving Curriculum.

    ERIC Educational Resources Information Center

    Haworth, Ian S.; And Others

    1997-01-01

    Describes the use of three case studies, on computer, to enhance problem solving and critical thinking among doctoral pharmacy students in a physical chemistry course. Students are expected to use specific computer programs, spreadsheets, electronic mail, molecular graphics, word processing, online literature searching, and other computer-based…

  3. Handheld Computers in Education. Research Brief

    ERIC Educational Resources Information Center

    Education Partnerships, Inc., 2003

    2003-01-01

    For over the last 20 years, educators have been trying to find the best practice in using technology for student learning. Some of the most widely used applications with computers have been student learning of programming, word processing, Web research, spreadsheets, games, and Web design. The difficulty with integrating many of these activities…

  4. Fitting Planetary Orbits with a Spreadsheet.

    ERIC Educational Resources Information Center

    Bridges, Richard

    1995-01-01

    Describes how to fit binocular observations of the planets to a theoretical model of circular orbits using a modern computer spreadsheet, from which fundamental data about the solar system may be deduced. (AIM)

  5. A literature review of quantitative indicators to measure the quality of labor and delivery care.

    PubMed

    Tripathi, Vandana

    2016-02-01

    Strengthening measurement of the quality of labor and delivery (L&D) care in low-resource countries requires an understanding of existing approaches. To identify quantitative indicators of L&D care quality and assess gaps in indicators. PubMed, CINAHL Plus, and Embase databases were searched for research published in English between January 1, 1990, and October 31, 2013, using structured terms. Studies describing indicators for L&D care quality assessment were included. Those whose abstracts contained inclusion criteria underwent full-text review. Study characteristics, including indicator selection and data sources, were extracted via a standard spreadsheet. The structured search identified 1224 studies. After abstract and full-text review, 477 were included in the analysis. Most studies selected indicators by using literature review, clinical guidelines, or expert panels. Few indicators were empirically validated; most studies relied on medical record review to measure indicators. Many quantitative indicators have been used to measure L&D care quality, but few have been validated beyond expert opinion. There has been limited use of clinical observation in quality assessment of care processes. The findings suggest the need for validated, efficient consensus indicators of the quality of L&D care processes, particularly in low-resource countries. Copyright © 2015 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.

  6. An X Window system for statlab results reporting.

    PubMed Central

    Barrows, R. C.; Allen, B.; Fink, D. J.

    1993-01-01

    We have developed a system that receives "stat" results encoded in Health Level Seven from the Laboratory Information System, prints a report in destination Intensive Care Units (ICUs), and captures the data for review in a custom spreadsheet format at color X-terminals located in ICUs. Available services include a reference nomogram plot of arterial blood gas data, printed summaries, automated access to the Clinical Information System and a Medline database, electronic mail, a simulated electronic calculator, and general news and information. Security mechanisms include an audit trail of user activities on the system. Noteworthy technical aspects and non-technical factors impacting success are discussed. Images Figure 2 Figure 3 PMID:8130490

  7. Documentation of spreadsheets for the analysis of aquifer-test and slug-test data

    USGS Publications Warehouse

    Halford, Keith J.; Kuniansky, Eve L.

    2002-01-01

    Several spreadsheets have been developed for the analysis of aquifer-test and slug-test data. Each spreadsheet incorporates analytical solution(s) of the partial differential equation for ground-water flow to a well for a specific type of condition or aquifer. The derivations of the analytical solutions were previously published. Thus, this report abbreviates the theoretical discussion, but includes practical information about each method and the important assumptions for the applications of each method. These spreadsheets were written in Microsoft Excel 9.0 (use of trade names does not constitute endorsement by the USGS). Storage properties should not be estimated with many of the spreadsheets because most are for analyzing single-well tests. Estimation of storage properties from single-well tests is generally discouraged because single-well tests are affected by wellbore storage and by well construction. These non-ideal effects frequently cause estimates of storage to be erroneous by orders of magnitude. Additionally, single-well tests are not sensitive to aquifer-storage properties. Single-well tests include all slug tests (Bouwer and Rice Method, Cooper, Bredehoeft, Papadopulos Method, and van der Kamp Method), the Cooper-Jacob straight-line Method, Theis recovery-data analysis, Jacob-Lohman method for flowing wells in a confined aquifer, and the step-drawdown test. Multi-well test spreadsheets included in this report are; Hantush-Jacob Leaky Aquifer Method and Distance-Drawdown Methods. The distance-drawdown method is an equilibrium or steady-state method, thus storage cannot be estimated.

  8. The meaning of diagnostic test results: a spreadsheet for swift data analysis.

    PubMed

    Maceneaney, P M; Malone, D E

    2000-03-01

    To design a spreadsheet program to: (a) analyse rapidly diagnostic test result data produced in local research or reported in the literature; (b) correct reported predictive values for disease prevalence in any population; (c) estimate the post-test probability of disease in individual patients. Microsoft Excel(TM)was used. Section A: a contingency (2 x 2) table was incorporated into the spreadsheet. Formulae for standard calculations [sample size, disease prevalence, sensitivity and specificity with 95% confidence intervals, predictive values and likelihood ratios (LRs)] were linked to this table. The results change automatically when the data in the true or false negative and positive cells are changed. Section B: this estimates predictive values in any population, compensating for altered disease prevalence. Sections C-F: Bayes' theorem was incorporated to generate individual post-test probabilities. The spreadsheet generates 95% confidence intervals, LRs and a table and graph of conditional probabilities once the sensitivity and specificity of the test are entered. The latter shows the expected post-test probability of disease for any pre-test probability when a test of known sensitivity and specificity is positive or negative. This spreadsheet can be used on desktop and palmtop computers. The MS Excel(TM)version can be downloaded via the Internet from the URL ftp://radiography.com/pub/Rad-data99.xls A spreadsheet is useful for contingency table data analysis and assessment of the clinical meaning of diagnostic test results. Copyright 2000 The Royal College of Radiologists.

  9. Comparing records with related chronologies

    NASA Astrophysics Data System (ADS)

    Bronk Ramsey, Christopher; Albert, Paul; Kearney, Rebecca; Staff, Richard A.

    2016-04-01

    In order to integrate ice, terrestrial and marine records, it is necessary to deal with records on different timescales. These timescales can be grouped into those that use a common fundamental chronometer (such as Uranium-Thorium dating or Radiocarbon) and can also be related to one another where we have chronological tie points such as tephra horizons. More generally we can, through a number of different methodologies, derive relationships between different timescales. A good example of this is the use of cosmogenic isotope production, specifically 10Be and 14C to relate the calibrated radiocarbon timescale to that of the Greenland ice cores. The relationships between different timescales can be mathematically expressed in terms of time-transfer functions. This formalism allows any related record to be considered against any linked timescale with an appropriate associated uncertainty. The prototype INTIMATE chronological database allows records to be viewed and compared in this way and this is now being further developed, both to include a wider range of records and also to provide better connectivity to other databases and chronological tools. These developments will also include new ways to use tephra tie-points to constrain the relationship between timescales directly, without needing to remodel each associated timescale. The database as it stands allows data for particular timeframes to be recalled and plotted against any timescale, or exported in spreadsheet format. New functionality will be added to allow users to work with their own data in a private space and then to publish it when it has been through the peer-review publication process. In order to make the data easier to use for other further analysis and plotting, and with data from other sources, the database will also act as a server to deliver data in a JSON format. The aim of this work is to make the comparison of integrated data much easier for researchers and to ensure that good practice in qualifying chronological uncertainty in record comparison is much more widespread.

  10. Alaska Geochemical Database - Mineral Exploration Tool for the 21st Century - PDF of presentation

    USGS Publications Warehouse

    Granitto, Matthew; Schmidt, Jeanine M.; Labay, Keith A.; Shew, Nora B.; Gamble, Bruce M.

    2012-01-01

    The U.S. Geological Survey has created a geochemical database of geologic material samples collected in Alaska. This database is readily accessible to anyone with access to the Internet. Designed as a tool for mineral or environmental assessment, land management, or mineral exploration, the initial version of the Alaska Geochemical Database - U.S. Geological Survey Data Series 637 - contains geochemical, geologic, and geospatial data for 264,158 samples collected from 1962-2009: 108,909 rock samples; 92,701 sediment samples; 48,209 heavy-mineral-concentrate samples; 6,869 soil samples; and 7,470 mineral samples. In addition, the Alaska Geochemical Database contains mineralogic data for 18,138 nonmagnetic-fraction heavy mineral concentrates, making it the first U.S. Geological Survey database of this scope that contains both geochemical and mineralogic data. Examples from the Alaska Range will illustrate potential uses of the Alaska Geochemical Database in mineral exploration. Data from the Alaska Geochemical Database have been extensively checked for accuracy of sample media description, sample site location, and analytical method using U.S. Geological Survey sample-submittal archives and U.S. Geological Survey publications (plus field notebooks and sample site compilation base maps from the Alaska Technical Data Unit in Anchorage, Alaska). The database is also the repository for nearly all previously released U.S. Geological Survey Alaska geochemical datasets. Although the Alaska Geochemical Database is a fully relational database in Microsoft® Access 2003 and 2010 formats, these same data are also provided as a series of spreadsheet files in Microsoft® Excel 2003 and 2010 formats, and as ASCII text files. A DVD version of the Alaska Geochemical Database was released in October 2011, as U.S. Geological Survey Data Series 637, and data downloads are available at http://pubs.usgs.gov/ds/637/. Also, all Alaska Geochemical Database data have been incorporated into the interactive U.S. Geological Survey Mineral Resource Data web portal, available at http://mrdata.usgs.gov/.

  11. Integrating Variances into an Analytical Database

    NASA Technical Reports Server (NTRS)

    Sanchez, Carlos

    2010-01-01

    For this project, I enrolled in numerous SATERN courses that taught the basics of database programming. These include: Basic Access 2007 Forms, Introduction to Database Systems, Overview of Database Design, and others. My main job was to create an analytical database that can handle many stored forms and make it easy to interpret and organize. Additionally, I helped improve an existing database and populate it with information. These databases were designed to be used with data from Safety Variances and DCR forms. The research consisted of analyzing the database and comparing the data to find out which entries were repeated the most. If an entry happened to be repeated several times in the database, that would mean that the rule or requirement targeted by that variance has been bypassed many times already and so the requirement may not really be needed, but rather should be changed to allow the variance's conditions permanently. This project did not only restrict itself to the design and development of the database system, but also worked on exporting the data from the database to a different format (e.g. Excel or Word) so it could be analyzed in a simpler fashion. Thanks to the change in format, the data was organized in a spreadsheet that made it possible to sort the data by categories or types and helped speed up searches. Once my work with the database was done, the records of variances could be arranged so that they were displayed in numerical order, or one could search for a specific document targeted by the variances and restrict the search to only include variances that modified a specific requirement. A great part that contributed to my learning was SATERN, NASA's resource for education. Thanks to the SATERN online courses I took over the summer, I was able to learn many new things about computers and databases and also go more in depth into topics I already knew about.

  12. Spreadsheets in Science Teaching.

    ERIC Educational Resources Information Center

    Elliot, Chris

    1988-01-01

    Described is the use of a spreadsheet to model dynamic phenomena using numerical iterative methods. Uses the discharge of a capacitor, simple and damped harmonic motion, and the flow of heat along a bar as examples. (Author/CW)

  13. Spreadsheet Works: Graphing Functions on a Spreadsheet.

    ERIC Educational Resources Information Center

    Ramamurthi, V. S.

    1989-01-01

    Explains graphing functions when using LOTUS 1-2-3. Provides examples and explains keystroke entries needed to make the graphs. Notes up to six functions can be displayed on the same set of axes. (MVL)

  14. Fitting Orbits to Jupiter's Moons with a Spreadsheet.

    ERIC Educational Resources Information Center

    Bridges, Richard

    1995-01-01

    Describes how a spreadsheet is used to fit a circular orbit model to observations of Jupiter's moons made with a small telescope. Kepler's Third Law and the inverse square law of gravity are observed. (AIM)

  15. Cold Climate Foundation Retrofit Experimental Hygrothermal Performance: Cloquet Residential Research Facility Laboratory Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Louise F.; Harmon, Anna C.

    2015-04-01

    Thermal and moisture problems in existing basements create a unique challenge because the exterior face of the wall is not easily or inexpensively accessible. This approach addresses thermal and moisture management from the interior face of the wall without disturbing the exterior soil and landscaping. the interior and exterior environments. This approach has the potential for improving durability, comfort, and indoor air quality. This project was funded jointly by the National Renewable Energy Laboratory (NREL) and Oak Ridge National Laboratory (ORNL). ORNL focused on developing a full basement wall system experimental database to enable others to validate hygrothermal simulation codes.more » NREL focused on testing the moisture durability of practical basement wall interior insulation retrofit solutions for cold climates. The project has produced a physically credible and reliable long-term hygrothermal performance database for retrofit foundation wall insulation systems in zone 6 and 7 climates that are fully compliant with the performance criteria in the 2009 Minnesota Energy Code. The experimental data were configured into a standard format that can be published online and that is compatible with standard commercially available spreadsheet and database software.« less

  16. Evolution of a Structure-Searchable Database into a Prototype for a High-Fidelity SmartPhone App for 62 Common Pesticides Used in Delaware.

    PubMed

    D'Souza, Malcolm J; Barile, Benjamin; Givens, Aaron F

    2015-05-01

    Synthetic pesticides are widely used in the modern world for human benefit. They are usually classified according to their intended pest target. In Delaware (DE), approximately 42 percent of the arable land is used for agriculture. In order to manage insectivorous and herbaceous pests (such as insects, weeds, nematodes, and rodents), pesticides are used profusely to biologically control the normal pest's life stage. In this undergraduate project, we first created a usable relational database containing 62 agricultural pesticides that are common in Delaware. Chemically pertinent quantitative and qualitative information was first stored in Bio-Rad's KnowItAll® Informatics System. Next, we extracted the data out of the KnowItAll® system and created additional sections on a Microsoft® Excel spreadsheet detailing pesticide use(s) and safety and handling information. Finally, in an effort to promote good agricultural practices, to increase efficiency in business decisions, and to make pesticide data globally accessible, we developed a mobile application for smartphones that displayed the pesticide database using Appery.io™; a cloud-based HyperText Markup Language (HTML5), jQuery Mobile and Hybrid Mobile app builder.

  17. Progress in 1988 1990 with computer applications in the ``hard-rock'' arena: Geochemistry, mineralogy, petrology, and volcanology

    NASA Astrophysics Data System (ADS)

    Rock, Nicholas M. S.

    This review covers rock, mineral and isotope geochemistry, mineralogy, igneous and metamorphic petrology, and volcanology. Crystallography, exploration geochemistry, and mineral exploration are excluded. Fairly extended comments on software availability, and on computerization of the publication process and of specimen collection indexes, may interest a wider audience. A proliferation of both published and commercial software in the past 3 years indicates increasing interest in what traditionally has been a rather reluctant sphere of geoscience computer activity. However, much of this software duplicates the same old functions (Harker and triangular plots, mineral recalculations, etc.). It usually is more efficient nowadays to use someone else's program, or to employ the command language in one of many general-purpose spreadsheet or statistical packages available, than to program a specialist operation from scratch in, say, FORTRAN. Greatest activity has been in mineralogy, where several journals specifically encourage publication of computer-related activities, and IMA and MSA Working Groups on microcomputers have been convened. In petrology and geochemistry, large national databases of rock and mineral analyses continue to multiply, whereas the international database IGBA grows slowly; some form of integration is necessary to make these disparate systems of lasting value to the global "hard-rock" community. Total merging or separate addressing via an intelligent "front-end" are both possibilities. In volcanology, the BBC's videodisk Volcanoes and the Smithsonian Institution's Global Volcanism Project use the most up-to-date computer technology in an exciting and innovative way, to promote public education.

  18. Software validation applied to spreadsheets used in laboratories working under ISO/IEC 17025

    NASA Astrophysics Data System (ADS)

    Banegas, J. M.; Orué, M. W.

    2016-07-01

    Several documents deal with software validation. Nevertheless, more are too complex to be applied to validate spreadsheets - surely the most used software in laboratories working under ISO/IEC 17025. The method proposed in this work is intended to be directly applied to validate spreadsheets. It includes a systematic way to document requirements, operational aspects regarding to validation, and a simple method to keep records of validation results and modifications history. This method is actually being used in an accredited calibration laboratory, showing to be practical and efficient.

  19. LICSS - a chemical spreadsheet in microsoft excel

    PubMed Central

    2012-01-01

    Background Representations of chemical datasets in spreadsheet format are important for ready data assimilation and manipulation. In addition to the normal spreadsheet facilities, chemical spreadsheets need to have visualisable chemical structures and data searchable by chemical as well as textual queries. Many such chemical spreadsheet tools are available, some operating in the familiar Microsoft Excel environment. However, within this group, the performance of Excel is often compromised, particularly in terms of the number of compounds which can usefully be stored on a sheet. Summary LICSS is a lightweight chemical spreadsheet within Microsoft Excel for Windows. LICSS stores structures solely as Smiles strings. Chemical operations are carried out by calling Java code modules which use the CDK, JChemPaint and OPSIN libraries to provide cheminformatics functionality. Compounds in sheets or charts may be visualised (individually or en masse), and sheets may be searched by substructure or similarity. All the molecular descriptors available in CDK may be calculated for compounds (in batch or on-the-fly), and various cheminformatic operations such as fingerprint calculation, Sammon mapping, clustering and R group table creation may be carried out. We detail here the features of LICSS and how they are implemented. We also explain the design criteria, particularly in terms of potential corporate use, which led to this particular implementation. Conclusions LICSS is an Excel-based chemical spreadsheet with a difference: • It can usefully be used on sheets containing hundreds of thousands of compounds; it doesn't compromise the normal performance of Microsoft Excel • It is designed to be installed and run in environments in which users do not have admin privileges; installation involves merely file copying, and sharing of LICSS sheets invokes automatic installation • It is free and extensible LICSS is open source software and we hope sufficient detail is provided here to enable developers to add their own features and share with the community. PMID:22301088

  20. LICSS - a chemical spreadsheet in microsoft excel.

    PubMed

    Lawson, Kevin R; Lawson, Jonty

    2012-02-02

    Representations of chemical datasets in spreadsheet format are important for ready data assimilation and manipulation. In addition to the normal spreadsheet facilities, chemical spreadsheets need to have visualisable chemical structures and data searchable by chemical as well as textual queries. Many such chemical spreadsheet tools are available, some operating in the familiar Microsoft Excel environment. However, within this group, the performance of Excel is often compromised, particularly in terms of the number of compounds which can usefully be stored on a sheet. LICSS is a lightweight chemical spreadsheet within Microsoft Excel for Windows. LICSS stores structures solely as Smiles strings. Chemical operations are carried out by calling Java code modules which use the CDK, JChemPaint and OPSIN libraries to provide cheminformatics functionality. Compounds in sheets or charts may be visualised (individually or en masse), and sheets may be searched by substructure or similarity. All the molecular descriptors available in CDK may be calculated for compounds (in batch or on-the-fly), and various cheminformatic operations such as fingerprint calculation, Sammon mapping, clustering and R group table creation may be carried out.We detail here the features of LICSS and how they are implemented. We also explain the design criteria, particularly in terms of potential corporate use, which led to this particular implementation. LICSS is an Excel-based chemical spreadsheet with a difference:• It can usefully be used on sheets containing hundreds of thousands of compounds; it doesn't compromise the normal performance of Microsoft Excel• It is designed to be installed and run in environments in which users do not have admin privileges; installation involves merely file copying, and sharing of LICSS sheets invokes automatic installation• It is free and extensibleLICSS is open source software and we hope sufficient detail is provided here to enable developers to add their own features and share with the community.

  1. Historical rock falls in Yosemite National Park, California (1857-2011)

    USGS Publications Warehouse

    Stock, Greg M.; Collins, Brian D.; Santaniello, David J.; Zimmer, Valerie L.; Wieczorek, Gerald F.; Snyder, James B.

    2013-01-01

    Inventories of rock falls and other types of landslides are valuable tools for improving understanding of these events. For example, detailed information on rock falls is critical for identifying mechanisms that trigger rock falls, for quantifying the susceptibility of different cliffs to rock falls, and for developing magnitude-frequency relations. Further, inventories can assist in quantifying the relative hazard and risk posed by these events over both short and long time scales. This report describes and presents the accompanying rock fall inventory database for Yosemite National Park, California. The inventory database documents 925 events spanning the period 1857–2011. Rock falls, rock slides, and other forms of slope movement represent a serious natural hazard in Yosemite National Park. Rock-fall hazard and risk are particularly relevant in Yosemite Valley, where glacially steepened granitic cliffs approach 1 km in height and where the majority of the approximately 4 million yearly visitors to the park congregate. In addition to damaging roads, trails, and other facilities, rock falls and other slope movement events have killed 15 people and injured at least 85 people in the park since the first documented rock fall in 1857. The accompanying report describes each of the organizational categories in the database, including event location, type of slope movement, date, volume, relative size, probable trigger, impact to humans, narrative description, references, and environmental conditions. The inventory database itself is contained in a Microsoft Excel spreadsheet (Yosemite_rock_fall_database_1857-2011.xlsx). Narrative descriptions of events are contained in the database, but are also provided in a more readable Adobe portable document format (pdf) file (Yosemite_rock_fall_database_narratives_1857-2011.pdf) available for download separate from the database.

  2. Integrating Computer Spreadsheet Modeling into a Microeconomics Curriculum: Principles to Managerial.

    ERIC Educational Resources Information Center

    Clark, Joy L.; Hegji, Charles E.

    1997-01-01

    Notes that using spreadsheets to teach microeconomics principles enables learning by doing in the exploration of basic concepts. Introduction of increasingly complex topics leads to exploration of theory and managerial decision making. (SK)

  3. Building Your Own Regression Model

    ERIC Educational Resources Information Center

    Horton, Robert, M.; Phillips, Vicki; Kenelly, John

    2004-01-01

    Spreadsheets to explore regression with an algebra 2 class in a medium-sized rural high school are presented. The use of spreadsheets can help students develop sophisticated understanding of mathematical models and use them to describe real-world phenomena.

  4. Petrogenetic Modeling with a Spreadsheet Program.

    ERIC Educational Resources Information Center

    Holm, Paul Eric

    1988-01-01

    Describes how interactive programs for scientific modeling may be created by using spreadsheet software such as LOTUS 1-2-3. Lists the advantages of using this method. Discusses fractional distillation, batch partial melting, and combination models as examples. (CW)

  5. [Development of an Excel spreadsheet for meta-analysis of indirect and mixed treatment comparisons].

    PubMed

    Tobías, Aurelio; Catalá-López, Ferrán; Roqué, Marta

    2014-01-01

    Meta-analyses in clinical research usually aimed to evaluate treatment efficacy and safety in direct comparison with a unique comparator. Indirect comparisons, using the Bucher's method, can summarize primary data when information from direct comparisons is limited or nonexistent. Mixed comparisons allow combining estimates from direct and indirect comparisons, increasing statistical power. There is a need for simple applications for meta-analysis of indirect and mixed comparisons. These can easily be conducted using a Microsoft Office Excel spreadsheet. We developed a spreadsheet for indirect and mixed effects comparisons of friendly use for clinical researchers interested in systematic reviews, but non-familiarized with the use of more advanced statistical packages. The use of the proposed Excel spreadsheet for indirect and mixed comparisons can be of great use in clinical epidemiology to extend the knowledge provided by traditional meta-analysis when evidence from direct comparisons is limited or nonexistent.

  6. Measuring Assurance of Learning Goals: Effectiveness of Computer Training and Assessment Tools

    ERIC Educational Resources Information Center

    Murphy, Marianne C.; Sharma, Aditya; Rosso, Mark

    2012-01-01

    Teaching office applications such as word processing, spreadsheet and presentation skills has been widely debated regarding its necessity, extent and delivery method. Training and Assessment applications such as MyITLab, SAM, etc. are popular tools for training students and are particularly useful in measuring Assurance of Learning (AOL)…

  7. A Comparison of Student Perceptions of Their Computer Skills to Their Actual Abilities

    ERIC Educational Resources Information Center

    Grant, Donna M.; Malloy, Alisha D.; Murphy, Marianne C.

    2009-01-01

    In this technology intensive society, most students are required to be proficient in computer skills to compete in today's global job market. These computer skills usually consist of basic to advanced knowledge in word processing, presentation, and spreadsheet applications. In many U.S. states, students are required to demonstrate computer…

  8. BIOREACTOR ECONOMICS, SIZE AND TIME OF OPERATION (BEST) COMPUTER SIMULATOR FOR DESIGNING SULFATE-REDUCING BACTERIA FIELD BIOREACTORS

    EPA Science Inventory

    BEST (bioreactor economics, size and time of operation) is an Excel™ spreadsheet-based model that is used in conjunction with the public domain geochemical modeling software, PHREEQCI. The BEST model is used in the design process of sulfate-reducing bacteria (SRB) field bioreacto...

  9. Moving Fingers under a Stick: A Laboratory Activity

    ERIC Educational Resources Information Center

    Massalha, Taha; Lanir, Yuval; Gluck, Paul

    2011-01-01

    We consider a demonstration in which pupils alternately slide and stop their fingers under a long horizontal rod which they support. The changeover is described in terms of the relevant kinetic and static friction. We present a model calculation, performed on a spreadsheet, which clarifies the process and describes graphically the stepwise…

  10. Applying 'evidence-based medicine' theory to interventional radiology. Part 2: a spreadsheet for swift assessment of procedural benefit and harm.

    PubMed

    Maceneaney, P M; Malone, D E

    2000-12-01

    To design a spreadsheet program to analyse interventional radiology (IR) data rapidly produced in local research or reported in the literature using 'evidence-based medicine' (EBM) parameters of treatment benefit and harm. Microsoft Excel(TM)was used. The spreadsheet consists of three worksheets. The first shows the 'Levels of Evidence and Grades of Recommendations' that can be assigned to therapeutic studies as defined by the Oxford Centre for EBM. The second and third worksheets facilitate the EBM assessment of therapeutic benefit and harm. Validity criteria are described. These include the assessment of the adequacy of sample size in the detection of possible procedural complications. A contingency (2 x 2) table for raw data on comparative outcomes in treated patients and controls has been incorporated. Formulae for EBM calculations are related to these numerators and denominators in the spreadsheet. The parameters calculated are benefit - relative risk reduction, absolute risk reduction, number needed to treat (NNT). Harm - relative risk, relative odds, number needed to harm (NNH). Ninety-five per cent confidence intervals are calculated for all these indices. The results change automatically when the data in the therapeutic outcome cells are changed. A final section allows the user to correct the NNT or NNH in their application to individual patients. This spreadsheet can be used on desktop and palmtop computers. The MS Excel(TM)version can be downloaded via the Internet from the URL ftp://radiography.com/pub/TxHarm00.xls. A spreadsheet is useful for the rapid analysis of the clinical benefit and harm from IR procedures.

  11. Do Vampires Exist? Using Spreadsheets To Investigate a Common Folktale.

    ERIC Educational Resources Information Center

    Drier, Hollylynne Stohl

    1999-01-01

    Describes the use of spreadsheets in a third grade class to teach basic mathematical concepts by investigating the existence of vampires. Incorporates addition and multiplication skills, patterning, variables, formulas, exponential growth, and proof by contradiction. (LRW)

  12. Standard Evaluation Procedures (SEPs) and Data Entry Spreadsheet Templates (DESTs) for Endocrine Disruptor Screening Program (EDSP) Tier 1 Assays

    EPA Pesticide Factsheets

    This page provides information and access to Standard Evaluation Procedures (SEPs) and Data Entry Spreadsheet Templates (DESTs) developed by EPA's Office of Chemical Safety and Pollution Prevention (OCSPP).

  13. Computer Corner: Spreadsheets, Power Series, Generating Functions, and Integers.

    ERIC Educational Resources Information Center

    Snow, Donald R.

    1989-01-01

    Implements a table algorithm on a spreadsheet program and obtains functions for several number sequences such as the Fibonacci and Catalan numbers. Considers other applications of the table algorithm to integers represented in various number bases. (YP)

  14. A new database sub-system for grain-size analysis

    NASA Astrophysics Data System (ADS)

    Suckow, Axel

    2013-04-01

    Detailed grain-size analyses of large depth profiles for palaeoclimate studies create large amounts of data. For instance (Novothny et al., 2011) presented a depth profile of grain-size analyses with 2 cm resolution and a total depth of more than 15 m, where each sample was measured with 5 repetitions on a Beckman Coulter LS13320 with 116 channels. This adds up to a total of more than four million numbers. Such amounts of data are not easily post-processed by spreadsheets or standard software; also MS Access databases would face serious performance problems. The poster describes a database sub-system dedicated to grain-size analyses. It expands the LabData database and laboratory management system published by Suckow and Dumke (2001). This compatibility with a very flexible database system provides ease to import the grain-size data, as well as the overall infrastructure of also storing geographic context and the ability to organize content like comprising several samples into one set or project. It also allows easy export and direct plot generation of final data in MS Excel. The sub-system allows automated import of raw data from the Beckman Coulter LS13320 Laser Diffraction Particle Size Analyzer. During post processing MS Excel is used as a data display, but no number crunching is implemented in Excel. Raw grain size spectra can be exported and controlled as Number- Surface- and Volume-fractions, while single spectra can be locked for further post-processing. From the spectra the usual statistical values (i.e. mean, median) can be computed as well as fractions larger than a grain size, smaller than a grain size, fractions between any two grain sizes or any ratio of such values. These deduced values can be easily exported into Excel for one or more depth profiles. However, such a reprocessing for large amounts of data also allows new display possibilities: normally depth profiles of grain-size data are displayed only with summarized parameters like the clay content, sand content, etc., which always only displays part of the available information at each depth. Alternatively, full spectra were displayed at one depth. The new software now allows to display the whole grain-size spectrum at each depth in a three dimensional display. LabData and the grain-size subsystem are based on MS Access as front-end and MS SQL Server as back-end database systems. The SQL code for the data model, SQL server procedures and triggers and the MS Access basic code for the front end are public domain code, published under the GNU GPL license agreement and are available free of charge. References: Novothny, Á., Frechen, M., Horváth, E., Wacha, L., Rolf, C., 2011. Investigating the penultimate and last glacial cycles of the Sütt dating, high-resolution grain size, and magnetic susceptibility data. Quaternary International 234, 75-85. Suckow, A., Dumke, I., 2001. A database system for geochemical, isotope hydrological and geochronological laboratories. Radiocarbon 43, 325-337.

  15. A Brief User's Guide to the Excel ® -Based DF Calculator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jubin, Robert T.

    2016-06-01

    To understand the importance of capturing penetrating forms of iodine as well as the other volatile radionuclides, a calculation tool was developed in the form of an Excel ® spreadsheet to estimate the overall plant decontamination factor (DF). The tool requires the user to estimate splits of the volatile radionuclides within the major portions of the reprocessing plant, speciation of iodine and individual DFs for each off-gas stream within the Used Nuclear Fuel reprocessing plant. The Impact to the overall plant DF for each volatile radionuclide is then calculated by the tool based on the specific user choices. The Excelmore » ® spreadsheet tracks both elemental and penetrating forms of iodine separately and allows changes in the speciation of iodine at each processing step. It also tracks 3H, 14C and 85Kr. This document provides a basic user's guide to the manipulation of this tool.« less

  16. Relativity on a Spreadsheet.

    ERIC Educational Resources Information Center

    Carson, S. R.

    1998-01-01

    Presents a method for using spreadsheets to model special relativistic phenomena based on the connection between electric and magnetic fields in special relativity. Uses the time dilation equation to carry out transformations between reference frames that show the connection between the fields quantitatively. (DDR)

  17. Carbon footprint estimator, phase II : volume I - GASCAP model & volume II - technical appendices [technical brief].

    DOT National Transportation Integrated Search

    2014-03-01

    This study resulted in the development of the GASCAP model (the Greenhouse Gas Assessment : Spreadsheet for Transportation Capital Projects). This spreadsheet model provides a user-friendly interface for determining the greenhouse gas (GHG) emissions...

  18. Improving Students' Understanding of the Importance of Economic Consequences in Standard Setting: A Computerized Spreadsheet Tool.

    ERIC Educational Resources Information Center

    Ivancevich, Daniel M.; And Others

    1996-01-01

    Points out that political and economic pressures have sometimes caused the Financial Accounting Standards Board to alter standards. Presents a spreadsheet tool that demonstrates the economic consequences of adopting accounting standards. (SK)

  19. 76 FR 34124 - Civil Supersonic Aircraft Panel Discussion

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-10

    ... and continuing to the second line in the second column, the Web site address should read as follows: https://spreadsheets.google.com/spreadsheet/viewform?formkey=dEFEdlRnYzBiaHZtTUozTHVtbkF4d0E6MQ . [FR...

  20. Assessment of ODOT culvert load rating spreadsheets for use in Michigan.

    DOT National Transportation Integrated Search

    2013-01-01

    The project Assessment of ODOT Culvert Load Rating Spreadsheets for use in Michigan was : a short time-frame project funded by the Michigan Department of Transportation (MDOT) : through the Center for Structural Durability (CSD) at Michigan Tec...

  1. A TOOL FOR PLANNING AERIAL PHOTOGRAPHY

    EPA Science Inventory

    abstract The U.S. EPAs Pacific Coastal Ecology Branch has developed a tool in the form of an Excel. spreadsheet that facilitates planning aerial photography missions. The spreadsheet accepts various input parameters such as desired photo-scale and boundary coordinates of the stud...

  2. 2017 Annual Technology Baseline (ATB): Cost and Performance Data for Electricity Generation Technologies

    DOE Data Explorer

    Hand, Maureen; Augustine, Chad; Feldman, David; Kurup, Parthiv; Beiter, Philipp; O'Connor, Patrick

    2017-08-21

    Each year since 2015, NREL has presented Annual Technology Baseline (ATB) in a spreadsheet that contains detailed cost and performance data (both current and projected) for renewable and conventional technologies. The spreadsheet includes a workbook for each technology. This spreadsheet provides data for the 2017 ATB. In this edition of the ATB, offshore wind power has been updated to include 15 technical resource groups. And, two options are now provided for representing market conditions for project financing, including current market conditions and long-term historical conditions. For more information, see https://atb.nrel.gov/.

  3. Resource Economics

    NASA Astrophysics Data System (ADS)

    Conrad, Jon M.

    1999-10-01

    Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. Through these examples and additional exercises at the end of each chapter, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems.

  4. The Euler’s Graphical User Interface Spreadsheet Calculator for Solving Ordinary Differential Equations by Visual Basic for Application Programming

    NASA Astrophysics Data System (ADS)

    Gaik Tay, Kim; Cheong, Tau Han; Foong Lee, Ming; Kek, Sie Long; Abdul-Kahar, Rosmila

    2017-08-01

    In the previous work on Euler’s spreadsheet calculator for solving an ordinary differential equation, the Visual Basic for Application (VBA) programming was used, however, a graphical user interface was not developed to capture users input. This weakness may make users confuse on the input and output since those input and output are displayed in the same worksheet. Besides, the existing Euler’s spreadsheet calculator is not interactive as there is no prompt message if there is a mistake in inputting the parameters. On top of that, there are no users’ instructions to guide users to input the derivative function. Hence, in this paper, we improved previous limitations by developing a user-friendly and interactive graphical user interface. This improvement is aimed to capture users’ input with users’ instructions and interactive prompt error messages by using VBA programming. This Euler’s graphical user interface spreadsheet calculator is not acted as a black box as users can click on any cells in the worksheet to see the formula used to implement the numerical scheme. In this way, it could enhance self-learning and life-long learning in implementing the numerical scheme in a spreadsheet and later in any programming language.

  5. Improving Information Management at Mare Island Naval Shipyard.

    DTIC Science & Technology

    1987-03-01

    copy reports [Ref. 8: pp. 1-41. C. PRIME TOKEN RING The prime ring is a token-type computer network linking five PRIME computers electronically . Each...the PRIME net are for news (a bulletin board), electronic mail, word processing, and data filing. d 4 o, " ,, " "." ’-" r...communications application) This is a group of general-purpose programs that includes word processing. electronic mail, and spreadsheet applications. Access is

  6. Using a Spreadsheet To Explore Melting, Dissolving and Phase Diagrams.

    ERIC Educational Resources Information Center

    Goodwin, Alan

    2002-01-01

    Compares phase diagrams relating to the solubilities and melting points of various substances in textbooks with those generated by a spreadsheet using data from the literature. Argues that differences between the diagrams give rise to new chemical insights. (Author/MM)

  7. The Use of Lotus 1-2-3 Macros in Engineering Calculations.

    ERIC Educational Resources Information Center

    Rosen, Edward M.

    1990-01-01

    Described are the use of spreadsheet programs in chemical engineering calculations using Lotus 1-2-3 macros. Discusses the macro commands, subroutine operations, and solution of partial differential equation. Provides examples of the subroutine programs and spreadsheet solution. (YP)

  8. Academic Testing and Grading with Spreadsheet Software.

    ERIC Educational Resources Information Center

    Ho, James K.

    1987-01-01

    Explains how spreadsheet software can be used in the design and grading of academic tests and in assigning grades. Macro programs and menu-driven software are highlighted and an example using IBM PCs and Lotus 1-2-3 software is given. (Author/LRW)

  9. Digitizing Olin Eggen's Card Database

    NASA Astrophysics Data System (ADS)

    Crast, J.; Silvis, G.

    2017-06-01

    The goal of the Eggen Card Database Project is to recover as many of the photometric observations from Olin Eggen's Card Database as possible and preserve these observations, in digital forms that are accessible by anyone. Any observations of interest to the AAVSO will be added to the AAVSO International Database (AID). Given to the AAVSO on long-term loan by the Cerro Tololo Inter-American Observatory, the database is a collection of over 78,000 index cards holding all Eggen's observations made between 1960 and 1990. The cards were electronically scanned and the resulting 108,000 card images have been published as a series of 2,216 PDF files, which are available from the AAVSO web site. The same images are also stored in an AAVSO online database where they are indexed by star name and card content. These images can be viewed using the eggen card portal online tool. Eggen made observations using filter bands from five different photometric systems. He documented these observations using 15 different data recording formats. Each format represents a combination of filter magnitudes and color indexes. These observations are being transcribed onto spreadsheets, from which observations of value to the AAVSO are added to the AID. A total of 506 U, B, V, R, and I observations were added to the AID for the variable stars S Car and l Car. We would like the reader to search through the card database using the eggen card portal for stars of particular interest. If such stars are found and retrieval of the observations is desired, e-mail the authors, and we will be happy to help retrieve those data for the reader.

  10. Spreadsheet Toolkit for Ulysses Hi-Scale Measurements of Interplanetary Ions and Electrons

    NASA Astrophysics Data System (ADS)

    Reza, J. Z.; Lanzerotti, L. J.; Denker, C.; Patterson, D.; Amstrong, T. P.

    2004-05-01

    Throughout the entire Ulysses out-of-the-ecliptic solar polar mission, the Heliosphere Instrument for Spectra, Composition, and Anisotropy at Low Energies (HI-SCALE) has collected measurements of interplanetary ions and electrons. Time-series of electron and ion fluxes obtained since 1990 have been carefully calibrated and will be stored in a data management system, which will be publicly accessible via the WWW. The goal of the Virtual Solar Observatory (VSO) is to provide data uniformly and efficiently to a diverse user community. However, data dissemination can only be a first step, which has to be followed by a suite of data analysis tools that are tailored towards a diverse user community in science, technology, and education. The widespread use and familiarity of spreadsheets, which are available at low cost or open source for many operating systems, make them an interesting tool to investigate for the analysis of HI-SCALE data. The data are written in comma separated variable (CSV) format, which is commonly used in spreadsheet programs. CSV files can simply be linked as external data to spreadsheet templates, which in turn can be used to generate tables and figures of basic statistical properties and frequency distributions, temporal evolution of electron and ion spectra, comparisons of various energy channels, automatic detection of solar events, solar cycle variations, and space weather. Exploring spreadsheet-assisted data analysis in the context of information technology research, data base information search and retrieval, and data visualization potentially impacts other VSO components, where diverse user communities are targeted. Finally, this presentation is the result of an undergraduate research project, which will allow us to evaluate the performance of user-based spreadsheet analysis "benchmarked" at the undergraduate skill level.

  11. Design and Evaluation of a Personal Diffusion Battery.

    PubMed

    Vosburgh, Donna J H; Klein, Timothy; Sheehan, Maura; Anthony, T Renee; Peters, Thomas M

    A four-stage personal diffusion battery (pDB) was designed and constructed to measure submicron particle size distributions. The pDB consisted of a screen-type diffusion battery, solenoid valve system, and electronic controller. A data inversion spreadsheet was created to solve for the number median diameter (NMD), geometric standard deviation (GSD), and particle number concentration of unimodal aerosols using stage number concentrations from the pDB combined with a handheld condensation particle counter (pDB+CPC). The inversion spreadsheet included particle entry losses, theoretical penetrations across screens, the detection efficiency of the CPC, and constraints so the spreadsheet solved to values within the pDB range. Size distribution parameters (NMD, GSD, and number concentration) measured with the pDB+CPC with inversion spreadsheet were within 25% of those measured with a scanning mobility particle sizer (SMPS) for 5 of 12 polydisperse combustion aerosols. For three tests conducted with propylene torch exhaust, the pDB+CPC with inversion spreadsheet successfully identified that the NMD was smaller than the constraint value of 16 nm. The ratio of the nanoparticle portion of the aerosol compared to the reference ( R nano ) was calculated to determine the ability of pDB+CPC with inversion spreadsheet to measure the nanoparticle portion of the aerosols. The R nano ranged from 0.87 to 1.01 when the inversion solved and from 0.06 to 2.01 when the inversion solved to a constraint. The pDB combined with CPC has limited use as a personal monitor but combining the pDB with a different detector would allow for the pDB to be used as a personal monitor.

  12. Design and Evaluation of a Personal Diffusion Battery

    PubMed Central

    Vosburgh, Donna J. H.; Klein, Timothy; Sheehan, Maura; Anthony, T. Renee; Peters, Thomas M.

    2016-01-01

    A four-stage personal diffusion battery (pDB) was designed and constructed to measure submicron particle size distributions. The pDB consisted of a screen-type diffusion battery, solenoid valve system, and electronic controller. A data inversion spreadsheet was created to solve for the number median diameter (NMD), geometric standard deviation (GSD), and particle number concentration of unimodal aerosols using stage number concentrations from the pDB combined with a handheld condensation particle counter (pDB+CPC). The inversion spreadsheet included particle entry losses, theoretical penetrations across screens, the detection efficiency of the CPC, and constraints so the spreadsheet solved to values within the pDB range. Size distribution parameters (NMD, GSD, and number concentration) measured with the pDB+CPC with inversion spreadsheet were within 25% of those measured with a scanning mobility particle sizer (SMPS) for 5 of 12 polydisperse combustion aerosols. For three tests conducted with propylene torch exhaust, the pDB+CPC with inversion spreadsheet successfully identified that the NMD was smaller than the constraint value of 16 nm. The ratio of the nanoparticle portion of the aerosol compared to the reference (R nano) was calculated to determine the ability of pDB+CPC with inversion spreadsheet to measure the nanoparticle portion of the aerosols. The R nano ranged from 0.87 to 1.01 when the inversion solved and from 0.06 to 2.01 when the inversion solved to a constraint. The pDB combined with CPC has limited use as a personal monitor but combining the pDB with a different detector would allow for the pDB to be used as a personal monitor. PMID:26900207

  13. Simplified risk assessment of noise induced hearing loss by means of 2 spreadsheet models.

    PubMed

    Lie, Arve; Engdahl, Bo; Tambs, Kristian

    2016-11-18

    The objective of this study has been to test 2 spreadsheet models to compare the observed with the expected hearing loss for a Norwegian reference population. The prevalence rates of the Norwegian and the National Institute for Occupational Safety and Health (NIOSH) definitions of hearing outcomes were calculated in terms of sex and age, 20-64 years old, for a screened (with no occupational noise exposure) (N = 18 858) and unscreened (N = 38 333) Norwegian reference population from the Nord-Trøndelag Hearing Loss Study (NTHLS). Based on the prevalence rates, 2 different spreadsheet models were constructed in order to compare the prevalence rates of various groups of workers with the expected rates. The spreadsheets were then tested on 10 different occupational groups with varying degrees of hearing loss as compared to a reference population. Hearing of office workers, train drivers, conductors and teachers differed little from the screened reference values based on the Norwegian and the NIOSH criterion. The construction workers, miners, farmers and military had an impaired hearing and railway maintenance workers and bus drivers had a mildly impaired hearing. The spreadsheet models give a valid assessment of the hearing loss. The use of spreadsheet models to compare hearing in occupational groups with that of a reference population is a simple and quick method. The results are in line with comparable hearing thresholds, and allow for significance testing. The method is believed to be useful for occupational health services in the assessment of risk of noise induced hearing loss (NIHL) and the preventive potential in groups of noise-exposed workers. Int J Occup Med Environ Health 2016;29(6):991-999. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  14. Modeling the Monthly Water Balance of a First Order Coastal Forested Watershed

    Treesearch

    S. V. Harder; Devendra M. Amatya; T. J. Callahan; Carl C. Trettin

    2006-01-01

    A study has been conducted to evaluate a spreadsheet-based conceptual Thornthwaite monthly water balance model and the process-based DRAINMOD model for their reliability in predicting monthly water budgets of a poorly drained, first order forested watershed at the Santee Experimental Forest located along the Lower Coastal Plain of South Carolina. Measured precipitation...

  15. A New Spin on Miscue Analysis: Using Spider Charts to Web Reading Processes

    ERIC Educational Resources Information Center

    Wohlwend, Karen E.

    2012-01-01

    This article introduces a way of seeing miscue analysis data through a "spider chart", a readily available digital graphing tool that provides an effective way to visually represent readers' complex coordination of interrelated cueing systems. A spider chart is a standard feature in recent spreadsheet software that puts a new spin on miscue…

  16. A Spreadsheet-based GIS tool for planning aerial photography

    EPA Science Inventory

    The U.S.EPA's Pacific Coastal Ecology Branch has developed a tool which facilitates planning aerial photography missions. This tool is an Excel spreadsheet which accepts various input parameters such as desired photo-scale and boundary coordinates of the study area and compiles ...

  17. Implementing multiresolution models and families of models: from entity-level simulation to desktop stochastic models and "repro" models

    NASA Astrophysics Data System (ADS)

    McEver, Jimmie; Davis, Paul K.; Bigelow, James H.

    2000-06-01

    We have developed and used families of multiresolution and multiple-perspective models (MRM and MRMPM), both in our substantive analytic work for the Department of Defense and to learn more about how such models can be designed and implemented. This paper is a brief case history of our experience with a particular family of models addressing the use of precision fires in interdicting and halting an invading army. Our models were implemented as closed-form analytic solutions, in spreadsheets, and in the more sophisticated AnalyticaTM environment. We also drew on an entity-level simulation for data. The paper reviews the importance of certain key attributes of development environments (visual modeling, interactive languages, friendly use of array mathematics, facilities for experimental design and configuration control, statistical analysis tools, graphical visualization tools, interactive post-processing, and relational database tools). These can go a long way towards facilitating MRMPM work, but many of these attributes are not yet widely available (or available at all) in commercial model-development tools--especially for use with personal computers. We conclude with some lessons learned from our experience.

  18. Resource Economics

    NASA Astrophysics Data System (ADS)

    Conrad, Jon M.

    2000-01-01

    Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. These problems help make concepts operational, develop economic intuition, and serve as a bridge to the study of real-world problems of resource management. Through these examples and additional exercises at the end of Chapters 1 to 8, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems. Book is unique in its use of spreadsheet software (Excel) to solve dynamic allocation problems Conrad is co-author of a previous book for the Press on the subject for graduate students Approach is extremely student-friendly; gives students the tools to apply research results to actual environmental issues

  19. Analytical resource assessment method for continuous (unconventional) oil and gas accumulations - The "ACCESS" Method

    USGS Publications Warehouse

    Crovelli, Robert A.; revised by Charpentier, Ronald R.

    2012-01-01

    The U.S. Geological Survey (USGS) periodically assesses petroleum resources of areas within the United States and the world. The purpose of this report is to explain the development of an analytic probabilistic method and spreadsheet software system called Analytic Cell-Based Continuous Energy Spreadsheet System (ACCESS). The ACCESS method is based upon mathematical equations derived from probability theory. The ACCESS spreadsheet can be used to calculate estimates of the undeveloped oil, gas, and NGL (natural gas liquids) resources in a continuous-type assessment unit. An assessment unit is a mappable volume of rock in a total petroleum system. In this report, the geologic assessment model is defined first, the analytic probabilistic method is described second, and the spreadsheet ACCESS is described third. In this revised version of Open-File Report 00-044 , the text has been updated to reflect modifications that were made to the ACCESS program. Two versions of the program are added as appendixes.

  20. Integrated Space Asset Management Database and Modeling

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd; Gagliano, Larry; Percy, Thomas; Mason, Shane

    2015-01-01

    Effective Space Asset Management is one key to addressing the ever-growing issue of space congestion. It is imperative that agencies around the world have access to data regarding the numerous active assets and pieces of space junk currently tracked in orbit around the Earth. At the center of this issues is the effective management of data of many types related to orbiting objects. As the population of tracked objects grows, so too should the data management structure used to catalog technical specifications, orbital information, and metadata related to those populations. Marshall Space Flight Center's Space Asset Management Database (SAM-D) was implemented in order to effectively catalog a broad set of data related to known objects in space by ingesting information from a variety of database and processing that data into useful technical information. Using the universal NORAD number as a unique identifier, the SAM-D processes two-line element data into orbital characteristics and cross-references this technical data with metadata related to functional status, country of ownership, and application category. The SAM-D began as an Excel spreadsheet and was later upgraded to an Access database. While SAM-D performs its task very well, it is limited by its current platform and is not available outside of the local user base. Further, while modeling and simulation can be powerful tools to exploit the information contained in SAM-D, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. This paper provides a summary of SAM-D development efforts to date and outlines a proposed data management infrastructure that extends SAM-D to support the larger data sets to be generated. A service-oriented architecture model using an information sharing platform named SIMON will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for visualizations. In addition, tight control of information sharing policy will increase confidence in the system, which would encourage industry partners to provide commercial data. Combined with the integration of new and legacy M&S tools, a SIMON-based architecture will provide a robust environment that can be extended and expanded indefinitely.

  1. Knowledge-based control of an adaptive interface

    NASA Technical Reports Server (NTRS)

    Lachman, Roy

    1989-01-01

    The analysis, development strategy, and preliminary design for an intelligent, adaptive interface is reported. The design philosophy couples knowledge-based system technology with standard human factors approaches to interface development for computer workstations. An expert system has been designed to drive the interface for application software. The intelligent interface will be linked to application packages, one at a time, that are planned for multiple-application workstations aboard Space Station Freedom. Current requirements call for most Space Station activities to be conducted at the workstation consoles. One set of activities will consist of standard data management services (DMS). DMS software includes text processing, spreadsheets, data base management, etc. Text processing was selected for the first intelligent interface prototype because text-processing software can be developed initially as fully functional but limited with a small set of commands. The program's complexity then can be increased incrementally. The intelligent interface includes the operator's behavior and three types of instructions to the underlying application software are included in the rule base. A conventional expert-system inference engine searches the data base for antecedents to rules and sends the consequents of fired rules as commands to the underlying software. Plans for putting the expert system on top of a second application, a database management system, will be carried out following behavioral research on the first application. The intelligent interface design is suitable for use with ground-based workstations now common in government, industrial, and educational organizations.

  2. (abstract) Generic Modeling of a Life Support System for Process Technology Comparisons

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.

  3. Buffer$--An Economic Analysis Tool

    Treesearch

    Gary Bentrup

    2007-01-01

    Buffer$ is an economic spreadsheet tool for analyzing the cost-benefits of conservation buffers by resource professionals. Conservation buffers are linear strips of vegetation managed for multiple landowner and societal objectives. The Microsoft Excel based spreadsheet can calculate potential income derived from a buffer, including income from cost-share/incentive...

  4. A spreadsheet that calculates meteor orbits

    NASA Astrophysics Data System (ADS)

    Langbroek, M.

    2004-08-01

    The author has written an MS Excel spreadsheet application called Metorb08.xls which calculates a meteor's orbital elements from its apparent radiant position and initial speed. It can be downloaded from URL http://home.wanadoo.nl/marco.langbroek along with a suite of other meteor-related Excel applications.

  5. Teaching Science and Mathematics Subjects Using the Excel Spreadsheet Package

    ERIC Educational Resources Information Center

    Ibrahim, Dogan

    2009-01-01

    The teaching of scientific subjects usually require laboratories where students can put the theory they have learned into practice. Traditionally, electronic programmable calculators, dedicated software, or expensive software simulation packages, such as MATLAB have been used to simulate scientific experiments. Recently, spreadsheet programs have…

  6. Automated Formative Feedback and Summative Assessment Using Individualised Spreadsheet Assignments

    ERIC Educational Resources Information Center

    Blayney, Paul; Freeman, Mark

    2004-01-01

    This paper reports on the effects of automating formative feedback at the student's discretion and automating summative assessment with individualised spreadsheet assignments. Quality learning outcomes are achieved when students adopt deep approaches to learning (Ramsden, 2003). Learning environments designed to align assessment to learning…

  7. Introduction to Classroom Sprego

    ERIC Educational Resources Information Center

    Csernoch, Mária; Biró, Piroska

    2016-01-01

    Sprego is programming with spreadsheet functions. The present paper provides introductory Sprego examples which have so far only been available in Hungarian. Spreadsheet environments offer both a programming tool which best serves beginner and end-user programmers' interest, and an approach which lightens the burden of coding and language details.…

  8. Hydrogen Financial Analysis Scenario Tool (H2FAST) Documentation

    Science.gov Websites

    for the web and spreadsheet versions of H2FAST. H2FAST Web Tool User's Manual H2FAST Spreadsheet Tool User's Manual (DRAFT) Technical Support Send questions or feedback about H2FAST to H2FAST@nrel.gov. Home

  9. EasyDelta: A spreadsheet for kinetic modeling of the stable carbon isotope composition of natural gases

    NASA Astrophysics Data System (ADS)

    Zou, Yan-Rong; Wang, Lianyuan; Shuai, Yanhua; Peng, Ping'an

    2005-08-01

    A new kinetic model and an Excel © spreadsheet program for modeling the stable carbon isotope composition of natural gases is provided in this paper. The model and spreadsheet could be used to describe and predict the variances in stable carbon isotope of natural gases under both experimental and geological conditions with heating temperature or geological time. It is a user-friendly convenient tool for the modeling of isotope variation with time under experimental and geological conditions. The spreadsheet, based on experimental data, requires the input of the kinetic parameters of gaseous hydrocarbons generation. Some assumptions are made in this model: the conventional (non-isotope species) kinetic parameters represent the light isotope species; the initial isotopic value is the same for all parallel chemical reaction of gaseous hydrocarbons generation for simplicity, the re-exponential factor ratio, 13A/ 12A, is a constant, and both heavy and light isotope species have similar activation energy distribution. These assumptions are common in modeling of isotope ratios. The spreadsheet is used for searching the best kinetic parameters of the heavy isotope species to reach the minimum errors compared with experimental data, and then extrapolating isotopic changes to the thermal history of sedimentary basins. A short calculation example on the variation in δ13C values of methane is provided in this paper to show application to geological conditions.

  10. Introducing Simulation via the Theory of Records

    ERIC Educational Resources Information Center

    Johnson, Arvid C.

    2011-01-01

    While spreadsheet simulation can be a useful method by which to help students to understand some of the more advanced concepts in an introductory statistics course, introducing the simulation methodology at the same time as these concepts can result in student cognitive overload. This article describes a spreadsheet model that has been…

  11. The Spreadsheet in an Educational Setting. Microcomputing Working Paper Series F 84-4.

    ERIC Educational Resources Information Center

    Wozny, Lucy

    This overview of a specific spreadsheet, Microsoft's Multiplan for the Apple Macintosh microcomputer, emphasizes specific features that are important to the academic community, including the mathematical functions of algebra, trigonometry, and statistical analysis. Additional features are summarized, including data formats for both numerical and…

  12. Forming Conjectures within a Spreadsheet Environment

    ERIC Educational Resources Information Center

    Calder, Nigel; Brown, Tony; Hanley, Una; Darby, Susan

    2006-01-01

    This paper is concerned with the use of spreadsheets within mathematical investigational tasks. Considering the learning of both children and pre-service teaching students, it examines how mathematical phenomena can be seen as a function of the pedagogical media through which they are encountered. In particular, it shows how pedagogical apparatus…

  13. Domestic Disasters and Geospatial Technology for the Defense Logistics Agency

    DTIC Science & Technology

    2014-12-01

    total distance traveled and satisfy all fuel demands. This report used the Vehicle Routing Problem (VRP) Spreadsheet Solver, developed by Erdogan ...Security Affairs, 2(2), 5–10. Erdogan , G. (2013). VRP spreadsheet solver. Retrieved from VeRoLog: EURO Working Group on Vehicle Routing and Logistics

  14. Interactive Spreadsheets in JCE Webware

    ERIC Educational Resources Information Center

    Coleman, William F.; Fedosky, Edward W.

    2005-01-01

    A description of the Microsoft Excel spreadsheet simulation, Anharmonicity.xls that can be used to smoothly and continuously switch a plotted function and its quadratic approximation is presented. It can be used in a classroom demonstration or incorporated into a student-centered computer-laboratory exercise to examine the qualitative behavior of…

  15. Spreadsheet Applications: Prototyping an Innovative Blended Course

    ERIC Educational Resources Information Center

    Baker, J. Howard

    2004-01-01

    After teaching the advanced spreadsheet course at a major university in Louisiana as a traditional classroom course for a number of years, it was decided to create a prototype-blended course, with a considerable portion offered via distance education. This research, which uses a prototyping methodology, is exploratory in nature. Prototyping can…

  16. LOTUS 1-2-3 and Decision Support: Allocating the Monograph Budget.

    ERIC Educational Resources Information Center

    Perry-Holmes, Claudia

    1985-01-01

    Describes the use of electronic spreadsheet software for library decision support systems using personal computers. Discussion covers templates, formulas for allocating the materials budget, LOTUS 1-2-3 and budget allocations, choosing a formula, the spreadsheet itself, graphing capabilities, and advantages and disadvantages of templates. Six…

  17. Triangular Plots and Spreadsheet Software.

    ERIC Educational Resources Information Center

    Holm, Paul Eric

    1988-01-01

    Describes how the limitations of the built-in graphics capabilities of spreadsheet software can be overcome by making full use of the flexibility of the grahics options. Uses triangular plots with labeled field boundaries produced using Lotus 1-2-3 to demonstrate these techniques and their use in teaching geology. (CW)

  18. Calculating the Variables of Finance on a Spreadsheet.

    ERIC Educational Resources Information Center

    Rochowicz, John A., Jr.

    The different approaches for solving problems and learning mathematics with technology are invaluable. This paper describes how to determine the variables of the ordinary annuity equation with a spreadsheet. Examples of future value of annuity, sinking fund annuity, the number of periods necessary for periodic payments plus interest to accumulate…

  19. Hydroshear Simulation Lab Test 2

    DOE Data Explorer

    Bauer, Steve

    2014-08-01

    This data file is for test 2. In this test a sample of granite with a pre cut (man made fracture) is confined, heated and differential stress is applied. max temperature in this this system development test is 95C. test details on the spreadsheets--note thta there are 2 spreadsheets

  20. Buyers Guide: Communications Software--Overview; Ratings Digest; Reviews; Benchmarks.

    ERIC Educational Resources Information Center

    Lockwood, Russ; And Others

    1988-01-01

    Contains articles which review communications software. Includes "Crosstalk Mark 4,""ProComm,""Freeway Advanced,""Windows InTalk,""Relay Silver," and "Smartcom III." Compares in terms of text proprietary, MCI upload, Test ASCII, Spreadsheet Proprietary, Text XMODEM, Spreadsheet XMODEM, MCI Download, Documentation, Support and Service, ease of use,…

  1. Spreadsheet Analysis of Harvesting Systems

    Treesearch

    R.B. Rummer; B.L. Lanford

    1987-01-01

    Harvesting systems can be modeled and analyzed on microcomputers using commercially available "spreadsheet" software. The effect of system or external variables on the production rate or system cost can be evaluated and alternative systems can be easily examined. The tedious calculations associated with such analyses are performed by the computer. For users...

  2. Constructing Meanings and Utilities within Algebraic Tasks

    ERIC Educational Resources Information Center

    Ainley, Janet; Bills, Liz; Wilson, Kirsty

    2004-01-01

    The Purposeful Algebraic Activity project aims to explore the potential of spreadsheets in the introduction to algebra and algebraic thinking. We discuss two sub-themes within the project: tracing the development of pupils' construction of meaning for variable from arithmetic-based activity, through use of spreadsheets, and into formal algebra,…

  3. The Protein Identifier Cross-Referencing (PICR) service: reconciling protein identifiers across multiple source databases.

    PubMed

    Côté, Richard G; Jones, Philip; Martens, Lennart; Kerrien, Samuel; Reisinger, Florian; Lin, Quan; Leinonen, Rasko; Apweiler, Rolf; Hermjakob, Henning

    2007-10-18

    Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. We have created the Protein Identifier Cross-Reference (PICR) service, a web application that provides interactive and programmatic (SOAP and REST) access to a mapping algorithm that uses the UniProt Archive (UniParc) as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV) or Microsoft Excel (XLS) files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR interface, documentation and code examples are available at http://www.ebi.ac.uk/Tools/picr.

  4. The Protein Identifier Cross-Referencing (PICR) service: reconciling protein identifiers across multiple source databases

    PubMed Central

    Côté, Richard G; Jones, Philip; Martens, Lennart; Kerrien, Samuel; Reisinger, Florian; Lin, Quan; Leinonen, Rasko; Apweiler, Rolf; Hermjakob, Henning

    2007-01-01

    Background Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. Results We have created the Protein Identifier Cross-Reference (PICR) service, a web application that provides interactive and programmatic (SOAP and REST) access to a mapping algorithm that uses the UniProt Archive (UniParc) as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV) or Microsoft Excel (XLS) files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. Conclusion We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR interface, documentation and code examples are available at . PMID:17945017

  5. [Food and nutrition security policy in Brazil: an analysis of resource allocation].

    PubMed

    Custódio, Marta Battaglia; Yuba, Tânia Yuka; Cyrillo, Denise Cavallini

    2013-02-01

    To describe the progression and distribution of federal funds for programs and activities that fall within the scope of the guidelines of the Brazilian National Policy on Food and Nutrition Security (PNSAN) in the period from 2004 to 2010. This descriptive study used data from the Transparency Website maintained by the Brazilian Public Sector Internal Control Office. Search results were exported to Excel spreadsheets. To determine the resources allocated to food security initiatives, a database was set up containing all actions developed by the federal government between 2004 and 2010. This database was reviewed and the actions that were not related to PNSAN were discarded. The annual amounts obtained were corrected by the Consumer Price Index and updated for the year 2010. Since actions are part of specific programs, the sum of the resources allocated for all the actions of a program amounted to the resources invested in the program as a whole. The programs were then prioritized according to the amount of resources received in 2010. Of the 5 014 actions receiving federal funds in the study period, 814 were related to PNSAN (229 programs). There was growth in resources allocated for PNSAN programs, reaching US$ 15 billion in 2010 (an 82% increase over the previous year). The largest amount was invested in Bolsa Família, a cash transfer program. Ten programs received 90% of the funds, of which five were linked to food production processes. The amount of resources invested in the PNSAN and in actions and programs that promote food and nutrition security is increasing in Brazil.

  6. Wood fueled boiler financial feasibility user's manual

    Treesearch

    Robert Govett; Scott Bowe; Terry Mace; Steve Hubbard; John (Rusty) Dramm; Richard Bergman

    2005-01-01

    “Wood Fueled Boiler Financial Feasibility” is a spreadsheet program designed for easy use on a personal computer. This program provides a starting point for interested parties to perform financial feasibility analysis of a steam boiler system for space heating or process heat. By allowing users to input the conditions applicable to their current or proposed fuel...

  7. Users guide for noble fir bough cruiser.

    Treesearch

    Roger D. Fight; Keith A. Blatner; Roger C. Chapman; William E. Schlosser

    2005-01-01

    The bough cruiser spreadsheet was developed to provide a method for cruising noble fir (Abies procera Rehd.) stands to estimate the weight of boughs that might be harvested. No boughs are cut as part of the cruise process. The approach is based on a two-stage sample. The first stage consists of fixed-radius plots that are used to estimate the...

  8. Exploring the Role of Digital Data in Contemporary Schools and Schooling--"200,000 Lines in an Excel Spreadsheet"

    ERIC Educational Resources Information Center

    Selwyn, Neil; Henderson, Michael; Chao, Shu-Hua

    2015-01-01

    The generation, processing and circulation of data in digital form is now an integral aspect of contemporary schooling. Based upon empirical study of two secondary school settings in Australia, this paper considers the different forms of digitally-based "data work" engaged in by school leaders, managers, administrators and teachers. In…

  9. Cash Flow Statement Spreadsheet Modeling Case Using a Prototype System Development Process

    ERIC Educational Resources Information Center

    Davis, Jefferson T.

    2015-01-01

    U.S. GAAP and IFRS standards both require a cash flow statement that presents operating, investing and financing net cash flows (FASB, FAS 95; 1987; IASB, IAS 7, 1992). Although students are exposed to the cash flow statement in beginning accounting courses and then study the cash flow statement in more depth in intermediate accounting classes,…

  10. A Switching-Mode Power Supply Design Tool to Improve Learning in a Power Electronics Course

    ERIC Educational Resources Information Center

    Miaja, P. F.; Lamar, D. G.; de Azpeitia, M.; Rodriguez, A.; Rodriguez, M.; Hernando, M. M.

    2011-01-01

    The static design of ac/dc and dc/dc switching-mode power supplies (SMPS) relies on a simple but repetitive process. Although specific spreadsheets, available in various computer-aided design (CAD) programs, are widely used, they are difficult to use in educational applications. In this paper, a graphic tool programmed in MATLAB is presented,…

  11. Technology Focus: Using Technology to Promote Equity in Financial Decision Making

    ERIC Educational Resources Information Center

    Garofalo, Joe; Kitchell, Barbara Ann

    2010-01-01

    The process of borrowing money can be intimidating to some people. Many feel at the mercy of a loan officer and just accept terms and amounts at face value. A graphing calculator, or spreadsheet, with appropriate knowledge of how to use it, can be an empowering tool to help create a more equitable situation or circumstance. Given the proper…

  12. Active Learning and Student Engagement in the Business Curriculum: Excel Can Be the Answer

    ERIC Educational Resources Information Center

    McCloskey, Donna W.; Bussom, Lisa

    2013-01-01

    Business educators are struggling with how better to engage their students in the learning process. At the same time, stakeholders are reporting that business students are ill prepared in problem solving techniques and the effective use of spreadsheets. The systemic use of Excel as a teaching tool in the business curriculum may be the answer to…

  13. A web-based 3D geological information visualization system

    NASA Astrophysics Data System (ADS)

    Song, Renbo; Jiang, Nan

    2013-03-01

    Construction of 3D geological visualization system has attracted much more concern in GIS, computer modeling, simulation and visualization fields. It not only can effectively help geological interpretation and analysis work, but also can it can help leveling up geosciences professional education. In this paper, an applet-based method was introduced for developing a web-based 3D geological information visualization system. The main aims of this paper are to explore a rapid and low-cost development method for constructing a web-based 3D geological system. First, the borehole data stored in Excel spreadsheets was extracted and then stored in SQLSERVER database of a web server. Second, the JDBC data access component was utilized for providing the capability of access the database. Third, the user interface was implemented with applet component embedded in JSP page and the 3D viewing and querying functions were implemented with PickCanvas of Java3D. Last, the borehole data acquired from geological survey were used for test the system, and the test results has shown that related methods of this paper have a certain application values.

  14. A Database of Herbaceous Vegetation Responses to Elevated Atmospheric CO2 (NDP-073)

    DOE Data Explorer

    Jones, Michael H [The Ohio State Univ., Columbus, OH (United States); Curtis, Peter S [The Ohio State Univ., Columbus, OH (United States); Cushman, Robert M [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brenkert, Antoinette L [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    1999-01-01

    To perform a statistically rigorous meta-analysis of research results on the response by herbaceous vegetation to increased atmospheric CO2 levels, a multiparameter database of responses was compiled from the published literature. Seventy-eight independent CO2-enrichment studies, covering 53 species and 26 response parameters, reported mean response, sample size, and variance of the response (either as standard deviation or standard error). An additional 43 studies, covering 25 species and 6 response parameters, did not report variances. This numeric data package accompanies the Carbon Dioxide Information Analysis Center's (CDIAC's) NDP- 072, which provides similar information for woody vegetation. This numeric data package contains a 30-field data set of CO2- exposure experiment responses by herbaceous plants (as both a flat ASCII file and a spreadsheet file), files listing the references to the CO2-exposure experiments and specific comments relevant to the data in the data sets, and this documentation file (which includes SAS and Fortran codes to read the ASCII data file; SAS is a registered trademark of the SAS Institute, Inc., Cary, North Carolina 27511).

  15. Supplemental knowledge acquisition through external product interface for CLIPS

    NASA Technical Reports Server (NTRS)

    Saito, Tim; Ebaud, Stephen; Loftin, Bowen R.

    1990-01-01

    Traditionally, the acquisition of knowledge for expert systems consisted of the interview process with the domain or subject matter expert (SME), observation of domain environment, and information gathering and research which constituted a direct form of knowledge acquisition (KA). The knowledge engineer would be responsible for accumulating pertinent information and/or knowledge from the SME(s) for input into the appropriate expert system development tool. The direct KA process may (or may not) have included forms of data or documentation to incorporate from the SME's surroundings. The differentiation between direct KA and supplemental KA (indirect) would be the difference in the use of data. In acquiring supplemental knowledge, the knowledge engineer would access other types of evidence (manuals, documents, data files, spreadsheets, etc.) that would support the reasoning or premises of the SME. When an expert makes a decision in a particular task, one tool that may have been used to justify a recommendation, would have been a spreadsheet total or column figure. Locating specific decision points from that data within the SME's framework would constitute supplemental KA. Data used for a specific purpose in one system or environment would be used as supplemental knowledge for another, specifically a CLIPS project.

  16. Forming conjectures within a spreadsheet environment

    NASA Astrophysics Data System (ADS)

    Calder, Nigel; Brown, Tony; Hanley, Una; Darby, Susan

    2006-12-01

    This paper is concerned with the use of spreadsheets within mathematical investigational tasks. Considering the learning of both children and pre-service teaching students, it examines how mathematical phenomena can be seen as a function of the pedagogical media through which they are encountered. In particular, it shows how pedagogical apparatus influence patterns of social interaction, and how this interaction shapes the mathematical ideas that are engaged with. Notions of conjecture, along with the particular faculty of the spreadsheet setting, are considered with regard to the facilitation of mathematical thinking. Employing an interpretive perspective, a key focus is on how alternative pedagogical media and associated discursive networks influence the way that students form and test informal conjectures.

  17. BiKEGG: a COBRA toolbox extension for bridging the BiGG and KEGG databases.

    PubMed

    Jamialahmadi, Oveis; Motamedian, Ehsan; Hashemi-Najafabadi, Sameereh

    2016-10-18

    Development of an interface tool between the Biochemical, Genetic and Genomic (BiGG) and KEGG databases is necessary for simultaneous access to the features of both databases. For this purpose, we present the BiKEGG toolbox, an open source COBRA toolbox extension providing a set of functions to infer the reaction correspondences between the KEGG reaction identifiers and those in the BiGG knowledgebase using a combination of manual verification and computational methods. Inferred reaction correspondences using this approach are supported by evidence from the literature, which provides a higher number of reconciled reactions between these two databases compared to the MetaNetX and MetRxn databases. This set of equivalent reactions is then used to automatically superimpose the predicted fluxes using COBRA methods on classical KEGG pathway maps or to create a customized metabolic map based on the KEGG global metabolic pathway, and to find the corresponding reactions in BiGG based on the genome annotation of an organism in the KEGG database. Customized metabolic maps can be created for a set of pathways of interest, for the whole KEGG global map or exclusively for all pathways for which there exists at least one flux carrying reaction. This flexibility in visualization enables BiKEGG to indicate reaction directionality as well as to visualize the reaction fluxes for different static or dynamic conditions in an animated manner. BiKEGG allows the user to export (1) the output visualized metabolic maps to various standard image formats or save them as a video or animated GIF file, and (2) the equivalent reactions for an organism as an Excel spreadsheet.

  18. Developing a Logistics Data Process for Support Equipment for NASA Ground Operations

    NASA Technical Reports Server (NTRS)

    Chakrabarti, Suman

    2010-01-01

    The United States NASA Space Shuttle has long been considered an extremely capable yet relatively expensive rocket. A great part of the roughly US $500 million per launch expense was the support footprint: refurbishment and maintenance of the space shuttle system, together with the long list of resources required to support it, including personnel, tools, facilities, transport and support equipment. NASA determined to make its next rocket system with a smaller logistics footprint, and thereby more cost-effective and quicker turnaround. The logical solution was to adopt a standard Logistics Support Analysis (LSA) process based on GEIA-STD-0007 http://www.logisticsengineers.org/may09pres/GEIASTD0007DEXShortIntro.pdf which is the successor of MIL-STD-1388-2B widely used by U.S., NATO, and other world military services and industries. This approach is unprecedented at NASA: it is the first time a major program of programs, Project Constellation, is factoring logistics and supportability into design at many levels. This paper will focus on one of those levels NASA ground support equipment for the next generation of NASA rockets and on building a Logistics Support Analysis Record (LSAR) for developing and documenting a support solution and inventory of resources for. This LSAR is actually a standards-based database, containing analyses of the time and tools, personnel, facilities and support equipment required to assemble and integrate the stages and umbilicals of a rocket. This paper will cover building this database from scratch: including creating and importing a hierarchical bill of materials (BOM) from legacy data; identifying line-replaceable units (LRUs) of a given piece of equipment; analyzing reliability and maintainability of said LRUs; and therefore making an assessment back to design whether the support solution for a piece of equipment is too much work, i.e., too resource-intensive. If one must replace or inspect an LRU too much, perhaps a modification of the design of the equipment can make such operational effort unnecessary. Finally, this paper addresses processes of tying resources to a timeline of tasks performed in ground operations: this enables various overarching analyses, e.g., a summarization of all resources used for a given piece of equipment. Quality Control of data will also be discussed: importing and exporting data from product teams, including spreadsheets-todatabase or data exchange between databases.

  19. Lessons Learned from Deploying an Analytical Task Management Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Welch, Clara; Arceneaux, Joshua; Bulgatz, Dennis; Hunt, Mitch; Young, Stephen

    2007-01-01

    Defining requirements, missions, technologies, and concepts for space exploration involves multiple levels of organizations, teams of people with complementary skills, and analytical models and simulations. Analytical activities range from filling a To-Be-Determined (TBD) in a requirement to creating animations and simulations of exploration missions. In a program as large as returning to the Moon, there are hundreds of simultaneous analysis activities. A way to manage and integrate efforts of this magnitude is to deploy a centralized database that provides the capability to define tasks, identify resources, describe products, schedule deliveries, and generate a variety of reports. This paper describes a web-accessible task management system and explains the lessons learned during the development and deployment of the database. Through the database, managers and team leaders can define tasks, establish review schedules, assign teams, link tasks to specific requirements, identify products, and link the task data records to external repositories that contain the products. Data filters and spreadsheet export utilities provide a powerful capability to create custom reports. Import utilities provide a means to populate the database from previously filled form files. Within a four month period, a small team analyzed requirements, developed a prototype, conducted multiple system demonstrations, and deployed a working system supporting hundreds of users across the aeros pace community. Open-source technologies and agile software development techniques, applied by a skilled team enabled this impressive achievement. Topics in the paper cover the web application technologies, agile software development, an overview of the system's functions and features, dealing with increasing scope, and deploying new versions of the system.

  20. Simulation modeling for the health care manager.

    PubMed

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  1. A Spreadsheet-Based Visualized Mindtool for Improving Students' Learning Performance in Identifying Relationships between Numerical Variables

    ERIC Educational Resources Information Center

    Lai, Chiu-Lin; Hwang, Gwo-Jen

    2015-01-01

    In this study, a spreadsheet-based visualized Mindtool was developed for improving students' learning performance when finding relationships between numerical variables by engaging them in reasoning and decision-making activities. To evaluate the effectiveness of the proposed approach, an experiment was conducted on the "phenomena of climate…

  2. A Computer Spreadsheet for Locating Assistive Devices.

    ERIC Educational Resources Information Center

    Palmer, Catherine V.; Garstecki, Dean C.

    1988-01-01

    The article presents a directory of assistive devices for persons with hearing impairments in a grid format by distributor and type of device (alerting devices, telephone, TV/radio/stereo, personal communication, group communication, and other). The product locator is also available in spreadsheet form for either the Macintosh or IBM-PC computers.…

  3. Working Together: Google Apps Goes to School

    ERIC Educational Resources Information Center

    Oishi, Lindsay

    2007-01-01

    Online collaboration and project-management tools allow people to work together without being in the same place at the same time. However, that is not all, Google Docs & Spreadsheets, for example, allows the creation of documents and spreadsheets just like in Microsoft Word and Excel, but with more collaborative capacity. Google Calendar lets…

  4. Simulating Satellite and Space Probe Motion at High School with Spreadsheets

    ERIC Educational Resources Information Center

    Benacka, Jan

    2017-01-01

    This paper gives an account of an experiment in which thirty-three high school students of ages 17-19 developed spreadsheet numerical models of satellite and space probe motion. The models are free to download. A survey was carried out to find out the students' opinion of the lessons.

  5. Using Spreadsheets to Teach Aspects of Biology Involving Mathematical Models

    ERIC Educational Resources Information Center

    Carlton, Kevin; Nicholls, Mike; Ponsonby, David

    2004-01-01

    Some aspects of biology, for example the Hardy-Weinberg simulation of population genetics or modelling heat flow in lizards, have an undeniable mathematical basis. Students can find the level of mathematical skill required to deal with such concepts to be an insurmountable hurdle to understanding. If not used effectively, spreadsheet models…

  6. Using Spreadsheet Modeling Techniques for Capital Project Review. AIR 1985 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Kaynor, Robert K.

    The value of microcomputer modeling tools and spreadsheets to help college institutional researchers analyze proposed capital projects is discussed, along with strengths and weaknesses of different software packages. Capital budgeting is the analysis that supports decisions about the allocation and commitment of funds to long-term capital…

  7. Spreadsheets as a Transparent Resource for Learning the Mathematics of Annuities

    ERIC Educational Resources Information Center

    Pournara, Craig

    2009-01-01

    The ability of mathematics teachers to decompress mathematics and to move between representations are two key features of mathematical knowledge that is usable for teaching. This article reports on four pre-service secondary mathematics teachers learning the mathematics of annuities. In working with spreadsheets students began to make sense of…

  8. A Simple Spreadsheet Strikes a Nerve among Adjuncts

    ERIC Educational Resources Information Center

    Stratford, Michael

    2012-01-01

    Energized by his fellow adjunct professors who had gathered for a national meeting last month in Washington, District of Columbia, Joshua A. Boldt flew home to Athens, Georgia, opened his laptop, and created a Google document. On his personal blog, the writing instructor implored colleagues to contribute to the publicly editable spreadsheet,…

  9. Studying Faculty Flows Using an Interactive Spreadsheet Model. AIR 1997 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Kelly, Wayne

    This paper describes a spreadsheet-based faculty flow model developed and implemented at the University of Calgary (Canada) to analyze faculty retirement, turnover, and salary issues. The study examined whether, given expected faculty turnover, the current salary increment system was sustainable in a stable or declining funding environment, and…

  10. Transition Matrices: A Tool to Assess Student Learning and Improve Instruction

    ERIC Educational Resources Information Center

    Morris, Gary A.; Walter, Paul; Skees, Spencer; Schwartz, Samantha

    2017-01-01

    This paper introduces a new spreadsheet tool for adoption by high school or college-level physics teachers who use common assessments in a pre-instruction/post-instruction mode to diagnose student learning and teaching effectiveness. The spreadsheet creates a simple matrix that identifies the percentage of students who select each possible…

  11. Pre-Service Mathematics Teachers' Learning and Teaching of Activity-Based Lessons Supported with Spreadsheets

    ERIC Educational Resources Information Center

    Agyei, Douglas D.; Voogt, Joke M.

    2016-01-01

    In this study, 12 pre-service mathematics teachers worked in teams to develop their knowledge and skills in using teacher-led spreadsheet demonstrations to help students explore mathematics concepts, stimulate discussions and perform authentic tasks through activity-based lessons. Pre-service teachers' lesson plans, their instruction of the…

  12. Examining Errors in Simple Spreadsheet Modeling from Different Research Perspectives

    ERIC Educational Resources Information Center

    Kadijevich, Djordje M.

    2012-01-01

    By using a sample of 1st-year undergraduate business students, this study dealt with the development of simple (deterministic and non-optimization) spreadsheet models of income statements within an introductory course on business informatics. The study examined students' errors in doing this for business situations of their choice and found three…

  13. Using Spreadsheets to Discover Meaning for Parameters in Nonlinear Models

    ERIC Educational Resources Information Center

    Green, Kris H.

    2008-01-01

    This paper explores the use of spreadsheets to develop an exploratory environment where mathematics students can develop their own understanding of the parameters of commonly encountered families of functions: linear, logarithmic, exponential and power. The key to this understanding involves opening up the definition of rate of change from the…

  14. Using a Spreadsheet Scroll Bar to Solve Equilibrium Concentrations

    ERIC Educational Resources Information Center

    Raviolo, Andres

    2012-01-01

    A simple, conceptual method is described for using the spreadsheet scroll bar to find the composition of a system at chemical equilibrium. Simulation of any kind of chemical equilibrium can be carried out using this method, and the effects of different disturbances can be predicted. This simulation, which can be used in general chemistry…

  15. Evolving Polygons and Spreadsheets: Connecting Mathematics across Grade Levels in Teacher Education

    ERIC Educational Resources Information Center

    Abramovich, Sergei; Brouwer, Peter

    2009-01-01

    This paper was prepared in response to the Conference Board of Mathematical Sciences recommendations for the preparation of secondary teachers. It shows how using trigonometry as a conceptual tool in spreadsheet-based applications enables one to develop mathematical understanding in the context of constructing geometric representations of unit…

  16. Excel Yourself with Personalised Email Messages

    ERIC Educational Resources Information Center

    McClean, Stephen

    2008-01-01

    Combining the Excel spreadsheet with an email program provides a very powerful tool for sending students personalised emails. Most email clients now support a Mail Merge facility whereby a generic template is created and information unique to each student record in the spreadsheet is filled into that template, generating tens if not hundreds of…

  17. Perovskite classification: An Excel spreadsheet to determine and depict end-member proportions for the perovskite- and vapnikite-subgroups of the perovskite supergroup

    NASA Astrophysics Data System (ADS)

    Locock, Andrew J.; Mitchell, Roger H.

    2018-04-01

    Perovskite mineral oxides commonly exhibit extensive solid-solution, and are therefore classified on the basis of the proportions of their ideal end-members. A uniform sequence of calculation of the end-members is required if comparisons are to be made between different sets of analytical data. A Microsoft Excel spreadsheet has been programmed to assist with the classification and depiction of the minerals of the perovskite- and vapnikite-subgroups following the 2017 nomenclature of the perovskite supergroup recommended by the International Mineralogical Association (IMA). Compositional data for up to 36 elements are input into the spreadsheet as oxides in weight percent. For each analysis, the output includes the formula, the normalized proportions of 15 end-members, and the percentage of cations which cannot be assigned to those end-members. The data are automatically plotted onto the ternary and quaternary diagrams recommended by the IMA for depiction of perovskite compositions. Up to 200 analyses can be entered into the spreadsheet, which is accompanied by data calculated for 140 perovskite compositions compiled from the literature.

  18. Management of plastic wastes at Brazilian ports and diagnosis of their generation.

    PubMed

    Neffa Gobbi, Clarice; Lourenço Sanches, Vânia Maria; Acordi Vasques Pacheco, Elen Beatriz; de Oliveira Cavalcanti Guimarães, Maria José; Vasconcelos de Freitas, Marcos Aurélio

    2017-11-15

    This study evaluated the management of plastic wastes at 20 Brazilian maritime ports, from three sources: vessels, leased and non-leased areas. The data were obtained from documents on port wastes organized in a relational database with defined protocols (closed form). Analysis of the spreadsheets prepared and field visits revealed that the main bottleneck in managing plastic wastes at ports is their segregation. In general, more material is segregated and sent for recycling from leased areas than non-leased ones (administered by the government). This relatively better performance in managing the wastes generated in leased areas is probably due to the need for private operators to comply with the international standards such as the Code of Environmental Practice to satisfy the international market. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Favorable Geochemistry from Springs and Wells in Colorado

    DOE Data Explorer

    Richard E. Zehner

    2012-02-01

    This layer contains favorable geochemistry for high-temperature geothermal systems, as interpreted by Richard "Rick" Zehner. The data is compiled from the data obtained from the USGS. The original data set combines 15,622 samples collected in the State of Colorado from several sources including 1) the original Geotherm geochemical database, 2) USGS NWIS (National Water Information System), 3) Colorado Geological Survey geothermal sample data, and 4) original samples collected by R. Zehner at various sites during the 2011 field season. These samples are also available in a separate shapefile FlintWaterSamples.shp. Data from all samples were reportedly collected using standard water sampling protocols (filtering through 0.45 micron filter, etc.) Sample information was standardized to ppm (micrograms/liter) in spreadsheet columns. Commonly-used cation and silica geothermometer temperature estimates are included.

  20. Geodynamics for Everyone: Robust Finite-Difference Heat Transfer Models using MS Excel 2007 Spreadsheets

    NASA Astrophysics Data System (ADS)

    Grose, C. J.

    2008-05-01

    Numerical geodynamics models of heat transfer are typically thought of as specialized topics of research requiring knowledge of specialized modelling software, linux platforms, and state-of-the-art finite-element codes. I have implemented analytical and numerical finite-difference techniques with Microsoft Excel 2007 spreadsheets to solve for complex solid-earth heat transfer problems for use by students, teachers, and practicing scientists without specialty in geodynamics modelling techniques and applications. While implementation of equations for use in Excel spreadsheets is occasionally cumbersome, once case boundary structure and node equations are developed, spreadsheet manipulation becomes routine. Model experimentation by modifying parameter values, geometry, and grid resolution makes Excel a useful tool whether in the classroom at the undergraduate or graduate level or for more engaging student projects. Furthermore, the ability to incorporate complex geometries and heat-transfer characteristics makes it ideal for first and occasionally higher order geodynamics simulations to better understand and constrain the results of professional field research in a setting that does not require the constraints of state-of-the-art modelling codes. The straightforward expression and manipulation of model equations in excel can also serve as a medium to better understand the confusing notations of advanced mathematical problems. To illustrate the power and robustness of computation and visualization in spreadsheet models I focus primarily on one-dimensional analytical and two-dimensional numerical solutions to two case problems: (i) the cooling of oceanic lithosphere and (ii) temperatures within subducting slabs. Excel source documents will be made available.

  1. Road Traffic Accident Analysis of Ajmer City Using Remote Sensing and GIS Technology

    NASA Astrophysics Data System (ADS)

    Bhalla, P.; Tripathi, S.; Palria, S.

    2014-12-01

    With advancement in technology, new and sophisticated models of vehicle are available and their numbers are increasing day by day. A traffic accident has multi-facet characteristics associated with it. In India 93% of crashes occur due to Human induced factor (wholly or partly). For proper traffic accident analysis use of GIS technology has become an inevitable tool. The traditional accident database is a summary spreadsheet format using codes and mileposts to denote location, type and severity of accidents. Geo-referenced accident database is location-referenced. It incorporates a GIS graphical interface with the accident information to allow for query searches on various accident attributes. Ajmer city, headquarter of Ajmer district, Rajasthan has been selected as the study area. According to Police records, 1531 accidents occur during 2009-2013. Maximum accident occurs in 2009 and the maximum death in 2013. Cars, jeeps, auto, pickup and tempo are mostly responsible for accidents and that the occurrence of accidents is mostly concentrated between 4PM to 10PM. GIS has proved to be a good tool for analyzing multifaceted nature of accidents. While road safety is a critical issue, yet it is handled in an adhoc manner. This Study is a demonstration of application of GIS for developing an efficient database on road accidents taking Ajmer City as a study. If such type of database is developed for other cities, a proper analysis of accidents can be undertaken and suitable management strategies for traffic regulation can be successfully proposed.

  2. Full value documentation in the Czech Food Composition Database.

    PubMed

    Machackova, M; Holasova, M; Maskova, E

    2010-11-01

    The aim of this project was to launch a new Food Composition Database (FCDB) Programme in the Czech Republic; to implement a methodology for food description and value documentation according to the standards designed by the European Food Information Resource (EuroFIR) Network of Excellence; and to start the compilation of a pilot FCDB. Foods for the initial data set were selected from the list of foods included in the Czech Food Consumption Basket. Selection of 24 priority components was based on the range of components used in former Czech tables. The priority list was extended with components for which original Czech analytical data or calculated data were available. Values that were input into the compiled database were documented according to the EuroFIR standards within the entities FOOD, COMPONENT, VALUE and REFERENCE using Excel sheets. Foods were described using the LanguaL Thesaurus. A template for documentation of data according to the EuroFIR standards was designed. The initial data set comprised documented data for 162 foods. Values were based on original Czech analytical data (available for traditional and fast foods, milk and milk products, wheat flour types), data derived from literature (for example, fruits, vegetables, nuts, legumes, eggs) and calculated data. The Czech FCDB programme has been successfully relaunched. Inclusion of the Czech data set into the EuroFIR eSearch facility confirmed compliance of the database format with the EuroFIR standards. Excel spreadsheets are applicable for full value documentation in the FCDB.

  3. Uses of the Drupal CMS Collaborative Framework in the Woods Hole Scientific Community (Invited)

    NASA Astrophysics Data System (ADS)

    Maffei, A. R.; Chandler, C. L.; Work, T. T.; Shorthouse, D.; Furfey, J.; Miller, H.

    2010-12-01

    Organizations that comprise the Woods Hole scientific community (Woods Hole Oceanographic Institution, Marine Biological Laboratory, USGS Woods Hole Coastal and Marine Science Center, Woods Hole Research Center, NOAA NMFS Northeast Fisheries Science Center, SEA Education Association) have a long history of collaborative activity regarding computing, computer network and information technologies that support common, inter-disciplinary science needs. Over the past several years there has been growing interest in the use of the Drupal Content Management System (CMS) playing a variety of roles in support of research projects resident at several of these organizations. Many of these projects are part of science programs that are national and international in scope. Here we survey the current uses of Drupal within the Woods Hole scientific community and examine reasons it has been adopted. The promise of emerging semantic features in the Drupal framework is examined and projections of how pre-existing Drupal-based websites might benefit are made. Closer examination of Drupal software design exposes it as more than simply a content management system. The flexibility of its architecture; the power of its taxonomy module; the care taken in nurturing the open-source developer community that surrounds it (including organized and often well-attended code sprints); the ability to bind emerging software technologies as Drupal modules; the careful selection process used in adopting core functionality; multi-site hosting and cross-site deployment of updates and a recent trend towards development of use-case inspired Drupal distributions casts Drupal as a general-purpose application deployment framework. Recent work in the semantic arena casts Drupal as an emerging RDF framework as well. Examples of roles played by Drupal-based websites within the Woods Hole scientific community that will be discussed include: science data metadata database, organization main website, biological taxonomy development, bibliographic database, physical media data archive inventory manager, disaster-response website development framework, science project task management, science conference planning, and spreadsheet-to-database converter.

  4. Collaborative data model and data base development for paleoenvironmental and archaeological domain using Semantic MediaWiki

    NASA Astrophysics Data System (ADS)

    Willmes, C.

    2017-12-01

    In the frame of the Collaborative Research Centre 806 (CRC 806) an interdisciplinary research project, that needs to manage data, information and knowledge from heterogeneous domains, such as archeology, cultural sciences, and the geosciences, a collaborative internal knowledge base system was developed. The system is based on the open source MediaWiki software, that is well known as the software that enables Wikipedia, for its facilitation of a web based collaborative knowledge and information management platform. This software is additionally enhanced with the Semantic MediaWiki (SMW) extension, that allows to store and manage structural data within the Wiki platform, as well as it facilitates complex query and API interfaces to the structured data stored in the SMW data base. Using an additional open source software called mobo, it is possible to improve the data model development process, as well as automated data imports, from small spreadsheets to large relational databases. Mobo is a command line tool that helps building and deploying SMW structure in an agile, Schema-Driven Development way, and allows to manage and collaboratively develop the data model formalizations, that are formalized in JSON-Schema format, using version control systems like git. The combination of a well equipped collaborative web platform facilitated by Mediawiki, the possibility to store and query structured data in this collaborative database provided by SMW, as well as the possibility for automated data import and data model development enabled by mobo, result in a powerful but flexible system to build and develop a collaborative knowledge base system. Furthermore, SMW allows the application of Semantic Web technology, the structured data can be exported into RDF, thus it is possible to set a triple-store including a SPARQL endpoint on top of the database. The JSON-Schema based data models, can be enhanced into JSON-LD, to facilitate and profit from the possibilities of Linked Data technology.

  5. Implementing a Community-Driven Cyberinfrastructure Platform for the Paleo- and Rock Magnetic Scientific Fields that Generalizes to Other Geoscience Disciplines

    NASA Astrophysics Data System (ADS)

    Minnett, R.; Jarboe, N.; Koppers, A. A.; Tauxe, L.; Constable, C.

    2013-12-01

    EarthRef.org is a geoscience umbrella website for several databases and data and model repository portals. These portals, unified in the mandate to preserve their respective data and promote scientific collaboration in their fields, are also disparate in their schemata. The Magnetics Information Consortium (http://earthref.org/MagIC/) is a grass-roots cyberinfrastructure effort envisioned by the paleo- and rock magnetic scientific community to archive their wealth of peer-reviewed raw data and interpretations from studies on natural and synthetic samples and relies on a partially strict subsumptive hierarchical data model. The Geochemical Earth Reference Model (http://earthref.org/GERM/) portal focuses on the chemical characterization of the Earth and relies on two data schemata: a repository of peer-reviewed reservoir geochemistry, and a database of partition coefficients for rocks, minerals, and elements. The Seamount Biogeosciences Network (http://earthref.org/SBN/) encourages the collaboration between the diverse disciplines involved in seamount research and includes the Seamount Catalog (http://earthref.org/SC/) of bathymetry and morphology. All of these portals also depend on the EarthRef Reference Database (http://earthref.org/ERR/) for publication reference metadata and the EarthRef Digital Archive (http://earthref.org/ERDA/), a generic repository of data objects and their metadata. The development of the new MagIC Search Interface (http://earthref.org/MagIC/search/) centers on a reusable platform designed to be flexible enough for largely heterogeneous datasets and to scale up to datasets with tens of millions of records. The HTML5 web application and Oracle 11g database residing at the San Diego Supercomputer Center (SDSC) support the online contribution and editing of complex datasets in a spreadsheet environment and the browsing and filtering of these contributions in the context of thousands of other datasets. EarthRef.org is in the process of implementing this platform across all of its data portals in spite of the wide variety of data schemata and is dedicated to serving the geoscience community with as little effort from the end-users as possible.

  6. Assessment of the lumber drying industry and current potential for value-added processing in Alaska.

    Treesearch

    David L. Nicholls; Kenneth A. Kilborn

    2001-01-01

    An assessment was done of the lumber drying industry in Alaska. Part 1 of the assessment included an evaluation of kiln capacity, kiln type, and species dried, by geographic region of the state. Part 2 of the assessment considered the value-added potential associated with lumber drying. Various costs related to lumber drying were evaluated in an Excel spreadsheet....

  7. Electronic spreadsheet vs. manual payroll.

    PubMed

    Kiley, M M

    1991-01-01

    Medical groups with direct employees must employ someone or contract with a company to compute payroll, writes Michael Kiley, Ph.D., M.P.H. However, many medical groups, including small ones, own a personal or minicomputer to handle accounts receivable. Kiley explains, in detail, how this same computer and a spreadsheet program also can be used to perform payroll functions.

  8. How Helpful Are Error Management and Counterfactual Thinking Instructions to Inexperienced Spreadsheet Users' Training Task Performance?

    ERIC Educational Resources Information Center

    Caputi, Peter; Chan, Amy; Jayasuriya, Rohan

    2011-01-01

    This paper examined the impact of training strategies on the types of errors that novice users make when learning a commonly used spreadsheet application. Fifty participants were assigned to a counterfactual thinking training (CFT) strategy, an error management training strategy, or a combination of both strategies, and completed an easy task…

  9. Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite

    ERIC Educational Resources Information Center

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…

  10. Flipping the Classroom and Instructional Technology Integration in a College-Level Information Systems Spreadsheet Course

    ERIC Educational Resources Information Center

    Davies, Randall S.; Dean, Douglas L.; Ball, Nick

    2013-01-01

    The purpose of this research was to explore how technology can be used to teach technological skills and to determine what benefit "flipping" the classroom might have for students taking an introductory-level college course on spreadsheets in terms of student achievement and satisfaction with the class. A pretest posttest…

  11. Pre-Service Teachers' TPACK Competencies for Spreadsheet Integration: Insights from a Mathematics-Specific Instructional Technology Course

    ERIC Educational Resources Information Center

    Agyei, Douglas D.; Voogt, Joke M.

    2015-01-01

    This article explored the impact of strategies applied in a mathematics instructional technology course for developing technology integration competencies, in particular in the use of spreadsheets, in pre-service teachers. In this respect, 104 pre-service mathematics teachers from a teacher training programme in Ghana enrolled in the mathematics…

  12. A Computer Simulation Using Spreadsheets for Learning Concept of Steady-State Equilibrium

    ERIC Educational Resources Information Center

    Sharda, Vandana; Sastri, O. S. K. S.; Bhardwaj, Jyoti; Jha, Arbind K.

    2016-01-01

    In this paper, we present a simple spreadsheet based simulation activity that can be performed by students at the undergraduate level. This simulation is implemented in free open source software (FOSS) LibreOffice Calc, which is available for both Windows and Linux platform. This activity aims at building the probability distribution for the…

  13. Handling Math Expressions in Economics: Recoding Spreadsheet Teaching Tool of Growth Models

    ERIC Educational Resources Information Center

    Moro-Egido, Ana I.; Pedauga, Luis E.

    2017-01-01

    In the present paper, we develop a teaching methodology for economic theory. The main contribution of this paper relies on combining the interactive characteristics of spreadsheet programs such as Excel and Unicode plain-text linear format for mathematical expressions. The advantage of Unicode standard rests on its ease for writing and reading…

  14. A Simple Spreadsheet Program for the Calculation of Lattice-Site Distributions

    ERIC Educational Resources Information Center

    McCaffrey, John G.

    2009-01-01

    A simple spreadsheet program is presented that can be used by undergraduate students to calculate the lattice-site distributions in solids. A major strength of the method is the natural way in which the correct number of ions or atoms are present, or absent, at specific lattice distances. The expanding-cube method utilized is straightforward to…

  15. Mathematical Modeling in Science: Using Spreadsheets to Create Mathematical Models and Address Scientific Inquiry

    ERIC Educational Resources Information Center

    Horton, Robert M.; Leonard, William H.

    2005-01-01

    In science, inquiry is used as students explore important and interesting questions concerning the world around them. In mathematics, one contemporary inquiry approach is to create models that describe real phenomena. Creating mathematical models using spreadsheets can help students learn at deep levels in both science and mathematics, and give…

  16. Myocardial and liver iron overload, assessed using T2* magnetic resonance imaging with an excel spreadsheet for post processing in Tunisian thalassemia major patients.

    PubMed

    Ouederni, Monia; Ben Khaled, Monia; Mellouli, Fethi; Ben Fraj, Elhem; Dhouib, Nawel; Yakoub, Ismehen Ben; Abbes, Selem; Mnif, Nejla; Bejaoui, Mohamed

    2017-01-01

    Thalassemia is a common genetic disorder in Tunisia. Early iron concentration assessment is a crucial and challenging issue. Most of annual deaths due to iron overload occurred in underdeveloped regions of the world. Limited access to liver and heart MRI monitoring might partially explain these poor prognostic results. Standard software programs are not available in Tunisia. This study is the first to evaluate iron overload in heart and liver using the MRI T2* with excel spreadsheet for post processing. Association of this MRI tool results to serum ferritin level, and echocardiography was also investigated. One hundred Tunisian-transfused thalassemia patients older than 10 years (16.1 ± 5.2) were enrolled in the study. The mean myocardial iron concentration (MIC) was 1.26 ± 1.65 mg/g dw (0.06-8.32). Cardiac T2* (CT2*) was under 20 ms in 30 % of patients and under 10 ms in 21 % of patients. Left ventricular ejection function was significantly lower in patients with CT2* <10 ms. Abnormal liver iron concentration (LIC >3 mg/g dw) was found in 95 % of patients. LIC was over 15 mg/g dw in 25 % of patients. MIC was more correlated than CT2* to LIC and serum ferritin. Among patients with SF <1000 μg/l, 13 % had CT2* <20 ms. Our data showed that 30 % of the Tunisian thalassemia major patients enrolled in this cohort had myocardial iron overload despite being treated by iron chelators. SF could not reliably predict iron overload in all thalassemia patients. MRI T2* using excel spreadsheet for routine follow-up of iron overload might improve the prognosis of thalassemia major patients in developing countries, such as Tunisia, where standard MRI tools are not available or expensive.

  17. WQEP - a computer spreadsheet program to evaluate water quality data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liddle, R.G.

    1996-12-31

    A flexible spreadsheet Water Quality Evaluation Program (WQEP) has been developed for mining companies, consultants, and regulators to interpret the results of water quality sampling. In order properly to evaluate hydrologic data, unit conversions and chemical calculations are done, quality control checks are needed, and a complete and up-to-date listing of water quality standards is necessary. This process is time consuming and tends not to be done for every sample. This program speeds the process by allowing the input of up to 115 chemical parameters from one sample. WQEP compares concentrations with EPA primary and secondary drinking water MCLs ormore » MCLG, EPA warmwater and Coldwater acute and chronic aquatic life criteria, irrigation criteria, livestock criteria, EPA human health criteria, and several other categories of criteria. The spreadsheet allows the input of State or local water standards of interest. Water quality checks include: anion/cations, TDS{sub m}/TDS{sub c} (where m=measured and c=calculated), EC{sub m}/EC{sub c}, EC{sub m}/ion sums, TDS{sub c}/EC ratio, TDS{sub m}/EC, EC vs. alkalinity, two hardness values, and EC vs. {Sigma} cations. WQEP computes the dissolved transport index of 23 parameters, computes ratios of 26 species for trend analysis, calculates non-carbonate alkalinity to adjust the bicarbonate concentration, and calculates 35 interpretive formulas (pE, SAR, S.I., unionized ammonia, ionized sulfide HS-, pK{sub x} values, etc.). Fingerprinting is conducted by automatic generation of stiff diagrams and ion histograms. Mass loading calculations, mass balance calculations, conversions of concentrations, ionic strength, and the activity coefficient and chemical activity of 33 parameters is calculated. This program allows a speedy and thorough evaluation of water quality data from metal mines, coal mining, and natural surface water systems and has been tested against hand calculations.« less

  18. Tutorial: simulating chromatography with Microsoft Excel Macros.

    PubMed

    Kadjo, Akinde; Dasgupta, Purnendu K

    2013-04-22

    Chromatography is one of the cornerstones of modern analytical chemistry; developing an instinctive feeling for how chromatography works will be invaluable to future generation of chromatographers. Specialized software programs exist that handle and manipulate chromatographic data; there are also some that simulate chromatograms. However, the algorithm details of such software are not transparent to a beginner. In contrast, how spreadsheet tools like Microsoft Excel™ work is well understood and the software is nearly universally available. We show that the simple repetition of an equilibration process at each plate (a spreadsheet row) followed by discrete movement of the mobile phase down by a row, easily automated by a subroutine (a "Macro" in Excel), readily simulates chromatography. The process is readily understood by a novice. Not only does this permit simulation of isocratic and simple single step gradient elution, linear or multistep gradients are also easily simulated. The versatility of a transparent and easily understandable computational platform further enables the simulation of complex but commonly encountered chromatographic scenarios such as the effects of nonlinear isotherms, active sites, column overloading, on-column analyte degradation, etc. These are not as easily simulated by available software. Views of the separation as it develops on the column and as it is seen by an end-column detector are both available in real time. Excel 2010™ also permits a 16-level (4-bit) color gradation of numerical values in a column/row; this permits visualization of a band migrating down the column, much as Tswett may have originally observed, but in a numerical domain. All parameters of relevance (partition constants, elution conditions, etc.) are readily changed so their effects can be examined. Illustrative Excel spreadsheets are given in the Supporting Information; these are easily modified by the user or the user can write his/her own routine. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. A Simple Spreadsheet Program to Simulate and Analyze the Far-UV Circular Dichroism Spectra of Proteins

    ERIC Educational Resources Information Center

    Abriata, Luciano A.

    2011-01-01

    A simple algorithm was implemented in a spreadsheet program to simulate the circular dichroism spectra of proteins from their secondary structure content and to fit [alpha]-helix, [beta]-sheet, and random coil contents from experimental far-UV circular dichroism spectra. The physical basis of the method is briefly reviewed within the context of…

  20. Fuels planning: science synthesis and integration; environmental consequences fact sheet 11: Smoke Impact Spreadsheet (SIS) model

    Treesearch

    Trent Wickman; Ann Acheson

    2005-01-01

    The Smoke Impact Spreadsheet (SIS) is a simple-to-use planning model for calculating particulate matter (PM) emissions and concentrations downwind of wildland fires. This fact sheet identifies the intended users and uses, required inputs, what the model does and does not do, and tells the user how to obtain the model.

  1. Teaching Graphical Simulations of Fourier Series Expansion of Some Periodic Waves Using Spreadsheets

    ERIC Educational Resources Information Center

    Singh, Iqbal; Kaur, Bikramjeet

    2018-01-01

    The present article demonstrates a way of programming using an Excel spreadsheet to teach Fourier series expansion in school/colleges without the knowledge of any typical programming language. By using this, a student learns to approximate partial sum of the n terms of Fourier series for some periodic signals such as square wave, saw tooth wave,…

  2. Integrating Math & Computer Skills in the Biology Classroom: An Example Using Spreadsheet Simulations to Teach Fundamental Sampling Concepts

    ERIC Educational Resources Information Center

    Ray, Darrell L.

    2013-01-01

    Students often enter biology programs deficient in the math and computational skills that would enhance their attainment of a deeper understanding of the discipline. To address some of these concerns, I developed a series of spreadsheet simulation exercises that focus on some of the mathematical foundations of scientific inquiry and the benefits…

  3. Effects of Screencasting on the Turkish Undergraduate Students' Achievement and Knowledge Acquisitions in Spreadsheet Applications

    ERIC Educational Resources Information Center

    Tekinarslan, Erkan

    2013-01-01

    The purpose of this study is to investigate the effects of screencasts on the Turkish undergraduate students' achievement and knowledge acquisitions in spreadsheet applications. The methodology of the study is based on a pretest-posttest experimental design with a control group. A total of 66 undergraduate students in two groups (n = 33 in…

  4. CVal: a spreadsheet tool to evaluate the direct benefits and costs of carbon sequestration contracts for managed forests

    Treesearch

    E.M. Bilek; Peter Becker; Tim. McAbee

    2009-01-01

    This documentation is meant to accompany CVal, a downloadable spreadsheet tool. CVal was constructed for foresters, other land management advisors, landowners, and carbon credit aggregators to evaluate the direct benefits and costs of entering into contracts for carbon sequestered in managed forests and forest plantations. CVal was designed to evaluate Exchange...

  5. An Integrated Management Support and Production Control System for Hardwood Forest Products

    Treesearch

    Guillermo A. Mendoza; Roger J. Meimban; William Sprouse; William G. Luppold; Philip A. Araman

    1991-01-01

    Spreadsheet and simulation models are tools which enable users to analyze a large number of variables affecting hardwood material utilization and profit in a systematic fashion. This paper describes two spreadsheet models; SEASaw and SEAIn, and a hardwood sawmill simulator. SEASaw is designed to estimate the amount of conversion from timber to lumber, while SEAIn is a...

  6. Exploring the Nature of the H[subscript 2] Bond. 1. Using Spreadsheet Calculations to Examine the Valence Bond and Molecular Orbital Methods

    ERIC Educational Resources Information Center

    Halpern, Arthur M.; Glendening, Eric D.

    2013-01-01

    A three-part project for students in physical chemistry, computational chemistry, or independent study is described in which they explore applications of valence bond (VB) and molecular orbital-configuration interaction (MO-CI) treatments of H[subscript 2]. Using a scientific spreadsheet, students construct potential-energy (PE) curves for several…

  7. Probabilistic assessment methodology for continuous-type petroleum accumulations

    USGS Publications Warehouse

    Crovelli, R.A.

    2003-01-01

    The analytic resource assessment method, called ACCESS (Analytic Cell-based Continuous Energy Spreadsheet System), was developed to calculate estimates of petroleum resources for the geologic assessment model, called FORSPAN, in continuous-type petroleum accumulations. The ACCESS method is based upon mathematical equations derived from probability theory in the form of a computer spreadsheet system. ?? 2003 Elsevier B.V. All rights reserved.

  8. Numerical Modelling with Spreadsheets as a Means to Promote STEM to High School Students

    ERIC Educational Resources Information Center

    Benacka, Jan

    2016-01-01

    The article gives an account of an experiment in which sixty-eight high school students of age 16 - 19 developed spreadsheet applications that simulated fall and projectile motion in the air. The students applied the Euler method to solve the governing differential equations. The aim was to promote STEM to the students and motivate them to study…

  9. Exploring Customization in Higher Education: An Experiment in Leveraging Computer Spreadsheet Technology to Deliver Highly Individualized Online Instruction to Undergraduate Business Students

    ERIC Educational Resources Information Center

    Kunzler, Jayson S.

    2012-01-01

    This dissertation describes a research study designed to explore whether customization of online instruction results in improved learning in a college business statistics course. The study involved utilizing computer spreadsheet technology to develop an intelligent tutoring system (ITS) designed to: a) collect and monitor individual real-time…

  10. Designing Optical Spreadsheets-Technological Pedagogical Content Knowledge Simulation (S-TPACK): A Case Study of Pre-Service Teachers Course

    ERIC Educational Resources Information Center

    Thohir, M. Anas

    2018-01-01

    In the 21st century, the competence of instructional technological design is important for pre-service physics teachers. This case study described the pre-service physics teachers' design of optical spreadsheet simulation and evaluated teaching and learning the task in the classroom. The case study chose three of thirty pre-service teacher's…

  11. Calculating Optimum sowing factor: A tool to evaluate sowing strategies and minimize seedling production cost

    Treesearch

    Eric van Steenis

    2013-01-01

    This paper illustrates how to use an excel spreadsheet as a decision-making tool to determine optimum sowing factor to minimize seedling production cost. Factors incorporated into the spreadsheet calculations include germination percentage, seeder accuracy, cost per seed, cavities per block, costs of handling, thinning, and transplanting labor, and more. In addition to...

  12. Development of a spreadsheet for SNPs typing using Microsoft EXCEL.

    PubMed

    Hashiyada, Masaki; Itakura, Yukio; Takahashi, Shirushi; Sakai, Jun; Funayama, Masato

    2009-04-01

    Single-nucleotide polymorphisms (SNPs) have some characteristics that make them very appropriate for forensic studies and applications. In our institute, SNPs typings were performed by the TaqMan SNP Genotyping Assays using the ABI PRISM 7500 FAST Real-Time PCR System (AppliedBiosystems) and Sequence Detection Software ver.1.4 (AppliedBiosystem). The TaqMan method was desired two positive control (Allele1 and 2) and one negative control to analyze each SNP locus. Therefore, it can be analyzed up to 24 loci of a person on a 96-well-plate at the same time. If SNPs analysis is expected to apply to biometrics authentication, 48 and over loci are required to identify a person. In this study, we designed a spreadsheet package using Microsoft EXCEL, and population data were used from our 120 SNPs population studies. On the spreadsheet, we defined SNP types using 'template files' instead of positive and negative controls. "Template files" consisted of the results of 94 unknown samples and two negative controls of each of 120 SNPs loci we had previously studied. By the use of the files, the spreadsheet could analyze 96 SNPs on a 96-wells-plate simultaneously.

  13. SynapticDB, effective web-based management and sharing of data from serial section electron microscopy.

    PubMed

    Shi, Bitao; Bourne, Jennifer; Harris, Kristen M

    2011-03-01

    Serial section electron microscopy (ssEM) is rapidly expanding as a primary tool to investigate synaptic circuitry and plasticity. The ultrastructural images collected through ssEM are content rich and their comprehensive analysis is beyond the capacity of an individual laboratory. Hence, sharing ultrastructural data is becoming crucial to visualize, analyze, and discover the structural basis of synaptic circuitry and function in the brain. We devised a web-based management system called SynapticDB (http://synapses.clm.utexas.edu/synapticdb/) that catalogues, extracts, analyzes, and shares experimental data from ssEM. The management strategy involves a library with check-in, checkout and experimental tracking mechanisms. We developed a series of spreadsheet templates (MS Excel, Open Office spreadsheet, etc) that guide users in methods of data collection, structural identification, and quantitative analysis through ssEM. SynapticDB provides flexible access to complete templates, or to individual columns with instructional headers that can be selected to create user-defined templates. New templates can also be generated and uploaded. Research progress is tracked via experimental note management and dynamic PDF forms that allow new investigators to follow standard protocols and experienced researchers to expand the range of data collected and shared. The combined use of templates and tracking notes ensures that the supporting experimental information is populated into the database and associated with the appropriate ssEM images and analyses. We anticipate that SynapticDB will serve future meta-analyses towards new discoveries about the composition and circuitry of neurons and glia, and new understanding about structural plasticity during development, behavior, learning, memory, and neuropathology.

  14. GeoRePORT Input Spreadsheet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    St. Onge, Melinda

    The Geothermal Resource Portfolio Optimization and Reporting Tool (GeoRePORT) was developed as a way to distill large amounts of geothermal project data into an objective, reportable data set that can be used to communicate with experts and non-experts. GeoRePORT summarizes (1) resource grade and certainty and (2) project readiness. This Excel file allows users to easily navigate through the resource grade attributes, using drop-down menus to pick grades and project readiness, and then easily print and share the summary with others. This spreadsheet is the first draft, for which we are soliciting expert feedback. The spreadsheet will be updated basedmore » on this feedback to increase usability of the tool. If you have any comments, please feel free to contact us.« less

  15. Intelligence Dissemination to the Warfighter

    DTIC Science & Technology

    2007-12-01

    that prevent other JWICS users from exchanging data. The CIA conducts most of their business on the CIAnet , which can pull data from JWICS but...data. Spreadsheets and word processors, in order to retain a high level of user- friendliness, handle several complex background processes that...the “ complex adaptive systems”, where the onus is placed equally on the analyst and on the tools to be receptive and adaptable. It is the

  16. Costing nursing education programs. It's as easy as 1-2-3.

    PubMed

    Fisher, M L; Hume, R; Emerick, R

    1998-01-01

    Staff development departments are pressured to reveal the costs of their educational programs and to compete with outside vendors for programming. The process of implementing a spreadsheet template for costing out staff development programs is described. The template is easy to use and supports "what if" analysis. This model allows educators to evaluate cost implications of curricular decisions and to better negotiate with internal and external customers.

  17. An Excel Macro to Plot the HFE-Diagram to Identify Sea Water Intrusion Phases.

    PubMed

    Giménez-Forcada, Elena; Sánchez San Román, F Javier

    2015-01-01

    A hydrochemical facies evolution diagram (HFE-D) is a multirectangular diagram, which is a useful tool in the interpretation of sea water intrusion processes. This method note describes a simple method for generating an HFE-D plot using the spreadsheet software package, Microsoft Excel. The code was applied to groundwater from the alluvial coastal plain of Grosseto (Tuscany, Italy), which is characterized by a complex salinization process in which sea water mixes with sulfate or bicarbonate recharge water. © 2014, National GroundWater Association.

  18. [Determine and Implement Updates to Be Made to MODEAR (Mission Operations Data Enterprise Architecture Repository)

    NASA Technical Reports Server (NTRS)

    Fanourakis, Sofia

    2015-01-01

    My main project was to determine and implement updates to be made to MODEAR (Mission Operations Data Enterprise Architecture Repository) process definitions to be used for CST-100 (Crew Space Transportation-100) related missions. Emphasis was placed on the scheduling aspect of the processes. In addition, I was to complete other tasks as given. Some of the additional tasks were: to create pass-through command look-up tables for the flight controllers, finish one of the MDT (Mission Operations Directorate Display Tool) displays, gather data on what is included in the CST-100 public data, develop a VBA (Visual Basic for Applications) script to create a csv (Comma-Separated Values) file with specific information from spreadsheets containing command data, create a command script for the November MCC-ASIL (Mission Control Center-Avionics System Integration Laboratory) testing, and take notes for one of the TCVB (Terminal Configured Vehicle B-737) meetings. In order to make progress in my main project I scheduled meetings with the appropriate subject matter experts, prepared material for the meetings, and assisted in the discussions in order to understand the process or processes at hand. After such discussions I made updates to various MODEAR processes and process graphics. These meetings have resulted in significant updates to the processes that were discussed. In addition, the discussions have helped the departments responsible for these processes better understand the work ahead and provided material to help document how their products are created. I completed my other tasks utilizing resources available to me and, when necessary, consulting with the subject matter experts. Outputs resulting from my other tasks were: two completed and one partially completed pass through command look-up tables for the fight controllers, significant updates to one of the MDT displays, a spreadsheet containing data on what is included in the CST-100 public data, a tool to create a csv file with specific information from spreadsheets containing command data, a command script for the November MCC-ASIL testing which resulted in a successful test day identifying several potential issues, and notes from one of the TCVB meetings that was used to keep the teams up to date on what was discussed and decided. I have learned a great deal working at NASA these last four months. I was able to meet and work with amazing individuals, further develop my technical knowledge, expand my knowledge base regarding human spaceflight, and contribute to the CST-100 missions. My work at NASA has strengthened my desire to continue my education in order to make further contributions to the field, and has given me the opportunity to see the advantages of a career at NASA.

  19. Perceptions of the software skills of graduates by employers in the financial services industry

    NASA Astrophysics Data System (ADS)

    Kyng, Tim; Tickle, Leonie; Wood, Leigh N.

    2013-12-01

    Software, particularly spreadsheet software, is ubiquitous in the financial services workplace. Yet little is known about the extent to which universities should, and do, prepare graduates for this aspect of the modern workplace. We have investigated this issue through a survey of financial services employers of graduates, the results of which are reported in this paper, as well as surveys of university graduates and academics, reported previously. Financial services employers rate software skills as important, would like their employees to be more highly skilled in the use of such software, and tend to prefer 'on-the-job' training rather than university training for statistical, database and specialized actuarial/financial software. There is a perception among graduates that employers do not provide adequate formal workplace training in the use of technical software.

  20. Mediation of a Teacher's Development of Spreadsheets as an Instrument to Support Pupils' Inquiry in Mathematics

    ERIC Educational Resources Information Center

    Fuglestad, Anne Berit

    2013-01-01

    This paper presents a case of collaboration with three teachers and a didactician on task development within a developmental research project based on ideas of inquiry and learning community. The teachers' goal was to utilise a spreadsheet to orchestrate the pupils' investigations and build a library of tasks for the classroom. The focus is on one…

  1. A Novel Real-Time Data Acquisition Using an Excel Spreadsheet in Pendulum Experiment Tool with Light-Based Timer

    ERIC Educational Resources Information Center

    Adhitama, Egy; Fauzi, Ahmad

    2018-01-01

    In this study, a pendulum experimental tool with a light-based timer has been developed to measure the period of a simple pendulum. The obtained data was automatically recorded in an Excel spreadsheet. The intensity of monochromatic light, sensed by a 3DU5C phototransistor, dynamically changes as the pendulum swings. The changed intensity varies…

  2. Developing Students' Understanding of Co-Opetition and Multilevel Inventory Management Strategies in Supply Chains: An In-Class Spreadsheet Simulation Exercise

    ERIC Educational Resources Information Center

    Fetter, Gary; Shockley, Jeff

    2014-01-01

    Instructors look for ways to explain to students how supply chains can be constructed so that competing suppliers can work together to improve inventory management performance (i.e., a phenomenon known as co-opetition). An Excel spreadsheet-driven simulation is presented that models a complete multilevel supply chain system--customer, retailer,…

  3. The Impacts of Mathematical Representations Developed through Webquest and Spreadsheet Activities on the Motivation of Pre-Service Elementary School Teachers

    ERIC Educational Resources Information Center

    Halat, Erdogan; Peker, Murat

    2011-01-01

    The purpose of this study was to compare the influence of instruction using WebQuest activities with the influence of an instruction using spreadsheet activities on the motivation of pre-service elementary school teachers in mathematics teaching course. There were a total of 70 pre-service elementary school teachers involved in this study. Thirty…

  4. Great Basin NV Play Fairway Analysis - Carson Sink

    DOE Data Explorer

    Jim Faulds

    2015-10-28

    All datasets and products specific to the Carson Sink Basin. Includes a packed ArcMap (.mpk), individually zipped shapefiles, and a file geodatabase for the Carson Sink area; a GeoSoft Oasis montaj project containing GM-SYS 2D gravity profiles along the trace of our seismic reflection lines; a 3D model in EarthVision; spreadsheet of links to published maps; and spreadsheets of well data.

  5. Students' Meaning Making in Science: Solving Energy Resource Problems in Virtual Worlds Combined with Spreadsheets to Develop Graphs

    ERIC Educational Resources Information Center

    Krange, Ingeborg; Arnseth, Hans Christian

    2012-01-01

    The aim of this study is to scrutinize the characteristics of conceptual meaning making when students engage with virtual worlds in combination with a spreadsheet with the aim to develop graphs. We study how these tools and the representations they contain or enable students to construct serve to influence their understanding of energy resource…

  6. NV PFA - Steptoe Valley

    DOE Data Explorer

    Jim Faulds

    2015-10-29

    All datasets and products specific to the Steptoe Valley model area. Includes a packed ArcMap project (.mpk), individually zipped shapefiles, and a file geodatabase for the northern Steptoe Valley area; a GeoSoft Oasis montaj project containing GM-SYS 2D gravity profiles along the trace of our seismic reflection lines; a 3D model in EarthVision; spreadsheet of links to published maps; and spreadsheets of well data.

  7. ThinTool: a spreadsheet model to evaluate fuel reduction thinning cost, net energy output, and nutrient impacts

    Treesearch

    Sang-Kyun Han; Han-Sup Han; William J. Elliot; Edward M. Bilek

    2017-01-01

    We developed a spreadsheet-based model, named ThinTool, to evaluate the cost of mechanical fuel reduction thinning including biomass removal, to predict net energy output, and to assess nutrient impacts from thinning treatments in northern California and southern Oregon. A combination of literature reviews, field-based studies, and contractor surveys was used to...

  8. A user-friendly tool for incremental haemodialysis prescription.

    PubMed

    Casino, Francesco Gaetano; Basile, Carlo

    2018-01-05

    There is a recently heightened interest in incremental haemodialysis (IHD), the main advantage of which could likely be a better preservation of the residual kidney function of the patients. The implementation of IHD, however, is hindered by many factors, among them, the mathematical complexity of its prescription. The aim of our study was to design a user-friendly tool for IHD prescription, consisting of only a few rows of a common spreadsheet. The keystone of our spreadsheet was the following fundamental concept: the dialysis dose to be prescribed in IHD depends only on the normalized urea clearance provided by the native kidneys (KRUn) of the patient for each frequency of treatment, according to the variable target model recently proposed by Casino and Basile (The variable target model: a paradigm shift in the incremental haemodialysis prescription. Nephrol Dial Transplant 2017; 32: 182-190). The first step was to put in sequence a series of equations in order to calculate, firstly, KRUn and, then, the key parameters to be prescribed for an adequate IHD; the second step was to compare KRUn values obtained with our spreadsheet with KRUn values obtainable with the gold standard Solute-solver (Daugirdas JT et al., Solute-solver: a web-based tool for modeling urea kinetics for a broad range of hemodialysis schedules in multiple patients. Am J Kidney Dis 2009; 54: 798-809) in a sample of 40 incident haemodialysis patients. Our spreadsheet provided excellent results. The differences with Solute-solver were clinically negligible. This was confirmed by the Bland-Altman plot built to analyse the agreement between KRUn values obtained with the two methods: the difference was 0.07 ± 0.05 mL/min/35 L. Our spreadsheet is a user-friendly tool able to provide clinically acceptable results in IHD prescription. Two immediate consequences could derive: (i) a larger dissemination of IHD might occur; and (ii) our spreadsheet could represent a useful tool for an ineludibly needed full-fledged clinical trial, comparing IHD with standard thrice-weekly HD. © The Author(s) 2018. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  9. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etmektzoglou, A; Mishra, P; Svatos, M

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomesmore » available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly translate research ideas into machine readable scripts without programming knowledge. As an open source initiative, it also enables researcher collaboration on future developments. I am a full time employee at Varian Medical Systems, Palo Alto, California.« less

  10. Comparison between In-house developed and Diamond commercial software for patient specific independent monitor unit calculation and verification with heterogeneity corrections.

    PubMed

    Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya

    2016-02-01

    The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  11. Spreadsheet WATERSHED modeling for nonpoint-source pollution management in a Wisconsin basin

    USGS Publications Warehouse

    Walker, J.F.; Pickard, S.A.; Sonzogni, W.C.

    1989-01-01

    Although several sophisticated nonpoint pollution models exist, few are available that are easy to use, cover a variety of conditions, and integrate a wide range of information to allow managers and planners to assess different control strategies. Here, a straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.A straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.

  12. Intuitive Tools for the Design and Analysis of Communication Payloads for Satellites

    NASA Technical Reports Server (NTRS)

    Culver, Michael R.; Soong, Christine; Warner, Joseph D.

    2014-01-01

    In an effort to make future communications satellite payload design more efficient and accessible, two tools were created with intuitive graphical user interfaces (GUIs). The first tool allows payload designers to graphically design their payload by using simple drag and drop of payload components onto a design area within the program. Information about each picked component is pulled from a database of common space-qualified communication components sold by commerical companies. Once a design is completed, various reports can be generated, such as the Master Equipment List. The second tool is a link budget calculator designed specifically for ease of use. Other features of this tool include being able to access a database of NASA ground based apertures for near Earth and Deep Space communication, the Tracking and Data Relay Satellite System (TDRSS) base apertures, and information about the solar system relevant to link budget calculations. The link budget tool allows for over 50 different combinations of user inputs, eliminating the need for multiple spreadsheets and the user errors associated with using them. Both of the aforementioned tools increase the productivity of space communication systems designers, and have the colloquial latitude to allow non-communication experts to design preliminary communication payloads.

  13. Citation Analysis of Hepatitis Monthly by Journal Citation Report (ISI), Google Scholar, and Scopus.

    PubMed

    Miri, Seyyed Mohammad; Raoofi, Azam; Heidari, Zahra

    2012-09-01

    Citation analysis as one of the most widely used methods of bibliometrics can be used for computing the various impact measures for scholars based on data from citation databases. Journal Citation Reports (JCR) from Thomson Reuters provides annual report in the form of impact factor (IF) for each journal. We aimed to evaluate the citation parameters of Hepatitis Monthly by JCR in 2010 and compare them with GS and Sc. All articles of Hepat Mon published in 2009 and 2008 which had been cited in 2010 in three databases including WoS, Sc and GS gathered in a spreadsheet. The IFs were manually calculated. Among the 104 total published articles the accuracy rates of GS and Sc in recording the total number of articles was 96% and 87.5%. There was a difference between IFs among the three databases (0.793 in ISI [Institute for Scientific Information], 0.945 in Sc and 0.85 GS). The missing rate of citations in ISI was 4% totally. Original articles were the main cited types, whereas, guidelines and clinical challenges were the least ones. None of the three databases succeed to record all articles published in the journal. Despite high sensitivity of GS comparing to Sc, it cannot be a reliable source for indexing since GS has lack of screening in the data collection and low specificity. Using an average of three IFs is suggested to find the correct IF. Editors should be more aware on the role of original articles in increasing IF and the potential efficacy of review articles in long term impact factor.

  14. A Spreadsheet for the Mixing of Rows of Jets with Confined Crossflow in a Rectangular Duct. Supplement

    NASA Technical Reports Server (NTRS)

    Holderman, James D.; Clisset, James R.; Moder, Jeffrey P.

    2010-01-01

    This is a printout of the supplemental spreadsheet that is a supplement to the document found in NASA/TM-2010-216100. The calculations for cases of opposed rows of jets with the orifices on one side shifted show that staggering can improve the mixing, particularly for cases where jets would overpenetrate slightly if the orifices were in an aligned configuration.

  15. Teaching graphical simulations of Fourier series expansion of some periodic waves using spreadsheets

    NASA Astrophysics Data System (ADS)

    Singh, Iqbal; Kaur, Bikramjeet

    2018-05-01

    The present article demonstrates a way of programming using an Excel spreadsheet to teach Fourier series expansion in school/colleges without the knowledge of any typical programming language. By using this, a student learns to approximate partial sum of the n terms of Fourier series for some periodic signals such as square wave, saw tooth wave, half wave rectifier and full wave rectifier signals.

  16. Spreadsheet Application Showing the Proper Elevation Angle, Points of Shot and Impact of a Projectile

    ERIC Educational Resources Information Center

    Benacka, Jan

    2015-01-01

    This paper provides the formula for the elevation angle at which a projectile has to be fired in a vacuum from a general position to hit a target at a given distance. A spreadsheet application that models the trajectory is presented, and the problem of finding the points of shot and impact of a projectile moving in a vacuum if three points of the…

  17. A Spreadsheet for the Mixing of a Row of Jets with a Confined Crossflow

    NASA Technical Reports Server (NTRS)

    Holderman, J. D.; Smith, T. D.; Clisset, J. R.; Lear, W. E.

    2005-01-01

    An interactive computer code, written with a readily available software program, Microsoft Excel (Microsoft Corporation, Redmond, WA) is presented which displays 3 D oblique plots of a conserved scalar distribution downstream of jets mixing with a confined crossflow, for a single row, double rows, or opposed rows of jets with or without flow area convergence and/or a non-uniform crossflow scalar distribution. This project used a previously developed empirical model of jets mixing in a confined crossflow to create an Microsoft Excel spreadsheet that can output the profiles of a conserved scalar for jets injected into a confined crossflow given several input variables. The program uses multiple spreadsheets in a single Microsoft Excel notebook to carry out the modeling. The first sheet contains the main program, controls for the type of problem to be solved, and convergence criteria. The first sheet also provides for input of the specific geometry and flow conditions. The second sheet presents the results calculated with this routine to show the effects on the mixing of varying flow and geometric parameters. Comparisons are also made between results from the version of the empirical correlations implemented in the spreadsheet and the versions originally written in Applesoft BASIC (Apple Computer, Cupertino, CA) in the 1980's.

  18. A Spreadsheet for the Mixing of a Row of Jets with a Confined Crossflow. Supplement

    NASA Technical Reports Server (NTRS)

    Holderman, J. D.; Smith, T. D.; Clisset, J. R.; Lear, W. E.

    2005-01-01

    An interactive computer code, written with a readily available software program, Microsoft Excel (Microsoft Corporation, Redmond, WA) is presented which displays 3 D oblique plots of a conserved scalar distribution downstream of jets mixing with a confined crossflow, for a single row, double rows, or opposed rows of jets with or without flow area convergence and/or a non-uniform crossflow scalar distribution. This project used a previously developed empirical model of jets mixing in a confined crossflow to create an Microsoft Excel spreadsheet that can output the profiles of a conserved scalar for jets injected into a confined crossflow given several input variables. The program uses multiple spreadsheets in a single Microsoft Excel notebook to carry out the modeling. The first sheet contains the main program, controls for the type of problem to be solved, and convergence criteria. The first sheet also provides for input of the specific geometry and flow conditions. The second sheet presents the results calculated with this routine to show the effects on the mixing of varying flow and geometric parameters. Comparisons are also made between results from the version of the empirical correlations implemented in the spreadsheet and the versions originally written in Applesoft BASIC (Apple Computer, Cupertino, CA) in the 1980's.

  19. Intermediate-sized natural gas fueled carbonate fuel cell power plants

    NASA Astrophysics Data System (ADS)

    Sudhoff, Frederick A.; Fleming, Donald K.

    1994-04-01

    This executive summary of the report describes the accomplishments of the joint US Department of Energy's (DOE) Morgantown Energy Technology Center (METC) and M-C POWER Corporation's Cooperative Research and Development Agreement (CRADA) No. 93-013. This study addresses the intermediate power plant size between 2 megawatt (MW) and 200 MW. A 25 MW natural-gas, fueled-carbonate fuel cell power plant was chosen for this purpose. In keeping with recent designs, the fuel cell will operate under approximately three atmospheres of pressure. An expander/alternator is utilized to expand exhaust gas to atmospheric conditions and generate additional power. A steam-bottoming cycle is not included in this study because it is not believed to be cost effective for this system size. This study also addresses the simplicity and accuracy of a spreadsheet-based simulation with that of a full Advanced System for Process Engineering (ASPEN) simulation. The personal computer can fully utilize the simple spreadsheet model simulation. This model can be made available to all users and is particularly advantageous to the small business user.

  20. Real-time PCR array as a universal platform for the detection of genetically modified crops and its application in identifying unapproved genetically modified crops in Japan.

    PubMed

    Mano, Junichi; Shigemitsu, Natsuki; Futo, Satoshi; Akiyama, Hiroshi; Teshima, Reiko; Hino, Akihiro; Furui, Satoshi; Kitta, Kazumi

    2009-01-14

    We developed a novel type of real-time polymerase chain reaction (PCR) array with TaqMan chemistry as a platform for the comprehensive and semiquantitative detection of genetically modified (GM) crops. Thirty primer-probe sets for the specific detection of GM lines, recombinant DNA (r-DNA) segments, endogenous reference genes, and donor organisms were synthesized, and a 96-well PCR plate was prepared with a different primer-probe in each well as the real-time PCR array. The specificity and sensitivity of the array were evaluated. A comparative analysis with the data and publicly available information on GM crops approved in Japan allowed us to assume the possibility of unapproved GM crop contamination. Furthermore, we designed a Microsoft Excel spreadsheet application, Unapproved GMO Checker version 2.01, which helps process all the data of real-time PCR arrays for the easy assumption of unapproved GM crop contamination. The spreadsheet is available free of charge at http://cse.naro.affrc.go.jp/jmano/index.html .

  1. Mass-balance measurements in Alaska and suggestions for simplified observation programs

    USGS Publications Warehouse

    Trabant, D.C.; March, R.S.

    1999-01-01

    US Geological Survey glacier fieldwork in Alaska includes repetitious measurements, corrections for leaning or bending stakes, an ability to reliably measure seasonal snow as deep as 10 m, absolute identification of summer surfaces in the accumulation area, and annual evaluation of internal accumulation, internal ablation, and glacier-thickness changes. Prescribed field measurement and note-taking techniques help eliminate field errors and expedite the interpretative process. In the office, field notes are transferred to computerized spread-sheets for analysis, release on the World Wide Web, and archival storage. The spreadsheets have error traps to help eliminate note-taking and transcription errors. Rigorous error analysis ends when mass-balance measurements are extrapolated and integrated with area to determine glacier and basin mass balances. Unassessable errors in the glacier and basin mass-balance data reduce the value of the data set for correlations with climate change indices. The minimum glacier mass-balance program has at least three measurement sites on a glacier and the measurements must include the seasonal components of mass balance as well as the annual balance.

  2. Matlab-Excel Interface for OpenDSS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The software allows users of the OpenDSS grid modeling software to access their load flow models using a GUI interface developed in MATLAB. The circuit definitions are entered into a Microsoft Excel spreadsheet which makes circuit creation and editing a much simpler process than the basic text-based editors used in the native OpenDSS interface. Plot tools have been developed which can be accessed through a MATLAB GUI once the desired parameters have been simulated.

  3. Strategy And The Spreadsheet: Optimizing The Total Army To Satisfy Both

    DTIC Science & Technology

    2016-02-11

    historically reduces military end strength at the conclusion of major conflicts. The Budget Control Act of 2011 imposed sequestration spending limits on...the military that began the process of drawing down the military through fiscal year 2021. While the 2016 defense budget delays sequestration cuts... budget by a wide margin, has started repeating a historical cycle of budget driven defense cuts. The Army’s large force represents an attractive

  4. Simulation of axonal excitability using a Spreadsheet template created in Microsoft Excel.

    PubMed

    Brown, A M

    2000-08-01

    The objective of this present study was to implement an established simulation protocol (A.M. Brown, A methodology for simulating biological systems using Microsoft Excel, Comp. Methods Prog. Biomed. 58 (1999) 181-90) to model axonal excitability. The simulation protocol involves the use of in-cell formulas directly typed into a spreadsheet and does not require any programming skills or use of the macro language. Once the initial spreadsheet template has been set up the simulations described in this paper can be executed with a few simple keystrokes. The model axon contained voltage-gated ion channels that were modeled using Hodgkin Huxley style kinetics. The basic properties of axonal excitability modeled were: (1) threshold of action potential firing, demonstrating that not only are the stimulus amplitude and duration critical in the generation of an action potential, but also the resting membrane potential; (2) refractoriness, the phenomenon of reduced excitability immediately following an action potential. The difference between the absolute refractory period, when no amount of stimulus will elicit an action potential, and relative refractory period, when an action potential may be generated by applying increased stimulus, was demonstrated with regard to the underlying state of the Na(+) and K(+) channels; (3) temporal summation, a process by which two sub-threshold stimuli can unite to elicit an action potential was shown to be due to conductance changes outlasting the first stimulus and summing with the second stimulus-induced conductance changes to drive the membrane potential past threshold; (4) anode break excitation, where membrane hyperpolarization was shown to produce an action potential by removing Na(+) channel inactivation that is present at resting membrane potential. The simulations described in this paper provide insights into mechanisms of axonal excitation that can be carried out by following an easily understood protocol.

  5. Organizational Linkages: Understanding the Productivity Paradox,

    DTIC Science & Technology

    1994-01-01

    students were asked to make a decision regarding a production scheduling. Some used a Lotus spreadsheet’s what-if capacity, which enabled them to...the degree to which managers and MBA students believed that they make better decisions using what-if spreadsheet models, despite the fact that their...for this system is Naylor et al.’s (1980) view of behavior in organizations. When Pritchard and his students (Pritchard et al., 1988) applied this

  6. Maxine: A spreadsheet for estimating dose from chronic atmospheric radioactive releases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jannik, Tim; Bell, Evaleigh; Dixon, Kenneth

    MAXINE is an EXCEL© spreadsheet, which is used to estimate dose to individuals for routine and accidental atmospheric releases of radioactive materials. MAXINE does not contain an atmospheric dispersion model, but rather doses are estimated using air and ground concentrations as input. Minimal input is required to run the program and site specific parameters are used when possible. Complete code description, verification of models, and user’s manual have been included.

  7. Integrating numerical computation into the undergraduate education physics curriculum using spreadsheet excel

    NASA Astrophysics Data System (ADS)

    Fauzi, Ahmad

    2017-11-01

    Numerical computation has many pedagogical advantages: it develops analytical skills and problem-solving skills, helps to learn through visualization, and enhances physics education. Unfortunately, numerical computation is not taught to undergraduate education physics students in Indonesia. Incorporate numerical computation into the undergraduate education physics curriculum presents many challenges. The main challenges are the dense curriculum that makes difficult to put new numerical computation course and most students have no programming experience. In this research, we used case study to review how to integrate numerical computation into undergraduate education physics curriculum. The participants of this research were 54 students of the fourth semester of physics education department. As a result, we concluded that numerical computation could be integrated into undergraduate education physics curriculum using spreadsheet excel combined with another course. The results of this research become complements of the study on how to integrate numerical computation in learning physics using spreadsheet excel.

  8. Microsoft excel spreadsheets for calculation of P-V-T relations and thermodynamic properties from equations of state of MgO, diamond and nine metals as pressure markers in high-pressure and high-temperature experiments

    NASA Astrophysics Data System (ADS)

    Sokolova, Tatiana S.; Dorogokupets, Peter I.; Dymshits, Anna M.; Danilov, Boris S.; Litasov, Konstantin D.

    2016-09-01

    We present Microsoft Excel spreadsheets for calculation of thermodynamic functions and P-V-T properties of MgO, diamond and 9 metals, Al, Cu, Ag, Au, Pt, Nb, Ta, Mo, and W, depending on temperature and volume or temperature and pressure. The spreadsheets include the most common pressure markers used in in situ experiments with diamond anvil cell and multianvil techniques. The calculations are based on the equation of state formalism via the Helmholtz free energy. The program was developed using Visual Basic for Applications in Microsoft Excel and is a time-efficient tool to evaluate volume, pressure and other thermodynamic functions using T-P and T-V data only as input parameters. This application is aimed to solve practical issues of high pressure experiments in geosciences and mineral physics.

  9. CAG - computer-aid-georeferencing, or rapid sharing, restructuring and presentation of environmental data using remote-server georeferencing for the GE clients. Educational and scientific implications.

    NASA Astrophysics Data System (ADS)

    Hronusov, V. V.

    2006-12-01

    We suggest a method of using external public servers for rearranging, restructuring and rapid sharing of environmental data for the purpose of quick presentations in numerous GE clients. The method allows to add new philosophy for the presentation (publication) of the data (mostly static) stored in the public domain (e.g., Blue Marble, Visible Earth, etc). - The new approach is generated by publishing freely accessible spreadsheets which contain enough information and links to the data. Due to the fact that most of the large depositories of the data on the environmental monitoring have rather simple net address system as well as simple hierarchy mostly based on the date and type of the data, it is possible to develop the http-based link to the file which contains the data. Publication of new data on the server is recorded by a simple entering a new address into a cell in the spreadsheet. At the moment we use the EditGrid (www.editgrid.com) system as a spreadsheet platform. The generation of kml-codes is achieved on the basis of XML data and XSLT procedures. Since the EditGride environment supports "fetch" and similar commands, it is possible to create"smart-adaptive" KML generation on the fly based on the data streams from RSS and XML sources. The previous GIS-based methods could combine hi-definition data combined from various sources, but large- scale comparisons of dynamic processes have been usually out of reach of the technology. The suggested method allows unlimited number of GE clients to view, review and compare dynamic and static process of previously un-combinable sources, and on unprecedent scales. The ease of automated or computer-assisted georeferencing has already led to translation about 3000 raster public domain imagery, point and linear data sources into GE-language. In addition the suggested method allows a user to create rapid animations to demonstrate dynamic processes; roducts of high demand in education, meteorology, volcanology and potentially in a number of industries. In general it is possible to state that the new approach, which we have tested on numerous projects, saves times and energy in creating huge amounts of georeferenced data of various kinds, and thus provided an excellent tools for education and science.

  10. User's manual for the National Water-Quality Assessment Program Invertebrate Data Analysis System (IDAS) software, version 5

    USGS Publications Warehouse

    Cuffney, Thomas F.; Brightbill, Robin A.

    2011-01-01

    The Invertebrate Data Analysis System (IDAS) software was developed to provide an accurate, consistent, and efficient mechanism for analyzing invertebrate data collected as part of the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program. The IDAS software is a stand-alone program for personal computers that run Microsoft Windows(Registered). It allows users to read data downloaded from the NAWQA Program Biological Transactional Database (Bio-TDB) or to import data from other sources either as Microsoft Excel(Registered) or Microsoft Access(Registered) files. The program consists of five modules: Edit Data, Data Preparation, Calculate Community Metrics, Calculate Diversities and Similarities, and Data Export. The Edit Data module allows the user to subset data on the basis of taxonomy or sample type, extract a random subsample of data, combine or delete data, summarize distributions, resolve ambiguous taxa (see glossary) and conditional/provisional taxa, import non-NAWQA data, and maintain and create files of invertebrate attributes that are used in the calculation of invertebrate metrics. The Data Preparation module allows the user to select the type(s) of sample(s) to process, calculate densities, delete taxa on the basis of laboratory processing notes, delete pupae or terrestrial adults, combine lifestages or keep them separate, select a lowest taxonomic level for analysis, delete rare taxa on the basis of the number of sites where a taxon occurs and (or) the abundance of a taxon in a sample, and resolve taxonomic ambiguities by one of four methods. The Calculate Community Metrics module allows the user to calculate 184 community metrics, including metrics based on organism tolerances, functional feeding groups, and behavior. The Calculate Diversities and Similarities module allows the user to calculate nine diversity and eight similarity indices. The Data Export module allows the user to export data to other software packages (CANOCO, Primer, PC-ORD, MVSP) and produce tables of community data that can be imported into spreadsheet, database, graphics, statistics, and word-processing programs. The IDAS program facilitates the documentation of analyses by keeping a log of the data that are processed, the files that are generated, and the program settings used to process the data. Though the IDAS program was developed to process NAWQA Program invertebrate data downloaded from Bio-TDB, the Edit Data module includes tools that can be used to convert non-NAWQA data into Bio-TDB format. Consequently, the data manipulation, analysis, and export procedures provided by the IDAS program can be used to process data generated outside of the NAWQA Program.

  11. Statically determined slip-line field solution for the axial forming force estimation in the radial-axial ring rolling process

    NASA Astrophysics Data System (ADS)

    Quagliato, Luca; Berti, Guido A.

    2017-10-01

    In this paper, a statically determined slip-line solution algorithm is proposed for the calculation of the axial forming force in the radial-axial ring rolling process of flat rings. The developed solution is implemented in an Excel spreadsheet for the construction of the slip-line field and the calculation of the pressure factor to be used in the force model. The comparison between analytical solution and authors' FE simulation allows stating that the developed model supersedes the previous literature ones and proves the reliability of the proposed approach.

  12. Generic Modeling of a Life Support System for Process Technology Comparison

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support system and process technology options for a Lunar Base with a crew size of 4 and mission lengths of 90 and 600 days. System configurations to minimize the life support system weight and power are explored.

  13. Forecasting in foodservice: model development, testing, and evaluation.

    PubMed

    Miller, J L; Thompson, P A; Orabella, M M

    1991-05-01

    This study was designed to develop, test, and evaluate mathematical models appropriate for forecasting menu-item production demand in foodservice. Data were collected from residence and dining hall foodservices at Ohio State University. Objectives of the study were to collect, code, and analyze the data; develop and test models using actual operation data; and compare forecasting results with current methods in use. Customer count was forecast using deseasonalized simple exponential smoothing. Menu-item demand was forecast by multiplying the count forecast by a predicted preference statistic. Forecasting models were evaluated using mean squared error, mean absolute deviation, and mean absolute percentage error techniques. All models were more accurate than current methods. A broad spectrum of forecasting techniques could be used by foodservice managers with access to a personal computer and spread-sheet and database-management software. The findings indicate that mathematical forecasting techniques may be effective in foodservice operations to control costs, increase productivity, and maximize profits.

  14. Approaches for Defining the Hsp90-dependent Proteome

    PubMed Central

    Hartson, Steven D.; Matts, Robert L.

    2011-01-01

    Hsp90 is the target of ongoing drug discovery studies seeking new compounds to treat cancer, neurodegenerative diseases, and protein folding disorders. To better understand Hsp90’s roles in cellular pathologies and in normal cells, numerous studies have utilized proteomics assays and related high-throughput tools to characterize its physical and functional protein partnerships. This review surveys these studies, and summarizes the strengths and limitations of the individual attacks. We also include downloadable spreadsheets compiling all of the Hsp90-interacting proteins identified in more than 23 studies. These tools include cross-references among gene aliases, human homologues of yeast Hsp90-interacting proteins, hyperlinks to database entries, summaries of canonical pathways that are enriched in the Hsp90 interactome, and additional bioinformatic annotations. In addition to summarizing Hsp90 proteomics studies performed to date and the insights they have provided, we identify gaps in our current understanding of Hsp90-mediated proteostasis. PMID:21906632

  15. Using an analytical geometry method to improve tiltmeter data presentation

    USGS Publications Warehouse

    Su, W.-J.

    2000-01-01

    The tiltmeter is a useful tool for geologic and geotechnical applications. To obtain full benefit from the tiltmeter, easy and accurate data presentations should be used. Unfortunately, the most commonly used method for tilt data reduction now may yield inaccurate and low-resolution results. This article describes a simple, accurate, and high-resolution approach developed at the Illinois State Geological Survey for data reduction and presentation. The orientation of tiltplates is determined first by using a trigonometric relationship, followed by a matrix transformation, to obtain the true amount of rotation change of the tiltplate at any given time. The mathematical derivations used for the determination and transformation are then coded into an integrated PC application by adapting the capabilities of commercial spreadsheet, database, and graphics software. Examples of data presentation from tiltmeter applications in studies of landfill covers, characterizations of mine subsidence, and investigations of slope stability are also discussed.

  16. A MATLAB toolbox and Excel workbook for calculating the densities, seismic wave speeds, and major element composition of minerals and rocks at pressure and temperature

    NASA Astrophysics Data System (ADS)

    Abers, Geoffrey A.; Hacker, Bradley R.

    2016-02-01

    To interpret seismic images, rock seismic velocities need to be calculated at elevated pressure and temperature for arbitrary compositions. This technical report describes an algorithm, software, and data to make such calculations from the physical properties of minerals. It updates a previous compilation and Excel® spreadsheet and includes new MATLAB® tools for the calculations. The database of 60 mineral end-members includes all parameters needed to estimate density and elastic moduli for many crustal and mantle rocks at conditions relevant to the upper few hundreds of kilometers of Earth. The behavior of α and β quartz is treated as a special case, owing to its unusual Poisson's ratio and thermal expansion that vary rapidly near the α-β transition. The MATLAB tools allow integration of these calculations into a variety of modeling and data analysis projects.

  17. A review of the use of handheld computers in medical nutrition.

    PubMed

    Holubar, Stefan; Harvey-Banchik, Lillian

    2007-08-01

    Handheld computers, or personal digital assistants (PDAs), have been used to assist clinicians in medical nutrition since the early 1980s. The term PDA was originally applied to programmable calculators; over time, the capabilities of these devices were expanded to allow for the use of more complicated programs such as databases, spreadsheets, and electronic books. Slowly, the device evolved into what is more commonly thought of as a PDA, that is, a device such as a PalmOS (PalmSource, Inc, Tokyo, Japan) or PocketPC (Microsoft, Redmond, WA) unit. We present a review of the literature about the use of PDAs in medical nutrition, followed by a discussion of the different types of PDAs and mobile technologies that are commercially available. This is followed by a discussion of software applications that are currently available for use by nutrition clinicians, focusing on freeware applications. Finally, future technologies and applications are discussed.

  18. Rdesign: A data dictionary with relational database design capabilities in Ada

    NASA Technical Reports Server (NTRS)

    Lekkos, Anthony A.; Kwok, Teresa Ting-Yin

    1986-01-01

    Data Dictionary is defined to be the set of all data attributes, which describe data objects in terms of their intrinsic attributes, such as name, type, size, format and definition. It is recognized as the data base for the Information Resource Management, to facilitate understanding and communication about the relationship between systems applications and systems data usage and to help assist in achieving data independence by permitting systems applications to access data knowledge of the location or storage characteristics of the data in the system. A research and development effort to use Ada has produced a data dictionary with data base design capabilities. This project supports data specification and analysis and offers a choice of the relational, network, and hierarchical model for logical data based design. It provides a highly integrated set of analysis and design transformation tools which range from templates for data element definition, spreadsheet for defining functional dependencies, normalization, to logical design generator.

  19. Geena 2, improved automated analysis of MALDI/TOF mass spectra.

    PubMed

    Romano, Paolo; Profumo, Aldo; Rocco, Mattia; Mangerini, Rosa; Ferri, Fabio; Facchiano, Angelo

    2016-03-02

    Mass spectrometry (MS) is producing high volumes of data supporting oncological sciences, especially for translational research. Most of related elaborations can be carried out by combining existing tools at different levels, but little is currently available for the automation of the fundamental steps. For the analysis of MALDI/TOF spectra, a number of pre-processing steps are required, including joining of isotopic abundances for a given molecular species, normalization of signals against an internal standard, background noise removal, averaging multiple spectra from the same sample, and aligning spectra from different samples. In this paper, we present Geena 2, a public software tool for the automated execution of these pre-processing steps for MALDI/TOF spectra. Geena 2 has been developed in a Linux-Apache-MySQL-PHP web development environment, with scripts in PHP and Perl. Input and output are managed as simple formats that can be consumed by any database system and spreadsheet software. Input data may also be stored in a MySQL database. Processing methods are based on original heuristic algorithms which are introduced in the paper. Three simple and intuitive web interfaces are available: the Standard Search Interface, which allows a complete control over all parameters, the Bright Search Interface, which leaves to the user the possibility to tune parameters for alignment of spectra, and the Quick Search Interface, which limits the number of parameters to a minimum by using default values for the majority of parameters. Geena 2 has been utilized, in conjunction with a statistical analysis tool, in three published experimental works: a proteomic study on the effects of long-term cryopreservation on the low molecular weight fraction of serum proteome, and two retrospective serum proteomic studies, one on the risk of developing breat cancer in patients affected by gross cystic disease of the breast (GCDB) and the other for the identification of a predictor of breast cancer mortality following breast cancer surgery, whose results were validated by ELISA, a completely alternative method. Geena 2 is a public tool for the automated pre-processing of MS data originated by MALDI/TOF instruments, with a simple and intuitive web interface. It is now under active development for the inclusion of further filtering options and for the adoption of standard formats for MS spectra.

  20. Analysis of the Requirements Generation Process for the Logistics Analysis and Wargame Support Tool

    DTIC Science & Technology

    2017-06-01

    For instance, the requirements for a pen seem straight forward; however, they may vary depending on the context in which the pen will be used...the interactions between the operational elements, specify which tasks are dependent on others and the order of executing task, and estimate how...configuration file to call that spreadsheet. This requirement can be met depending on the situation. If the nodes and arcs are pre-defined and readily

  1. An automated graphics tool for comparative genomics: the Coulson plot generator

    PubMed Central

    2013-01-01

    Background Comparative analysis is an essential component to biology. When applied to genomics for example, analysis may require comparisons between the predicted presence and absence of genes in a group of genomes under consideration. Frequently, genes can be grouped into small categories based on functional criteria, for example membership of a multimeric complex, participation in a metabolic or signaling pathway or shared sequence features and/or paralogy. These patterns of retention and loss are highly informative for the prediction of function, and hence possible biological context, and can provide great insights into the evolutionary history of cellular functions. However, representation of such information in a standard spreadsheet is a poor visual means from which to extract patterns within a dataset. Results We devised the Coulson Plot, a new graphical representation that exploits a matrix of pie charts to display comparative genomics data. Each pie is used to describe a complex or process from a separate taxon, and is divided into sectors corresponding to the number of proteins (subunits) in a complex/process. The predicted presence or absence of proteins in each complex are delineated by occupancy of a given sector; this format is visually highly accessible and makes pattern recognition rapid and reliable. A key to the identity of each subunit, plus hierarchical naming of taxa and coloring are included. A java-based application, the Coulson plot generator (CPG) automates graphic production, with a tab or comma-delineated text file as input and generating an editable portable document format or svg file. Conclusions CPG software may be used to rapidly convert spreadsheet data to a graphical matrix pie chart format. The representation essentially retains all of the information from the spreadsheet but presents a graphically rich format making comparisons and identification of patterns significantly clearer. While the Coulson plot format is highly useful in comparative genomics, its original purpose, the software can be used to visualize any dataset where entity occupancy is compared between different classes. Availability CPG software is available at sourceforge http://sourceforge.net/projects/coulson and http://dl.dropbox.com/u/6701906/Web/Sites/Labsite/CPG.html PMID:23621955

  2. U.S. Army Concept of Operations and Standard Operating Procedure for Acquisition Program Managers Using Item Unique Identification

    DTIC Science & Technology

    2017-09-01

    Figure 58. Click on run ................................................................................................61 Figure 59. Top view of...engines, helicopter rotors, and turbine blades , and so forth Creating Marks Readable with a Scanner 4.  Simple techniques to follow:  Make the light...spreadsheet with data Figure 58. Click on Menu bar and find “View” then click on “Macros.” Click on run Figure 59. 62 Top view of xml spreadsheet

  3. Small-Caliber Projectile Target Impact Angle Determined From Close Proximity Radiographs

    DTIC Science & Technology

    2006-10-01

    discrete motion data that can be numerically modeled using linear aerodynamic theory or 6-degrees-of- freedom equations of motion. The values of Fφ...Prediction Excel® Spreadsheet shown in figure 9. The Gamma at Impact Spreadsheet uses the linear aerodynamics model , equations 5 and 6, to calculate αT...trajectory angle error via consideration of the RMS fit errors of the actual firings. However, the linear aerodynamics model does not include this effect

  4. A retention index calculator simplifies identification of plant volatile organic compounds.

    PubMed

    Lucero, Mary; Estell, Rick; Tellez, María; Fredrickson, Ed

    2009-01-01

    Plant volatiles (PVOCs) are important targets for studies in natural products, chemotaxonomy and biochemical ecology. The complexity of PVOC profiles often limits research to studies targeting only easily identified compounds. With the availability of mass spectral libraries and recent growth of retention index (RI) libraries, PVOC identification can be achieved using only gas chromatography coupled to mass spectrometry (GCMS). However, RI library searching is not typically automated, and until recently, RI libraries were both limited in scope and costly to obtain. To automate RI calculation and lookup functions commonly utilised in PVOC analysis. Formulae required for calculating retention indices from retention time data were placed in a spreadsheet along with lookup functions and a retention index library. Retention times obtained from GCMS analysis of alkane standards and Koeberlinia spinosa essential oil were entered into the spreadsheet to determine retention indices. Indices were used in combination with mass spectral analysis to identify compounds contained in Koeberlinia spinosa essential oil. Eighteen compounds were positively identified. Total oil yield was low, with only 5 ppm in purple berries. The most abundant compounds were octen-3-ol and methyl salicylate. The spreadsheet accurately calculated RIs of the detected compounds. The downloadable spreadsheet tool developed for this study provides a calculator and RI library that works in conjuction with GCMS or other analytical techniques to identify PVOCs in plant extracts.

  5. Alkahest NuclearBLAST : a user-friendly BLAST management and analysis system

    PubMed Central

    Diener, Stephen E; Houfek, Thomas D; Kalat, Sam E; Windham, DE; Burke, Mark; Opperman, Charles; Dean, Ralph A

    2005-01-01

    Background - Sequencing of EST and BAC end datasets is no longer limited to large research groups. Drops in per-base pricing have made high throughput sequencing accessible to individual investigators. However, there are few options available which provide a free and user-friendly solution to the BLAST result storage and data mining needs of biologists. Results - Here we describe NuclearBLAST, a batch BLAST analysis, storage and management system designed for the biologist. It is a wrapper for NCBI BLAST which provides a user-friendly web interface which includes a request wizard and the ability to view and mine the results. All BLAST results are stored in a MySQL database which allows for more advanced data-mining through supplied command-line utilities or direct database access. NuclearBLAST can be installed on a single machine or clustered amongst a number of machines to improve analysis throughput. NuclearBLAST provides a platform which eases data-mining of multiple BLAST results. With the supplied scripts, the program can export data into a spreadsheet-friendly format, automatically assign Gene Ontology terms to sequences and provide bi-directional best hits between two datasets. Users with SQL experience can use the database to ask even more complex questions and extract any subset of data they require. Conclusion - This tool provides a user-friendly interface for requesting, viewing and mining of BLAST results which makes the management and data-mining of large sets of BLAST analyses tractable to biologists. PMID:15958161

  6. Construction of a database for published phase II/III drug intervention clinical trials for the period 2009-2014 comprising 2,326 records, 90 disease categories, and 939 drug entities.

    PubMed

    Jeong, Sohyun; Han, Nayoung; Choi, Boyoon; Sohn, Minji; Song, Yun-Kyoung; Chung, Myeon-Woo; Na, Han-Sung; Ji, Eunhee; Kim, Hyunah; Rhew, Ki Yon; Kim, Therasa; Kim, In-Wha; Oh, Jung Mi

    2016-06-01

    To construct a database of published clinical drug trials suitable for use 1) as a research tool in accessing clinical trial information and 2) in evidence-based decision-making by regulatory professionals, clinical research investigators, and medical practitioners. Comprehensive information obtained from a search of design elements and results of clinical trials in peer reviewed journals using PubMed (http://www.ncbi.nlm.ih.gov/pubmed). The methodology to develop a structured database was devised by a panel composed of experts in medical, pharmaceutical, information technology, and members of Ministry of Food and Drug Safety (MFDS) using a step by step approach. A double-sided system consisting of user mode and manager mode served as the framework for the database; elements of interest from each trial were entered via secure manager mode enabling the input information to be accessed in a user-friendly manner (user mode). Information regarding methodology used and results of drug treatment were extracted as detail elements of each data set and then inputted into the web-based database system. Comprehensive information comprising 2,326 clinical trial records, 90 disease states, and 939 drugs entities and concerning study objectives, background, methods used, results, and conclusion could be extracted from published information on phase II/III drug intervention clinical trials appearing in SCI journals within the last 10 years. The extracted data was successfully assembled into a clinical drug trial database with easy access suitable for use as a research tool. The clinically most important therapeutic categories, i.e., cancer, cardiovascular, respiratory, neurological, metabolic, urogenital, gastrointestinal, psychological, and infectious diseases were covered by the database. Names of test and control drugs, details on primary and secondary outcomes and indexed keywords could also be retrieved and built into the database. The construction used in the database enables the user to sort and download targeted information as a Microsoft Excel spreadsheet. Because of the comprehensive and standardized nature of the clinical drug trial database and its ease of access it should serve as valuable information repository and research tool for accessing clinical trial information and making evidence-based decisions by regulatory professionals, clinical research investigators, and medical practitioners.

  7. Computer-based Astronomy Labs for Non-science Majors

    NASA Astrophysics Data System (ADS)

    Smith, A. B. E.; Murray, S. D.; Ward, R. A.

    1998-12-01

    We describe and demonstrate two laboratory exercises, Kepler's Third Law and Stellar Structure, which are being developed for use in an astronomy laboratory class aimed at non-science majors. The labs run with Microsoft's Excel 98 (Macintosh) or Excel 97 (Windows). They can be run in a classroom setting or in an independent learning environment. The intent of the labs is twofold; first and foremost, students learn the subject matter through a series of informational frames. Next, students enhance their understanding by applying their knowledge in lab procedures, while also gaining familiarity with the use and power of a widely-used software package and scientific tool. No mathematical knowledge beyond basic algebra is required to complete the labs or to understand the computations in the spreadsheets, although the students are exposed to the concepts of numerical integration. The labs are contained in Excel workbook files. In the files are multiple spreadsheets, which contain either a frame with information on how to run the lab, material on the subject, or one or more procedures. Excel's VBA macro language is used to automate the labs. The macros are accessed through button interfaces positioned on the spreadsheets. This is done intentionally so that students can focus on learning the subject matter and the basic spreadsheet features without having to learn advanced Excel features all at once. Students open the file and progress through the informational frames to the procedures. After each procedure, student comments and data are automatically recorded in a preformatted Lab Report spreadsheet. Once all procedures have been completed, the student is prompted for a filename in which to save their Lab Report. The lab reports can then be printed or emailed to the instructor. The files will have full worksheet and workbook protection, and will have a "redo" feature at the end of the lab for students who want to repeat a procedure.

  8. A New Global Open Source Marine Hydrocarbon Emission Site Database

    NASA Astrophysics Data System (ADS)

    Onyia, E., Jr.; Wood, W. T.; Barnard, A.; Dada, T.; Qazzaz, M.; Lee, T. R.; Herrera, E.; Sager, W.

    2017-12-01

    Hydrocarbon emission sites (e.g. seeps) discharge large volumes of fluids and gases into the oceans that are not only important for biogeochemical budgets, but also support abundant chemosynthetic communities. Documenting the locations of modern emissions is a first step towards understanding and monitoring how they affect the global state of the seafloor and oceans. Currently, no global open source (i.e. non-proprietry) detailed maps of emissions sites are available. As a solution, we have created a database that is housed within an Excel spreadsheet and use the latest versions of Earthpoint and Google Earth for position coordinate conversions and data mapping, respectively. To date, approximately 1,000 data points have been collected from referenceable sources across the globe, and we are continualy expanding the dataset. Due to the variety of spatial extents encountered, to identify each site we used two different methods: 1) point (x, y, z) locations for individual sites and; 2) delineation of areas where sites are clustered. Certain well-known areas, such as the Gulf of Mexico and the Mediterranean Sea, have a greater abundance of information; whereas significantly less information is available in other regions due to the absence of emission sites, lack of data, or because the existing data is proprietary. Although the geographical extent of the data is currently restricted to regions where the most data is publicly available, as the database matures, we expect to have more complete coverage of the world's oceans. This database is an information resource that consolidates and organizes the existing literature on hydrocarbons released into the marine environment, thereby providing a comprehensive reference for future work. We expect that the availability of seafloor hydrocarbon emission maps will benefit scientific understanding of hydrocarbon rich areas as well as potentially aiding hydrocarbon exploration and environmental impact assessements.

  9. Monitoring of small laboratory animal experiments by a designated web-based database.

    PubMed

    Frenzel, T; Grohmann, C; Schumacher, U; Krüll, A

    2015-10-01

    Multiple-parametric small animal experiments require, by their very nature, a sufficient number of animals which may need to be large to obtain statistically significant results.(1) For this reason database-related systems are required to collect the experimental data as well as to support the later (re-) analysis of the information gained during the experiments. In particular, the monitoring of animal welfare is simplified by the inclusion of warning signals (for instance, loss in body weight >20%). Digital patient charts have been developed for human patients but are usually not able to fulfill the specific needs of animal experimentation. To address this problem a unique web-based monitoring system using standard MySQL, PHP, and nginx has been created. PHP was used to create the HTML-based user interface and outputs in a variety of proprietary file formats, namely portable document format (PDF) or spreadsheet files. This article demonstrates its fundamental features and the easy and secure access it offers to the data from any place using a web browser. This information will help other researchers create their own individual databases in a similar way. The use of QR-codes plays an important role for stress-free use of the database. We demonstrate a way to easily identify all animals and samples and data collected during the experiments. Specific ways to record animal irradiations and chemotherapy applications are shown. This new analysis tool allows the effective and detailed analysis of huge amounts of data collected through small animal experiments. It supports proper statistical evaluation of the data and provides excellent retrievable data storage. © The Author(s) 2015.

  10. Introducing the GRACEnet/REAP Data Contribution, Discovery, and Retrieval System.

    PubMed

    Del Grosso, S J; White, J W; Wilson, G; Vandenberg, B; Karlen, D L; Follett, R F; Johnson, J M F; Franzluebbers, A J; Archer, D W; Gollany, H T; Liebig, M A; Ascough, J; Reyes-Fox, M; Pellack, L; Starr, J; Barbour, N; Polumsky, R W; Gutwein, M; James, D

    2013-07-01

    Difficulties in accessing high-quality data on trace gas fluxes and performance of bioenergy/bioproduct feedstocks limit the ability of researchers and others to address environmental impacts of agriculture and the potential to produce feedstocks. To address those needs, the GRACEnet (Greenhouse gas Reduction through Agricultural Carbon Enhancement network) and REAP (Renewable Energy Assessment Project) research programs were initiated by the USDA Agricultural Research Service (ARS). A major product of these programs is the creation of a database with greenhouse gas fluxes, soil carbon stocks, biomass yield, nutrient, and energy characteristics, and input data for modeling cropped and grazed systems. The data include site descriptors (e.g., weather, soil class, spatial attributes), experimental design (e.g., factors manipulated, measurements performed, plot layouts), management information (e.g., planting and harvesting schedules, fertilizer types and amounts, biomass harvested, grazing intensity), and measurements (e.g., soil C and N stocks, plant biomass amount and chemical composition). To promote standardization of data and ensure that experiments were fully described, sampling protocols and a spreadsheet-based data-entry template were developed. Data were first uploaded to a temporary database for checking and then were uploaded to the central database. A Web-accessible application allows for registered users to query and download data including measurement protocols. Separate portals have been provided for each project (GRACEnet and REAP) at nrrc.ars.usda.gov/slgracenet/#/Home and nrrc.ars.usda.gov/slreap/#/Home. The database architecture and data entry template have proven flexible and robust for describing a wide range of field experiments and thus appear suitable for other natural resource research projects. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  11. The PEPR GeneChip data warehouse, and implementation of a dynamic time series query tool (SGQT) with graphical interface.

    PubMed

    Chen, Josephine; Zhao, Po; Massaro, Donald; Clerch, Linda B; Almon, Richard R; DuBois, Debra C; Jusko, William J; Hoffman, Eric P

    2004-01-01

    Publicly accessible DNA databases (genome browsers) are rapidly accelerating post-genomic research (see http://www.genome.ucsc.edu/), with integrated genomic DNA, gene structure, EST/ splicing and cross-species ortholog data. DNA databases have relatively low dimensionality; the genome is a linear code that anchors all associated data. In contrast, RNA expression and protein databases need to be able to handle very high dimensional data, with time, tissue, cell type and genes, as interrelated variables. The high dimensionality of microarray expression profile data, and the lack of a standard experimental platform have complicated the development of web-accessible databases and analytical tools. We have designed and implemented a public resource of expression profile data containing 1024 human, mouse and rat Affymetrix GeneChip expression profiles, generated in the same laboratory, and subject to the same quality and procedural controls (Public Expression Profiling Resource; PEPR). Our Oracle-based PEPR data warehouse includes a novel time series query analysis tool (SGQT), enabling dynamic generation of graphs and spreadsheets showing the action of any transcript of interest over time. In this report, we demonstrate the utility of this tool using a 27 time point, in vivo muscle regeneration series. This data warehouse and associated analysis tools provides access to multidimensional microarray data through web-based interfaces, both for download of all types of raw data for independent analysis, and also for straightforward gene-based queries. Planned implementations of PEPR will include web-based remote entry of projects adhering to quality control and standard operating procedure (QC/SOP) criteria, and automated output of alternative probe set algorithms for each project (see http://microarray.cnmcresearch.org/pgadatatable.asp).

  12. Citation Analysis of Hepatitis Monthly by Journal Citation Report (ISI), Google Scholar, and Scopus

    PubMed Central

    Miri, Seyyed Mohammad; Raoofi, Azam; Heidari, Zahra

    2012-01-01

    Background Citation analysis as one of the most widely used methods of bibliometrics can be used for computing the various impact measures for scholars based on data from citation databases. Journal Citation Reports (JCR) from Thomson Reuters provides annual report in the form of impact factor (IF) for each journal. Objectives We aimed to evaluate the citation parameters of Hepatitis Monthly by JCR in 2010 and compare them with GS and Sc. Materials and Methods All articles of Hepat Mon published in 2009 and 2008 which had been cited in 2010 in three databases including WoS, Sc and GS gathered in a spreadsheet. The IFs were manually calculated. Results Among the 104 total published articles the accuracy rates of GS and Sc in recording the total number of articles was 96% and 87.5%. There was a difference between IFs among the three databases (0.793 in ISI [Institute for Scientific Information], 0.945 in Sc and 0.85 GS). The missing rate of citations in ISI was 4% totally. Original articles were the main cited types, whereas, guidelines and clinical challenges were the least ones. Conclusions None of the three databases succeed to record all articles published in the journal. Despite high sensitivity of GS comparing to Sc, it cannot be a reliable source for indexing since GS has lack of screening in the data collection and low specificity. Using an average of three IFs is suggested to find the correct IF. Editors should be more aware on the role of original articles in increasing IF and the potential efficacy of review articles in long term impact factor. PMID:23087765

  13. The PEPR GeneChip data warehouse, and implementation of a dynamic time series query tool (SGQT) with graphical interface

    PubMed Central

    Chen, Josephine; Zhao, Po; Massaro, Donald; Clerch, Linda B.; Almon, Richard R.; DuBois, Debra C.; Jusko, William J.; Hoffman, Eric P.

    2004-01-01

    Publicly accessible DNA databases (genome browsers) are rapidly accelerating post-genomic research (see http://www.genome.ucsc.edu/), with integrated genomic DNA, gene structure, EST/ splicing and cross-species ortholog data. DNA databases have relatively low dimensionality; the genome is a linear code that anchors all associated data. In contrast, RNA expression and protein databases need to be able to handle very high dimensional data, with time, tissue, cell type and genes, as interrelated variables. The high dimensionality of microarray expression profile data, and the lack of a standard experimental platform have complicated the development of web-accessible databases and analytical tools. We have designed and implemented a public resource of expression profile data containing 1024 human, mouse and rat Affymetrix GeneChip expression profiles, generated in the same laboratory, and subject to the same quality and procedural controls (Public Expression Profiling Resource; PEPR). Our Oracle-based PEPR data warehouse includes a novel time series query analysis tool (SGQT), enabling dynamic generation of graphs and spreadsheets showing the action of any transcript of interest over time. In this report, we demonstrate the utility of this tool using a 27 time point, in vivo muscle regeneration series. This data warehouse and associated analysis tools provides access to multidimensional microarray data through web-based interfaces, both for download of all types of raw data for independent analysis, and also for straightforward gene-based queries. Planned implementations of PEPR will include web-based remote entry of projects adhering to quality control and standard operating procedure (QC/SOP) criteria, and automated output of alternative probe set algorithms for each project (see http://microarray.cnmcresearch.org/pgadatatable.asp). PMID:14681485

  14. Supplement to the Carcinogenic Potency Database (CPDB): Results ofanimal bioassays published in the general literature through 1997 and bythe National Toxicology Program in 1997-1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gold, Lois Swirsky; Manley, Neela B.; Slone, Thomas H.

    2005-04-08

    The Carcinogenic Potency Database (CPDB) is a systematic and unifying resource that standardizes the results of chronic, long-term animal cancer tests which have been conducted since the 1950s. The analyses include sufficient information on each experiment to permit research into many areas of carcinogenesis. Both qualitative and quantitative information is reported on positive and negative experiments that meet a set of inclusion criteria. A measure of carcinogenic potency, TD50 (daily dose rate in mg/kg body weight/day to induce tumors in half of test animals that would have remained tumor-free at zero dose), is estimated for each tissue-tumor combination reported. Thismore » article is the ninth publication of a chronological plot of the CPDB; it presents results on 560 experiments of 188 chemicals in mice, rats, and hamsters from 185 publications in the general literature updated through 1997, and from 15 Reports of the National Toxicology Program in 1997-1998. The test agents cover a wide variety of uses and chemical classes. The CPDB Web Site(http://potency.berkeley.edu/) presents the combined database of all published plots in a variety of formats as well as summary tables by chemical and by target organ, supplemental materials on dosing and survival, a detailed guide to using the plot formats, and documentation of methods and publications. The overall CPDB, including the results in this article, presents easily accessible results of 6153 experiments on 1485 chemicals from 1426 papers and 429 NCI/NTP (National Cancer Institute/National Toxicology program) Technical Reports. A tab-separated format of the full CPDB for reading the data into spreadsheets or database applications is available on the Web Site.« less

  15. Development of a clinical prediction model to calculate patient life expectancy: the measure of actuarial life expectancy (MALE).

    PubMed

    Clarke, M G; Kennedy, K P; MacDonagh, R P

    2009-01-01

    To develop a clinical prediction model enabling the calculation of an individual patient's life expectancy (LE) and survival probability based on age, sex, and comorbidity for use in the joint decision-making process regarding medical treatment. A computer software program was developed with a team of 3 clinicians, 2 professional actuaries, and 2 professional computer programmers. This incorporated statistical spreadsheet and database access design methods. Data sources included life insurance industry actuarial rating factor tables (public and private domain), Government Actuary Department UK life tables, professional actuarial sources, and evidence-based medical literature. The main outcome measures were numerical and graphical display of comorbidity-adjusted LE; 5-, 10-, and 15-year survival probability; in addition to generic UK population LE. Nineteen medical conditions, which impacted significantly on LE in actuarial terms and were commonly encountered in clinical practice, were incorporated in the final model. Numerical and graphical representations of statistical predictions of LE and survival probability were successfully generated for patients with either no comorbidity or a combination of the 19 medical conditions included. Validation and testing, including actuarial peer review, confirmed consistency with the data sources utilized. The evidence-based actuarial data utilized in this computer program design represent a valuable resource for use in the clinical decision-making process, where an accurate objective assessment of patient LE can so often make the difference between patients being offered or denied medical and surgical treatment. Ongoing development to incorporate additional comorbidities and enable Web-based access will enhance its use further.

  16. Space-Plane Spreadsheet Program

    NASA Technical Reports Server (NTRS)

    Mackall, Dale

    1993-01-01

    Basic Hypersonic Data and Equations (HYPERDATA) spreadsheet computer program provides data gained from three analyses of performance of space plane. Equations used to perform analyses derived from Newton's second law of physics, derivation included. First analysis is parametric study of some basic factors affecting ability of space plane to reach orbit. Second includes calculation of thickness of spherical fuel tank. Third produces ratio between volume of fuel and total mass for each of various aircraft. HYPERDATA intended for use on Macintosh(R) series computers running Microsoft Excel 3.0.

  17. Social Security: A Present Value Analysis of Old Age Survivors Insurance (OASI) Taxes and Benefits.

    DTIC Science & Technology

    1995-12-01

    private sector plans and provides a spreadsheet model for making this comparison of plans using different assumptions. The investigation was done by collecting data from various books, Government publications, and various Government agencies to conduct a spreadsheet analysis of three different wage-earning groups, assuming various real interest rates potentially earned in the private sector . A comparison of Social Security with alternative private sector plans is important to the DoD/DoN because less constrained budgets could

  18. Collaborative writing: Tools and tips.

    PubMed

    Eapen, Bell Raj

    2007-01-01

    Majority of technical writing is done by groups of experts and various web based applications have made this collaboration easy. Email exchange of word processor documents with tracked changes used to be the standard technique for collaborative writing. However web based tools like Google docs and Spreadsheets have made the process fast and efficient. Various versioning tools and synchronous editors are available for those who need additional functionality. Having a group leader who decides the scheduling, communication and conflict resolving protocols is important for successful collaboration.

  19. Use of Excel ion exchange equilibrium solver with WinGEMS to model and predict NPE distribution in the Mead/Westvaco Evandale, TX, hardwood bleach plant

    Treesearch

    Christopher Litvay; Alan Rudie; Peter Hart

    2003-01-01

    An Excel spreadsheet developed to solve the ion-exchange equilibrium in wood pulps has been linked by dynamic data exchange to WinGEMS and used to model the non-process elements in the hardwood bleach plant of the Mead/Westvaco Evandale mill. Pulp and filtrate samples were collected from the diffusion washers and final wash press of the bleach plant. A WinGEMS model of...

  20. Fischer Assays of Oil-Shale Drill Cores and Rotary Cuttings from the Greater Green River Basin, Southwestern Wyoming

    USGS Publications Warehouse

    ,

    2008-01-01

    Chapter 1 of this CD-ROM is a database of digitized Fischer (shale-oil) assays of cores and cuttings from boreholes drilled in the Eocene Green River oil shale deposits in southwestern Wyoming. Assays of samples from some surface sections are also included. Most of the Fischer assay analyses were made by the former U.S. Bureau of Mines (USBM) at its laboratory in Laramie, Wyoming. Other assays, made by institutional or private laboratories, were donated to the U.S. Geological Survey (USGS) and are included in this database as well as Adobe PDF-scanned images of some of the original laboratory assay reports and lithologic logs prepared by USBM geologists. The size of this database is 75.2 megabytes and includes information on 971 core holes and rotary-drilled boreholes and numerous surface sections. Most of these data were released previously by the USBM and the USGS through the National Technical Information Service but are no longer available from that agency. Fischer assays for boreholes in northeastern Utah and northwestern Colorado have been published by the USGS. Additional data include geophysical logs, groundwater data, chemical and X-ray diffraction analyses, and other data. These materials are available for inspection in the office of the USGS Central Energy Resources Team in Lakewood, Colorado. The digitized assays were checked with the original laboratory reports, but some errors likely remain. Other information, such as locations and elevations of core holes and oil and gas tests, were not thoroughly checked. However, owing to the current interest in oil-shale development, it was considered in the public interest to make this preliminary database available at this time. Chapter 2 of this CD-ROM presents oil-yield histograms of samples of cores and cuttings from exploration drill holes in the Eocene Green River Formation in the Great Divide, Green River, and Washakie Basins of southwestern Wyoming. A database was compiled that includes about 47,000 Fischer assays from 186 core holes and 240 rotary drill holes. Most of the oil yield data are from analyses performed by the former U.S. Bureau of Mines oil shale laboratory in Laramie, Wyoming, with some analyses made by private laboratories. Location data for 971 Wyoming oil-shale drill holes are listed in a spreadsheet that is included in the CD-ROM. These Wyoming Fischer assays and histograms are part of a much larger collection of oil-shale information, including geophysical and lithologic logs, water data, chemical and X-ray diffraction analyses on the Green River oil-shale deposits in Colorado, Utah, and Wyoming held by the U.S. Geological Survey. Because of an increased interest in oil shale, this CD-ROM containing Fischer assay data and oil-yield histograms for the Green River oil-shale deposits in southwestern Wyoming is being released to the public. Microsoft Excel spreadsheets included with Chapter 2 contain the Fischer assay data from the 426 holes and data on the company name and drill-hole name, and location. Histograms of the oil yields obtained from the Fischer assays are presented in both Grapher and PDF format. Fischer assay text data files are also included in the CD-ROM.

  1. An Excel Spreadsheet Model for States and Districts to Assess the Cost-Benefit of School Nursing Services.

    PubMed

    Wang, Li Yan; O'Brien, Mary Jane; Maughan, Erin D

    2016-11-01

    This paper describes a user-friendly, Excel spreadsheet model and two data collection instruments constructed by the authors to help states and districts perform cost-benefit analyses of school nursing services delivered by full-time school nurses. Prior to applying the model, states or districts need to collect data using two forms: "Daily Nurse Data Collection Form" and the "Teacher Survey." The former is used to record daily nursing activities, including number of student health encounters, number of medications administered, number of student early dismissals, and number of medical procedures performed. The latter is used to obtain estimates for the time teachers spend addressing student health issues. Once inputs are entered in the model, outputs are automatically calculated, including program costs, total benefits, net benefits, and benefit-cost ratio. The spreadsheet model, data collection tools, and instructions are available at the NASN website ( http://www.nasn.org/The/CostBenefitAnalysis ).

  2. Calculation tool for transported geothermal energy using two-step absorption process

    DOE Data Explorer

    Kyle Gluesenkamp

    2016-02-01

    This spreadsheet allows the user to calculate parameters relevant to techno-economic performance of a two-step absorption process to transport low temperature geothermal heat some distance (1-20 miles) for use in building air conditioning. The parameters included are (1) energy density of aqueous LiBr and LiCl solutions, (2) transportation cost of trucking solution, and (3) equipment cost for the required chillers and cooling towers in the two-step absorption approach. More information is available in the included public report: "A Technical and Economic Analysis of an Innovative Two-Step Absorption System for Utilizing Low-Temperature Geothermal Resources to Condition Commercial Buildings"

  3. Dairy Analytics and Nutrient Analysis (DANA) Prototype System User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sam Alessi; Dennis Keiser

    2012-10-01

    This document is a user manual for the Dairy Analytics and Nutrient Analysis (DANA) model. DANA provides an analysis of dairy anaerobic digestion technology and allows users to calculate biogas production, co-product valuation, capital costs, expenses, revenue and financial metrics, for user customizable scenarios, dairy and digester types. The model provides results for three anaerobic digester types; Covered Lagoons, Modified Plug Flow, and Complete Mix, and three main energy production technologies; electricity generation, renewable natural gas generation, and compressed natural gas generation. Additional options include different dairy types, bedding types, backend treatment type as well as numerous production, and economicmore » parameters. DANA’s goal is to extend the National Market Value of Anaerobic Digester Products analysis (informa economics, 2012; Innovation Center, 2011) to include a greater and more flexible set of regional digester scenarios and to provide a modular framework for creation of a tool to support farmer and investor needs. Users can set up scenarios from combinations of existing parameters or add new parameters, run the model and view a variety of reports, charts and tables that are automatically produced and delivered over the web interface. DANA is based in the INL’s analysis architecture entitled Generalized Environment for Modeling Systems (GEMS) , which offers extensive collaboration, analysis, and integration opportunities and greatly speeds the ability construct highly scalable web delivered user-oriented decision tools. DANA’s approach uses server-based data processing and web-based user interfaces, rather a client-based spreadsheet approach. This offers a number of benefits over the client-based approach. Server processing and storage can scale up to handle a very large number of scenarios, so that analysis of county, even field level, across the whole U.S., can be performed. Server based databases allow dairy and digester parameters be held and managed in a single managed data repository, while allows users to customize standard values and perform individual analysis. Server-based calculations can be easily extended, versions and upgrades managed, and any changes are immediately available to all users. This user manual describes how to use and/or modify input database tables, run DANA, view and modify reports.« less

  4. LimsPortal and BonsaiLIMS: development of a lab information management system for translational medicine

    PubMed Central

    2011-01-01

    Background Laboratory Information Management Systems (LIMS) are an increasingly important part of modern laboratory infrastructure. As typically very sophisticated software products, LIMS often require considerable resources to select, deploy and maintain. Larger organisations may have access to specialist IT support to assist with requirements elicitation and software customisation, however smaller groups will often have limited IT support to perform the kind of iterative development that can resolve the difficulties that biologists often have when specifying requirements. Translational medicine aims to accelerate the process of treatment discovery by bringing together multiple disciplines to discover new approaches to treating disease, or novel applications of existing treatments. The diverse set of disciplines and complexity of processing procedures involved, especially with the use of high throughput technologies, bring difficulties in customizing a generic LIMS to provide a single system for managing sample related data within a translational medicine research setting, especially where limited IT support is available. Results We have designed and developed a LIMS, BonsaiLIMS, around a very simple data model that can be easily implemented using a variety of technologies, and can be easily extended as specific requirements dictate. A reference implementation using Oracle 11 g database and the Python framework, Django is presented. Conclusions By focusing on a minimal feature set and a modular design we have been able to deploy the BonsaiLIMS system very quickly. The benefits to our institute have been the avoidance of the prolonged implementation timescales, budget overruns, scope creep, off-specifications and user fatigue issues that typify many enterprise software implementations. The transition away from using local, uncontrolled records in spreadsheet and paper formats to a centrally held, secured and backed-up database brings the immediate benefits of improved data visibility, audit and overall data quality. The open-source availability of this software allows others to rapidly implement a LIMS which in itself might sufficiently address user requirements. In situations where this software does not meet requirements, it can serve to elicit more accurate specifications from end-users for a more heavyweight LIMS by acting as a demonstrable prototype. PMID:21569484

  5. LimsPortal and BonsaiLIMS: development of a lab information management system for translational medicine.

    PubMed

    Bath, Timothy G; Bozdag, Selcuk; Afzal, Vackar; Crowther, Daniel

    2011-05-13

    Laboratory Information Management Systems (LIMS) are an increasingly important part of modern laboratory infrastructure. As typically very sophisticated software products, LIMS often require considerable resources to select, deploy and maintain. Larger organisations may have access to specialist IT support to assist with requirements elicitation and software customisation, however smaller groups will often have limited IT support to perform the kind of iterative development that can resolve the difficulties that biologists often have when specifying requirements. Translational medicine aims to accelerate the process of treatment discovery by bringing together multiple disciplines to discover new approaches to treating disease, or novel applications of existing treatments. The diverse set of disciplines and complexity of processing procedures involved, especially with the use of high throughput technologies, bring difficulties in customizing a generic LIMS to provide a single system for managing sample related data within a translational medicine research setting, especially where limited IT support is available. We have designed and developed a LIMS, BonsaiLIMS, around a very simple data model that can be easily implemented using a variety of technologies, and can be easily extended as specific requirements dictate. A reference implementation using Oracle 11 g database and the Python framework, Django is presented. By focusing on a minimal feature set and a modular design we have been able to deploy the BonsaiLIMS system very quickly. The benefits to our institute have been the avoidance of the prolonged implementation timescales, budget overruns, scope creep, off-specifications and user fatigue issues that typify many enterprise software implementations. The transition away from using local, uncontrolled records in spreadsheet and paper formats to a centrally held, secured and backed-up database brings the immediate benefits of improved data visibility, audit and overall data quality. The open-source availability of this software allows others to rapidly implement a LIMS which in itself might sufficiently address user requirements. In situations where this software does not meet requirements, it can serve to elicit more accurate specifications from end-users for a more heavyweight LIMS by acting as a demonstrable prototype.

  6. Creating User-Friendly Tools for Data Analysis and Visualization in K-12 Classrooms: A Fortran Dinosaur Meets Generation Y

    NASA Technical Reports Server (NTRS)

    Chambers, L. H.; Chaudhury, S.; Page, M. T.; Lankey, A. J.; Doughty, J.; Kern, Steven; Rogerson, Tina M.

    2008-01-01

    During the summer of 2007, as part of the second year of a NASA-funded project in partnership with Christopher Newport University called SPHERE (Students as Professionals Helping Educators Research the Earth), a group of undergraduate students spent 8 weeks in a research internship at or near NASA Langley Research Center. Three students from this group formed the Clouds group along with a NASA mentor (Chambers), and the brief addition of a local high school student fulfilling a mentorship requirement. The Clouds group was given the task of exploring and analyzing ground-based cloud observations obtained by K-12 students as part of the Students' Cloud Observations On-Line (S'COOL) Project, and the corresponding satellite data. This project began in 1997. The primary analysis tools developed for it were in FORTRAN, a computer language none of the students were familiar with. While they persevered through computer challenges and picky syntax, it eventually became obvious that this was not the most fruitful approach for a project aimed at motivating K-12 students to do their own data analysis. Thus, about halfway through the summer the group shifted its focus to more modern data analysis and visualization tools, namely spreadsheets and Google(tm) Earth. The result of their efforts, so far, is two different Excel spreadsheets and a Google(tm) Earth file. The spreadsheets are set up to allow participating classrooms to paste in a particular dataset of interest, using the standard S'COOL format, and easily perform a variety of analyses and comparisons of the ground cloud observation reports and their correspondence with the satellite data. This includes summarizing cloud occurrence and cloud cover statistics, and comparing cloud cover measurements from the two points of view. A visual classification tool is also provided to compare the cloud levels reported from the two viewpoints. This provides a statistical counterpart to the existing S'COOL data visualization tool, which is used for individual ground-to-satellite correspondences. The Google(tm) Earth file contains a set of placemarks and ground overlays to show participating students the area around their school that the satellite is measuring. This approach will be automated and made interactive by the S'COOL database expert and will also be used to help refine the latitude/longitude location of the participating schools. Once complete, these new data analysis tools will be posted on the S'COOL website for use by the project participants in schools around the US and the world.

  7. Graphical modeling and query language for hospitals.

    PubMed

    Barzdins, Janis; Barzdins, Juris; Rencis, Edgars; Sostaks, Agris

    2013-01-01

    So far there has been little evidence that implementation of the health information technologies (HIT) is leading to health care cost savings. One of the reasons for this lack of impact by the HIT likely lies in the complexity of the business process ownership in the hospitals. The goal of our research is to develop a business model-based method for hospital use which would allow doctors to retrieve directly the ad-hoc information from various hospital databases. We have developed a special domain-specific process modelling language called the MedMod. Formally, we define the MedMod language as a profile on UML Class diagrams, but we also demonstrate it on examples, where we explain the semantics of all its elements informally. Moreover, we have developed the Process Query Language (PQL) that is based on MedMod process definition language. The purpose of PQL is to allow a doctor querying (filtering) runtime data of hospital's processes described using MedMod. The MedMod language tries to overcome deficiencies in existing process modeling languages, allowing to specify the loosely-defined sequence of the steps to be performed in the clinical process. The main advantages of PQL are in two main areas - usability and efficiency. They are: 1) the view on data through "glasses" of familiar process, 2) the simple and easy-to-perceive means of setting filtering conditions require no more expertise than using spreadsheet applications, 3) the dynamic response to each step in construction of the complete query that shortens the learning curve greatly and reduces the error rate, and 4) the selected means of filtering and data retrieving allows to execute queries in O(n) time regarding the size of the dataset. We are about to continue developing this project with three further steps. First, we are planning to develop user-friendly graphical editors for the MedMod process modeling and query languages. The second step is to do evaluation of usability the proposed language and tool involving the physicians from several hospitals in Latvia and working with real data from these hospitals. Our third step is to develop an efficient implementation of the query language.

  8. Automated and comprehensive link engineering supporting branched, ring, and mesh network topologies

    NASA Astrophysics Data System (ADS)

    Farina, J.; Khomchenko, D.; Yevseyenko, D.; Meester, J.; Richter, A.

    2016-02-01

    Link design, while relatively easy in the past, can become quite cumbersome with complex channel plans and equipment configurations. The task of designing optical transport systems and selecting equipment is often performed by an applications or sales engineer using simple tools, such as custom Excel spreadsheets. Eventually, every individual has their own version of the spreadsheet as well as their own methodology for building the network. This approach becomes unmanageable very quickly and leads to mistakes, bending of the engineering rules and installations that do not perform as expected. We demonstrate a comprehensive planning environment, which offers an efficient approach to unify, control and expedite the design process by controlling libraries of equipment and engineering methodologies, automating the process and providing the analysis tools necessary to predict system performance throughout the system and for all channels. In addition to the placement of EDFAs and DCEs, performance analysis metrics are provided at every step of the way. Metrics that can be tracked include power, CD and OSNR, SPM, XPM, FWM and SBS. Automated routine steps assist in design aspects such as equalization, padding and gain setting for EDFAs, the placement of ROADMs and transceivers, and creating regeneration points. DWDM networks consisting of a large number of nodes and repeater huts, interconnected in linear, branched, mesh and ring network topologies, can be designed much faster when compared with conventional design methods. Using flexible templates for all major optical components, our technology-agnostic planning approach supports the constant advances in optical communications.

  9. Value of Information spreadsheet

    DOE Data Explorer

    Trainor-Guitton, Whitney

    2014-05-12

    This spreadsheet represents the information posteriors derived from synthetic data of magnetotellurics (MT). These were used to calculate value of information of MT for geothermal exploration. Information posteriors describe how well MT was able to locate the "throat" of clay caps, which are indicative of hidden geothermal resources. This data is full explained in the peer-reviewed publication: Trainor-Guitton, W., Hoversten, G. M., Ramirez, A., Roberts, J., Júlíusson, E., Key, K., Mellors, R. (Sept-Oct. 2014) The value of spatial information for determining well placement: a geothermal example, Geophysics.

  10. Angular Speed of a Compact Disc

    NASA Astrophysics Data System (ADS)

    Sawicki, Mikolaj ``Mik''

    2006-09-01

    A spinning motion of a compact disc in a CD player offers an interesting and challenging problem in rotational kinematics with a nonconstant angular acceleration that can be incorporated into a typical introductory physics class for engineers and scientists. It can be used either as an example presented during the lecture, emphasizing application of calculus, or as a homework assignment that could be handled easily with the help of a spreadsheet, thus eliminating the calculus aspect altogether. I tried both approaches, and the spreadsheet study was favored by my students.

  11. Reducing perceived barriers to nursing homes data entry in the advancing excellence campaign: the role of LANEs (Local Area Networks for Excellence).

    PubMed

    Bakerjian, Debra; Bonner, Alice; Benner, Carol; Caswell, Cheryl; Weintraub, Alissa; Koren, Mary Jane

    2011-09-01

    Advancing Excellence (AE) is a coalition-based campaign concerned with how society cares for its elderly and disabled citizens. The purpose of this project was to work with a small group of volunteer nursing homes and with local quality improvement networks called LANEs (Local Area Networks for Excellence) in 6 states in a learning collaborative. The purpose of the collaborative was to determine effective ways for LANEs to address and mitigate perceived barriers to nursing home data entry in the national Advancing Excellence campaign and to test methods by which local quality improvement networks could support nursing homes as they enter data on the AE Web site. A semistructured telephone survey of nursing homes was conducted in 6 states. Participants included LANEs from California, Georgia, Massachusetts, Michigan, Oklahoma, and Washington. Facility characteristics were obtained from a series of questions during the telephone interview. Three states (GA, MA, OK) piloted a new spreadsheet and process for entering data on staff turnover, and 3 states (CA, MI, WA) piloted a new spreadsheet and process for entering data on consistent assignment. Many of the nursing homes we contacted had not entered data for organizational goals on the national Web site, but all were able to do so with telephone assistance from the LANE. Eighty-five percent of nursing homes said they would be able to collect information on advance directives if tools (eg, spreadsheets) were provided. Over 40% of nursing homes, including for-profit homes, were willing to have staff and residents/families enter satisfaction data directly on an independent Web site. Nursing homes were able to convey concerns and questions about the process of goal entry, and offer suggestions to the LANEs during semistructured telephone interviews. The 6 LANEs discussed nursing home responses on their regularly scheduled calls, and useful strategies were shared across states. Nursing homes reported that they are using Advancing Excellence target setting and goal entry to improve care, and that they would use new tools such as those for measuring satisfaction, consistent assignment, and advance directives. Having LANE members contact nursing homes directly by telephone engaged the nursing homes in providing valuable feedback on new Advancing Excellence goals and data entry. It also provided an opportunity to clarify issues related to the campaign and ongoing quality improvement efforts, including culture change. Published by Elsevier Inc.

  12. A Systems Model for Power Technology Assessment

    NASA Technical Reports Server (NTRS)

    Hoffman, David J.

    2002-01-01

    A computer model is under continuing development at NASA Glenn Research Center that enables first-order assessments of space power technology. The model, an evolution of NASA Glenn's Array Design Assessment Model (ADAM), is an Excel workbook that consists of numerous spreadsheets containing power technology performance data and sizing algorithms. Underlying the model is a number of databases that contain default values for various power generation, energy storage and power management and distribution component parameters. These databases are actively maintained by a team of systems analysts so that they contain state-of-art data as well as the most recent technology performance projections. Sizing of the power subsystems can be accomplished either by using an assumed mass specific power (W/kg) or energy (Wh/kg) or by a bottoms-up calculation that accounts for individual component performance and masses. The power generation, energy storage and power management and distribution subsystems are sized for given mission requirements for a baseline case and up to three alternatives. This allows four different power systems to be sized and compared using consistent assumptions and sizing algorithms. The component sizing models contained in the workbook are modular so that they can be easily maintained and updated. All significant input values have default values loaded from the databases that can be over-written by the user. The default data and sizing algorithms for each of the power subsystems are described in some detail. The user interface and workbook navigational features are also discussed. Finally, an example study case that illustrates the model's capability is presented.

  13. Nutrient modeling for a semi-intensive IMC pond: an MS-Excel approach.

    PubMed

    Ray, Lala I P; Mal, B C; Moulick, S

    2017-11-01

    Semi-intensive Indian Major Carp (IMC) culture was practised in polythene lined dugout ponds at the Aquacultural Farm of Indian Institute of Technology, Kharagpur, West Bengal for 3 consecutive years at three different stocking densities (S.D), viz., 20,000, 35,000 and 50,000 numbers of fingerlings per hectare of water spread area. Fingerlings of Catla, Rohu and Mrigal were raised at a stocking ratio of 4:3:3. Total ammonia nitrogen (TAN) value along with other fishpond water quality parameters was monitored at 1 day intervals to ensure a good water ecosystem for a better fish growth. Water exchange was carried out before the TAN reached the critical limit. Field data on TAN obtained from the cultured fishponds stocked with three different stocking densities were used to study the dynamics of TAN. A developed model used to study the nutrient dynamics in shrimp pond was used to validate the observed data in the IMC pond ecosystem. Two years of observed TAN data were used to calibrate the spreadsheet model and the same model was validated using the third year observed data. The manual calibration based on the trial and error process of parameters adjustments was used and several simulations were performed by changing the model parameters. After adjustment of each parameter, the simulated and measured values of the water quality parameters were compared to judge the improvement in the model prediction. Forward finite difference discretization method was used in a MS-Excel spreadsheet to calibrate and validate the model for obtaining the TAN levels during the culture period. Observed data from the cultured fishponds of three different S.D were used to standardize 13 model parameters. The efficiency of the developed spreadsheet model was found to be more than 90% for the TAN estimation in the IMC cultured fishponds.

  14. Illuminating the Depths of the MagIC (Magnetics Information Consortium) Database

    NASA Astrophysics Data System (ADS)

    Koppers, A. A. P.; Minnett, R.; Jarboe, N.; Jonestrask, L.; Tauxe, L.; Constable, C.

    2015-12-01

    The Magnetics Information Consortium (http://earthref.org/MagIC/) is a grass-roots cyberinfrastructure effort envisioned by the paleo-, geo-, and rock magnetic scientific community. Its mission is to archive their wealth of peer-reviewed raw data and interpretations from magnetics studies on natural and synthetic samples. Many of these valuable data are legacy datasets that were never published in their entirety, some resided in other databases that are no longer maintained, and others were never digitized from the field notebooks and lab work. Due to the volume of data collected, most studies, modern and legacy, only publish the interpreted results and, occasionally, a subset of the raw data. MagIC is making an extraordinary effort to archive these data in a single data model, including the raw instrument measurements if possible. This facilitates the reproducibility of the interpretations, the re-interpretation of the raw data as the community introduces new techniques, and the compilation of heterogeneous datasets that are otherwise distributed across multiple formats and physical locations. MagIC has developed tools to assist the scientific community in many stages of their workflow. Contributors easily share studies (in a private mode if so desired) in the MagIC Database with colleagues and reviewers prior to publication, publish the data online after the study is peer reviewed, and visualize their data in the context of the rest of the contributions to the MagIC Database. From organizing their data in the MagIC Data Model with an online editable spreadsheet, to validating the integrity of the dataset with automated plots and statistics, MagIC is continually lowering the barriers to transforming dark data into transparent and reproducible datasets. Additionally, this web application generalizes to other databases in MagIC's umbrella website (EarthRef.org) so that the Geochemical Earth Reference Model (http://earthref.org/GERM/) portal, Seamount Biogeosciences Network (http://earthref.org/SBN/), EarthRef Digital Archive (http://earthref.org/ERDA/) and EarthRef Reference Database (http://earthref.org/ERR/) benefit from its development.

  15. Predicting Ga and Cu Profiles in Co-Evaporated Cu(In,Ga)Se 2 Using Modified Diffusion Equations and a Spreadsheet

    DOE PAGES

    Repins, Ingrid L.; Harvey, Steve; Bowers, Karen; ...

    2017-05-15

    Cu(In,Ga)Se 2(CIGS) photovoltaic absorbers frequently develop Ga gradients during growth. These gradients vary as a function of growth recipe, and are important to device performance. Prediction of Ga profiles using classic diffusion equations is not possible because In and Ga atoms occupy the same lattice sites and thus diffuse interdependently, and there is not yet a detailed experimental knowledge of the chemical potential as a function of composition that describes this interaction. Here, we show how diffusion equations can be modified to account for site sharing between In and Ga atoms. The analysis has been implemented in an Excel spreadsheet,more » and outputs predicted Cu, In, and Ga profiles for entered deposition recipes. A single set of diffusion coefficients and activation energies are chosen, such that simulated elemental profiles track with published data and those from this study. Extent and limits of agreement between elemental profiles predicted from the growth recipes and the spreadsheet tool are demonstrated.« less

  16. Meta-analyses and Forest plots using a microsoft excel spreadsheet: step-by-step guide focusing on descriptive data analysis.

    PubMed

    Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B

    2012-01-20

    Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.

  17. Meta-analyses and Forest plots using a microsoft excel spreadsheet: step-by-step guide focusing on descriptive data analysis

    PubMed Central

    2012-01-01

    Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software. PMID:22264277

  18. Multiphysics Object Oriented Simulation Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The Multiphysics Object Oriented Simulation Environment (MOOSE) software library developed at Idaho National Laboratory is a tool. MOOSE, like other tools, doesn't actually complete a task. Instead, MOOSE seeks to reduce the effort required to create engineering simulation applications. MOOSE itself is a software library: a blank canvas upon which you write equations and then MOOSE can help you solve them. MOOSE is comparable to a spreadsheet application. A spreadsheet, by itself, doesn't do anything. Only once equations are entered into it will a spreadsheet application compute anything. Such is the same for MOOSE. An engineer or scientist can utilizemore » the equation solvers within MOOSE to solve equations related to their area of study. For instance, a geomechanical scientist can input equations related to water flow in underground reservoirs and MOOSE can solve those equations to give the scientist an idea of how water could move over time. An engineer might input equations related to the forces in steel beams in order to understand the load bearing capacity of a bridge. Because MOOSE is a blank canvas it can be useful in many scientific and engineering pursuits.« less

  19. Predicting Ga and Cu Profiles in Co-Evaporated Cu(In,Ga)Se 2 Using Modified Diffusion Equations and a Spreadsheet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Repins, Ingrid L.; Harvey, Steve; Bowers, Karen

    Cu(In,Ga)Se 2(CIGS) photovoltaic absorbers frequently develop Ga gradients during growth. These gradients vary as a function of growth recipe, and are important to device performance. Prediction of Ga profiles using classic diffusion equations is not possible because In and Ga atoms occupy the same lattice sites and thus diffuse interdependently, and there is not yet a detailed experimental knowledge of the chemical potential as a function of composition that describes this interaction. Here, we show how diffusion equations can be modified to account for site sharing between In and Ga atoms. The analysis has been implemented in an Excel spreadsheet,more » and outputs predicted Cu, In, and Ga profiles for entered deposition recipes. A single set of diffusion coefficients and activation energies are chosen, such that simulated elemental profiles track with published data and those from this study. Extent and limits of agreement between elemental profiles predicted from the growth recipes and the spreadsheet tool are demonstrated.« less

  20. Earth Science Multimedia Theater

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.

    1998-01-01

    The presentation will begin with the latest 1998 NASA Earth Science Vision for the next 25 years. A compilation of the 10 days of animations of Hurricane Georges which were supplied daily on NASA to Network television will be shown. NASA's visualizations of Hurricane Bonnie which appeared in the Sept 7 1998 issue of TIME magazine. Highlights will be shown from the NASA hurricane visualization resource video tape that has been used repeatedly this season on network TV. Results will be presented from a new paper on automatic wind measurements in Hurricane Luis from 1 -min GOES images that will appear in the October BAMS. The visualizations are produced by the Goddard Visualization & Analysis Laboratory, and Scientific Visualization Studio, as well as other Goddard and NASA groups using NASA, NOAA, ESA, and NASDA Earth science datasets. Visualizations will be shown from the "Digital-HyperRes-Panorama" Earth Science ETheater'98 recently presented in Tokyo, Paris and Phoenix. The presentation in Paris used a SGI/CRAY Onyx Infinite Reality Super Graphics Workstation at 2560 X 1024 resolution with dual synchronized video Epson 71 00 projectors on a 20ft wide screen. Earth Science Electronic Theater '999 is being prepared for a December 1 st showing at NASA HQ in Washington and January presentation at the AMS meetings in Dallas. The 1999 version of the Etheater will be triple wide with at resolution of 3840 X 1024 on a 60 ft wide screen. Visualizations will also be featured from the new Earth Today Exhibit which was opened by Vice President Gore on July 2, 1998 at the Smithsonian Air & Space Museum in Washington, as well as those presented for possible use at the American Museum of Natural History (NYC), Disney EPCOT, and other venues. New methods are demonstrated for visualizing, interpreting, comparing, organizing and analyzing immense Hyperimage remote sensing datasets and three dimensional numerical model results. We call the data from many new Earth sensing satellites, Hyperimage datasets, because they have such high resolution in the spectral, temporal, spatial, and dynamic range domains. The traditional numerical spreadsheet paradigm has been extended to develop a scientific visualization approach for processing Hyperimage datasets and 3D model results interactively. The advantages of extending the powerful spreadsheet style of computation to multiple sets of images and organizing image processing were demonstrated using the Distributed Image SpreadSheet (DISS).

Top