NetpathXL - An Excel Interface to the Program NETPATH
Parkhurst, David L.; Charlton, Scott R.
2008-01-01
NetpathXL is a revised version of NETPATH that runs under Windows? operating systems. NETPATH is a computer program that uses inverse geochemical modeling techniques to calculate net geochemical reactions that can account for changes in water composition between initial and final evolutionary waters in hydrologic systems. The inverse models also can account for the isotopic composition of waters and can be used to estimate radiocarbon ages of dissolved carbon in ground water. NETPATH relies on an auxiliary, database program, DB, to enter the chemical analyses and to perform speciation calculations that define total concentrations of elements, charge balance, and redox state of aqueous solutions that are then used in inverse modeling. Instead of DB, NetpathXL relies on Microsoft Excel? to enter the chemical analyses. The speciation calculation formerly included in DB is implemented within the program NetpathXL. A program DBXL can be used to translate files from the old DB format (.lon files) to NetpathXL spreadsheets, or to create new NetpathXL spreadsheets. Once users have a NetpathXL spreadsheet with the proper format, new spreadsheets can be generated by copying or saving NetpathXL spreadsheets. In addition, DBXL can convert NetpathXL spreadsheets to PHREEQC input files. New capabilities in PHREEQC (version 2.15) allow solution compositions to be written to a .lon file, and inverse models developed in PHREEQC to be written as NetpathXL .pat and model files. NetpathXL can open NetpathXL spreadsheets, NETPATH-format path files (.pat files), and NetpathXL-format path files (.pat files). Once the speciation calculations have been performed on a spreadsheet file or a .pat file has been opened, the NetpathXL calculation engine is identical to the original NETPATH. Development of models and viewing results in NetpathXL rely on keyboard entry as in NETPATH.
GREET 1.5 : transportation fuel-cycle model. Vol. 1 : methodology, development, use, and results.
DOT National Transportation Integrated Search
1999-10-01
This report documents the development and use of the most recent version (Version 1.5) of the Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation (GREET) model. The model, developed in a spreadsheet format, estimates the full fuel...
Using Spreadsheets to Teach Statistics in Geography.
ERIC Educational Resources Information Center
Lee, M. P.; Soper, J. B.
1987-01-01
Maintains that teaching methods of statistical calculation in geography may be enhanced by using a computer spreadsheet. The spreadsheet format of rows and columns allows the data to be inspected and altered to demonstrate various statistical properties. The inclusion of graphics and database facilities further adds to the value of a spreadsheet.…
CALCULATIONAL TOOL FOR SKIN CONTAMINATION DOSE ESTIMATE
DOE Office of Scientific and Technical Information (OSTI.GOV)
HILL, R.L.
2005-03-31
A spreadsheet calculational tool was developed to automate the calculations performed for estimating dose from skin contamination. This document reports on the design and testing of the spreadsheet calculational tool.
Automated Formative Feedback and Summative Assessment Using Individualised Spreadsheet Assignments
ERIC Educational Resources Information Center
Blayney, Paul; Freeman, Mark
2004-01-01
This paper reports on the effects of automating formative feedback at the student's discretion and automating summative assessment with individualised spreadsheet assignments. Quality learning outcomes are achieved when students adopt deep approaches to learning (Ramsden, 2003). Learning environments designed to align assessment to learning…
Spreadsheet log analysis in subsurface geology
Doveton, J.H.
2000-01-01
Most of the direct knowledge of the geology of the subsurface is gained from the examination of core and drill-cuttings recovered from boreholes drilled by the petroleum and water industries. Wireline logs run in these same boreholes generally have been restricted to tasks of lithostratigraphic correlation and thee location of hydrocarbon pay zones. However, the range of petrophysical measurements has expanded markedly in recent years, so that log traces now can be transformed to estimates of rock composition. Increasingly, logs are available in a digital format that can be read easily by a desktop computer and processed by simple spreadsheet software methods. Taken together, these developments offer accessible tools for new insights into subsurface geology that complement the traditional, but limited, sources of core and cutting observations.
User's guide: RPGrow$: a red pine growth and analysis spreadsheet for the Lake States.
Carol A. Hyldahl; Gerald H. Grossman
1993-01-01
Describes RPGrow$, a stand-level, interactive spreadsheet for projecting growth and yield and estimating financial returns of red pine plantations in the Lake States. This spreadsheet is based on published growth models for red pine. Financial analyses are based on discounted cash flow methods.
Documentation of spreadsheets for the analysis of aquifer-test and slug-test data
Halford, Keith J.; Kuniansky, Eve L.
2002-01-01
Several spreadsheets have been developed for the analysis of aquifer-test and slug-test data. Each spreadsheet incorporates analytical solution(s) of the partial differential equation for ground-water flow to a well for a specific type of condition or aquifer. The derivations of the analytical solutions were previously published. Thus, this report abbreviates the theoretical discussion, but includes practical information about each method and the important assumptions for the applications of each method. These spreadsheets were written in Microsoft Excel 9.0 (use of trade names does not constitute endorsement by the USGS). Storage properties should not be estimated with many of the spreadsheets because most are for analyzing single-well tests. Estimation of storage properties from single-well tests is generally discouraged because single-well tests are affected by wellbore storage and by well construction. These non-ideal effects frequently cause estimates of storage to be erroneous by orders of magnitude. Additionally, single-well tests are not sensitive to aquifer-storage properties. Single-well tests include all slug tests (Bouwer and Rice Method, Cooper, Bredehoeft, Papadopulos Method, and van der Kamp Method), the Cooper-Jacob straight-line Method, Theis recovery-data analysis, Jacob-Lohman method for flowing wells in a confined aquifer, and the step-drawdown test. Multi-well test spreadsheets included in this report are; Hantush-Jacob Leaky Aquifer Method and Distance-Drawdown Methods. The distance-drawdown method is an equilibrium or steady-state method, thus storage cannot be estimated.
Maxine: A spreadsheet for estimating dose from chronic atmospheric radioactive releases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jannik, Tim; Bell, Evaleigh; Dixon, Kenneth
MAXINE is an EXCEL© spreadsheet, which is used to estimate dose to individuals for routine and accidental atmospheric releases of radioactive materials. MAXINE does not contain an atmospheric dispersion model, but rather doses are estimated using air and ground concentrations as input. Minimal input is required to run the program and site specific parameters are used when possible. Complete code description, verification of models, and user’s manual have been included.
The Spreadsheet in an Educational Setting. Microcomputing Working Paper Series F 84-4.
ERIC Educational Resources Information Center
Wozny, Lucy
This overview of a specific spreadsheet, Microsoft's Multiplan for the Apple Macintosh microcomputer, emphasizes specific features that are important to the academic community, including the mathematical functions of algebra, trigonometry, and statistical analysis. Additional features are summarized, including data formats for both numerical and…
Analysis of Slug Tests in Formations of High Hydraulic Conductivity
Butler, J.J.; Garnett, E.J.; Healey, J.M.
2003-01-01
A new procedure is presented for the analysis of slug tests performed in partially penetrating wells in formations of high hydraulic conductivity. This approach is a simple, spreadsheet-based implementation of existing models that can be used for analysis of tests from confined or unconfined aquifers. Field examples of tests exhibiting oscillatory and nonoscillatory behavior are used to illustrate the procedure and to compare results with estimates obtained using alternative approaches. The procedure is considerably simpler than recently proposed methods for this hydrogeologic setting. Although the simplifications required by the approach can introduce error into hydraulic-conductivity estimates, this additional error becomes negligible when appropriate measures are taken in the field. These measures are summarized in a set of practical field guidelines for slug tests in highly permeable aquifers.
Analysis of slug tests in formations of high hydraulic conductivity.
Butler, James J; Garnett, Elizabeth J; Healey, John M
2003-01-01
A new procedure is presented for the analysis of slug tests performed in partially penetrating wells in formations of high hydraulic conductivity. This approach is a simple, spreadsheet-based implementation of existing models that can be used for analysis of tests from confined or unconfined aquifers. Field examples of tests exhibiting oscillatory and nonoscillatory behavior are used to illustrate the procedure and to compare results with estimates obtained using alternative approaches. The procedure is considerably simpler than recently proposed methods for this hydrogeologic setting. Although the simplifications required by the approach can introduce error into hydraulic-conductivity estimates, this additional error becomes negligible when appropriate measures are taken in the field. These measures are summarized in a set of practical field guidelines for slug tests in highly permeable aquifers.
NETL CO 2 Storage prospeCtive Resource Estimation Excel aNalysis (CO 2-SCREEN) User's Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanguinito, Sean M.; Goodman, Angela; Levine, Jonathan
This user’s manual guides the use of the National Energy Technology Laboratory’s (NETL) CO 2 Storage prospeCtive Resource Estimation Excel aNalysis (CO 2-SCREEN) tool, which was developed to aid users screening saline formations for prospective CO 2 storage resources. CO 2- SCREEN applies U.S. Department of Energy (DOE) methods and equations for estimating prospective CO 2 storage resources for saline formations. CO2-SCREEN was developed to be substantive and user-friendly. It also provides a consistent method for calculating prospective CO 2 storage resources that allows for consistent comparison of results between different research efforts, such as the Regional Carbon Sequestration Partnershipsmore » (RCSP). CO 2-SCREEN consists of an Excel spreadsheet containing geologic inputs and outputs, linked to a GoldSim Player model that calculates prospective CO 2 storage resources via Monte Carlo simulation.« less
Keemei: cloud-based validation of tabular bioinformatics file formats in Google Sheets.
Rideout, Jai Ram; Chase, John H; Bolyen, Evan; Ackermann, Gail; González, Antonio; Knight, Rob; Caporaso, J Gregory
2016-06-13
Bioinformatics software often requires human-generated tabular text files as input and has specific requirements for how those data are formatted. Users frequently manage these data in spreadsheet programs, which is convenient for researchers who are compiling the requisite information because the spreadsheet programs can easily be used on different platforms including laptops and tablets, and because they provide a familiar interface. It is increasingly common for many different researchers to be involved in compiling these data, including study coordinators, clinicians, lab technicians and bioinformaticians. As a result, many research groups are shifting toward using cloud-based spreadsheet programs, such as Google Sheets, which support the concurrent editing of a single spreadsheet by different users working on different platforms. Most of the researchers who enter data are not familiar with the formatting requirements of the bioinformatics programs that will be used, so validating and correcting file formats is often a bottleneck prior to beginning bioinformatics analysis. We present Keemei, a Google Sheets Add-on, for validating tabular files used in bioinformatics analyses. Keemei is available free of charge from Google's Chrome Web Store. Keemei can be installed and run on any web browser supported by Google Sheets. Keemei currently supports the validation of two widely used tabular bioinformatics formats, the Quantitative Insights into Microbial Ecology (QIIME) sample metadata mapping file format and the Spatially Referenced Genetic Data (SRGD) format, but is designed to easily support the addition of others. Keemei will save researchers time and frustration by providing a convenient interface for tabular bioinformatics file format validation. By allowing everyone involved with data entry for a project to easily validate their data, it will reduce the validation and formatting bottlenecks that are commonly encountered when human-generated data files are first used with a bioinformatics system. Simplifying the validation of essential tabular data files, such as sample metadata, will reduce common errors and thereby improve the quality and reliability of research outcomes.
A Computer Spreadsheet for Locating Assistive Devices.
ERIC Educational Resources Information Center
Palmer, Catherine V.; Garstecki, Dean C.
1988-01-01
The article presents a directory of assistive devices for persons with hearing impairments in a grid format by distributor and type of device (alerting devices, telephone, TV/radio/stereo, personal communication, group communication, and other). The product locator is also available in spreadsheet form for either the Macintosh or IBM-PC computers.…
DOT National Transportation Integrated Search
2014-03-01
This study resulted in the development of the GASCAP model (the Greenhouse Gas Assessment : Spreadsheet for Transportation Capital Projects). This spreadsheet model provides a user-friendly interface for determining the greenhouse gas (GHG) emissions...
Handling Math Expressions in Economics: Recoding Spreadsheet Teaching Tool of Growth Models
ERIC Educational Resources Information Center
Moro-Egido, Ana I.; Pedauga, Luis E.
2017-01-01
In the present paper, we develop a teaching methodology for economic theory. The main contribution of this paper relies on combining the interactive characteristics of spreadsheet programs such as Excel and Unicode plain-text linear format for mathematical expressions. The advantage of Unicode standard rests on its ease for writing and reading…
ERIC Educational Resources Information Center
Sims, Paul A.
2010-01-01
An approach is presented that utilizes a spreadsheet to allow students to explore different means of calculating and visualizing how the charge on peptides and proteins varies as a function of pH. In particular, the concept of isoelectric point is developed to allow students to compare the results of their spreadsheet calculations with those of…
Gazoorian, Christopher L.
2015-01-01
A graphical user interface, with an integrated spreadsheet summary report, has been developed to estimate and display the daily mean streamflows and statistics and to evaluate different water management or water withdrawal scenarios with the estimated monthly data. This package of regression equations, U.S. Geological Survey streamgage data, and spreadsheet application produces an interactive tool to estimate an unaltered daily streamflow hydrograph and streamflow statistics at ungaged sites in New York. Among other uses, the New York Streamflow Estimation Tool can assist water managers with permitting water withdrawals, implementing habitat protection, estimating contaminant loads, or determining the potential affect from chemical spills.
Gandy, Lisa M; Gumm, Jordan; Fertig, Benjamin; Thessen, Anne; Kennish, Michael J; Chavan, Sameer; Marchionni, Luigi; Xia, Xiaoxin; Shankrit, Shambhavi; Fertig, Elana J
2017-01-01
Scientists have unprecedented access to a wide variety of high-quality datasets. These datasets, which are often independently curated, commonly use unstructured spreadsheets to store their data. Standardized annotations are essential to perform synthesis studies across investigators, but are often not used in practice. Therefore, accurately combining records in spreadsheets from differing studies requires tedious and error-prone human curation. These efforts result in a significant time and cost barrier to synthesis research. We propose an information retrieval inspired algorithm, Synthesize, that merges unstructured data automatically based on both column labels and values. Application of the Synthesize algorithm to cancer and ecological datasets had high accuracy (on the order of 85-100%). We further implement Synthesize in an open source web application, Synthesizer (https://github.com/lisagandy/synthesizer). The software accepts input as spreadsheets in comma separated value (CSV) format, visualizes the merged data, and outputs the results as a new spreadsheet. Synthesizer includes an easy to use graphical user interface, which enables the user to finish combining data and obtain perfect accuracy. Future work will allow detection of units to automatically merge continuous data and application of the algorithm to other data formats, including databases.
Gumm, Jordan; Fertig, Benjamin; Thessen, Anne; Kennish, Michael J.; Chavan, Sameer; Marchionni, Luigi; Xia, Xiaoxin; Shankrit, Shambhavi; Fertig, Elana J.
2017-01-01
Scientists have unprecedented access to a wide variety of high-quality datasets. These datasets, which are often independently curated, commonly use unstructured spreadsheets to store their data. Standardized annotations are essential to perform synthesis studies across investigators, but are often not used in practice. Therefore, accurately combining records in spreadsheets from differing studies requires tedious and error-prone human curation. These efforts result in a significant time and cost barrier to synthesis research. We propose an information retrieval inspired algorithm, Synthesize, that merges unstructured data automatically based on both column labels and values. Application of the Synthesize algorithm to cancer and ecological datasets had high accuracy (on the order of 85–100%). We further implement Synthesize in an open source web application, Synthesizer (https://github.com/lisagandy/synthesizer). The software accepts input as spreadsheets in comma separated value (CSV) format, visualizes the merged data, and outputs the results as a new spreadsheet. Synthesizer includes an easy to use graphical user interface, which enables the user to finish combining data and obtain perfect accuracy. Future work will allow detection of units to automatically merge continuous data and application of the algorithm to other data formats, including databases. PMID:28437440
The meaning of diagnostic test results: a spreadsheet for swift data analysis.
Maceneaney, P M; Malone, D E
2000-03-01
To design a spreadsheet program to: (a) analyse rapidly diagnostic test result data produced in local research or reported in the literature; (b) correct reported predictive values for disease prevalence in any population; (c) estimate the post-test probability of disease in individual patients. Microsoft Excel(TM)was used. Section A: a contingency (2 x 2) table was incorporated into the spreadsheet. Formulae for standard calculations [sample size, disease prevalence, sensitivity and specificity with 95% confidence intervals, predictive values and likelihood ratios (LRs)] were linked to this table. The results change automatically when the data in the true or false negative and positive cells are changed. Section B: this estimates predictive values in any population, compensating for altered disease prevalence. Sections C-F: Bayes' theorem was incorporated to generate individual post-test probabilities. The spreadsheet generates 95% confidence intervals, LRs and a table and graph of conditional probabilities once the sensitivity and specificity of the test are entered. The latter shows the expected post-test probability of disease for any pre-test probability when a test of known sensitivity and specificity is positive or negative. This spreadsheet can be used on desktop and palmtop computers. The MS Excel(TM)version can be downloaded via the Internet from the URL ftp://radiography.com/pub/Rad-data99.xls A spreadsheet is useful for contingency table data analysis and assessment of the clinical meaning of diagnostic test results. Copyright 2000 The Royal College of Radiologists.
Reed, Shelby D; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L; Bowers, Margaret T; Samsa, Gregory P; Paul, Sara; Schulman, Kevin A; Whellan, David J; Riegel, Barbara J
2012-01-01
Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers and health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions.
Reed, Shelby D.; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L.; Bowers, Margaret T.; Samsa, Gregory P.; Paul, Sara; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara J.
2011-01-01
Background Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Methods and Results Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers or health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. Conclusions The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions. PMID:22147884
OntoMaton: a bioportal powered ontology widget for Google Spreadsheets.
Maguire, Eamonn; González-Beltrán, Alejandra; Whetzel, Patricia L; Sansone, Susanna-Assunta; Rocca-Serra, Philippe
2013-02-15
Data collection in spreadsheets is ubiquitous, but current solutions lack support for collaborative semantic annotation that would promote shared and interdisciplinary annotation practices, supporting geographically distributed players. OntoMaton is an open source solution that brings ontology lookup and tagging capabilities into a cloud-based collaborative editing environment, harnessing Google Spreadsheets and the NCBO Web services. It is a general purpose, format-agnostic tool that may serve as a component of the ISA software suite. OntoMaton can also be used to assist the ontology development process. OntoMaton is freely available from Google widgets under the CPAL open source license; documentation and examples at: https://github.com/ISA-tools/OntoMaton.
An Integrated Management Support and Production Control System for Hardwood Forest Products
Guillermo A. Mendoza; Roger J. Meimban; William Sprouse; William G. Luppold; Philip A. Araman
1991-01-01
Spreadsheet and simulation models are tools which enable users to analyze a large number of variables affecting hardwood material utilization and profit in a systematic fashion. This paper describes two spreadsheet models; SEASaw and SEAIn, and a hardwood sawmill simulator. SEASaw is designed to estimate the amount of conversion from timber to lumber, while SEAIn is a...
Probabilistic assessment methodology for continuous-type petroleum accumulations
Crovelli, R.A.
2003-01-01
The analytic resource assessment method, called ACCESS (Analytic Cell-based Continuous Energy Spreadsheet System), was developed to calculate estimates of petroleum resources for the geologic assessment model, called FORSPAN, in continuous-type petroleum accumulations. The ACCESS method is based upon mathematical equations derived from probability theory in the form of a computer spreadsheet system. ?? 2003 Elsevier B.V. All rights reserved.
Engine Icing Data - An Analytics Approach
NASA Technical Reports Server (NTRS)
Fitzgerald, Brooke A.; Flegel, Ashlie B.
2017-01-01
Engine icing researchers at the NASA Glenn Research Center use the Escort data acquisition system in the Propulsion Systems Laboratory (PSL) to generate and collect a tremendous amount of data every day. Currently these researchers spend countless hours processing and formatting their data, selecting important variables, and plotting relationships between variables, all by hand, generally analyzing data in a spreadsheet-style program (such as Microsoft Excel). Though spreadsheet-style analysis is familiar and intuitive to many, processing data in spreadsheets is often unreproducible and small mistakes are easily overlooked. Spreadsheet-style analysis is also time inefficient. The same formatting, processing, and plotting procedure has to be repeated for every dataset, which leads to researchers performing the same tedious data munging process over and over instead of making discoveries within their data. This paper documents a data analysis tool written in Python hosted in a Jupyter notebook that vastly simplifies the analysis process. From the file path of any folder containing time series datasets, this tool batch loads every dataset in the folder, processes the datasets in parallel, and ingests them into a widget where users can search for and interactively plot subsets of columns in a number of ways with a click of a button, easily and intuitively comparing their data and discovering interesting dynamics. Furthermore, comparing variables across data sets and integrating video data (while extremely difficult with spreadsheet-style programs) is quite simplified in this tool. This tool has also gathered interest outside the engine icing branch, and will be used by researchers across NASA Glenn Research Center. This project exemplifies the enormous benefit of automating data processing, analysis, and visualization, and will help researchers move from raw data to insight in a much smaller time frame.
ISA-TAB-Nano: a specification for sharing nanomaterial research data in spreadsheet-based format.
Thomas, Dennis G; Gaheen, Sharon; Harper, Stacey L; Fritts, Martin; Klaessig, Fred; Hahn-Dantona, Elizabeth; Paik, David; Pan, Sue; Stafford, Grace A; Freund, Elaine T; Klemm, Juli D; Baker, Nathan A
2013-01-14
The high-throughput genomics communities have been successfully using standardized spreadsheet-based formats to capture and share data within labs and among public repositories. The nanomedicine community has yet to adopt similar standards to share the diverse and multi-dimensional types of data (including metadata) pertaining to the description and characterization of nanomaterials. Owing to the lack of standardization in representing and sharing nanomaterial data, most of the data currently shared via publications and data resources are incomplete, poorly-integrated, and not suitable for meaningful interpretation and re-use of the data. Specifically, in its current state, data cannot be effectively utilized for the development of predictive models that will inform the rational design of nanomaterials. We have developed a specification called ISA-TAB-Nano, which comprises four spreadsheet-based file formats for representing and integrating various types of nanomaterial data. Three file formats (Investigation, Study, and Assay files) have been adapted from the established ISA-TAB specification; while the Material file format was developed de novo to more readily describe the complexity of nanomaterials and associated small molecules. In this paper, we have discussed the main features of each file format and how to use them for sharing nanomaterial descriptions and assay metadata. The ISA-TAB-Nano file formats provide a general and flexible framework to record and integrate nanomaterial descriptions, assay data (metadata and endpoint measurements) and protocol information. Like ISA-TAB, ISA-TAB-Nano supports the use of ontology terms to promote standardized descriptions and to facilitate search and integration of the data. The ISA-TAB-Nano specification has been submitted as an ASTM work item to obtain community feedback and to provide a nanotechnology data-sharing standard for public development and adoption.
ISA-TAB-Nano: A Specification for Sharing Nanomaterial Research Data in Spreadsheet-based Format
2013-01-01
Background and motivation The high-throughput genomics communities have been successfully using standardized spreadsheet-based formats to capture and share data within labs and among public repositories. The nanomedicine community has yet to adopt similar standards to share the diverse and multi-dimensional types of data (including metadata) pertaining to the description and characterization of nanomaterials. Owing to the lack of standardization in representing and sharing nanomaterial data, most of the data currently shared via publications and data resources are incomplete, poorly-integrated, and not suitable for meaningful interpretation and re-use of the data. Specifically, in its current state, data cannot be effectively utilized for the development of predictive models that will inform the rational design of nanomaterials. Results We have developed a specification called ISA-TAB-Nano, which comprises four spreadsheet-based file formats for representing and integrating various types of nanomaterial data. Three file formats (Investigation, Study, and Assay files) have been adapted from the established ISA-TAB specification; while the Material file format was developed de novo to more readily describe the complexity of nanomaterials and associated small molecules. In this paper, we have discussed the main features of each file format and how to use them for sharing nanomaterial descriptions and assay metadata. Conclusion The ISA-TAB-Nano file formats provide a general and flexible framework to record and integrate nanomaterial descriptions, assay data (metadata and endpoint measurements) and protocol information. Like ISA-TAB, ISA-TAB-Nano supports the use of ontology terms to promote standardized descriptions and to facilitate search and integration of the data. The ISA-TAB-Nano specification has been submitted as an ASTM work item to obtain community feedback and to provide a nanotechnology data-sharing standard for public development and adoption. PMID:23311978
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKone, T.E.; Enoch, K.G.
2002-08-01
CalTOX has been developed as a set of spreadsheet models and spreadsheet data sets to assist in assessing human exposures from continuous releases to multiple environmental media, i.e. air, soil, and water. It has also been used for waste classification and for setting soil clean-up levels at uncontrolled hazardous wastes sites. The modeling components of CalTOX include a multimedia transport and transformation model, multi-pathway exposure scenario models, and add-ins to quantify and evaluate uncertainty and variability. All parameter values used as inputs to CalTOX are distributions, described in terms of mean values and a coefficient of variation, rather than asmore » point estimates or plausible upper values such as most other models employ. This probabilistic approach allows both sensitivity and uncertainty analyses to be directly incorporated into the model operation. This manual provides CalTOX users with a brief overview of the CalTOX spreadsheet model and provides instructions for using the spreadsheet to make deterministic and probabilistic calculations of source-dose-risk relationships.« less
Code of Federal Regulations, 2010 CFR
2010-07-01
... Use. (e) Format and delivery. (1) Electronic format only. Reports of use must be maintained and delivered in electronic format only, as prescribed in paragraphs (e)(2) through (8) of this section. A hard... spreadsheet templates. All report of use data files must be delivered in ASCII format. However, to facilitate...
[Development of an Excel spreadsheet for meta-analysis of indirect and mixed treatment comparisons].
Tobías, Aurelio; Catalá-López, Ferrán; Roqué, Marta
2014-01-01
Meta-analyses in clinical research usually aimed to evaluate treatment efficacy and safety in direct comparison with a unique comparator. Indirect comparisons, using the Bucher's method, can summarize primary data when information from direct comparisons is limited or nonexistent. Mixed comparisons allow combining estimates from direct and indirect comparisons, increasing statistical power. There is a need for simple applications for meta-analysis of indirect and mixed comparisons. These can easily be conducted using a Microsoft Office Excel spreadsheet. We developed a spreadsheet for indirect and mixed effects comparisons of friendly use for clinical researchers interested in systematic reviews, but non-familiarized with the use of more advanced statistical packages. The use of the proposed Excel spreadsheet for indirect and mixed comparisons can be of great use in clinical epidemiology to extend the knowledge provided by traditional meta-analysis when evidence from direct comparisons is limited or nonexistent.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-23
... a separate document, our preferred file format is Microsoft Word. If you attach multiple comments (such as form letters), our preferred format is a Microsoft Excel spreadsheet. (2) By Hard Copy: Submit...
FHWA infrastructure carbon estimator : final report and user’s guide.
DOT National Transportation Integrated Search
2014-09-01
This study developed a method of estimating energy and GHG emissions from construction and maintenance of transportation systems. Designed as a spreadsheet-based model for practitioners, FHWAs Infrastructure Carbon Estimator is based on data colle...
Code of Federal Regulations, 2012 CFR
2012-04-01
...; electronically stored or transmitted information or data; books; papers; correspondence; accounts; financial... electronic records or paper documents; (2) Electronic information which is in a readable format such as a facsimile paper format or an electronic or hardcopy spreadsheet; (3) In the case of a paper record that is...
Soda Lake Well Lithology Data and Geologic Cross-Sections
Faulds, James E.
2013-12-31
Comprehensive catalogue of drill‐hole data in spreadsheet, shapefile, and Geosoft database formats. Includes XYZ locations of well heads, year drilled, type of well, operator, total depths, well path data (deviations), lithology logs, and temperature data. Plus, 13 cross‐sections in Adobe Illustrator format.
Crovelli, Robert A.; revised by Charpentier, Ronald R.
2012-01-01
The U.S. Geological Survey (USGS) periodically assesses petroleum resources of areas within the United States and the world. The purpose of this report is to explain the development of an analytic probabilistic method and spreadsheet software system called Analytic Cell-Based Continuous Energy Spreadsheet System (ACCESS). The ACCESS method is based upon mathematical equations derived from probability theory. The ACCESS spreadsheet can be used to calculate estimates of the undeveloped oil, gas, and NGL (natural gas liquids) resources in a continuous-type assessment unit. An assessment unit is a mappable volume of rock in a total petroleum system. In this report, the geologic assessment model is defined first, the analytic probabilistic method is described second, and the spreadsheet ACCESS is described third. In this revised version of Open-File Report 00-044 , the text has been updated to reflect modifications that were made to the ACCESS program. Two versions of the program are added as appendixes.
Using a spreadsheet/table template for economic value added analysis.
Cassey, Margaret
2008-01-01
Translating clinical research into practical applications that are cost effective has received significant attention as staff nurses attempt to expand new knowledge into an already complex daily workflow. spreadsheet/table template created in a word processing format can assist with setting up and carrying out the analysis of costs for comparing different approaches to routine activities. By encouraging nurses to take the initiative to examine parts of everyday nursing practice with an eye to cost analysis, significant contributions can be made to maximizing the bottom line.
Spreadsheet WATERSHED modeling for nonpoint-source pollution management in a Wisconsin basin
Walker, J.F.; Pickard, S.A.; Sonzogni, W.C.
1989-01-01
Although several sophisticated nonpoint pollution models exist, few are available that are easy to use, cover a variety of conditions, and integrate a wide range of information to allow managers and planners to assess different control strategies. Here, a straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.A straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.
Ladtap XL Version 2017: A Spreadsheet For Estimating Dose Resulting From Aqueous Releases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Minter, K.; Jannik, T.
LADTAP XL© is an EXCEL© spreadsheet used to estimate dose to offsite individuals and populations resulting from routine and accidental releases of radioactive materials to the Savannah River. LADTAP XL© contains two worksheets: LADTAP and IRRIDOSE. The LADTAP worksheet estimates dose for environmental pathways including external exposure resulting from recreational activities on the Savannah River and internal exposure resulting from ingestion of water, fish, and invertebrates originating from the Savannah River. IRRIDOSE estimates offsite dose to individuals and populations from irrigation of foodstuffs with contaminated water from the Savannah River. In 2004, a complete description of the LADTAP XL© codemore » and an associated user’s manual was documented in LADTAP XL©: A Spreadsheet for Estimating Dose Resulting from Aqueous Release (WSRC-TR-2004-00059) and revised input parameters, dose coefficients, and radionuclide decay constants were incorporated into LADTAP XL© Version 2013 (SRNL-STI-2011-00238). LADTAP XL© Version 2017 is a slight modification to Version 2013 with minor changes made for more user-friendly parameter inputs and organization, updates in the time conversion factors used within the dose calculations, and fixed an issue with the expected time build-up parameter referenced within the population shoreline dose calculations. This manual has been produced to update the code description, verification of the models, and provide an updated user’s manual. LADTAP XL© Version 2017 has been verified by Minter (2017) and is ready for use at the Savannah River Site (SRS).« less
Increasing Fleet Readiness Through Improved Distance Support
2013-03-01
gis tic s D ata Pro ce ss Tra ini ng Da ta An aly ze Da ta An...Format Fleet Maintenance Formatted Data /sData F.2.2 • ... Fleet Support Infrastructure Data ~ Formatted Trainin~ ata l Obtain and Format Fleet...system either by using a form or a spreadsheet F.2.2 Obtain Data from Fleet Recorded System Performance Data is downloaded from ship either manually
A quality-based cost model for new electronic systems and products
NASA Astrophysics Data System (ADS)
Shina, Sammy G.; Saigal, Anil
1998-04-01
This article outlines a method for developing a quality-based cost model for the design of new electronic systems and products. The model incorporates a methodology for determining a cost-effective design margin allocation for electronic products and systems and its impact on manufacturing quality and cost. A spreadsheet-based cost estimating tool was developed to help implement this methodology in order for the system design engineers to quickly estimate the effect of design decisions and tradeoffs on the quality and cost of new products. The tool was developed with automatic spreadsheet connectivity to current process capability and with provisions to consider the impact of capital equipment and tooling purchases to reduce the product cost.
Internal trip capture estimator for mixed-use developments.
DOT National Transportation Integrated Search
2007-12-01
This report describes a spreadsheet tool for estimating trip generation for mixed-use developments, : accounting for internal trip capture. Internal trip capture is the portion of trips generated by a mixed-use : development that both begin and end w...
Tools for Requirements Management: A Comparison of Telelogic DOORS and the HiVe
2006-07-01
types DOORS deals with are text files, spreadsheets, FrameMaker , rich text, Microsoft Word and Microsoft Project. 2.5.1 Predefined file formats DOORS...during the export. DOORS exports FrameMaker files in an incomplete format, meaning DOORS exported files will have to be opened in FrameMaker and saved
LICSS - a chemical spreadsheet in microsoft excel
2012-01-01
Background Representations of chemical datasets in spreadsheet format are important for ready data assimilation and manipulation. In addition to the normal spreadsheet facilities, chemical spreadsheets need to have visualisable chemical structures and data searchable by chemical as well as textual queries. Many such chemical spreadsheet tools are available, some operating in the familiar Microsoft Excel environment. However, within this group, the performance of Excel is often compromised, particularly in terms of the number of compounds which can usefully be stored on a sheet. Summary LICSS is a lightweight chemical spreadsheet within Microsoft Excel for Windows. LICSS stores structures solely as Smiles strings. Chemical operations are carried out by calling Java code modules which use the CDK, JChemPaint and OPSIN libraries to provide cheminformatics functionality. Compounds in sheets or charts may be visualised (individually or en masse), and sheets may be searched by substructure or similarity. All the molecular descriptors available in CDK may be calculated for compounds (in batch or on-the-fly), and various cheminformatic operations such as fingerprint calculation, Sammon mapping, clustering and R group table creation may be carried out. We detail here the features of LICSS and how they are implemented. We also explain the design criteria, particularly in terms of potential corporate use, which led to this particular implementation. Conclusions LICSS is an Excel-based chemical spreadsheet with a difference: • It can usefully be used on sheets containing hundreds of thousands of compounds; it doesn't compromise the normal performance of Microsoft Excel • It is designed to be installed and run in environments in which users do not have admin privileges; installation involves merely file copying, and sharing of LICSS sheets invokes automatic installation • It is free and extensible LICSS is open source software and we hope sufficient detail is provided here to enable developers to add their own features and share with the community. PMID:22301088
LICSS - a chemical spreadsheet in microsoft excel.
Lawson, Kevin R; Lawson, Jonty
2012-02-02
Representations of chemical datasets in spreadsheet format are important for ready data assimilation and manipulation. In addition to the normal spreadsheet facilities, chemical spreadsheets need to have visualisable chemical structures and data searchable by chemical as well as textual queries. Many such chemical spreadsheet tools are available, some operating in the familiar Microsoft Excel environment. However, within this group, the performance of Excel is often compromised, particularly in terms of the number of compounds which can usefully be stored on a sheet. LICSS is a lightweight chemical spreadsheet within Microsoft Excel for Windows. LICSS stores structures solely as Smiles strings. Chemical operations are carried out by calling Java code modules which use the CDK, JChemPaint and OPSIN libraries to provide cheminformatics functionality. Compounds in sheets or charts may be visualised (individually or en masse), and sheets may be searched by substructure or similarity. All the molecular descriptors available in CDK may be calculated for compounds (in batch or on-the-fly), and various cheminformatic operations such as fingerprint calculation, Sammon mapping, clustering and R group table creation may be carried out.We detail here the features of LICSS and how they are implemented. We also explain the design criteria, particularly in terms of potential corporate use, which led to this particular implementation. LICSS is an Excel-based chemical spreadsheet with a difference:• It can usefully be used on sheets containing hundreds of thousands of compounds; it doesn't compromise the normal performance of Microsoft Excel• It is designed to be installed and run in environments in which users do not have admin privileges; installation involves merely file copying, and sharing of LICSS sheets invokes automatic installation• It is free and extensibleLICSS is open source software and we hope sufficient detail is provided here to enable developers to add their own features and share with the community.
Rare Earth Geochemistry of Rock Core form WY Reservoirs
Quillinan, Scott; Bagdonnas, Davin; McLaughlin, J. Fred; Nye, Charles
2016-10-01
These data include major, minor, trace and rare earth element concentration of geologic formations in Wyoming oil and gas fields. *Note - Link below contains updated version of spreadsheet (6/14/2017)
Understanding Solubility through Excel Spreadsheets
NASA Astrophysics Data System (ADS)
Brown, Pamela
2001-02-01
This article describes assignments related to the solubility of inorganic salts that can be given in an introductory general chemistry course. Le Châtelier's principle, solubility, unit conversion, and thermodynamics are tied together to calculate heats of solution by two methods: heats of formation and an application of the van't Hoff equation. These assignments address the need for math, graphing, and computer skills in the chemical technology program by developing skill in the use of Microsoft Excel to prepare spreadsheets and graphs and to perform linear and nonlinear curve-fitting. Background information on the value of understanding and predicting solubility is provided.
VeriClick: an efficient tool for table format verification
NASA Astrophysics Data System (ADS)
Nagy, George; Tamhankar, Mangesh
2012-01-01
The essential layout attributes of a visual table can be defined by the location of four critical grid cells. Although these critical cells can often be located by automated analysis, some means of human interaction is necessary for correcting residual errors. VeriClick is a macro-enabled spreadsheet interface that provides ground-truthing, confirmation, correction, and verification functions for CSV tables. All user actions are logged. Experimental results of seven subjects on one hundred tables suggest that VeriClick can provide a ten- to twenty-fold speedup over performing the same functions with standard spreadsheet editing commands.
Spreadsheet Toolkit for Ulysses Hi-Scale Measurements of Interplanetary Ions and Electrons
NASA Astrophysics Data System (ADS)
Reza, J. Z.; Lanzerotti, L. J.; Denker, C.; Patterson, D.; Amstrong, T. P.
2004-05-01
Throughout the entire Ulysses out-of-the-ecliptic solar polar mission, the Heliosphere Instrument for Spectra, Composition, and Anisotropy at Low Energies (HI-SCALE) has collected measurements of interplanetary ions and electrons. Time-series of electron and ion fluxes obtained since 1990 have been carefully calibrated and will be stored in a data management system, which will be publicly accessible via the WWW. The goal of the Virtual Solar Observatory (VSO) is to provide data uniformly and efficiently to a diverse user community. However, data dissemination can only be a first step, which has to be followed by a suite of data analysis tools that are tailored towards a diverse user community in science, technology, and education. The widespread use and familiarity of spreadsheets, which are available at low cost or open source for many operating systems, make them an interesting tool to investigate for the analysis of HI-SCALE data. The data are written in comma separated variable (CSV) format, which is commonly used in spreadsheet programs. CSV files can simply be linked as external data to spreadsheet templates, which in turn can be used to generate tables and figures of basic statistical properties and frequency distributions, temporal evolution of electron and ion spectra, comparisons of various energy channels, automatic detection of solar events, solar cycle variations, and space weather. Exploring spreadsheet-assisted data analysis in the context of information technology research, data base information search and retrieval, and data visualization potentially impacts other VSO components, where diverse user communities are targeted. Finally, this presentation is the result of an undergraduate research project, which will allow us to evaluate the performance of user-based spreadsheet analysis "benchmarked" at the undergraduate skill level.
Rare Earth Element Geochemistry for Produced Waters, WY
Quillinan, Scott; Nye, Charles; McLing, Travis; Neupane, Hari
2016-06-30
These data represent major, minor, trace, isotopes, and rare earth element concentrations in geologic formations and water associated with oil and gas production. *Note - Link below contains updated version of spreadsheet (6/14/2017)
Molten Salt Power Tower Cost Model for the System Advisor Model (SAM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turchi, C. S.; Heath, G. A.
2013-02-01
This report describes a component-based cost model developed for molten-salt power tower solar power plants. The cost model was developed by the National Renewable Energy Laboratory (NREL), using data from several prior studies, including a contracted analysis from WorleyParsons Group, which is included herein as an Appendix. The WorleyParsons' analysis also estimated material composition and mass for the plant to facilitate a life cycle analysis of the molten salt power tower technology. Details of the life cycle assessment have been published elsewhere. The cost model provides a reference plant that interfaces with NREL's System Advisor Model or SAM. The referencemore » plant assumes a nominal 100-MWe (net) power tower running with a nitrate salt heat transfer fluid (HTF). Thermal energy storage is provided by direct storage of the HTF in a two-tank system. The design assumes dry-cooling. The model includes a spreadsheet that interfaces with SAM via the Excel Exchange option in SAM. The spreadsheet allows users to estimate the costs of different-size plants and to take into account changes in commodity prices. This report and the accompanying Excel spreadsheet can be downloaded at https://sam.nrel.gov/cost.« less
A simple node and conductor data generator for SINDA
NASA Technical Reports Server (NTRS)
Gottula, Ronald R.
1992-01-01
This paper presents a simple, automated method to generate NODE and CONDUCTOR DATA for thermal match modes. The method uses personal computer spreadsheets to create SINDA inputs. It was developed in order to make SINDA modeling less time consuming and serves as an alternative to graphical methods. Anyone having some experience using a personal computer can easily implement this process. The user develops spreadsheets to automatically calculate capacitances and conductances based on material properties and dimensional data. The necessary node and conductor information is then taken from the spreadsheets and automatically arranged into the proper format, ready for insertion directly into the SINDA model. This technique provides a number of benefits to the SINDA user such as a reduction in the number of hand calculations, and an ability to very quickly generate a parametric set of NODE and CONDUCTOR DATA blocks. It also provides advantages over graphical thermal modeling systems by retaining the analyst's complete visibility into the thermal network, and by permitting user comments anywhere within the DATA blocks.
Plantation thinning systems in the Southern United States
Bryce J. Stokes; William F. Watson
1996-01-01
This paper reviews southern pine management and thinning practices, describes three harvesting systems for thinning, and presents production and cost estimates, and utilization rates. The costs and product recoveries were developed from published sources using a spreadsheet analysis. Systems included tree-length, flail/chip, and cut-to-length. The estimated total...
Animated-simulation modeling facilitates clinical-process costing.
Zelman, W N; Glick, N D; Blackmore, C C
2001-09-01
Traditionally, the finance department has assumed responsibility for assessing process costs in healthcare organizations. To enhance process-improvement efforts, however, many healthcare providers need to include clinical staff in process cost analysis. Although clinical staff often use electronic spreadsheets to model the cost of specific processes, PC-based animated-simulation tools offer two major advantages over spreadsheets: they allow clinicians to interact more easily with the costing model so that it more closely represents the process being modeled, and they represent cost output as a cost range rather than as a single cost estimate, thereby providing more useful information for decision making.
IDA Cost Research Symposium Held 25 May 1995.
1995-08-01
Excel Spreadsheet Publications: MCR Report TR-9507/01 Category: II.B Keywords: Government, Estimating, Missiles, Analysis, Production, Data...originally developed by Martin Marietta as part of SASET software estimating model. To be implemented as part of SoftEST Software Estimating Tool...following documents to report the results of Its work. Reports Reports are the most authoritative and most carefully considered products IDA
Development of a Genetic Algorithm to Automate Clustering of a Dependency Structure Matrix
NASA Technical Reports Server (NTRS)
Rogers, James L.; Korte, John J.; Bilardo, Vincent J.
2006-01-01
Much technology assessment and organization design data exists in Microsoft Excel spreadsheets. Tools are needed to put this data into a form that can be used by design managers to make design decisions. One need is to cluster data that is highly coupled. Tools such as the Dependency Structure Matrix (DSM) and a Genetic Algorithm (GA) can be of great benefit. However, no tool currently combines the DSM and a GA to solve the clustering problem. This paper describes a new software tool that interfaces a GA written as an Excel macro with a DSM in spreadsheet format. The results of several test cases are included to demonstrate how well this new tool works.
User's Guide for Evaluating Subsurface Vapor Intrusion into Buildings
This revised version of the User's Guide corresponds with the release of Version 3.1 of the Johnson and Ettinger (1991) model (J E) spreadsheets for estimating subsurface vapor intrusion into buildings.
SEDIMENT DATA - ST. PAUL WATERWAY - TACOMA, WA - 1996 MONITORING DATA
Benthic Infauna Monitoring Data Files are Excel-format spreadsheet files which contain data presented in the St. Paul Waterway Area Remedial Action and Habitat Restoration Project, 1996 Monitoring Report. The files can be viewed directly or readily downlo aded and read into most ...
LCoE Analysis of Surge-Mode WEC
Bill Staby
2017-03-07
Spreadsheet which provides estimates of reductions in Levelized Cost of Energy for a surge-mode wave energy converter (WEC). This is made available via adoption of the advanced control strategies developed during this research effort.
Documentation of a spreadsheet for time-series analysis and drawdown estimation
Halford, Keith J.
2006-01-01
Drawdowns during aquifer tests can be obscured by barometric pressure changes, earth tides, regional pumping, and recharge events in the water-level record. These stresses can create water-level fluctuations that should be removed from observed water levels prior to estimating drawdowns. Simple models have been developed for estimating unpumped water levels during aquifer tests that are referred to as synthetic water levels. These models sum multiple time series such as barometric pressure, tidal potential, and background water levels to simulate non-pumping water levels. The amplitude and phase of each time series are adjusted so that synthetic water levels match measured water levels during periods unaffected by an aquifer test. Differences between synthetic and measured water levels are minimized with a sum-of-squares objective function. Root-mean-square errors during fitting and prediction periods were compared multiple times at four geographically diverse sites. Prediction error equaled fitting error when fitting periods were greater than or equal to four times prediction periods. The proposed drawdown estimation approach has been implemented in a spreadsheet application. Measured time series are independent so that collection frequencies can differ and sampling times can be asynchronous. Time series can be viewed selectively and magnified easily. Fitting and prediction periods can be defined graphically or entered directly. Synthetic water levels for each observation well are created with earth tides, measured time series, moving averages of time series, and differences between measured and moving averages of time series. Selected series and fitting parameters for synthetic water levels are stored and drawdowns are estimated for prediction periods. Drawdowns can be viewed independently and adjusted visually if an anomaly skews initial drawdowns away from 0. The number of observations in a drawdown time series can be reduced by averaging across user-defined periods. Raw or reduced drawdown estimates can be copied from the spreadsheet application or written to tab-delimited ASCII files.
Formation and Dimerization of NO2 A General Chemistry Experiment
NASA Astrophysics Data System (ADS)
Hennis, April D.; Highberger, C. Scott; Schreiner, Serge
1997-11-01
We have developed a general chemistry experiment which illustrates Gay-Lussac's law of combining volumes. Students are able to determine the partial pressures and equilibrium constant for the formation and dimerization of NO2. The experiment can be carried out in about 45 minutes with students working in groups of two. The experiment readily provides students with data that can be manipulated with a common spreadsheet.
A new, low-cost sun photometer for student use
NASA Astrophysics Data System (ADS)
Espinoza, A.; Pérez-Álvarez, H.; Parra-Vilchis, J. I.; Fauchey-López, E.; Fernando-González, L.; Faus-Landeros, G. E.; Celarier, E. A.; Robinson, D. Q.; Zepeda-Galbez, R.
2011-12-01
We have designed a sun photometer for the measurement of aerosol optical thickness (AOT) at 505 nm and 620 nm, using custom-made glass filters (9.5 nm bandpass, FWHM) and photodiodes. The recommended price-point (US150 - US200) allowed us to incorporate technologies such as microcontrollers, a sun target, a USB port for data uploading, nonvolatile memory to contain tables of up to 127 geolocation profiles, extensive calibration data, and a log of up to 2,000 measurements. The instrument is designed to be easy to use, and to provide instant display of AOT estimates. A diffuser in the fore-optics limits the sensitivity to pointing error. We have developed postprocessing software to refine the AOT estimates, format a spreadsheet file, and upload the data to the GLOBE website. We are currently finalizing hardware and firmware, and conducting extensive calibration/validation experiments. These instruments will soon be in production and available to the K-12 education community, including and especially the GLOBE program.
Wang, Li Yan; O'Brien, Mary Jane; Maughan, Erin D
2016-11-01
This paper describes a user-friendly, Excel spreadsheet model and two data collection instruments constructed by the authors to help states and districts perform cost-benefit analyses of school nursing services delivered by full-time school nurses. Prior to applying the model, states or districts need to collect data using two forms: "Daily Nurse Data Collection Form" and the "Teacher Survey." The former is used to record daily nursing activities, including number of student health encounters, number of medications administered, number of student early dismissals, and number of medical procedures performed. The latter is used to obtain estimates for the time teachers spend addressing student health issues. Once inputs are entered in the model, outputs are automatically calculated, including program costs, total benefits, net benefits, and benefit-cost ratio. The spreadsheet model, data collection tools, and instructions are available at the NASN website ( http://www.nasn.org/The/CostBenefitAnalysis ).
NASA Technical Reports Server (NTRS)
ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.
2005-01-01
The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.
User Instructions for the Policy Analysis Modeling System (PAMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNeil, Michael A.; Letschert, Virginie E.; Van Buskirk, Robert D.
PAMS uses country-specific and product-specific data to calculate estimates of impacts of a Minimum Efficiency Performance Standard (MEPS) program. The analysis tool is self-contained in a Microsoft Excel spreadsheet, and requires no links to external data, or special code additions to run. The analysis can be customized to a particular program without additional user input, through the use of the pull-down menus located on the Summary page. In addition, the spreadsheet contains many areas into which user-generated input data can be entered for increased accuracy of projection. The following is a step-by-step guide for using and customizing the tool.
A new R function, exsic, to assist taxonomists in creating indices
USDA-ARS?s Scientific Manuscript database
Taxonomists manage large amounts of specimen data. This is usually initiated in spreadsheets and then converted for publication into locality lists and in indices to associate collectors and collector numbers from herbarium sheets to identifications, a format technically termed an exsiccate list. Th...
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Demand Side Variability, and Network Variability studies, including input data, processing programs, and... should include the product or product groups carried under each listed contract; (k) Spreadsheets and...
A Spreadsheet for Estimating Soil Water Characteristic Curves (SWCC)
2017-05-01
Federal Highway Admin- istration (FHWA), was designed to simulate the behavior of pavement and subgrade materials over several years of operation. The...Guide for mechanistic- empirical design of new and rehabilitated pavement structures. TRB-NCHRP
Microhole Tubing Bending Report
Oglesby, Ken
2012-01-01
A downhole tubing bending study was made and is reported herein. IT contains a report and 2 excel spreadsheets to calculate tubing bending and to estimate contact points of the tubing to the drilled hole wall (creating a new support point).
A Brief User's Guide to the Excel ® -Based DF Calculator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jubin, Robert T.
2016-06-01
To understand the importance of capturing penetrating forms of iodine as well as the other volatile radionuclides, a calculation tool was developed in the form of an Excel ® spreadsheet to estimate the overall plant decontamination factor (DF). The tool requires the user to estimate splits of the volatile radionuclides within the major portions of the reprocessing plant, speciation of iodine and individual DFs for each off-gas stream within the Used Nuclear Fuel reprocessing plant. The Impact to the overall plant DF for each volatile radionuclide is then calculated by the tool based on the specific user choices. The Excelmore » ® spreadsheet tracks both elemental and penetrating forms of iodine separately and allows changes in the speciation of iodine at each processing step. It also tracks 3H, 14C and 85Kr. This document provides a basic user's guide to the manipulation of this tool.« less
NASA Astrophysics Data System (ADS)
Whittington, A. G.; Speck, A.; Witzig, S.
2011-12-01
As part of an NSF-funded project, "CUES: Connecting Undergraduates to the Enterprise of Science," new inquiry-based homework materials were developed for two upper-level classes at the University of Missouri: Geochemistry (required for Geology majors, fulfills the computing requirement by having 50% of the grade come from five spreadsheet-based homework assignments), and Solar System Science (open to seniors and graduate students, co-taught and cross-listed between Geology and Physics & Astronomy). Inquiry involves activities where the learner engages in scientifically oriented questions, gives priority to evidence in responding to questions, formulates explanations from evidence, connects explanations to scientific knowledge, and communicates and justifies explanations. We engage students in inquiry-based learning by presenting homework exercises as "mini-journal" articles that follow the format of a scientific journal article, including a title, authors, abstract, introduction, methods, results, discussion and citations to peer-reviewed literature. The mini-journal provides a scaffold and serves as a springboard for students to develop and carry out their own follow-up investigation. They then present their findings in the form of their own mini-journal. Mini-journals replace traditional homework problem sets with a format that more directly reflects and encourages scientific practice. Students are engaged in inquiry-based homework which encompass doing, thinking, and communicating, while the mini-journal allows the instructor to contain lines of inquiry within the limits posed by available resources. In the examples we present, research is conducted via spreadsheet modeling, where the students develop their own spreadsheets. Example assignments from Geochemistry include "Trace Element Partitioning During Mantle Melting and MORB Crystallization" and "Isotopic Investigations of Crustal Evolution in the Midcontinent US". The key differences between the old and new formats include (i) active participation of the students in defining the question/problem that they will pursue, within well-defined boundaries, (ii) open-ended nature of the inquiry, so that students need to recognize when they have enough information to answer their question, (iii) extensive spreadsheet manipulation and presentation of results in graphical and tabular formats, and (iv) a written discussion of their findings. Grading is weighted more towards how the problem was addressed, and how findings are presented and interpreted, and less on actual numerical answers. Survey responses from students indicate that they experience discomfort on being presented with an open-ended assignment, but like the freedom to define their own problem. Students also recognize that reading, writing and critical thinking skills employed in the minijournal format increase their understanding of content. The combination of calculation and writing components make these assignments particularly useful for classes designated as "computer-based", and/or "writing intensive" (or similar designations).
An automated graphics tool for comparative genomics: the Coulson plot generator
2013-01-01
Background Comparative analysis is an essential component to biology. When applied to genomics for example, analysis may require comparisons between the predicted presence and absence of genes in a group of genomes under consideration. Frequently, genes can be grouped into small categories based on functional criteria, for example membership of a multimeric complex, participation in a metabolic or signaling pathway or shared sequence features and/or paralogy. These patterns of retention and loss are highly informative for the prediction of function, and hence possible biological context, and can provide great insights into the evolutionary history of cellular functions. However, representation of such information in a standard spreadsheet is a poor visual means from which to extract patterns within a dataset. Results We devised the Coulson Plot, a new graphical representation that exploits a matrix of pie charts to display comparative genomics data. Each pie is used to describe a complex or process from a separate taxon, and is divided into sectors corresponding to the number of proteins (subunits) in a complex/process. The predicted presence or absence of proteins in each complex are delineated by occupancy of a given sector; this format is visually highly accessible and makes pattern recognition rapid and reliable. A key to the identity of each subunit, plus hierarchical naming of taxa and coloring are included. A java-based application, the Coulson plot generator (CPG) automates graphic production, with a tab or comma-delineated text file as input and generating an editable portable document format or svg file. Conclusions CPG software may be used to rapidly convert spreadsheet data to a graphical matrix pie chart format. The representation essentially retains all of the information from the spreadsheet but presents a graphically rich format making comparisons and identification of patterns significantly clearer. While the Coulson plot format is highly useful in comparative genomics, its original purpose, the software can be used to visualize any dataset where entity occupancy is compared between different classes. Availability CPG software is available at sourceforge http://sourceforge.net/projects/coulson and http://dl.dropbox.com/u/6701906/Web/Sites/Labsite/CPG.html PMID:23621955
Crovelli, Robert A.; Coe, Jeffrey A.
2008-01-01
The Probabilistic Landslide Assessment Cost Estimation System (PLACES) presented in this report estimates the number and economic loss (cost) of landslides during a specified future time in individual areas, and then calculates the sum of those estimates. The analytic probabilistic methodology is based upon conditional probability theory and laws of expectation and variance. The probabilistic methodology is expressed in the form of a Microsoft Excel computer spreadsheet program. Using historical records, the PLACES spreadsheet is used to estimate the number of future damaging landslides and total damage, as economic loss, from future landslides caused by rainstorms in 10 counties of the San Francisco Bay region in California. Estimates are made for any future 5-year period of time. The estimated total number of future damaging landslides for the entire 10-county region during any future 5-year period of time is about 330. Santa Cruz County has the highest estimated number of damaging landslides (about 90), whereas Napa, San Francisco, and Solano Counties have the lowest estimated number of damaging landslides (5?6 each). Estimated direct costs from future damaging landslides for the entire 10-county region for any future 5-year period are about US $76 million (year 2000 dollars). San Mateo County has the highest estimated costs ($16.62 million), and Solano County has the lowest estimated costs (about $0.90 million). Estimated direct costs are also subdivided into public and private costs.
Spreadsheets Answer "What If...?
ERIC Educational Resources Information Center
Pogge, Alfred F.; Lunetta, Vincent N.
1987-01-01
Demonstrates how a spreadsheet program can do calculations, freeing students to question, analyze data and learn science. Notes several popular spreadsheet programs. Gives an example using Lotus 1-2-3 spreadsheets for a sampling experiment in Biology. Shows other examples of spreadsheet use in laboratory activities. (CW)
Evaluating Technology Integration in the Elementary School: A Site-Based Approach.
ERIC Educational Resources Information Center
Mowe, Richard
This book enables educators at the elementary level to conduct formative evaluations of their technology programs in minimum time. Most of the technology is computer related, including word processing, graphics, desktop publishing, spreadsheets, databases, instructional software, programming, and telecommunications. The design of the book is aimed…
,
2011-01-01
The U.S. Geological Survey (USGS) recently completed a comprehensive assessment of in-place oil in oil shales in the Eocene Green River in the Greater Green River Basin, Wyoming, Colorado, and Utah. This CD-ROM includes reports, data, and an ArcGIS project describing the assessment. A database was compiled that includes about 47,000 Fischer assays from 186 core holes and 240 rotary drill holes. Most of the oil yield data were analyzed by the former U.S. Bureau of Mines oil shale laboratory in Laramie, Wyoming, and some analyses were made by private laboratories. Location data for 971 Wyoming oil-shale drill holes are listed in a spreadsheet and included in the CD-ROM. Total in-place resources for the three assessed units in the Green River Formation are: (1) Tipton Shale Member, 362,816 million barrels of oil (MMBO), (2) Wilkins Peak Member, 704,991 MMBO, and (3) LaClede Bed of the Laney Member, 377,184 MMBO, for a total of 1.44 trillion barrels of oil in place. This compares with estimated in-place resources for the Piceance Basin of Colorado of 1.53 trillion barrels and estimated in-place resources for the Uinta Basin of Utah and Colorado of 1.32 trillion barrels.
Geologic Map and Cross Sections of the McGinness Hills Geothermal Area - GIS Data
Faulds, James E.
2013-12-31
Geologic map data in shapefile format that includes faults, unit contacts, unit polygons, attitudes of strata and faults, and surficial geothermal features. 5 cross‐sections in Adobe Illustrator format. Comprehensive catalogue of drill‐hole data in spreadsheet, shapefile, and Geosoft database formats. Includes XYZ locations of well heads, year drilled, type of well, operator, total depths, well path data (deviations), lithology logs, and temperature data. 3D model constructed with EarthVision using geologic map data, cross‐sections, drill‐hole data, and geophysics.
Using Spreadsheets in the Management, Analysis, and Reporting of Evaluation Data.
ERIC Educational Resources Information Center
Glowacki, Margaret L.; Rice, Richard L., Jr.
Currently available spreadsheet programs for microcomputers provide many features that can be very useful for evaluators and researchers. Some of the basic concepts involved in spreadsheet use are introduced, and information is provided on the use of spreadsheets in maintaining and analyzing evaluation data. The spreadsheet used in the discussion…
Elements of a next generation time-series ASCII data file format for Earth Sciences
NASA Astrophysics Data System (ADS)
Webster, C. J.
2015-12-01
Data in ASCII comma separated value (CSV) format are recognized as the most simple, straightforward and readable type of data present in the geosciences. Many scientific workflows developed over the years rely on data using this simple format. However, there is a need for a lightweight ASCII header format standard that is easy to create and easy to work with. Current OGC grade XML standards are complex and difficult to implement for researchers with few resources. Ideally, such a format should provide the data in CSV for easy consumption by generic applications such as spreadsheets. The format should use an existing time standard. The header should be easily human readable as well as machine parsable. The metadata format should be extendable to allow vocabularies to be adopted as they are created by external standards bodies. The creation of such a format will increase the productivity of software engineers and scientists because fewer translators and checkers would be required. Data in ASCII comma separated value (CSV) format are recognized as the most simple, straightforward and readable type of data present in the geosciences. Many scientific workflows developed over the years rely on data using this simple format. However, there is a need for a lightweight ASCII header format standard that is easy to create and easy to work with. Current OGC grade XML standards are complex and difficult to implement for researchers with few resources. Ideally, such a format would provide the data in CSV for easy consumption by generic applications such as spreadsheets. The format would use existing time standard. The header would be easily human readable as well as machine parsable. The metadata format would be extendable to allow vocabularies to be adopted as they are created by external standards bodies. The creation of such a format would increase the productivity of software engineers and scientists because fewer translators would be required.
ERIC Educational Resources Information Center
Fish, Laurel J.; Halcoussis, Dennis; Phillips, G. Michael
2017-01-01
The Monte Carlo method and related multiple imputation methods are traditionally used in math, physics and science to estimate and analyze data and are now becoming standard tools in analyzing business and financial problems. However, few sources explain the application of the Monte Carlo method for individuals and business professionals who are…
Users guide for noble fir bough cruiser.
Roger D. Fight; Keith A. Blatner; Roger C. Chapman; William E. Schlosser
2005-01-01
The bough cruiser spreadsheet was developed to provide a method for cruising noble fir (Abies procera Rehd.) stands to estimate the weight of boughs that might be harvested. No boughs are cut as part of the cruise process. The approach is based on a two-stage sample. The first stage consists of fixed-radius plots that are used to estimate the...
Estimating snag and large tree densities and distributions on a landscape for wildlife management.
Lisa J. Bate; Edward O. Garton; Michael J. Wisdom
1999-01-01
We provide efficient and accurate methods for sampling snags and large trees on a landscape to conduct compliance and effectiveness monitoring for wildlife in relation to the habitat standards and guidelines on National Forests. Included online are the necessary spreadsheets, macros, and instructions to conduct all surveys and analyses pertaining to estimation of snag...
Web-based X-ray quality control documentation.
David, George; Burnett, Lou Ann; Schenkel, Robert
2003-01-01
The department of radiology at the Medical College of Georgia Hospital and Clinics has developed an equipment quality control web site. Our goal is to provide immediate access to virtually all medical physics survey data. The web site is designed to assist equipment engineers, department management and technologists. By improving communications and access to equipment documentation, we believe productivity is enhanced. The creation of the quality control web site was accomplished in three distinct steps. First, survey data had to be placed in a computer format. The second step was to convert these various computer files to a format supported by commercial web browsers. Third, a comprehensive home page had to be designed to provide convenient access to the multitude of surveys done in the various x-ray rooms. Because we had spent years previously fine-tuning the computerization of the medical physics quality control program, most survey documentation was already in spreadsheet or database format. A major technical decision was the method of conversion of survey spreadsheet and database files into documentation appropriate for the web. After an unsatisfactory experience with a HyperText Markup Language (HTML) converter (packaged with spreadsheet and database software), we tried creating Portable Document Format (PDF) files using Adobe Acrobat software. This process preserves the original formatting of the document and takes no longer than conventional printing; therefore, it has been very successful. Although the PDF file generated by Adobe Acrobat is a proprietary format, it can be displayed through a conventional web browser using the freely distributed Adobe Acrobat Reader program that is available for virtually all platforms. Once a user installs the software, it is automatically invoked by the web browser whenever the user follows a link to a file with a PDF extension. Although no confidential patient information is available on the web site, our legal department recommended that we secure the site in order to keep out those wishing to make mischief. Our interim solution has not been to password protect the page, which we feared would hinder access for occasional legitimate users, but also not to provide links to it from other hospital and department pages. Utility and productivity were improved and time and money were saved by making radiological equipment quality control documentation instantly available on-line.
Calibration of work zone impact analysis software for Missouri.
DOT National Transportation Integrated Search
2013-12-01
This project calibrated two software programs used for estimating the traffic impacts of work zones. The WZ Spreadsheet : and VISSIM programs were recommended in a previous study by the authors. The two programs were calibrated using : field data fro...
Bus Lifecycle Cost Model for Federal Land Management Agencies.
DOT National Transportation Integrated Search
2011-09-30
The Bus Lifecycle Cost Model is a spreadsheet-based planning tool that estimates capital, operating, and maintenance costs for various bus types over the full lifecycle of the vehicle. The model is based on a number of operating characteristics, incl...
Beyond the Mechanics of Spreadsheets: Using Design Instruction to Address Spreadsheet Errors
ERIC Educational Resources Information Center
Schneider, Kent N.; Becker, Lana L.; Berg, Gary G.
2017-01-01
Given that the usage and complexity of spreadsheets in the accounting profession are expected to increase, it is more important than ever to ensure that accounting graduates are aware of the dangers of spreadsheet errors and are equipped with design skills to minimize those errors. Although spreadsheet mechanics are prevalent in accounting…
NASA Astrophysics Data System (ADS)
Dawson, H. E.
2003-12-01
This paper presents a mass balance approach to assessing the cumulative impacts of discharge from Coal Bed Methane (CBM) wells on surface water quality and its suitability for irrigation in the Powder River Basin. Key water quality parameters for predicting potential effects of CBM development on irrigated agriculture are sodicity, expressed as sodium adsorption ratio (SAR) and salinity, expressed as electrical conductivity (EC). The assessment was performed with the aid of a spreadsheet model, which was designed to estimate steady-state SAR and EC at gauged stream locations after mixing with CBM produced water. Model input included ambient stream water quality and flow, CBM produced water quality and discharge rates, conveyance loss (quantity of water loss that may occur between the discharge point and the receiving streams), beneficial uses, regulatory thresholds, and discharge allocation at state-line boundaries. Historical USGS data were used to establish ambient stream water quality and flow conditions. The resultant water quality predicted for each stream station included the cumulative discharge of CBM produced water in all reaches upstream of the station. Model output was presented in both tabular and graphical formats, and indicated the suitability of pre- and post-mixing water quality for irrigation. Advantages and disadvantages of the spreadsheet model are discussed. This approach was used by federal agencies to support the development of the January 2003 Environmental Impact Statements (EIS) for the Wyoming and Montana portions of the Powder River Basin.
Ferry Lifecycle Cost Model for Federal Land Management Agencies : User's Guide.
DOT National Transportation Integrated Search
2011-09-30
The Ferry Lifecycle Cost Model (model) is a spreadsheet-based sketch planning tool that estimates capital, operating, and total cost for various vessels that could be used to provide ferry service on a particular route given known service parameters....
COAL UTILITY EVIRONMENTAL COST (CUECOST) WORKBOOK USER'S MANUAL
The document is a user's manual for the Coal Utility Environmental Cost (CUECost) workbook (an interrelated set of spreadsheets) and documents its development and the validity of methods used to estimate installed capital ad annualize costs. The CUECost workbook produces rough-or...
Methodology for National Water Savings Model and Spreadsheet Tool—Outdoor Water Use
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Alison, A; Chen, Yuting; Dunham, Camilla
This report describes the method Lawrence Berkeley National Laboratory (LBNL) developed to estimate national impacts of the U.S. Environmental Protection Agency’s (EPA’s) WaterSense labeling program for weather-based irrigation controllers (WBIC). Estimated impacts include the national water savings attributable to the program and the net present value of the lifetime water savings for consumers of irrigation controllers.
Low-Temperature Hydrothermal Resource Potential Estimate
Katherine Young
2016-06-30
Compilation of data (spreadsheet and shapefiles) for several low-temperature resource types, including isolated springs and wells, delineated area convection systems, sedimentary basins and coastal plains sedimentary systems. For each system, we include estimates of the accessible resource base, mean extractable resource and beneficial heat. Data compiled from USGS and other sources. The paper (submitted to GRC 2016) describing the methodology and analysis is also included.
Bruland, Philipp; Dugas, Martin
2017-01-07
Data capture for clinical registries or pilot studies is often performed in spreadsheet-based applications like Microsoft Excel or IBM SPSS. Usually, data is transferred into statistic software, such as SAS, R or IBM SPSS Statistics, for analyses afterwards. Spreadsheet-based solutions suffer from several drawbacks: It is generally not possible to ensure a sufficient right and role management; it is not traced who has changed data when and why. Therefore, such systems are not able to comply with regulatory requirements for electronic data capture in clinical trials. In contrast, Electronic Data Capture (EDC) software enables a reliable, secure and auditable collection of data. In this regard, most EDC vendors support the CDISC ODM standard to define, communicate and archive clinical trial meta- and patient data. Advantages of EDC systems are support for multi-user and multicenter clinical trials as well as auditable data. Migration from spreadsheet based data collection to EDC systems is labor-intensive and time-consuming at present. Hence, the objectives of this research work are to develop a mapping model and implement a converter between the IBM SPSS and CDISC ODM standard and to evaluate this approach regarding syntactic and semantic correctness. A mapping model between IBM SPSS and CDISC ODM data structures was developed. SPSS variables and patient values can be mapped and converted into ODM. Statistical and display attributes from SPSS are not corresponding to any ODM elements; study related ODM elements are not available in SPSS. The S2O converting tool was implemented as command-line-tool using the SPSS internal Java plugin. Syntactic and semantic correctness was validated with different ODM tools and reverse transformation from ODM into SPSS format. Clinical data values were also successfully transformed into the ODM structure. Transformation between the spreadsheet format IBM SPSS and the ODM standard for definition and exchange of trial data is feasible. S2O facilitates migration from Excel- or SPSS-based data collections towards reliable EDC systems. Thereby, advantages of EDC systems like reliable software architecture for secure and traceable data collection and particularly compliance with regulatory requirements are achievable.
R.D. Fight; J.M. Cahill; T.A. Snellgrove; T.D. Fahey
1987-01-01
PRUNE-SIM is a spreadsheet template (program) that allows users to simulate a financial analysis of pruning coast Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco var. menziesii). The program estimates the increase in product value resulting from pruning the butt 17-foot log. Product recovery information is based on actual...
Problem Solving with Spreadsheets.
ERIC Educational Resources Information Center
Catterall, P.; Lewis, R.
1985-01-01
Documents the educational use of spreadsheets through a description of exploratory work which utilizes spreadsheets to achieve the objectives of Conway's Game of Life, a scientific method game for the development of problem-solving techniques. The implementation and classroom use of the spreadsheet programs are discussed. (MBR)
Pressure Ratio to Thermal Environments
NASA Technical Reports Server (NTRS)
Lopez, Pedro; Wang, Winston
2012-01-01
A pressure ratio to thermal environments (PRatTlE.pl) program is a Perl language code that estimates heating at requested body point locations by scaling the heating at a reference location times a pressure ratio factor. The pressure ratio factor is the ratio of the local pressure at the reference point and the requested point from CFD (computational fluid dynamics) solutions. This innovation provides pressure ratio-based thermal environments in an automated and traceable method. Previously, the pressure ratio methodology was implemented via a Microsoft Excel spreadsheet and macro scripts. PRatTlE is able to calculate heating environments for 150 body points in less than two minutes. PRatTlE is coded in Perl programming language, is command-line-driven, and has been successfully executed on both the HP and Linux platforms. It supports multiple concurrent runs. PRatTlE contains error trapping and input file format verification, which allows clear visibility into the input data structure and intermediate calculations.
Determination of Needed Spreadsheet Competencies for Business Personnel in the Mid-South States.
ERIC Educational Resources Information Center
Rogers, Betty S.; Arn, Joseph V.
1993-01-01
A survey of 209 Mid-South businesses determined spreadsheet usage, what competencies are needed for entry-level and continued employment, and sources of spreadsheet training. Recommended that, because of their widespread use, spreadsheets should be taught to all business students. (Author/JOW)
DOT National Transportation Integrated Search
1998-09-16
This paper demonstrates application of the principles of economic analysis to evaluate highway capacity expansion in an urban setting, using a sketch-planning model called Spreadsheet Model for Induced Travel Estimation (SMITE). The application takes...
R.D. Fight; J.M. Cahill; T.D. Fahey
1992-01-01
The DFPRUNE spreadsheet program is designed to estimate the expected financial return from pruning coast Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco var. menziesii. It is a significant revision of the PRUNE-SIM program. The PRUNE-SIM program was based on the average product recovery for unpruned logs from a single stand...
ERIC Educational Resources Information Center
Sims, Paul A.
2012-01-01
A brief history of the development of the empirical equation that is used by prominent, Internet-based programs to estimate (or calculate) the extinction coefficients of proteins is presented. In addition, an overview of a series of related assignments designed to help students understand the origin of the empirical equation is provided. The…
Solving Optimization Problems with Spreadsheets
ERIC Educational Resources Information Center
Beigie, Darin
2017-01-01
Spreadsheets provide a rich setting for first-year algebra students to solve problems. Individual spreadsheet cells play the role of variables, and creating algebraic expressions for a spreadsheet to perform a task allows students to achieve a glimpse of how mathematics is used to program a computer and solve problems. Classic optimization…
PYROLASER - PYROLASER OPTICAL PYROMETER OPERATING SYSTEM
NASA Technical Reports Server (NTRS)
Roberts, F. E.
1994-01-01
The PYROLASER package is an operating system for the Pyrometer Instrument Company's Pyrolaser. There are 6 individual programs in the PYROLASER package: two main programs, two lower level subprograms, and two programs which, although independent, function predominantly as macros. The package provides a quick and easy way to setup, control, and program a standard Pyrolaser. Temperature and emissivity measurements may be either collected as if the Pyrolaser were in the manual operations mode, or displayed on real time strip charts and stored in standard spreadsheet format for post-test analysis. A shell is supplied to allow macros, which are test-specific, to be easily added to the system. The Pyrolaser Simple Operation program provides full on-screen remote operation capabilities, thus allowing the user to operate the Pyrolaser from the computer just as it would be operated manually. The Pyrolaser Simple Operation program also allows the use of "quick starts". Quick starts provide an easy way to permit routines to be used as setup macros for specific applications or tests. The specific procedures required for a test may be ordered in a sequence structure and then the sequence structure can be started with a simple button in the cluster structure provided. One quick start macro is provided for continuous Pyrolaser operation. A subprogram, Display Continuous Pyr Data, is used to display and store the resulting data output. Using this macro, the system is set up for continuous operation and the subprogram is called to display the data in real time on strip charts. The data is simultaneously stored in a spreadsheet format. The resulting spreadsheet file can be opened in any one of a number of commercially available spreadsheet programs. The Read Continuous Pyrometer program is provided as a continuously run subprogram for incorporation of the Pyrolaser software into a process control or feedback control scheme in a multi-component system. The program requires the Pyrolaser to be set up using the Pyrometer String Transfer macro. It requires no inputs and provides temperature and emissivity as outputs. The Read Continuous Pyrometer program can be run continuously and the data can be sampled as often or as seldom as updates of temperature and emissivity are required. PYROLASER is written using the Labview software for use on Macintosh series computers running System 6.0.3 or later, Sun Sparc series computers running OpenWindows 3.0 or MIT's X Window System (X11R4 or X11R5), and IBM PC or compatibles running Microsoft Windows 3.1 or later. Labview requires a minimum of 5Mb of RAM on a Macintosh, 24Mb of RAM on a Sun, and 8Mb of RAM on an IBM PC or compatible. The Labview software is a product of National Instruments (Austin,TX; 800-433-3488), and is not included with this program. The standard distribution medium for PYROLASER is a 3.5 inch 800K Macintosh format diskette. It is also available on a 3.5 inch 720K MS-DOS format diskette, a 3.5 inch diskette in UNIX tar format, and a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation in Macintosh WordPerfect version 2.0.4 format is included on the distribution medium. Printed documentation is included in the price of the program. PYROLASER was developed in 1992.
Spreadsheet Assessment Tool v. 2.4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, David J.; Martinez, Ruben
2016-03-03
The Spreadsheet Assessment Tool (SAT) is an easy to use, blast assessment tool that is intended to estimate the potential risk due to an explosive attack on a blood irradiator. The estimation of risk is based on the methodology, assumptions, and results of a detailed blast effects assessment study that is summarized in Sandia National Laboratories Technical Report SAND2015-6166. Risk as defined in the report and as used in the SAT is: "The potential risk of creating an air blast-induced vent opening at a buildings envelope surface". Vent openings can be created at a buildings envelope through the failure ofmore » an exterior building component—like a wall, window, or door—due to an explosive sabotage of an irradiator within the building. To estimate risk, the tool requires that users obtain and input information pertaining to the building's characteristics and the irradiator location. The tool also suggests several prescriptive mitigation strategies that can be considered to reduce risk. Given the variability in civilian building construction practices, the input parameters used by this tool may not apply to all buildings being assessed. The tool should not be used as a substitute for engineering judgment. The tool is intended for assessment purposes only.« less
Simulation Software's Effect on College Students Spreadsheet Project Scores
ERIC Educational Resources Information Center
Atkinson, J. Kirk; Thrasher, Evelyn H.; Coleman, Phillip D.
2011-01-01
The purpose of this study is to explore the potential impact of support materials on student spreadsheet skill acquisition. Specifically, this study examines the use of an online spreadsheet simulation tool versus a printed book across two independent student groups. This study hypothesizes that the online spreadsheet simulation tool will have a…
Longevity and Depreciation of Audiovisual Equipment.
ERIC Educational Resources Information Center
Post, Richard
1987-01-01
Describes results of survey of media service directors at public universities in Ohio to determine the expected longevity of audiovisual equipment. Use of the Delphi technique for estimates is explained, results are compared with an earlier survey done in 1977, and use of spreadsheet software to calculate depreciation is discussed. (LRW)
OP-Yield Version 1.00 user's guide
Martin W. Ritchie; Jianwei Zhang
2018-01-01
OP-Yield is a Microsoft Excel⢠spreadsheet with 14 specified user inputs to derive custom yield estimates using the original Oliver and Powers (1978) functions as the foundation. It presents yields for ponderosa pine (Pinus ponderosa Lawson & C. Lawson) plantations in northern California. The basic model forms for dominantand...
Enabling Process Improvement and Control in Higher Education Management
ERIC Educational Resources Information Center
Bell, Gary; Warwick, Jon; Kennedy, Mike
2009-01-01
The emergence of "managerialism" in the governance and direction of UK higher education (HE) institutions has been led by government demands for greater accountability in the quality and cost of universities. There is emerging anecdotal evidence indicating that the estimation performance of HE spreadsheets and regression models are poor.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, Curtis M.
2005-09-19
This report describes a screening and ranking framework(SRF) developed to evaluate potential geologic carbon dioxide (CO2) storage sites on the basis of health, safety, and environmental (HSE) risk arising from possible CO2 leakage. The approach is based on the assumption that HSE risk due to CO2 leakage is dependent on three basic characteristics of a geologic CO2 storage site: (1) the potential for primary containment by the target formation; (2) the potential for secondary containment if the primary formation leaks; and (3) the potential for attenuation and dispersion of leaking CO2 if the primary formation leaks and secondary containment fails.more » The framework is implemented in a spreadsheet in which users enter numerical scores representing expert opinions or general information available from published materials along with estimates of uncertainty to evaluate the three basic characteristics in order to screen and rank candidate sites. Application of the framework to the Rio Visa Gas Field, Ventura Oil Field, and Mammoth Mountain demonstrates the approach. Refinements and extensions are possible through the use of more detailed data or model results in place of property proxies. Revisions and extensions to improve the approach are anticipated in the near future as it is used and tested by colleagues and collaborators.« less
Finding P-Values for F Tests of Hypothesis on a Spreadsheet.
ERIC Educational Resources Information Center
Rochowicz, John A., Jr.
The calculation of the F statistic for a one-factor analysis of variance (ANOVA) and the construction of an ANOVA tables are easily implemented on a spreadsheet. This paper describes how to compute the p-value (observed significance level) for a particular F statistic on a spreadsheet. Decision making on a spreadsheet and applications to the…
Cognitive Skills, Domain Knowledge, and Self-Efficacy: Effects on Spreadsheet Quality
ERIC Educational Resources Information Center
Adkins, Joni K.
2011-01-01
Numerous studies have shown that spreadsheets used in companies often have errors which may affect the quality of the decisions made with these tools. Many businesses are unaware or choose to ignore the risks associated with spreadsheet use. The intent of this study was to learn more about the characteristics of spreadsheet end user developers,…
Spreadsheets and Bulgarian goats
NASA Astrophysics Data System (ADS)
Sugden, Steve
2012-10-01
We consider a problem appearing in an Australian Mathematics Challenge in 2003. This article considers whether a spreadsheet might be used to model this problem, thus allowing students to explore its structure within the spreadsheet environment. It then goes on to reflect on some general principles of problem decomposition when the final goal is a successful and lucid spreadsheet implementation.
ChargeOut! : discounted cash flow compared with traditional machine-rate analysis
Ted Bilek
2008-01-01
ChargeOut!, a discounted cash-flow methodology in spreadsheet format for analyzing machine costs, is compared with traditional machine-rate methodologies. Four machine-rate models are compared and a common data set representative of logging skiddersâ costs is used to illustrate the differences between ChargeOut! and the machine-rate methods. The study found that the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-03
... attach multiple pieces of information, our preferred format is a spreadsheet in Microsoft Excel. By hard... addresses the petition. Evaluation of Petition The 11 species named in the petition include six species... species is endemic to the southern Western Ghats, in southern India. The species is reported from six...
McCafferty, A.E.; Horton, R.J.; Stanton, M.R.; McDougal, R.R.; Fey, D.L.
2011-01-01
* provide measurements to study the geochemical, mineralogical, and geophysical characteristics of rocks having weak to extreme degrees of alteration and to develop an understanding of how these characteristics change with alteration type. Data are provided in two digital formats: an Arc/Info geodatabase and a Microsoft Excel spreadsheet.
Dodd, Jonathan D; MacEneaney, Peter M; Malone, Dermot E
2004-05-01
The aim of this study was to show how evidence-based medicine (EBM) techniques can be applied to the appraisal of diagnostic radiology publications. A clinical scenario is described: a gastroenterologist has questioned the diagnostic performance of magnetic resonance cholangiopancreatography (MRCP) in a patient who may have common bile duct (CBD) stones. His opinion was based on an article on MRCP published in "Gut." The principles of EBM are described and then applied to the critical appraisal of this paper. Another paper on the same subject was obtained from the radiology literature and was also critically appraised using explicit EBM criteria. The principles for assessing the validity and strength of both studies are outlined. All statistical parameters were generated quickly using a spreadsheet in Excel format. The results of EBM assessment of both papers are presented. The calculation and application of confidence intervals (CIs) and likelihood ratios (LRs) for both studies are described. These statistical results are applied to individual patient scenarios using graphs of conditional probability (GCP). Basic EBM principles are described and additional points relevant to radiologists discussed. Online resources for EBR practice are identified. The principles of EBM and their application to radiology are discussed. It is emphasized that sensitivity and specificity are point estimates of the "true" characteristics of a test in clinical practice. A spreadsheet can be used to quickly calculate CIs, LRs and GCPs. These give the radiologist a better understanding of the meaning of diagnostic test results in any patient or population of patients.
DataSpread: Unifying Databases and Spreadsheets.
Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya
2015-08-01
Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current "pane" (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases.
DataSpread: Unifying Databases and Spreadsheets
Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya
2015-01-01
Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current “pane” (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases. PMID:26900487
The Evolution of Spreadsheets.
ERIC Educational Resources Information Center
Schuyler, Michael
1985-01-01
Discusses basic features and functions of spreadsheet programs and describes additional capabilities (editing, windowing, graphics, and word processing) of two second-generation spreadsheet programs: Lotus 1-2-3 and Symphony. (MBR)
Teaching physics using Microsoft Excel
NASA Astrophysics Data System (ADS)
Uddin, Zaheer; Ahsanuddin, Muhammad; Khan, Danish Ahmed
2017-09-01
Excel is both ubiquitous and easily understandable. Most people from every walk of life know how to use MS office and Excel spreadsheets. Students are also familiar with spreadsheets. Most students know how to use spreadsheets for data analysis. Besides basic use of Excel, some important aspects of spreadsheets are highlighted in this article. MS Excel can be used to visualize effects of various parameters in a physical system. It can be used as a simulating tool; simulation of wind data has been done through spreadsheets in this study. Examples of Lissajous figures and a damped harmonic oscillator are presented in this article.
Dose estimates for the solid waste performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rittman, P.D.
1994-08-30
The Solid Waste Performance Assessment calculations by PNL in 1990 were redone to incorporate changes in methods and parameters since then. The ten scenarios found in their report were reduced to three, the Post-Drilling Resident, the Post-Excavation Resident, and an All Pathways Irrigator. In addition, estimates of population dose to people along the Columbia River are also included. The attached report describes the methods and parameters used in the calculations, and derives dose factors for each scenario. In addition, waste concentrations, ground water concentrations, and river water concentrations needed to reach the performance objectives of 100 mrem/yr and 500 person-rem/yrmore » are computed. Internal dose factors from DOE-0071 were applied when computing internal dose. External dose rate factors came from the GENII Version 1.485 software package. Dose calculations were carried out on a spreadsheet. The calculations are described in detail in the report for 63 nuclides, including 5 not presently in the GENII libraries. The spreadsheet calculations were checked by comparison with GENII, as described in Appendix D.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, Curtis M.
2006-03-15
This report describes a screening and ranking framework(SRF) developed to evaluate potential geologic carbon dioxide (CO2)storage sites on the basis of health, safety, and environmental (HSE)risk arising from possible CO2 leakage. The approach is based on theassumption that HSE risk due to CO2 leakage is dependent on three basiccharacteristics of a geologic CO2 storage site: (1) the potential forprimary containment by the target formation, (2) the potential forsecondary containment if the primary formation leaks, and (3) thepotential for attenuation and dispersion of leaking CO2 if the primaryformation leaks and secondary containment fails. The framework isimplemented in a spreadsheet in whichmore » users enter numerical scoresrepresenting expert opinions or general information available frompublished materials along with estimates of uncertainty to evaluate thethree basic characteristics in order to screen and rank candidate sites.Application of the framework to the Rio Vista Gas Field, Ventura OilField, and Mammoth Mountain demonstrates the approach. Refinements andextensions are possible through the use of more detailed data or modelresults in place of property proxies. Revisions and extensions to improvethe approach are anticipated in the near future as it is used and testedby colleagues and collaborators.« less
Spreadsheet-Like Image Analysis
1992-08-01
1 " DTIC AD-A254 395 S LECTE D, ° AD-E402 350 Technical Report ARPAD-TR-92002 SPREADSHEET-LIKE IMAGE ANALYSIS Paul Willson August 1992 U.S. ARMY...August 1992 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS SPREADSHEET-LIKE IMAGE ANALYSIS 6. AUTHOR(S) Paul Willson 7. PERFORMING ORGANIZATION NAME(S) AND...14. SUBJECT TERMS 15. NUMBER OF PAGES Image analysis , nondestructive inspection, spreadsheet, Macintosh software, 14 neural network, signal processing
Well 9-1 Logs and Data: Roosevelt Hot Spring Area, Utah (FORGE)
Joe Moore
2016-03-03
This is a compilation of logs and data from Well 9-1 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
Closed Loop Analysis Meta-Language Program (CLAMP)
2012-05-01
formats of Spreadsheets, XML, MCPML, or something else should be the ( anthropometry or other) experts’ productivity in: 1) crafting data 2) applying...FORCE MATERIEL COMMAND UNITED STATES AIR FORCE NOTICE AND SIGNATURE PAGE Using Government drawings, specifications, or other data included in...formulated or supplied the drawings, specifications, or other data does not license the holder or any other person or corporation; or convey any rights or
Valentine, Page C.; Gallea, Leslie B.; Blackwood, Dann S.; Twomey, Erin R.
2010-01-01
The U.S. Geological Survey, in collaboration with National Oceanic and Atmospheric Administration's National Marine Sanctuary Program, conducted seabed mapping and related research in the Stellwagen Bank National Marine Sanctuary region from 1993 to 2004. The mapped area is approximately 3,700 km (1,100 nmi) in size and was subdivided into 18 quadrangles. An extensive series of sea-floor maps of the region based on multibeam sonar surveys has been published as paper maps and online in digital format (PDF, EPS, PS). In addition, 2,628 seabed-sediment samples were collected and analyzed and are in the usSEABED: Atlantic Coast Offshore Surficial Sediment Data Release. This report presents for viewing and downloading the more than 10,600 still seabed photographs that were acquired during the project. The digital images are provided in thumbnail, medium (1536 x 1024 pixels), and high (3071 x 2048) resolution. The images can be viewed by quadrangle on the U.S. Geological Survey Woods Hole Coastal and Marine Science Center's photograph database. Photograph metadata are embedded in each image in Exchangeable Image File Format and also provided in spreadsheet format. Published digital topographic maps and descriptive text for seabed features are included here for downloading and serve as context for the photographs. An interactive topographic map for each quadrangle shows locations of photograph stations, and each location is linked to the photograph database. This map also shows stations where seabed sediment was collected for texture analysis; the results of grain-size analysis and associated metadata are presented in spreadsheet format.
Data acquisition and real-time control using spreadsheets: interfacing Excel with external hardware.
Aliane, Nourdine
2010-07-01
Spreadsheets have become a popular computational tool and a powerful platform for performing engineering calculations. Moreover, spreadsheets include a macro language, which permits the inclusion of standard computer code in worksheets, and thereby enable developers to greatly extend spreadsheets' capabilities by designing specific add-ins. This paper describes how to use Excel spreadsheets in conjunction to Visual Basic for Application programming language to perform data acquisition and real-time control. Afterwards, the paper presents two Excel applications with interactive user interfaces developed for laboratory demonstrations and experiments in an introductory course in control. 2010 ISA. Published by Elsevier Ltd. All rights reserved.
Automatic estimation of elasticity parameters in breast tissue
NASA Astrophysics Data System (ADS)
Skerl, Katrin; Cochran, Sandy; Evans, Andrew
2014-03-01
Shear wave elastography (SWE), a novel ultrasound imaging technique, can provide unique information about cancerous tissue. To estimate elasticity parameters, a region of interest (ROI) is manually positioned over the stiffest part of the shear wave image (SWI). The aim of this work is to estimate the elasticity parameters i.e. mean elasticity, maximal elasticity and standard deviation, fully automatically. Ultrasonic SWI of a breast elastography phantom and breast tissue in vivo were acquired using the Aixplorer system (SuperSonic Imagine, Aix-en-Provence, France). First, the SWI within the ultrasonic B-mode image was detected using MATLAB then the elasticity values were extracted. The ROI was automatically positioned over the stiffest part of the SWI and the elasticity parameters were calculated. Finally all values were saved in a spreadsheet which also contains the patient's study ID. This spreadsheet is easily available for physicians and clinical staff for further evaluation and so increase efficiency. Therewith the efficiency is increased. This algorithm simplifies the handling, especially for the performance and evaluation of clinical trials. The SWE processing method allows physicians easy access to the elasticity parameters of the examinations from their own and other institutions. This reduces clinical time and effort and simplifies evaluation of data in clinical trials. Furthermore, reproducibility will be improved.
Bennett, R; Christiansen, K; Clifton-Hadley, R
1999-04-09
Many 'economic' studies of livestock diseases in Great Britain have been carried out over time. Most studies have considered just one or two diseases and used a different methodology and valuation base from other studies, hampering any comparative assessment of the economic impact of diseases. A standardized methodology was applied to the estimation of the direct costs to livestock production of some 30 endemic diseases/conditions of farm animals in Great Britain. This involved identification of the livestock populations at risk, estimation of the annual incidence of each disease in these populations, identification of the range and incidence of physical effects of each disease on production, valuation of the physical effects of each disease and estimation of the financial value of output losses/resource wastage due to a disease and the costs of specific treatment and prevention measures. The wider economic impacts of disease (such as the implications for human health, animal welfare and markets) were not included in the assessments. Using this standardized methodology with common financial values, a simple spreadsheet model was constructed for each disease. Given the paucity of appropriate disease data for economic assessment, 'low' and 'high' values were used to reflect uncertainties surrounding key disease parameters. Preliminary estimates of the value of disease output losses/resource wastage, treatment and prevention costs are presented for each disease. Despite the limitations of the spreadsheet models and of the estimates derived from them, we conclude that the models represent a useful start in developing a system for the comparative economic assessment of livestock diseases in Great Britain.
Spreadsheet-based engine data analysis tool - user's guide.
DOT National Transportation Integrated Search
2016-07-01
This record refers to both the spreadsheet tool - Fleet Equipment Performance Measurement Preventive Maintenance Model: Spreadsheet-Based Engine Data Analysis Tool, http://ntl.bts.gov/lib/60000/60000/60007/0-6626-P1_Final.xlsm - and its accompanying ...
Modeling Steady-State Groundwater Flow Using Microcomputer Spreadsheets.
ERIC Educational Resources Information Center
Ousey, John Russell, Jr.
1986-01-01
Describes how microcomputer spreadsheets are easily adapted for use in groundwater modeling. Presents spreadsheet set-ups and the results of five groundwater models. Suggests that this approach can provide a basis for demonstrations, laboratory exercises, and student projects. (ML)
Numerical Stimulation of Multicomponent Chromatography Using Spreadsheets.
ERIC Educational Resources Information Center
Frey, Douglas D.
1990-01-01
Illustrated is the use of spreadsheet programs for implementing finite difference numerical simulations of chromatography as an instructional tool in a separations course. Discussed are differential equations, discretization and integration, spreadsheet development, computer requirements, and typical simulation results. (CW)
How Spreadsheets Boost Productivity.
ERIC Educational Resources Information Center
Ross, James
1988-01-01
Explains the use of computerized bookkeeping systems called spreadsheets to perform mathematical and accounting functions such as totaling expenditures, averaging test grades, and transferring funds. Advises about adapting spreadsheet programs and discusses several essential features, including linkage, macro functions, and sharing capabilities.…
Solution to Projectile Motion with Quadratic Drag and Graphing the Trajectory in Spreadsheets
ERIC Educational Resources Information Center
Benacka, Jan
2010-01-01
This note gives the analytical solution to projectile motion with quadratic drag by decomposing the velocity vector to "x," "y" coordinate directions. The solution is given by definite integrals. First, the impact angle is estimated from above, then the projectile coordinates are computed, and the trajectory is graphed at various launch angles and…
Andrew C. Oishi; David Hawthorne; Ram Oren
2016-01-01
Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS). We developed the Baseliner software to help establish a standardized protocol for processing sap...
This report estimates environmental emission factors (EmF) for key chemicals, construction and treatment materials, transportation/on-site equipment, and other processes used at remediation sites. The basis for chemical, construction, and treatment material EmFs is life cycle inv...
A Method for Measuring Collection Expansion Rates and Shelf Space Capacities.
ERIC Educational Resources Information Center
Sapp, Gregg; Suttle, George
1994-01-01
Describes an effort to quantify annual collection expansion and shelf space capacities with a computer spreadsheet program. Methods used to quantify the space taken at the beginning of the project; to estimate annual rate of collection growth; and to plot stack space and usage, volume equivalents and usage, and growth capacity are covered.…
This report estimates environmental emission factors (EmF) for key chemicals, construction and treatment materials, transportation/on-site equipment, and other processes used at remediation sites. The basis for chemical, construction, and treatment material EmFs is life cycle inv...
Users guide for FRCS: fuel reduction cost simulator software.
Roger D. Fight; Bruce R. Hartsough; Peter Noordijk
2006-01-01
The Fuel Reduction Cost Simulator (FRCS) spreadsheet application is public domain software used to estimate costs for fuel reduction treatments involving removal of trees of mixed sizes in the form of whole trees, logs, or chips from a forest. Equipment production rates were developed from existing studies. Equipment operating cost rates are from December 2002 prices...
Modeling the Milky Way: Spreadsheet Science.
ERIC Educational Resources Information Center
Whitmer, John C.
1990-01-01
Described is the generation of a scale model of the solar system and the milky way galaxy using a computer spreadsheet program. A sample spreadsheet including cell formulas is provided. Suggestions for using this activity as a teaching technique are included. (CW)
Using Spreadsheets to Produce Acid-Base Titration Curves.
ERIC Educational Resources Information Center
Cawley, Martin James; Parkinson, John
1995-01-01
Describes two spreadsheets for producing acid-base titration curves, one uses relatively simple cell formulae that can be written into the spreadsheet by inexperienced students and the second uses more complex formulae that are best written by the teacher. (JRH)
In this spreadsheet, user(s) provide their company’s manufacturer code, user contact information for EV-CIS, and user roles. This spreadsheet is used for the Company Authorizing Official (CAO), CROMERR Signer, and EV-CIS Submitters.
Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melaina, Marc
This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.
Well 14-2 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)
Joe Moore
2016-03-03
This is a compilation of logs and data from Well 14-2 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
Well 52-21 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)
Joe Moore
2016-03-03
This is a compilation of logs and data from Well 52-21 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
Well 82-33 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)
Joe Moore
2016-03-03
This is a compilation of logs and data from Well 82-33 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
Well Acord 1-26 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joe Moore
This is a compilation of logs and data from Well Acord 1-26 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
Preparation of School District Budgets with Microcomputer Electronic Spreadsheets.
ERIC Educational Resources Information Center
Hinitz, Herman J.
1996-01-01
Preparing a microcomputer electronic spreadsheet containing all relevant school district budgetary information is possible with currently available hardware and software (such as Lotus 1-2-3), despite random-access-memory limitations. Spreadsheets can provide financial summaries, inventory-control listings, scheduling alternatives,…
A Spreadsheet in the Mathematics Classroom.
ERIC Educational Resources Information Center
Watkins, Will; Taylor, Monty
1989-01-01
Demonstrates how spreadsheets can be used to implement linear system solving algorithms in college mathematics classes. Lotus 1-2-3 is described, a linear system of equations is illustrated using spreadsheets, and the interplay between applications, computations, and theory is discussed. (four references) (LRW)
The Iodine-Clock Reaction--A Spreadsheet Simulation To Test.
ERIC Educational Resources Information Center
Swain, P. A.
1997-01-01
Describes a spreadsheet activity for the iodine-clock reaction which follows the concentrations of all reactions and products for 200 seconds and gives the induction period. Explains that, although there are limitations to the spreadsheet, it is nevertheless illuminating. (Author/ASK)
XLWrap - Querying and Integrating Arbitrary Spreadsheets with SPARQL
NASA Astrophysics Data System (ADS)
Langegger, Andreas; Wöß, Wolfram
In this paper a novel approach is presented for generating RDF graphs of arbitrary complexity from various spreadsheet layouts. Currently, none of the available spreadsheet-to-RDF wrappers supports cross tables and tables where data is not aligned in rows. Similar to RDF123, XLWrap is based on template graphs where fragments of triples can be mapped to specific cells of a spreadsheet. Additionally, it features a full expression algebra based on the syntax of OpenOffice Calc and various shift operations, which can be used to repeat similar mappings in order to wrap cross tables including multiple sheets and spreadsheet files. The set of available expression functions includes most of the native functions of OpenOffice Calc and can be easily extended by users of XLWrap.
Tools for Basic Statistical Analysis
NASA Technical Reports Server (NTRS)
Luz, Paul L.
2005-01-01
Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.
Integrating Critical Spreadsheet Competencies into the Accounting Curriculum
ERIC Educational Resources Information Center
Walters, L. Melissa; Pergola, Teresa M.
2012-01-01
The American Institute of Certified Public Accountants (AICPA) and the International Accounting Education Standards Board (IAESB) identify spreadsheet technology as a key information technology (IT) competency for accounting professionals. However requisite spreadsheet competencies are not specifically defined by the AICPA or IAESB nor are they…
Exploring Difference Equations with Spreadsheets.
ERIC Educational Resources Information Center
Walsh, Thomas P.
1996-01-01
When using spreadsheets to explore real-world problems involving periodic change, students can observe what happens at each period, generate a graph, and learn how changing the starting quantity or constants affects results. Spreadsheet lessons for high school students are presented that explore mathematical modeling, linear programming, and…
Visual Basic programs for spreadsheet analysis.
Hunt, Bruce
2005-01-01
A collection of Visual Basic programs, entitled Function.xls, has been written for ground water spreadsheet calculations. This collection includes programs for calculating mathematical functions and for evaluating analytical solutions in ground water hydraulics and contaminant transport. Several spreadsheet examples are given to illustrate their use.
Decision Analysis Using Spreadsheets.
ERIC Educational Resources Information Center
Sounderpandian, Jayavel
1989-01-01
Discussion of decision analysis and its importance in a business curriculum focuses on the use of spreadsheets instead of commercial software packages for computer assisted instruction. A hypothetical example is given of a company drilling for oil, and suggestions are provided for classroom exercises using spreadsheets. (seven references) (LRW)
ERIC Educational Resources Information Center
Batt, Russell H., Ed.
1988-01-01
Notes two uses of computer spreadsheets in the chemistry classroom. Discusses the general use of the spreadsheet to easily provide changing parameters of equations and then replotting the results on the screen. Presents a molecular orbital spreadsheet calculation of the LCAO-MO approach. Supplies representative printouts and graphs. (MVL)
XAFS Data Interchange: A single spectrum XAFS data file format.
Ravel, B; Newville, M
We propose a standard data format for the interchange of XAFS data. The XAFS Data Interchange (XDI) standard is meant to encapsulate a single spectrum of XAFS along with relevant metadata. XDI is a text-based format with a simple syntax which clearly delineates metadata from the data table in a way that is easily interpreted both by a computer and by a human. The metadata header is inspired by the format of an electronic mail header, representing metadata names and values as an associative array. The data table is represented as columns of numbers. This format can be imported as is into most existing XAFS data analysis, spreadsheet, or data visualization programs. Along with a specification and a dictionary of metadata types, we provide an application-programming interface written in C and bindings for programming dynamic languages.
XAFS Data Interchange: A single spectrum XAFS data file format
NASA Astrophysics Data System (ADS)
Ravel, B.; Newville, M.
2016-05-01
We propose a standard data format for the interchange of XAFS data. The XAFS Data Interchange (XDI) standard is meant to encapsulate a single spectrum of XAFS along with relevant metadata. XDI is a text-based format with a simple syntax which clearly delineates metadata from the data table in a way that is easily interpreted both by a computer and by a human. The metadata header is inspired by the format of an electronic mail header, representing metadata names and values as an associative array. The data table is represented as columns of numbers. This format can be imported as is into most existing XAFS data analysis, spreadsheet, or data visualization programs. Along with a specification and a dictionary of metadata types, we provide an application-programming interface written in C and bindings for programming dynamic languages.
How to Create Automatically Graded Spreadsheets for Statistics Courses
ERIC Educational Resources Information Center
LoSchiavo, Frank M.
2016-01-01
Instructors often use spreadsheet software (e.g., Microsoft Excel) in their statistics courses so that students can gain experience conducting computerized analyses. Unfortunately, students tend to make several predictable errors when programming spreadsheets. Without immediate feedback, programming errors are likely to go undetected, and as a…
Teaching with Spreadsheets: An Example from Heat Transfer.
ERIC Educational Resources Information Center
Drago, Peter
1993-01-01
Provides an activity which measures the heat transfer through an insulated cylindrical tank, allowing the student to gain a better knowledge of both the physics involved and the working of spreadsheets. Provides both a spreadsheet solution and a maximum-minimum method of solution for the problem. (MVL)
Spreadsheet-Based Program for Simulating Atomic Emission Spectra
ERIC Educational Resources Information Center
Flannigan, David J.
2014-01-01
A simple Excel spreadsheet-based program for simulating atomic emission spectra from the properties of neutral atoms (e.g., energies and statistical weights of the electronic states, electronic partition functions, transition probabilities, etc.) is described. The contents of the spreadsheet (i.e., input parameters, formulas for calculating…
Teaching Raster GIS Operations with Spreadsheets.
ERIC Educational Resources Information Center
Raubal, Martin; Gaupmann, Bernhard; Kuhn, Werner
1997-01-01
Defines raster technology in its relationship to geographic information systems and notes that it is typically used with the application of remote sensing techniques and scanning devices. Discusses the role of spreadsheets in a raster model, and describes a general approach based on spreadsheets. Includes six computer-generated illustrations. (MJP)
Spreadsheet Design: An Optimal Checklist for Accountants
ERIC Educational Resources Information Center
Barnes, Jeffrey N.; Tufte, David; Christensen, David
2009-01-01
Just as good grammar, punctuation, style, and content organization are important to well-written documents, basic fundamentals of spreadsheet design are essential to clear communication. In fact, the very principles of good writing should be integrated into spreadsheet workpaper design and organization. The unique contributions of this paper are…
Computer Applications: Using Electronic Spreadsheets.
ERIC Educational Resources Information Center
Riley, Connee; And Others
This instructional unit is intended to assist teachers in helping students learn to use electronic spreadsheets. The 11 learning activities included, all of which are designed for use in conjunction with Multiplan Spreadsheet Software, are arranged in order of increasing difficulty. An effort has been made to include problems applicable to each of…
Manipulative and Numerical Spreadsheet Templates for the Study of Discrete Structures.
ERIC Educational Resources Information Center
Abramovich, Sergei
1998-01-01
Argues that basic components of discrete mathematics can be introduced to students through gradual elaboration of experiences with iconic spreadsheet-based simulations of concrete materials. Suggests that the study of homogeneous and heterogeneous patterns of manipulative spreadsheet templates allows for appreciation of the development of…
Excel Spreadsheets for Algebra: Improving Mental Modeling for Problem Solving
ERIC Educational Resources Information Center
Engerman, Jason; Rusek, Matthew; Clariana, Roy
2014-01-01
This experiment investigates the effectiveness of Excel spreadsheets in a high school algebra class. Students in the experiment group convincingly outperformed the control group on a post lesson assessment. The student responses, teacher observations involving Excel spreadsheet revealed that it operated as a mindtool, which formed the users'…
ERIC Educational Resources Information Center
Barreto, Humberto
2015-01-01
This article is not the usual Excel pedagogy fare in that it does not provide an application or example taught via a spreadsheet. Instead, it briefly reviews the history of spreadsheets in the economics classroom and explores the current environment, with an emphasis on modern learning theory. The conclusion is not surprising: spreadsheets improve…
The Growing Problems with Spreadsheet Budgeting
ERIC Educational Resources Information Center
Solomon, Jeff; Johnson, Stella; Wilcox, Leon; Olson, Tom
2010-01-01
The ubiquitous spreadsheet in some version has been the sole and unrivaled instrument of financial management for decades. And it has served well. The spreadsheet provides the flexibility to design a unique business process. It allows users to create formulas that execute complex calculations, and it is available in the globally standardized Excel…
Levels of Student Responses in a Spreadsheet-Based Environment
ERIC Educational Resources Information Center
Tabach, Michal; Friedlander, Alex
2004-01-01
The purpose of this report is to investigate the range of student responses in three domains--hypothesizing, organizing data, and algebraic generalization of patterns during their work on a spreadsheet-based activity. In a wider context, we attempted to investigate students' utilization schemes of spreadsheets in their learning of introductory…
Spreadsheets and Bulgarian Goats
ERIC Educational Resources Information Center
Sugden, Steve
2012-01-01
We consider a problem appearing in an Australian Mathematics Challenge in 2003. This article considers whether a spreadsheet might be used to model this problem, thus allowing students to explore its structure within the spreadsheet environment. It then goes on to reflect on some general principles of problem decomposition when the final goal is a…
CEASAW: A User-Friendly Computer Environment Analysis for the Sawmill Owner
Guillermo Mendoza; William Sprouse; Philip A. Araman; William G. Luppold
1991-01-01
Improved spreadsheet software capabilities have brought optimization to users with little or no background in mathematical programming. Better interface capabilities of spreadsheet models now make it possible to combine optimization models with a spreadsheet system. Sawmill production and inventory systems possess many features that make them suitable application...
Lens ray diagrams with a spreadsheet
NASA Astrophysics Data System (ADS)
González, Manuel I.
2018-05-01
Physicists create spreadsheets customarily to carry out numerical calculations and to display their results in a meaningful, nice-looking way. Spreadsheets can also be used to display a vivid geometrical model of a physical system. This statement is illustrated with an example taken from geometrical optics: images formed by a thin lens. A careful mixture of standard Excel functions allows to display a realistic automated ray diagram. The suggested spreadsheet is intended as an auxiliary didactic tool for instructors who wish to teach their students to create their own ray diagrams.
Low-Temperature Hydrothermal Resource Potential
Katherine Young
2016-06-30
Compilation of data (spreadsheet and shapefiles) for several low-temperature resource types, including isolated springs and wells, delineated area convection systems, sedimentary basins and coastal plains sedimentary systems. For each system, we include estimates of the accessible resource base, mean extractable resource and beneficial heat. Data compiled from USGS and other sources. The paper (submitted to GRC 2016) describing the methodology and analysis is also included.
Christopher P. Hansen; Mark A. Rumble; Joshua J. Millspaugh
2010-01-01
Monitoring ruffed grouse (Bonasa umbellus) in the Black Hills National Forest is a priority for forest managers due to the bird's status as the management indicator species for quaking aspen (Populus tremuloides) and its value to hunters and other recreational groups. We conducted drumming surveys, estimated occupancy, and assessed the influence of sampling and...
[Cost of mother-child care in Morelos State].
Cahuana-Hurtado, Lucero; Sosa-Rubí, Sandra; Bertozzi, Stefano
2004-01-01
To compare the cost of maternal and child health care (current model) to that of the WHO Mother-Baby Package if it were implemented. A pilot cross-sectional case study was conducted in September 2001 in Sanitary District No. III, Morelos State, Mexico. Two rural health centers, an urban health center, and a general hospital, all managed by the Ministry of Health, were selected for the study. The Mother-Baby Package Costing Spreadsheet was used to estimate the total cost and cost per intervention for the current model and for the Mother-Baby Package model. The total cost of the Mother-Baby Package was twice the cost of the current model. Of the 18 interventions evaluated, the highest proportion of total costs corresponded to antenatal care and normal delivery. Personnel costs represented more than half of the total costs. The Mother-Baby Package Costing Spreadsheet is a practical tool to estimate and compare costs and is useful to guide the distribution of financial resources allocated to maternal and child healthcare. However, this model has limited application unless it is adapted to the structure of each healthcare system. The English version of this paper is available at: http://www.insp.mx/salud/index.html.
Implementation of Instrumental Variable Bounds for Data Missing Not at Random.
Marden, Jessica R; Wang, Linbo; Tchetgen, Eric J Tchetgen; Walter, Stefan; Glymour, M Maria; Wirth, Kathleen E
2018-05-01
Instrumental variables are routinely used to recover a consistent estimator of an exposure causal effect in the presence of unmeasured confounding. Instrumental variable approaches to account for nonignorable missing data also exist but are less familiar to epidemiologists. Like instrumental variables for exposure causal effects, instrumental variables for missing data rely on exclusion restriction and instrumental variable relevance assumptions. Yet these two conditions alone are insufficient for point identification. For estimation, researchers have invoked a third assumption, typically involving fairly restrictive parametric constraints. Inferences can be sensitive to these parametric assumptions, which are typically not empirically testable. The purpose of our article is to discuss another approach for leveraging a valid instrumental variable. Although the approach is insufficient for nonparametric identification, it can nonetheless provide informative inferences about the presence, direction, and magnitude of selection bias, without invoking a third untestable parametric assumption. An important contribution of this article is an Excel spreadsheet tool that can be used to obtain empirical evidence of selection bias and calculate bounds and corresponding Bayesian 95% credible intervals for a nonidentifiable population proportion. For illustrative purposes, we used the spreadsheet tool to analyze HIV prevalence data collected by the 2007 Zambia Demographic and Health Survey (DHS).
Pearson, Richard
2011-03-01
To assess the possibility of estimating the refractive index of rigid contact lenses on the basis of measurements of their back vertex power (BVP) in air and when immersed in liquid. First, a spreadsheet model was used to quantify the magnitude of errors arising from simulated inaccuracies in the variables required to calculate refractive index. Then, refractive index was calculated from in-air and in-liquid measurements of BVP of 21 lenses that had been made in three negative BVPs from materials with seven different nominal refractive index values. The power measurements were made by two operators on two occasions. Intraobserver reliability showed a mean difference of 0.0033±0.0061 (t = 0.544, P = 0.59), interobserver reliability showed a mean difference of 0.0043±0.0061 (t = 0.707, P = 0.48), and the mean difference between the nominal and calculated refractive index values was -0.0010±0.0111 (t = -0.093, P = 0.93). The spreadsheet prediction that low-powered lenses might be subject to greater errors in the calculated values of refractive index was substantiated by the experimental results. This method shows good intra and interobserver reliabilities and can be used easily in a clinical setting to provide an estimate of the refractive index of rigid contact lenses having a BVP of 3 D or more.
Lens Ray Diagrams with a Spreadsheet
ERIC Educational Resources Information Center
González, Manuel I.
2018-01-01
Physicists create spreadsheets customarily to carry out numerical calculations and to display their results in a meaningful, nice-looking way. Spreadsheets can also be used to display a vivid geometrical model of a physical system. This statement is illustrated with an example taken from geometrical optics: images formed by a thin lens. A careful…
Spreadsheet Modeling of Electron Distributions in Solids
ERIC Educational Resources Information Center
Glassy, Wingfield V.
2006-01-01
A series of spreadsheet modeling exercises constructed as part of a new upper-level elective course on solid state materials and surface chemistry is described. The spreadsheet exercises are developed to provide students with the opportunity to interact with the conceptual framework where the role of the density of states and the Fermi-Dirac…
Designing Spreadsheet-Based Tasks for Purposeful Algebra
ERIC Educational Resources Information Center
Ainley, Janet; Bills, Liz; Wilson, Kirsty
2005-01-01
We describe the design of a sequence of spreadsheet-based pedagogic tasks for the introduction of algebra in the early years of secondary schooling within the Purposeful Algebraic Activity project. This design combines two relatively novel features to bring a different perspective to research in the use of spreadsheets for the learning and…
NASA Astrophysics Data System (ADS)
Rallo, G.; Provenzano, G.; Manzano-Juárez, J.
2012-04-01
In the Mediterranean environment, where the period of crops growth does not coincide with the rainy season, the crop is subject to water stress periods that may be amplified with improper irrigation management. Agro-hydrological models can be considered an economic and simple tool to optimize irrigation water use, mainly when water represents a limiting factor for crop production. In the last two decades, agro-hydrological physically based models have been developed to simulate mass and energy exchange processes in the soil-plant-atmosphere system (Feddes et al., 1978; Bastiaanssen et al., 2007). Unfortunately these models, although very reliable, as a consequence of the high number of required variables and the complex computational analysis, cannot often be used. Therefore, simplified agro-hydrological models may represent an useful and simple tool for practical irrigation scheduling. The main objective of the work is to assess, for an olive orchard, the suitability of FAO-56 spreadsheet agro-hydrological model to estimate a long time series of field transpiration, soil water content and crop water stress dynamic. A modification of the spreadsheet is suggested in order to adapt the simulations to a crop tolerant to water stress. In particular, by implementing a new crop water stress function, actual transpiration fluxes and an ecophysiological stress indicator, i. e. the relative transpiration, are computed in order to evaluate a plant-based irrigation scheduling parameter. Validation of the proposed amendment is carried out by means of measured sap fluxes, measured on different plants and up-scaled to plot level. Spatial and temporal variability of soil water contents in the plot was measured, at several depths, using the Diviner 2000 capacitance probe (Sentek Environmental Technologies, 2000) and TDR-100 (Campbell scientific, Inc.) system. The detailed measurements of soil water content, allowed to explore the high spatial variability of soil water content due to the combined effect of the punctual irrigation and the non-uniform root density distribution. A further validation of the plant-based irrigation-timing indicator will be carried out by considering another ecophysiological stress variable like the predawn leaf water potential. Accuracy of the model output was assessed using the Mean Absolute Difference, the Root Mean Square Difference and the efficiency index of Nash and Sutcliffe. Experimental data, recorded during three years of field observation, allowed, with a great level of detail, to investigate on the dynamic of water fluxes from the soil to atmosphere as well as to validate the proposed amendment of the FAO-56 spreadsheet. The modified model simulated with a satisfactory approximation the measured values of average soil water content in the root zone, with error of estimation equal to about 2.0%. These differences can be considered acceptable for practical applications taking into account the intrinsic variability of the data especially in the soil moisture point measurements. An error less than 1 mm was calculated in the daily transpiration estimation. A good performance was observed in the estimation of the cumulate transpiration fluxes.
Estimation of pharmacokinetic parameters from non-compartmental variables using Microsoft Excel.
Dansirikul, Chantaratsamon; Choi, Malcolm; Duffull, Stephen B
2005-06-01
This study was conducted to develop a method, termed 'back analysis (BA)', for converting non-compartmental variables to compartment model dependent pharmacokinetic parameters for both one- and two-compartment models. A Microsoft Excel spreadsheet was implemented with the use of Solver and visual basic functions. The performance of the BA method in estimating pharmacokinetic parameter values was evaluated by comparing the parameter values obtained to a standard modelling software program, NONMEM, using simulated data. The results show that the BA method was reasonably precise and provided low bias in estimating fixed and random effect parameters for both one- and two-compartment models. The pharmacokinetic parameters estimated from the BA method were similar to those of NONMEM estimation.
A Spreadsheet for a 2 x 3 x 2 Log-Linear Analysis. AIR 1991 Annual Forum Paper.
ERIC Educational Resources Information Center
Saupe, Joe L.
This paper describes a personal computer spreadsheet set up to carry out hierarchical log-linear analyses, a type of analysis useful for institutional research into multidimensional frequency tables formed from categorical variables such as faculty rank, student class level, gender, or retention status. The spreadsheet provides a concrete vehicle…
Using Spreadsheets to Help Students Think Recursively
ERIC Educational Resources Information Center
Webber, Robert P.
2012-01-01
Spreadsheets lend themselves naturally to recursive computations, since a formula can be defined as a function of one of more preceding cells. A hypothesized closed form for the "n"th term of a recursive sequence can be tested easily by using a spreadsheet to compute a large number of the terms. Similarly, a conjecture about the limit of a series…
A Spreadsheet Tool for Learning the Multiple Regression F-Test, T-Tests, and Multicollinearity
ERIC Educational Resources Information Center
Martin, David
2008-01-01
This note presents a spreadsheet tool that allows teachers the opportunity to guide students towards answering on their own questions related to the multiple regression F-test, the t-tests, and multicollinearity. The note demonstrates approaches for using the spreadsheet that might be appropriate for three different levels of statistics classes,…
ERIC Educational Resources Information Center
Abramovich, Sergei
2016-01-01
The paper presents the use of spreadsheets integrated with digital tools capable of symbolic computations and graphic constructions in a master's level capstone course for secondary mathematics teachers. Such use of spreadsheets is congruent with the Type II technology applications framework aimed at the development of conceptual knowledge in the…
NASA Astrophysics Data System (ADS)
Ariana, I. M.; Bagiada, I. M.
2018-01-01
Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).
Definition and maintenance of a telemetry database dictionary
NASA Technical Reports Server (NTRS)
Knopf, William P. (Inventor)
2007-01-01
A telemetry dictionary database includes a component for receiving spreadsheet workbooks of telemetry data over a web-based interface from other computer devices. Another component routes the spreadsheet workbooks to a specified directory on the host processing device. A process then checks the received spreadsheet workbooks for errors, and if no errors are detected the spreadsheet workbooks are routed to another directory to await initiation of a remote database loading process. The loading process first converts the spreadsheet workbooks to comma separated value (CSV) files. Next, a network connection with the computer system that hosts the telemetry dictionary database is established and the CSV files are ported to the computer system that hosts the telemetry dictionary database. This is followed by a remote initiation of a database loading program. Upon completion of loading a flatfile generation program is manually initiated to generate a flatfile to be used in a mission operations environment by the core ground system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpkins, A.A.
1996-09-01
AXAOTHER XL is an Excel Spreadsheet used to determine dose to the maximally exposed offsite individual during high-velocity straight winds or tornado conditions. Both individual and population doses may be considered. Potential exposure pathways are inhalation and plume shine. For high-velocity straight winds the spreadsheet has the capability to determine the downwind relative air concentration, however for the tornado conditions, the user must enter the relative air concentration. Theoretical models are discussed and hand calculations are performed to ensure proper application of methodologies. A section has also been included that contains user instructions for the spreadsheet.
Erickson, Collin B; Ankenman, Bruce E; Sanchez, Susan M
2018-06-01
This data article provides the summary data from tests comparing various Gaussian process software packages. Each spreadsheet represents a single function or type of function using a particular input sample size. In each spreadsheet, a row gives the results for a particular replication using a single package. Within each spreadsheet there are the results from eight Gaussian process model-fitting packages on five replicates of the surface. There is also one spreadsheet comparing the results from two packages performing stochastic kriging. These data enable comparisons between the packages to determine which package will give users the best results.
DARPA Initiative in Concurrent Engineering (DICE). Phase 2
1990-07-31
XS spreadsheet tool " Q-Calc spreadsheet tool " TAE+ outer wrapper for XS • Framemaker-based formal EDN (Electronic Design Notebook) " Data...shared global object space and object persistence. Technical Results Module Development XS Integration Environment A prototype of the wrapper concepts...for a spreadsheet integration environment, using an X-Windows based extensible Lotus 1-2-3 emulation called XS , and was (initially) targeted for
ERIC Educational Resources Information Center
Gierdien, M. Faaiz
2014-01-01
This paper reports on the initial stages of a small-scale project involving the use of "spreadsheet algebra programs" in the professional development of eight teachers from three township high schools. In terms of the education context, the paper draws on social practice theory. It then details what is meant by spreadsheet algebra. An…
2012-03-01
fall-2006/lecture-notes/lect11.pdf Chang, C.-T. (2005). A Linearization Approach for Inventory Models with Variable Lead Time. International Journal of Production Economics , 263...Demand and Lead Time are Stochastic. International Journal of Production Economics , 595-605. Hayya, J. C., Harrison, T. P., & He, X. (2011). The Impact
NASA Astrophysics Data System (ADS)
McCartney, Tannis Maureen
Tectonic subsidence curves for over 300 subsurface wells in west-central Alberta indicate that the Western Canada Foreland Basin was initiated at the same time the lower units of the Fernie Formation were being deposited. This evidence is further supported by sedimentological data and fits with the timing of the onset of deformation in the Cordillera and the initiation of the foreland basin in Montana. The volume of subsidence curves in this study required an innovative methodology. Subsidence calculations were performed using customized macros in a spreadsheet. The tectonic subsidence variations were displayed in a tectonic subsidence envelope, which showed the total variation in the subsidence curves, and three suites of maps: tectonic subsidence, tectonic subsidence residuals, and tectonic subsidence ratios. Collectively, the maps of the tectonic subsidence in the Fernie Formation show that there was a western influence on subsidence during deposition of the oldest members of the Fernie Formation.
Gallegos, Tanya J.; Varela, Brian A.
2015-01-01
Comprehensive, published, and publicly available data regarding the extent, location, and character of hydraulic fracturing in the United States are scarce. The objective of this data series is to publish data related to hydraulic fracturing in the public domain. The spreadsheets released with this data series contain derivative datasets aggregated temporally and spatially from the commercial and proprietary IHS database of U.S. oil and gas production and well data (IHS Energy, 2011). These datasets, served in 21 spreadsheets in Microsoft Excel (.xlsx) format, outline the geographical distributions of hydraulic fracturing treatments and associated wells (including well drill-hole directions) as well as water volumes, proppants, treatment fluids, and additives used in hydraulic fracturing treatments in the United States from 1947 through 2010. This report also describes the data—extraction/aggregation processing steps, field names and descriptions, field types and sources. An associated scientific investigation report (Gallegos and Varela, 2014) provides a detailed analysis of the data presented in this data series and comparisons of the data and trends to the literature.
Electronics Environmental Benefits Calculator
The Electronics Environmental Benefits Calculator (EEBC) was developed to assist organizations in estimating the environmental benefits of greening their purchase, use and disposal of electronics.The EEBC estimates the environmental and economic benefits of: Purchasing Electronic Product Environmental Assessment Tool (EPEAT)-registered products; Enabling power management features on computers and monitors above default percentages; Extending the life of equipment beyond baseline values; Reusing computers, monitors and cell phones; and Recycling computers, monitors, cell phones and loads of mixed electronic products.The EEBC may be downloaded as a Microsoft Excel spreadsheet.See https://www.federalelectronicschallenge.net/resources/bencalc.htm for more details.
Conductor requirements for high-temperature superconducting utility power transformers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pleva, E. F.; Mehrotra, V.; Schwenterly, S W
High-temperature superconducting (HTS) coated conductors in utility power transformers must satisfy a set of operating requirements that are driven by two major considerations-HTS transformers must be economically competitive with conventional units, and the conductor must be robust enough to be used in a commercial manufacturing environment. The transformer design and manufacturing process will be described in order to highlight the various requirements that it imposes on the HTS conductor. Spreadsheet estimates of HTS transformer costs allow estimates of the conductor cost required for an HTS transformer to be competitive with a similarly performing conventional unit.
Optimization of replacement and inspection decisions for multiple components on a power system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mauney, D.A.
1994-12-31
The use of optimization on the rescheduling of replacement dates provided a very proactive approach to deciding when components on individual units need to be addressed with a run/repair/replace decision. Including the effects of time value of money and taxes and unit need inside the spreadsheet model allowed the decision maker to concentrate on the effects of engineering input and replacement date decisions on the final net present value (NPV). The personal computer (PC)-based model was applied to a group of 140 forced outage critical fossil plant tube components across a power system. The estimated resulting NPV of the optimizationmore » was in the tens of millions of dollars. This PC spreadsheet model allows the interaction of inputs from structural reliability risk assessment models, plant foreman interviews, and actual failure history on a by component by unit basis across a complete power production system. This model includes not only the forced outage performance of these components caused by tube failures but, in addition, the forecasted need of the individual units on the power system and the expected cost of their replacement power if forced off line. The use of cash flow analysis techniques in the spreadsheet model results in the calculation of an NPV for a whole combination of replacement dates. This allows rapid assessments of {open_quotes}what if{close_quotes} scenarios of major maintenance projects on a systemwide basis and not just on a unit-by-unit basis.« less
Bailey, Stephanie L.; Bono, Rose S.; Nash, Denis; Kimmel, April D.
2018-01-01
Background Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. Methods We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. Results We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Conclusions Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited. PMID:29570737
Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D
2018-01-01
Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited.
SEEG initiative estimates of Brazilian greenhouse gas emissions from 1970 to 2015.
de Azevedo, Tasso Rezende; Costa Junior, Ciniro; Brandão Junior, Amintas; Cremer, Marcelo Dos Santos; Piatto, Marina; Tsai, David Shiling; Barreto, Paulo; Martins, Heron; Sales, Márcio; Galuchi, Tharic; Rodrigues, Alessandro; Morgado, Renato; Ferreira, André Luis; Barcellos E Silva, Felipe; Viscondi, Gabriel de Freitas; Dos Santos, Karoline Costal; Cunha, Kamyla Borges da; Manetti, Andrea; Coluna, Iris Moura Esteves; Albuquerque, Igor Reis de; Junior, Shigueo Watanabe; Leite, Clauber; Kishinami, Roberto
2018-05-29
This work presents the SEEG platform, a 46-year long dataset of greenhouse gas emissions (GHG) in Brazil (1970-2015) providing more than 2 million data records for the Agriculture, Energy, Industry, Waste and Land Use Change Sectors at national and subnational levels. The SEEG dataset was developed by the Climate Observatory, a Brazilian civil society initiative, based on the IPCC guidelines and Brazilian National Inventories embedded with country specific emission factors and processes, raw data from multiple official and non-official sources, and organized together with social and economic indicators. Once completed, the SEEG dataset was converted into a spreadsheet format and shared via web-platform that, by means of simple queries, allows users to search data by emission sources and country and state activities. Because of its effectiveness in producing and making available data on a consistent and accessible basis, SEEG may significantly increase the capacity of civil society, scientists and stakeholders to understand and anticipate trends related to GHG emissions as well as its implications to public policies in Brazil.
SEEG initiative estimates of Brazilian greenhouse gas emissions from 1970 to 2015
de Azevedo, Tasso Rezende; Costa Junior, Ciniro; Brandão Junior, Amintas; Cremer, Marcelo dos Santos; Piatto, Marina; Tsai, David Shiling; Barreto, Paulo; Martins, Heron; Sales, Márcio; Galuchi, Tharic; Rodrigues, Alessandro; Morgado, Renato; Ferreira, André Luis; Barcellos e Silva, Felipe; Viscondi, Gabriel de Freitas; dos Santos, Karoline Costal; Cunha, Kamyla Borges da; Manetti, Andrea; Coluna, Iris Moura Esteves; Albuquerque, Igor Reis de; Junior, Shigueo Watanabe; Leite, Clauber; Kishinami, Roberto
2018-01-01
This work presents the SEEG platform, a 46-year long dataset of greenhouse gas emissions (GHG) in Brazil (1970–2015) providing more than 2 million data records for the Agriculture, Energy, Industry, Waste and Land Use Change Sectors at national and subnational levels. The SEEG dataset was developed by the Climate Observatory, a Brazilian civil society initiative, based on the IPCC guidelines and Brazilian National Inventories embedded with country specific emission factors and processes, raw data from multiple official and non-official sources, and organized together with social and economic indicators. Once completed, the SEEG dataset was converted into a spreadsheet format and shared via web-platform that, by means of simple queries, allows users to search data by emission sources and country and state activities. Because of its effectiveness in producing and making available data on a consistent and accessible basis, SEEG may significantly increase the capacity of civil society, scientists and stakeholders to understand and anticipate trends related to GHG emissions as well as its implications to public policies in Brazil. PMID:29809176
Individualized Human CAD Models: Anthropmetric Morphing and Body Tissue Layering
2014-07-31
Part Flow Chart of the Interaction among VBA Macros, Excel® Spreadsheet, and SolidWorks Front View of the Male and Female Soldier CAD Model...yellow highlighting. The spreadsheet is linked to the CAD model by macros created with the Visual Basic for Application ( VBA ) editor in Microsoft Excel...basically three working parts to the anthropometric morphing that are all interconnected ( VBA macros, Excel spreadsheet, and SolidWorks). The flow
Edwardson, S R; Pejsa, J
1993-01-01
A computer-based tutorial for teaching nursing financial management concepts was developed using the macro function of a commercially available spreadsheet program. The goals of the tutorial were to provide students with an experience with spreadsheets as a computer tool and to teach selected financial management concepts. Preliminary results show the tutorial was well received by students. Suggestions are made for overcoming the general lack of computer sophistication among students.
ERIC Educational Resources Information Center
Ge, Yingbin; Rittenhouse, Robert C.; Buchanan, Jacob C.; Livingston, Benjamin
2014-01-01
We have designed an exercise suitable for a lab or project in an undergraduate physical chemistry course that creates a Microsoft Excel spreadsheet to calculate the energy of the S[subscript 0] ground electronic state and the S[subscript 1] and T[subscript 1] excited states of H[subscript 2]. The spreadsheet calculations circumvent the…
Simplifying CEA through Excel, VBA, and Subeq
NASA Technical Reports Server (NTRS)
Foster, Ryan
2004-01-01
Many people use compound equilibrium programs for very different reasons, varying from refrigerators to light bulbs to rockets. A commonly used equilibrium program is CEA. CEA can take various inputs such as pressure, temperature, and volume along with numerous reactants and run them through equilibrium equations to obtain valuable output information, including products formed and their relative amounts. A little over a year ago, Bonnie McBride created the program subeq with the goal to simplify the calling of CEA. Subeq was also designed to be called by other programs, including Excel, through the use of Visual Basic for Applications (VBA). The largest advantage of using Excel is that it allows the user to input the information in a colorful and user-friendly environment while allowing VBA to run subeq, which is in the form of a FORTRAN DLL (Dynamic Link Library). Calling subeq in this form makes it much faster than if it were converted to VBA. Since subeq requires such large lists of reactant and product names, all of which can't be passed in as an array, subeq had to be changed to accept very long strings of reactants and products. To pass this string and adjust the transfer of input and output parameters, the subeq DLL had to be changed. One program that does this is Compaq Visual FORTRAN, which allows DLLs to be edited, debugged, and compiled. Compaq Visual FORTRAN uses FORTRAN 90/95, which has additional features to that of FORTRAN 77. My goals this summer include finishing up the excel spreadsheet of subeq, which I started last summer, and putting it on the Internet so that others can use it without having to download my spreadsheet. To finish up the spreadsheet I will need to work on debugging current options and problems. I will also work on making it as robust as possible, so that all errors that may arise will be clearly communicated to the user. New features will be added old ones will be changed as I receive comments from people using the spreadsheet. To implement this onto the Internet, I will need to develop an XML input/output format and learn how to write HTML.
Module Interconnection Frameworks for a Real-Time Spreadsheet
1993-10-19
Malaysian Navy by the Malaysian subsidiary of Encore, and that response has been favorable. Prototyping is another area where rapid. interactive...contract with Encore which provides software royalties for every Infinity system shipped. Royalties from the Encore product RTware, Inc. 714 9th St. Suite...Overhead: $153,035 Total Cost: $ 680,074 13.17. Royalties : N/A 13.18. Fee or Profit- $50,000 13.19. Total Estimated Cost: $ 730,743 13.20. Authorized
Biaxial (Tension-Torsion) Testing of an Oxide/Oxide Ceramic Matrix Composite
2013-03-01
estimation algorithms and constants . . . . . . . . . . . . . 61 4.27 Biaxial (tension-torsion) load spreadsheet with independent axial load and torsion...through the composite and provides the main load - bearing capability. The interaction of the two (or more) phases takes place in the interface. The...transfer loads between fibers[15]. The fiber-to-fiber load transfer mechanism provided by the matrix plays a major role in the load - bearing properties of the
Jobs and Economic Development Impact (JEDI) Model: Offshore Wind User Reference Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lantz, E.; Goldberg, M.; Keyser, D.
2013-06-01
The Offshore Wind Jobs and Economic Development Impact (JEDI) model, developed by NREL and MRG & Associates, is a spreadsheet based input-output tool. JEDI is meant to be a user friendly and transparent tool to estimate potential economic impacts supported by the development and operation of offshore wind projects. This guide describes how to use the model as well as technical information such as methodology, limitations, and data sources.
A document-centric approach for developing the tolAPC ontology.
Blfgeh, Aisha; Warrender, Jennifer; Hilkens, Catharien M U; Lord, Phillip
2017-11-28
There are many challenges associated with ontology building, as the process often touches on many different subject areas; it needs knowledge of the problem domain, an understanding of the ontology formalism, software in use and, sometimes, an understanding of the philosophical background. In practice, it is very rare that an ontology can be completed by a single person, as they are unlikely to combine all of these skills. So people with these skills must collaborate. One solution to this is to use face-to-face meetings, but these can be expensive and time-consuming for teams that are not co-located. Remote collaboration is possible, of course, but one difficulty here is that domain specialists use a wide-variety of different "formalisms" to represent and share their data - by the far most common, however, is the "office file" either in the form of a word-processor document or a spreadsheet. Here we describe the development of an ontology of immunological cell types; this was initially developed by domain specialists using an Excel spreadsheet for collaboration. We have transformed this spreadsheet into an ontology using highly-programmatic and pattern-driven ontology development. Critically, the spreadsheet remains part of the source for the ontology; the domain specialists are free to update it, and changes will percolate to the end ontology. We have developed a new ontology describing immunological cell lines built by instantiating ontology design patterns written programmatically, using values from a spreadsheet catalogue. This method employs a spreadsheet that was developed by domain experts. The spreadsheet is unconstrained in its usage and can be freely updated resulting in a new ontology. This provides a general methodology for ontology development using data generated by domain specialists.
Kurylyk, Barret L.; Irvine, Dylan J.; Carey, Sean K.; Briggs, Martin A.; Werkema, Dale D.; Bonham, Mariah
2017-01-01
Groundwater flow advects heat, and thus, the deviation of subsurface temperatures from an expected conduction‐dominated regime can be analysed to estimate vertical water fluxes. A number of analytical approaches have been proposed for using heat as a groundwater tracer, and these have typically assumed a homogeneous medium. However, heterogeneous thermal properties are ubiquitous in subsurface environments, both at the scale of geologic strata and at finer scales in streambeds. Herein, we apply the analytical solution of Shan and Bodvarsson (2004), developed for estimating vertical water fluxes in layered systems, in 2 new environments distinct from previous vadose zone applications. The utility of the solution for studying groundwater‐surface water exchange is demonstrated using temperature data collected from an upwelling streambed with sediment layers, and a simple sensitivity analysis using these data indicates the solution is relatively robust. Also, a deeper temperature profile recorded in a borehole in South Australia is analysed to estimate deeper water fluxes. The analytical solution is able to match observed thermal gradients, including the change in slope at sediment interfaces. Results indicate that not accounting for layering can yield errors in the magnitude and even direction of the inferred Darcy fluxes. A simple automated spreadsheet tool (Flux‐LM) is presented to allow users to input temperature and layer data and solve the inverse problem to estimate groundwater flux rates from shallow (e.g., <1 m) or deep (e.g., up to 100 m) profiles. The solution is not transient, and thus, it should be cautiously applied where diel signals propagate or in deeper zones where multi‐decadal surface signals have disturbed subsurface thermal regimes.
ERIC Educational Resources Information Center
Smith, Michael
1990-01-01
Presents several examples of the iteration method using computer spreadsheets. Examples included are simple iterative sequences and the solution of equations using the Newton-Raphson formula, linear interpolation, and interval bisection. (YP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-02-01
This appendix is a compilation of work done to predict overall cycle performance from gasifier to generator terminals. A spreadsheet has been generated for each case to show flows within a cycle. The spreadsheet shows gaseous or solid composition of flow, temperature of flow, quantity of flow, and heat heat content of flow. Prediction of steam and gas turbine performance was obtained by the computer program GTPro. Outputs of all runs for each combined cycle reviewed has been added to this appendix. A process schematic displaying all flows predicted through GTPro and the spreadsheet is also added to this appendix.more » The numbered bubbles on the schematic correspond to columns on the top headings of the spreadsheet.« less
User's manual for the Graphical Constituent Loading Analysis System (GCLAS)
Koltun, G.F.; Eberle, Michael; Gray, J.R.; Glysson, G.D.
2006-01-01
This manual describes the Graphical Constituent Loading Analysis System (GCLAS), an interactive cross-platform program for computing the mass (load) and average concentration of a constituent that is transported in stream water over a period of time. GCLAS computes loads as a function of an equal-interval streamflow time series and an equal- or unequal-interval time series of constituent concentrations. The constituent-concentration time series may be composed of measured concentrations or a combination of measured and estimated concentrations. GCLAS is not intended for use in situations where concentration data (or an appropriate surrogate) are collected infrequently or where an appreciable amount of the concentration values are censored. It is assumed that the constituent-concentration time series used by GCLAS adequately represents the true time-varying concentration. Commonly, measured constituent concentrations are collected at a frequency that is less than ideal (from a load-computation standpoint), so estimated concentrations must be inserted in the time series to better approximate the expected chemograph. GCLAS provides tools to facilitate estimation and entry of instantaneous concentrations for that purpose. Water-quality samples collected for load computation frequently are collected in a single vertical or at single point in a stream cross section. Several factors, some of which may vary as a function of time and (or) streamflow, can affect whether the sample concentrations are representative of the mean concentration in the cross section. GCLAS provides tools to aid the analyst in assessing whether concentrations in samples collected in a single vertical or at single point in a stream cross section exhibit systematic bias with respect to the mean concentrations. In cases where bias is evident, the analyst can construct coefficient relations in GCLAS to reduce or eliminate the observed bias. GCLAS can export load and concentration data in formats suitable for entry into the U.S. Geological Survey's National Water Information System. GCLAS can also import and export data in formats that are compatible with various commonly used spreadsheet and statistics programs.
Fitting Planetary Orbits with a Spreadsheet.
ERIC Educational Resources Information Center
Bridges, Richard
1995-01-01
Describes how to fit binocular observations of the planets to a theoretical model of circular orbits using a modern computer spreadsheet, from which fundamental data about the solar system may be deduced. (AIM)
Strontium-90 Error Discovered in Subcontract Laboratory Spreadsheet
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. D. Brown A. S. Nagel
1999-07-31
West Valley Demonstration Project health physicists and environment scientists discovered a series of errors in a subcontractor's spreadsheet being used to reduce data as part of their strontium-90 analytical process.
Analysis of the Requirements Generation Process for the Logistics Analysis and Wargame Support Tool
2017-06-01
For instance, the requirements for a pen seem straight forward; however, they may vary depending on the context in which the pen will be used...the interactions between the operational elements, specify which tasks are dependent on others and the order of executing task, and estimate how...configuration file to call that spreadsheet. This requirement can be met depending on the situation. If the nodes and arcs are pre-defined and readily
Economic Comparison of Processes Using Spreadsheet Programs
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Pappano, A. W.; Jennings, C. N.
1986-01-01
Inexpensive approach aids plant-design decisions. Commercially available electronic spreadsheet programs aid economic comparison of different processes for producing particular end products. Facilitates plantdesign decisions without requiring large expenditures for powerful mainframe computers.
Digital Archiving: Where the Past Lives Again
NASA Astrophysics Data System (ADS)
Paxson, K. B.
2012-06-01
The process of digital archiving for variable star data by manual entry with an Excel spreadsheet is described. Excel-based tools including a Step Magnitude Calculator and a Julian Date Calculator for variable star observations where magnitudes and Julian dates have not been reduced are presented. Variable star data in the literature and the AAVSO International Database prior to 1911 are presented and reviewed, with recent archiving work being highlighted. Digitization using optical character recognition software conversion is also demonstrated, with editing and formatting suggestions for the OCR-converted text.
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, Michael J.
SchemaOnRead provides tools for implementing schema-on-read including a single function call (e.g., schemaOnRead("filename")) that reads text (TXT), comma separated value (CSV), raster image (BMP, PNG, GIF, TIFF, and JPG), R data (RDS), HDF5, NetCDF, spreadsheet (XLS, XLSX, ODS, and DIF), Weka Attribute-Relation File Format (ARFF), Epi Info (REC), Pajek network (PAJ), R network (NET), Hypertext Markup Language (HTML), SPSS (SAV), Systat (SYS), and Stata (DTA) files. It also recursively reads folders (e.g., schemaOnRead("folder")), returning a nested list of the contained elements.
Spreadsheets in Science Teaching.
ERIC Educational Resources Information Center
Elliot, Chris
1988-01-01
Described is the use of a spreadsheet to model dynamic phenomena using numerical iterative methods. Uses the discharge of a capacitor, simple and damped harmonic motion, and the flow of heat along a bar as examples. (Author/CW)
Spreadsheet Works: Graphing Functions on a Spreadsheet.
ERIC Educational Resources Information Center
Ramamurthi, V. S.
1989-01-01
Explains graphing functions when using LOTUS 1-2-3. Provides examples and explains keystroke entries needed to make the graphs. Notes up to six functions can be displayed on the same set of axes. (MVL)
Fitting Orbits to Jupiter's Moons with a Spreadsheet.
ERIC Educational Resources Information Center
Bridges, Richard
1995-01-01
Describes how a spreadsheet is used to fit a circular orbit model to observations of Jupiter's moons made with a small telescope. Kepler's Third Law and the inverse square law of gravity are observed. (AIM)
NCBI GEO: mining tens of millions of expression profiles--database and tools update.
Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Edgar, Ron
2007-01-01
The Gene Expression Omnibus (GEO) repository at the National Center for Biotechnology Information (NCBI) archives and freely disseminates microarray and other forms of high-throughput data generated by the scientific community. The database has a minimum information about a microarray experiment (MIAME)-compliant infrastructure that captures fully annotated raw and processed data. Several data deposit options and formats are supported, including web forms, spreadsheets, XML and Simple Omnibus Format in Text (SOFT). In addition to data storage, a collection of user-friendly web-based interfaces and applications are available to help users effectively explore, visualize and download the thousands of experiments and tens of millions of gene expression patterns stored in GEO. This paper provides a summary of the GEO database structure and user facilities, and describes recent enhancements to database design, performance, submission format options, data query and retrieval utilities. GEO is accessible at http://www.ncbi.nlm.nih.gov/geo/
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, Curtis M.
2006-11-27
A screening and ranking framework (SRF) has been developedto evaluate potential geologic carbon dioxide (CO2) storage sites on thebasis of health, safety, and environmental (HSE) risk arising from CO2leakage. The approach is based on the assumption that CO2 leakage risk isdependent on three basic characteristics of a geologic CO2 storage site:(1) the potential for primary containment by the target formation; (2)the potential for secondary containment if the primary formation leaks;and (3) the potential for attenuation and dispersion of leaking CO2 ifthe primary formation leaks and secondary containment fails. Theframework is implemented in a spreadsheet in which users enter numericalscores representingmore » expert opinions or published information along withestimates of uncertainty. Applications to three sites in Californiademonstrate the approach. Refinements and extensions are possible throughthe use of more detailed data or model results in place of propertyproxies.« less
Wentzensen, Nicolas; Wacholder, Sholom
2013-02-01
Researchers developing biomarkers for early detection can determine the potential for clinical benefit at early stages of development. We provide the theoretical background showing the quantitative connection between biomarker levels in cases and controls and clinically meaningful risk measures, as well as a spreadsheet for researchers to use in their own research. We provide researchers with tools to decide whether a test is useful, whether it needs technical improvement, whether it may work only in specific populations, or whether any further development is futile. The methods described here apply to any method that aims to estimate risk of disease based on biomarkers, clinical tests, genetics, environment, or behavior. Many efforts go into futile biomarker development and premature clinical testing. In many instances, predictions for translational success or failure can be made early, simply based on critical analysis of case–control data. Our article presents well-established theory in a form that can be appreciated by biomarker researchers. Furthermore, we provide an interactive spreadsheet that links biomarker performance with specific disease characteristics to evaluate the promise of biomarker candidates at an early stage.
Shulman, Stanley A; Smith, Jerome P
2002-01-01
A method is presented for the evaluation of the bias, variability, and accuracy of gas monitors. This method is based on using the parameters for the fitted response curves of the monitors. Thereby, variability between calibrations, between dates within each calibration period, and between different units can be evaluated at several different standard concentrations. By combining variability information with bias information, accuracy can be assessed. An example using carbon monoxide monitor data is provided. Although the most general statistical software required for these tasks is not available on a spreadsheet, when the same number of dates in a calibration period are evaluated for each monitor unit, the calculations can be done on a spreadsheet. An example of such calculations, together with the formulas needed for their implementation, is provided. In addition, the methods can be extended by use of appropriate statistical models and software to evaluate monitor trends within calibration periods, as well as consider the effects of other variables, such as humidity and temperature, on monitor variability and bias.
Software validation applied to spreadsheets used in laboratories working under ISO/IEC 17025
NASA Astrophysics Data System (ADS)
Banegas, J. M.; Orué, M. W.
2016-07-01
Several documents deal with software validation. Nevertheless, more are too complex to be applied to validate spreadsheets - surely the most used software in laboratories working under ISO/IEC 17025. The method proposed in this work is intended to be directly applied to validate spreadsheets. It includes a systematic way to document requirements, operational aspects regarding to validation, and a simple method to keep records of validation results and modifications history. This method is actually being used in an accredited calibration laboratory, showing to be practical and efficient.
Public availability of research data in dentistry journals indexed in Journal Citation Reports.
Vidal-Infer, Antonio; Tarazona, Beatriz; Alonso-Arroyo, Adolfo; Aleixandre-Benavent, Rafael
2018-01-01
Dentistry is a medical discipline with an increasing scientific production in the last years. Due to the importance of data sharing in science, this study aims at analyzing the availability of raw data in articles from scientific journals indexed in the Dentistry category of the 2014 edition of the Journal Citation Reports. A review of the 88 websites of journals from the Dentistry category was conducted to determine the data-sharing editorial policies. Furthermore, a search in the PubMed Central repository to collect information about the characteristics of the supplementary material of articles from those journals was carried out. The possibility of publishing a supplementary material was higher in the first quartile journals. A percentage of 7.6% of the articles registered in PubMed Central contained a supplementary material, especially text documents, but the presence of spreadsheets was scarce. There is a relationship between openness policies and the impact of the journals according to their quartile or position ranking by the impact factor in the JCR, but the willingness of sharing raw data in spreadsheets format is still limited. This study will reveal the resources of raw data which will improve quality of research and clinical practice.
Software for Estimating Costs of Testing Rocket Engines
NASA Technical Reports Server (NTRS)
Hines, Merlon M.
2004-01-01
A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.
Software for Estimating Costs of Testing Rocket Engines
NASA Technical Reports Server (NTRS)
Hines, Merion M.
2002-01-01
A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.
Software for Estimating Costs of Testing Rocket Engines
NASA Technical Reports Server (NTRS)
Hines, Merlon M.
2003-01-01
A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.
ERIC Educational Resources Information Center
Clark, Joy L.; Hegji, Charles E.
1997-01-01
Notes that using spreadsheets to teach microeconomics principles enables learning by doing in the exploration of basic concepts. Introduction of increasingly complex topics leads to exploration of theory and managerial decision making. (SK)
Building Your Own Regression Model
ERIC Educational Resources Information Center
Horton, Robert, M.; Phillips, Vicki; Kenelly, John
2004-01-01
Spreadsheets to explore regression with an algebra 2 class in a medium-sized rural high school are presented. The use of spreadsheets can help students develop sophisticated understanding of mathematical models and use them to describe real-world phenomena.
Petrogenetic Modeling with a Spreadsheet Program.
ERIC Educational Resources Information Center
Holm, Paul Eric
1988-01-01
Describes how interactive programs for scientific modeling may be created by using spreadsheet software such as LOTUS 1-2-3. Lists the advantages of using this method. Discusses fractional distillation, batch partial melting, and combination models as examples. (CW)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-02-01
This appendix is a compilation of work done to predict overall cycle performance from gasifier to generator terminals. A spreadsheet has been generated for each case to show flows within a cycle. The spreadsheet shows gaseous or solid composition of flow, temperature of flow, quantity of flow, and heat heat content of flow. Prediction of steam and gas turbine performance was obtained by the computer program GTPro. Outputs of all runs for each combined cycle reviewed has been added to this appendix. A process schematic displaying all flows predicted through GTPro and the spreadsheet is also added to this appendix.more » The numbered bubbles on the schematic correspond to columns on the top headings of the spreadsheet.« less
Bradley, D. Nathan
2012-01-01
The slope-area method is a technique for estimating the peak discharge of a flood after the water has receded (Dalrymple and Benson, 1967). This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks (HWMs) on trees or buildings. These indicators of flood stage are combined with measurements of the cross-sectional geometry of the stream, estimates of channel roughness, and a mathematical model that balances the total energy of the flow between cross sections. This is in contrast to a “direct” measurement of discharge during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a gage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the stream gage, resulting in more accurate computation of high flows. The Slope-Area Computation program (SAC; Fulford, 1994) is an implementation of the slope-area method that computes a peak-discharge estimate from inputs of water-surface slope (from surveyed HWMs), channel geometry, and estimated channel roughness. SAC is a command line program written in Fortran that reads input data from a formatted text file and prints results to another formatted text file. Preparing the input file can be time-consuming and prone to errors. This document describes the SAC graphical user interface (GUI), a crossplatform “wrapper” application that prepares the SAC input file, executes the program, and helps the user interpret the output. The SAC GUI is an update and enhancement of the slope-area method (SAM; Hortness, 2004; Berenbrock, 1996), an earlier spreadsheet tool used to aid field personnel in the completion of a slope-area measurement. The SAC GUI reads survey data, develops a plan-view plot, water-surface profile, cross-section plots, and develops the SAC input file. The SAC GUI also develops HEC-2 files that can be imported into HEC–RAS.
Maceneaney, P M; Malone, D E
2000-12-01
To design a spreadsheet program to analyse interventional radiology (IR) data rapidly produced in local research or reported in the literature using 'evidence-based medicine' (EBM) parameters of treatment benefit and harm. Microsoft Excel(TM)was used. The spreadsheet consists of three worksheets. The first shows the 'Levels of Evidence and Grades of Recommendations' that can be assigned to therapeutic studies as defined by the Oxford Centre for EBM. The second and third worksheets facilitate the EBM assessment of therapeutic benefit and harm. Validity criteria are described. These include the assessment of the adequacy of sample size in the detection of possible procedural complications. A contingency (2 x 2) table for raw data on comparative outcomes in treated patients and controls has been incorporated. Formulae for EBM calculations are related to these numerators and denominators in the spreadsheet. The parameters calculated are benefit - relative risk reduction, absolute risk reduction, number needed to treat (NNT). Harm - relative risk, relative odds, number needed to harm (NNH). Ninety-five per cent confidence intervals are calculated for all these indices. The results change automatically when the data in the therapeutic outcome cells are changed. A final section allows the user to correct the NNT or NNH in their application to individual patients. This spreadsheet can be used on desktop and palmtop computers. The MS Excel(TM)version can be downloaded via the Internet from the URL ftp://radiography.com/pub/TxHarm00.xls. A spreadsheet is useful for the rapid analysis of the clinical benefit and harm from IR procedures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Ronald J.; Reilly, Timothy J.; Lopez, Anthony
2015-09-15
Highlights: • A spreadsheet-based risk screening tool for groundwater affected by landfills is presented. • Domenico solute transport equations are used to estimate downgradient contaminant concentrations. • Landfills are categorized as presenting high, moderate or low risks. • Analysis of parameter sensitivity and examples of the method’s application are given. • The method has value to regulators and those considering redeveloping closed landfills. - Abstract: A screening tool for quantifying levels of concern for contaminants detected in monitoring wells on or near landfills to down-gradient receptors (streams, wetlands and residential lots) was developed and evaluated. The tool uses Quick Domenicomore » Multi-scenario (QDM), a spreadsheet implementation of Domenico-based solute transport, to estimate concentrations of contaminants reaching receptors under steady-state conditions from a constant-strength source. Unlike most other available Domenico-based model applications, QDM calculates the time for down-gradient contaminant concentrations to approach steady state and appropriate dispersivity values, and allows for up to fifty simulations on a single spreadsheet. Sensitivity of QDM solutions to critical model parameters was quantified. The screening tool uses QDM results to categorize landfills as having high, moderate and low levels of concern, based on contaminant concentrations reaching receptors relative to regulatory concentrations. The application of this tool was demonstrated by assessing levels of concern (as defined by the New Jersey Pinelands Commission) for thirty closed, uncapped landfills in the New Jersey Pinelands National Reserve, using historic water-quality data from monitoring wells on and near landfills and hydraulic parameters from regional flow models. Twelve of these landfills are categorized as having high levels of concern, indicating a need for further assessment. This tool is not a replacement for conventional numerically-based transport model or other available Domenico-based applications, but is suitable for quickly assessing the level of concern posed by a landfill or other contaminant point source before expensive and lengthy monitoring or remediation measures are taken. In addition to quantifying the level of concern using historic groundwater-monitoring data, the tool allows for archiving model scenarios and adding refinements as new data become available.« less
2008-09-01
used in the analysis. The analytical approach assumes steady-state, summer conditions applied to a continuously stirred tank reactor ( CSTR ). CSTR ...constituent due to the fully mixed CSTR assumption. Thus, there is no spatial dimensionality. The DO CSTR model is solved using a spreadsheet...For this study, the CSTR represents the bottom meter of water along the reach of the channel being assessed. A unit bottom layer thickness of 1 m
Oakes, J M; Feldman, H A
2001-02-01
Nonequivalent controlled pretest-posttest designs are central to evaluation science, yet no practical and unified approach for estimating power in the two most widely used analytic approaches to these designs exists. This article fills the gap by presenting and comparing useful, unified power formulas for ANCOVA and change-score analyses, indicating the implications of each on sample-size requirements. The authors close with practical recommendations for evaluators. Mathematical details and a simple spreadsheet approach are included in appendices.
Economic analysis of recycling contaminated concrete
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stephen, A.; Ayers, K.W.; Boren, J.K.
1997-02-01
Decontamination and Decommissioning activities in the DOE complex generate large volumes of radioactively contaminated and uncontaminated concrete. Currently, this concrete is usually decontaminated, the contaminated waste is disposed of in a LLW facility and the decontaminated concrete is placed in C&D landfills. A number of alternatives to this practice are available including recycling of the concrete. Cost estimates for six alternatives were developed using a spreadsheet model. The results of this analysis show that recycling alternatives are at least as economical as current practice.
Survey Costs and Errors: User’s Manual for the Lotus 1-2-3 Spreadsheet
1991-04-01
select appropriate options such as the use of a business reply envelope or a self -addressed, stamped envelope for returning mailed surveys. Recruit. T... self -explanatory and need not be discussed here. Mode/Systematic Automatically enter ALL time and cost estimates for a survey project. "Time and cost...user can choose between a business reply envelope (BRE) or a self -addressed, stamped envelope (SASE) for returning the surveys. For mail surveys, the
ABC estimation of unit costs for emergency department services.
Holmes, R L; Schroeder, R E
1996-04-01
Rapid evolution of the health care industry forces managers to make cost-effective decisions. Typical hospital cost accounting systems do not provide emergency department managers with the information needed, but emergency department settings are so complex and dynamic as to make the more accurate activity-based costing (ABC) system prohibitively expensive. Through judicious use of the available traditional cost accounting information and simple computer spreadsheets. managers may approximate the decision-guiding information that would result from the much more costly and time-consuming implementation of ABC.
Do Vampires Exist? Using Spreadsheets To Investigate a Common Folktale.
ERIC Educational Resources Information Center
Drier, Hollylynne Stohl
1999-01-01
Describes the use of spreadsheets in a third grade class to teach basic mathematical concepts by investigating the existence of vampires. Incorporates addition and multiplication skills, patterning, variables, formulas, exponential growth, and proof by contradiction. (LRW)
The Computer Bulletin Board. Modified Gran Plots of Very Weak Acids on a Spreadsheet.
ERIC Educational Resources Information Center
Chau, F. T.; And Others
1990-01-01
Presented are two applications of computer technology to chemistry instruction: the use of a spreadsheet program to analyze acid-base titration curves and the use of database software to catalog stockroom inventories. (CW)
This page provides information and access to Standard Evaluation Procedures (SEPs) and Data Entry Spreadsheet Templates (DESTs) developed by EPA's Office of Chemical Safety and Pollution Prevention (OCSPP).
Computer Corner: Spreadsheets, Power Series, Generating Functions, and Integers.
ERIC Educational Resources Information Center
Snow, Donald R.
1989-01-01
Implements a table algorithm on a spreadsheet program and obtains functions for several number sequences such as the Fibonacci and Catalan numbers. Considers other applications of the table algorithm to integers represented in various number bases. (YP)
ERIC Educational Resources Information Center
Carson, S. R.
1998-01-01
Presents a method for using spreadsheets to model special relativistic phenomena based on the connection between electric and magnetic fields in special relativity. Uses the time dilation equation to carry out transformations between reference frames that show the connection between the fields quantitatively. (DDR)
ERIC Educational Resources Information Center
Ivancevich, Daniel M.; And Others
1996-01-01
Points out that political and economic pressures have sometimes caused the Financial Accounting Standards Board to alter standards. Presents a spreadsheet tool that demonstrates the economic consequences of adopting accounting standards. (SK)
76 FR 34124 - Civil Supersonic Aircraft Panel Discussion
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-10
... and continuing to the second line in the second column, the Web site address should read as follows: https://spreadsheets.google.com/spreadsheet/viewform?formkey=dEFEdlRnYzBiaHZtTUozTHVtbkF4d0E6MQ . [FR...
Assessment of ODOT culvert load rating spreadsheets for use in Michigan.
DOT National Transportation Integrated Search
2013-01-01
The project Assessment of ODOT Culvert Load Rating Spreadsheets for use in Michigan was : a short time-frame project funded by the Michigan Department of Transportation (MDOT) : through the Center for Structural Durability (CSD) at Michigan Tec...
A TOOL FOR PLANNING AERIAL PHOTOGRAPHY
abstract The U.S. EPAs Pacific Coastal Ecology Branch has developed a tool in the form of an Excel. spreadsheet that facilitates planning aerial photography missions. The spreadsheet accepts various input parameters such as desired photo-scale and boundary coordinates of the stud...
Hand, Maureen; Augustine, Chad; Feldman, David; Kurup, Parthiv; Beiter, Philipp; O'Connor, Patrick
2017-08-21
Each year since 2015, NREL has presented Annual Technology Baseline (ATB) in a spreadsheet that contains detailed cost and performance data (both current and projected) for renewable and conventional technologies. The spreadsheet includes a workbook for each technology. This spreadsheet provides data for the 2017 ATB. In this edition of the ATB, offshore wind power has been updated to include 15 technical resource groups. And, two options are now provided for representing market conditions for project financing, including current market conditions and long-term historical conditions. For more information, see https://atb.nrel.gov/.
NASA Astrophysics Data System (ADS)
Conrad, Jon M.
1999-10-01
Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. Through these examples and additional exercises at the end of each chapter, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems.
NASA Astrophysics Data System (ADS)
Gaik Tay, Kim; Cheong, Tau Han; Foong Lee, Ming; Kek, Sie Long; Abdul-Kahar, Rosmila
2017-08-01
In the previous work on Euler’s spreadsheet calculator for solving an ordinary differential equation, the Visual Basic for Application (VBA) programming was used, however, a graphical user interface was not developed to capture users input. This weakness may make users confuse on the input and output since those input and output are displayed in the same worksheet. Besides, the existing Euler’s spreadsheet calculator is not interactive as there is no prompt message if there is a mistake in inputting the parameters. On top of that, there are no users’ instructions to guide users to input the derivative function. Hence, in this paper, we improved previous limitations by developing a user-friendly and interactive graphical user interface. This improvement is aimed to capture users’ input with users’ instructions and interactive prompt error messages by using VBA programming. This Euler’s graphical user interface spreadsheet calculator is not acted as a black box as users can click on any cells in the worksheet to see the formula used to implement the numerical scheme. In this way, it could enhance self-learning and life-long learning in implementing the numerical scheme in a spreadsheet and later in any programming language.
Kelly, Christopher; Pashayan, Nora; Munisamy, Sreetharan; Powles, John W
2009-06-30
Our aim was to estimate the burden of fatal disease attributable to excess adiposity in England and Wales in 2003 and 2015 and to explore the sensitivity of the estimates to the assumptions and methods used. A spreadsheet implementation of the World Health Organization's (WHO) Comparative Risk Assessment (CRA) methodology for continuously distributed exposures was used. For our base case, adiposity-related risks were assumed to be minimal with a mean (SD) BMI of 21 (1) Kg m-2. All cause mortality risks for 2015 were taken from the Government Actuary and alternative compositions by cause derived. Disease-specific relative risks by BMI were taken from the CRA project and varied in sensitivity analyses. Under base case methods and assumptions for 2003, approximately 41,000 deaths and a loss of 1.05 years of life expectancy were attributed to excess adiposity. Seventy-seven percent of all diabetic deaths, 23% of all ischaemic heart disease deaths and 14% of all cerebrovascular disease deaths were attributed to excess adiposity. Predictions for 2015 were found to be more sensitive to assumptions about the future course of mortality risks for diabetes than to variation in the assumed trend in BMI. On less favourable assumptions the attributable loss of life expectancy in 2015 would rise modestly to 1.28 years. Excess adiposity appears to contribute materially but modestly to mortality risks in England and Wales and this contribution is likely to increase in the future. Uncertainty centres on future trends of associated diseases, especially diabetes. The robustness of these estimates is limited by the lack of control for correlated risks by stratification and by the empirical uncertainty surrounding the effects of prolonged excess adiposity beginning in adolescence.
Using a Spreadsheet To Explore Melting, Dissolving and Phase Diagrams.
ERIC Educational Resources Information Center
Goodwin, Alan
2002-01-01
Compares phase diagrams relating to the solubilities and melting points of various substances in textbooks with those generated by a spreadsheet using data from the literature. Argues that differences between the diagrams give rise to new chemical insights. (Author/MM)
The Use of Lotus 1-2-3 Macros in Engineering Calculations.
ERIC Educational Resources Information Center
Rosen, Edward M.
1990-01-01
Described are the use of spreadsheet programs in chemical engineering calculations using Lotus 1-2-3 macros. Discusses the macro commands, subroutine operations, and solution of partial differential equation. Provides examples of the subroutine programs and spreadsheet solution. (YP)
Academic Testing and Grading with Spreadsheet Software.
ERIC Educational Resources Information Center
Ho, James K.
1987-01-01
Explains how spreadsheet software can be used in the design and grading of academic tests and in assigning grades. Macro programs and menu-driven software are highlighted and an example using IBM PCs and Lotus 1-2-3 software is given. (Author/LRW)
Design and Evaluation of a Personal Diffusion Battery.
Vosburgh, Donna J H; Klein, Timothy; Sheehan, Maura; Anthony, T Renee; Peters, Thomas M
A four-stage personal diffusion battery (pDB) was designed and constructed to measure submicron particle size distributions. The pDB consisted of a screen-type diffusion battery, solenoid valve system, and electronic controller. A data inversion spreadsheet was created to solve for the number median diameter (NMD), geometric standard deviation (GSD), and particle number concentration of unimodal aerosols using stage number concentrations from the pDB combined with a handheld condensation particle counter (pDB+CPC). The inversion spreadsheet included particle entry losses, theoretical penetrations across screens, the detection efficiency of the CPC, and constraints so the spreadsheet solved to values within the pDB range. Size distribution parameters (NMD, GSD, and number concentration) measured with the pDB+CPC with inversion spreadsheet were within 25% of those measured with a scanning mobility particle sizer (SMPS) for 5 of 12 polydisperse combustion aerosols. For three tests conducted with propylene torch exhaust, the pDB+CPC with inversion spreadsheet successfully identified that the NMD was smaller than the constraint value of 16 nm. The ratio of the nanoparticle portion of the aerosol compared to the reference ( R nano ) was calculated to determine the ability of pDB+CPC with inversion spreadsheet to measure the nanoparticle portion of the aerosols. The R nano ranged from 0.87 to 1.01 when the inversion solved and from 0.06 to 2.01 when the inversion solved to a constraint. The pDB combined with CPC has limited use as a personal monitor but combining the pDB with a different detector would allow for the pDB to be used as a personal monitor.
Design and Evaluation of a Personal Diffusion Battery
Vosburgh, Donna J. H.; Klein, Timothy; Sheehan, Maura; Anthony, T. Renee; Peters, Thomas M.
2016-01-01
A four-stage personal diffusion battery (pDB) was designed and constructed to measure submicron particle size distributions. The pDB consisted of a screen-type diffusion battery, solenoid valve system, and electronic controller. A data inversion spreadsheet was created to solve for the number median diameter (NMD), geometric standard deviation (GSD), and particle number concentration of unimodal aerosols using stage number concentrations from the pDB combined with a handheld condensation particle counter (pDB+CPC). The inversion spreadsheet included particle entry losses, theoretical penetrations across screens, the detection efficiency of the CPC, and constraints so the spreadsheet solved to values within the pDB range. Size distribution parameters (NMD, GSD, and number concentration) measured with the pDB+CPC with inversion spreadsheet were within 25% of those measured with a scanning mobility particle sizer (SMPS) for 5 of 12 polydisperse combustion aerosols. For three tests conducted with propylene torch exhaust, the pDB+CPC with inversion spreadsheet successfully identified that the NMD was smaller than the constraint value of 16 nm. The ratio of the nanoparticle portion of the aerosol compared to the reference (R nano) was calculated to determine the ability of pDB+CPC with inversion spreadsheet to measure the nanoparticle portion of the aerosols. The R nano ranged from 0.87 to 1.01 when the inversion solved and from 0.06 to 2.01 when the inversion solved to a constraint. The pDB combined with CPC has limited use as a personal monitor but combining the pDB with a different detector would allow for the pDB to be used as a personal monitor. PMID:26900207
Simplified risk assessment of noise induced hearing loss by means of 2 spreadsheet models.
Lie, Arve; Engdahl, Bo; Tambs, Kristian
2016-11-18
The objective of this study has been to test 2 spreadsheet models to compare the observed with the expected hearing loss for a Norwegian reference population. The prevalence rates of the Norwegian and the National Institute for Occupational Safety and Health (NIOSH) definitions of hearing outcomes were calculated in terms of sex and age, 20-64 years old, for a screened (with no occupational noise exposure) (N = 18 858) and unscreened (N = 38 333) Norwegian reference population from the Nord-Trøndelag Hearing Loss Study (NTHLS). Based on the prevalence rates, 2 different spreadsheet models were constructed in order to compare the prevalence rates of various groups of workers with the expected rates. The spreadsheets were then tested on 10 different occupational groups with varying degrees of hearing loss as compared to a reference population. Hearing of office workers, train drivers, conductors and teachers differed little from the screened reference values based on the Norwegian and the NIOSH criterion. The construction workers, miners, farmers and military had an impaired hearing and railway maintenance workers and bus drivers had a mildly impaired hearing. The spreadsheet models give a valid assessment of the hearing loss. The use of spreadsheet models to compare hearing in occupational groups with that of a reference population is a simple and quick method. The results are in line with comparable hearing thresholds, and allow for significance testing. The method is believed to be useful for occupational health services in the assessment of risk of noise induced hearing loss (NIHL) and the preventive potential in groups of noise-exposed workers. Int J Occup Med Environ Health 2016;29(6):991-999. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
NASA Astrophysics Data System (ADS)
McGrath, H.; Stefanakis, E.; Nastev, M.
2016-06-01
Conventional knowledge of the flood hazard alone (extent and frequency) is not sufficient for informed decision-making. The public safety community needs tools and guidance to adequately undertake flood hazard risk assessment in order to estimate respective damages and social and economic losses. While many complex computer models have been developed for flood risk assessment, they require highly trained personnel to prepare the necessary input (hazard, inventory of the built environment, and vulnerabilities) and analyze model outputs. As such, tools which utilize open-source software or are built within popular desktop software programs are appealing alternatives. The recently developed Rapid Risk Evaluation (ER2) application runs scenario based loss assessment analyses in a Microsoft Excel spreadsheet. User input is limited to a handful of intuitive drop-down menus utilized to describe the building type, age, occupancy and the expected water level. In anticipation of local depth damage curves and other needed vulnerability parameters, those from the U.S. FEMA's Hazus-Flood software have been imported and temporarily accessed in conjunction with user input to display exposure and estimated economic losses related to the structure and the content of the building. Building types and occupancies representative of those most exposed to flooding in Fredericton (New Brunswick) were introduced and test flood scenarios were run. The algorithm was successfully validated against results from the Hazus-Flood model for the same building types and flood depths.
Light-Duty Automotive Technology, Carbon Dioxide Emissions, and Fuel Economy Trends Data
The Light-Duty Automotive Technology, Carbon Dioxide Emissions, and Fuel Economy Trends report is the authoritative reference for carbon dioxide (CO2) emissions, fuel economy, and powertrain technology trends for new personal vehicles in the United States. The ??Trends?? report has been published annually since 1975 and covers all passenger cars, sport utility vehicles, minivans, and all but the largest pickup trucks and vans. This report does not provide formal compliance values for EPA CO2 emissions standards and NHTSA CAFE standards. The downloadable data are available in PDF or spreadsheet (XLS) formats.
This SOP described the method used to automatically parse analytical data generated from gas chromatography/mass spectrometry (GC/MS) analyses into CTEPP summary spreadsheets and electronically import the summary spreadsheets into the CTEPP study database.
A Spreadsheet-based GIS tool for planning aerial photography
The U.S.EPA's Pacific Coastal Ecology Branch has developed a tool which facilitates planning aerial photography missions. This tool is an Excel spreadsheet which accepts various input parameters such as desired photo-scale and boundary coordinates of the study area and compiles ...
NASA Astrophysics Data System (ADS)
Conrad, Jon M.
2000-01-01
Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. These problems help make concepts operational, develop economic intuition, and serve as a bridge to the study of real-world problems of resource management. Through these examples and additional exercises at the end of Chapters 1 to 8, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems. Book is unique in its use of spreadsheet software (Excel) to solve dynamic allocation problems Conrad is co-author of a previous book for the Press on the subject for graduate students Approach is extremely student-friendly; gives students the tools to apply research results to actual environmental issues
Rand, Hugh; Shumway, Martin; Trees, Eija K.; Simmons, Mustafa; Agarwala, Richa; Davis, Steven; Tillman, Glenn E.; Defibaugh-Chavez, Stephanie; Carleton, Heather A.; Klimke, William A.; Katz, Lee S.
2017-01-01
Background As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. Methods We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (reference-based SNP and wgMLST approaches). These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and “known” phylogenetic trees in publicly-accessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Results Our “outbreak” benchmark datasets represent the four major foodborne bacterial pathogens (Listeria monocytogenes, Salmonella enterica, Escherichia coli, and Campylobacter jejuni) and one simulated dataset where the “known tree” can be accurately called the “true tree”. The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. Discussion These five benchmark datasets will help standardize comparison of current and future phylogenomic pipelines, and facilitate important cross-institutional collaborations. Our work is part of a global effort to provide collaborative infrastructure for sequence data and analytic tools—we welcome additional benchmark datasets in our recommended format, and, if relevant, we will add these on our GitHub site. Together, these datasets, dataset format, and the underlying GitHub infrastructure present a recommended path for worldwide standardization of phylogenomic pipelines. PMID:29372115
Timme, Ruth E; Rand, Hugh; Shumway, Martin; Trees, Eija K; Simmons, Mustafa; Agarwala, Richa; Davis, Steven; Tillman, Glenn E; Defibaugh-Chavez, Stephanie; Carleton, Heather A; Klimke, William A; Katz, Lee S
2017-01-01
As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (reference-based SNP and wgMLST approaches). These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and "known" phylogenetic trees in publicly-accessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Our "outbreak" benchmark datasets represent the four major foodborne bacterial pathogens ( Listeria monocytogenes , Salmonella enterica , Escherichia coli , and Campylobacter jejuni ) and one simulated dataset where the "known tree" can be accurately called the "true tree". The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. These five benchmark datasets will help standardize comparison of current and future phylogenomic pipelines, and facilitate important cross-institutional collaborations. Our work is part of a global effort to provide collaborative infrastructure for sequence data and analytic tools-we welcome additional benchmark datasets in our recommended format, and, if relevant, we will add these on our GitHub site. Together, these datasets, dataset format, and the underlying GitHub infrastructure present a recommended path for worldwide standardization of phylogenomic pipelines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gold, Lois Swirsky; Manley, Neela B.; Slone, Thomas H.
2005-04-08
The Carcinogenic Potency Database (CPDB) is a systematic and unifying resource that standardizes the results of chronic, long-term animal cancer tests which have been conducted since the 1950s. The analyses include sufficient information on each experiment to permit research into many areas of carcinogenesis. Both qualitative and quantitative information is reported on positive and negative experiments that meet a set of inclusion criteria. A measure of carcinogenic potency, TD50 (daily dose rate in mg/kg body weight/day to induce tumors in half of test animals that would have remained tumor-free at zero dose), is estimated for each tissue-tumor combination reported. Thismore » article is the ninth publication of a chronological plot of the CPDB; it presents results on 560 experiments of 188 chemicals in mice, rats, and hamsters from 185 publications in the general literature updated through 1997, and from 15 Reports of the National Toxicology Program in 1997-1998. The test agents cover a wide variety of uses and chemical classes. The CPDB Web Site(http://potency.berkeley.edu/) presents the combined database of all published plots in a variety of formats as well as summary tables by chemical and by target organ, supplemental materials on dosing and survival, a detailed guide to using the plot formats, and documentation of methods and publications. The overall CPDB, including the results in this article, presents easily accessible results of 6153 experiments on 1485 chemicals from 1426 papers and 429 NCI/NTP (National Cancer Institute/National Toxicology program) Technical Reports. A tab-separated format of the full CPDB for reading the data into spreadsheets or database applications is available on the Web Site.« less
Buffer$--An Economic Analysis Tool
Gary Bentrup
2007-01-01
Buffer$ is an economic spreadsheet tool for analyzing the cost-benefits of conservation buffers by resource professionals. Conservation buffers are linear strips of vegetation managed for multiple landowner and societal objectives. The Microsoft Excel based spreadsheet can calculate potential income derived from a buffer, including income from cost-share/incentive...
A spreadsheet that calculates meteor orbits
NASA Astrophysics Data System (ADS)
Langbroek, M.
2004-08-01
The author has written an MS Excel spreadsheet application called Metorb08.xls which calculates a meteor's orbital elements from its apparent radiant position and initial speed. It can be downloaded from URL http://home.wanadoo.nl/marco.langbroek along with a suite of other meteor-related Excel applications.
Teaching Science and Mathematics Subjects Using the Excel Spreadsheet Package
ERIC Educational Resources Information Center
Ibrahim, Dogan
2009-01-01
The teaching of scientific subjects usually require laboratories where students can put the theory they have learned into practice. Traditionally, electronic programmable calculators, dedicated software, or expensive software simulation packages, such as MATLAB have been used to simulate scientific experiments. Recently, spreadsheet programs have…
Introduction to Classroom Sprego
ERIC Educational Resources Information Center
Csernoch, Mária; Biró, Piroska
2016-01-01
Sprego is programming with spreadsheet functions. The present paper provides introductory Sprego examples which have so far only been available in Hungarian. Spreadsheet environments offer both a programming tool which best serves beginner and end-user programmers' interest, and an approach which lightens the burden of coding and language details.…
The Devil and Daniel's Spreadsheet
ERIC Educational Resources Information Center
Burke, Maurice J.
2012-01-01
"When making mathematical models, technology is valuable for varying assumptions, exploring consequences, and comparing predictions with data," notes the Common Core State Standards Initiative (2010, p. 72). This exploration of the recursive process in the Devil and Daniel Webster problem reveals that the symbolic spreadsheet fits this bill.…
Hydrogen Financial Analysis Scenario Tool (H2FAST) Documentation
for the web and spreadsheet versions of H2FAST. H2FAST Web Tool User's Manual H2FAST Spreadsheet Tool User's Manual (DRAFT) Technical Support Send questions or feedback about H2FAST to H2FAST@nrel.gov. Home
Farrance, Ian; Frenkel, Robert
2014-01-01
The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional relationship and contribute to the combined standard uncertainty of the measurand. PMID:24659835
Farrance, Ian; Frenkel, Robert
2014-02-01
The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship and contribute to the combined standard uncertainty of the measurand.
NASA Astrophysics Data System (ADS)
Zou, Yan-Rong; Wang, Lianyuan; Shuai, Yanhua; Peng, Ping'an
2005-08-01
A new kinetic model and an Excel © spreadsheet program for modeling the stable carbon isotope composition of natural gases is provided in this paper. The model and spreadsheet could be used to describe and predict the variances in stable carbon isotope of natural gases under both experimental and geological conditions with heating temperature or geological time. It is a user-friendly convenient tool for the modeling of isotope variation with time under experimental and geological conditions. The spreadsheet, based on experimental data, requires the input of the kinetic parameters of gaseous hydrocarbons generation. Some assumptions are made in this model: the conventional (non-isotope species) kinetic parameters represent the light isotope species; the initial isotopic value is the same for all parallel chemical reaction of gaseous hydrocarbons generation for simplicity, the re-exponential factor ratio, 13A/ 12A, is a constant, and both heavy and light isotope species have similar activation energy distribution. These assumptions are common in modeling of isotope ratios. The spreadsheet is used for searching the best kinetic parameters of the heavy isotope species to reach the minimum errors compared with experimental data, and then extrapolating isotopic changes to the thermal history of sedimentary basins. A short calculation example on the variation in δ13C values of methane is provided in this paper to show application to geological conditions.
When Spreadsheets Become Software - Quality Control Challenges and Approaches - 13360
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fountain, Stefanie A.; Chen, Emmie G.; Beech, John F.
2013-07-01
As part of a preliminary waste acceptance criteria (PWAC) development, several commercial models were employed, including the Hydrologic Evaluation of Landfill Performance model (HELP) [1], the Disposal Unit Source Term - Multiple Species model (DUSTMS) [2], and the Analytical Transient One, Two, and Three-Dimensional model (AT123D) [3]. The results of these models were post-processed in MS Excel spreadsheets to convert the model results to alternate units, compare the groundwater concentrations to the groundwater concentration thresholds, and then to adjust the waste contaminant masses (based on average concentration over the waste volume) as needed in an attempt to achieve groundwater concentrationsmore » at the limiting point of assessment that would meet the compliance concentrations while maximizing the potential use of the landfill (i.e., maximizing the volume of projected waste being generated that could be placed in the landfill). During the course of the PWAC calculation development, one of the Microsoft (MS) Excel spreadsheets used to post-process the results of the commercial model packages grew to include more than 575,000 formulas across 18 worksheets. This spreadsheet was used to assess six base scenarios as well as nine uncertainty/sensitivity scenarios. The complexity of the spreadsheet resulted in the need for a rigorous quality control (QC) procedure to verify data entry and confirm the accuracy of formulas. (authors)« less
Introducing Artificial Neural Networks through a Spreadsheet Model
ERIC Educational Resources Information Center
Rienzo, Thomas F.; Athappilly, Kuriakose K.
2012-01-01
Business students taking data mining classes are often introduced to artificial neural networks (ANN) through point and click navigation exercises in application software. Even if correct outcomes are obtained, students frequently do not obtain a thorough understanding of ANN processes. This spreadsheet model was created to illuminate the roles of…
Introducing Simulation via the Theory of Records
ERIC Educational Resources Information Center
Johnson, Arvid C.
2011-01-01
While spreadsheet simulation can be a useful method by which to help students to understand some of the more advanced concepts in an introductory statistics course, introducing the simulation methodology at the same time as these concepts can result in student cognitive overload. This article describes a spreadsheet model that has been…
Forming Conjectures within a Spreadsheet Environment
ERIC Educational Resources Information Center
Calder, Nigel; Brown, Tony; Hanley, Una; Darby, Susan
2006-01-01
This paper is concerned with the use of spreadsheets within mathematical investigational tasks. Considering the learning of both children and pre-service teaching students, it examines how mathematical phenomena can be seen as a function of the pedagogical media through which they are encountered. In particular, it shows how pedagogical apparatus…
Domestic Disasters and Geospatial Technology for the Defense Logistics Agency
2014-12-01
total distance traveled and satisfy all fuel demands. This report used the Vehicle Routing Problem (VRP) Spreadsheet Solver, developed by Erdogan ...Security Affairs, 2(2), 5–10. Erdogan , G. (2013). VRP spreadsheet solver. Retrieved from VeRoLog: EURO Working Group on Vehicle Routing and Logistics
Interactive Spreadsheets in JCE Webware
ERIC Educational Resources Information Center
Coleman, William F.; Fedosky, Edward W.
2005-01-01
A description of the Microsoft Excel spreadsheet simulation, Anharmonicity.xls that can be used to smoothly and continuously switch a plotted function and its quadratic approximation is presented. It can be used in a classroom demonstration or incorporated into a student-centered computer-laboratory exercise to examine the qualitative behavior of…
Spreadsheet Applications: Prototyping an Innovative Blended Course
ERIC Educational Resources Information Center
Baker, J. Howard
2004-01-01
After teaching the advanced spreadsheet course at a major university in Louisiana as a traditional classroom course for a number of years, it was decided to create a prototype-blended course, with a considerable portion offered via distance education. This research, which uses a prototyping methodology, is exploratory in nature. Prototyping can…
LOTUS 1-2-3 and Decision Support: Allocating the Monograph Budget.
ERIC Educational Resources Information Center
Perry-Holmes, Claudia
1985-01-01
Describes the use of electronic spreadsheet software for library decision support systems using personal computers. Discussion covers templates, formulas for allocating the materials budget, LOTUS 1-2-3 and budget allocations, choosing a formula, the spreadsheet itself, graphing capabilities, and advantages and disadvantages of templates. Six…
Triangular Plots and Spreadsheet Software.
ERIC Educational Resources Information Center
Holm, Paul Eric
1988-01-01
Describes how the limitations of the built-in graphics capabilities of spreadsheet software can be overcome by making full use of the flexibility of the grahics options. Uses triangular plots with labeled field boundaries produced using Lotus 1-2-3 to demonstrate these techniques and their use in teaching geology. (CW)
Calculating the Variables of Finance on a Spreadsheet.
ERIC Educational Resources Information Center
Rochowicz, John A., Jr.
The different approaches for solving problems and learning mathematics with technology are invaluable. This paper describes how to determine the variables of the ordinary annuity equation with a spreadsheet. Examples of future value of annuity, sinking fund annuity, the number of periods necessary for periodic payments plus interest to accumulate…
Hydroshear Simulation Lab Test 2
Bauer, Steve
2014-08-01
This data file is for test 2. In this test a sample of granite with a pre cut (man made fracture) is confined, heated and differential stress is applied. max temperature in this this system development test is 95C. test details on the spreadsheets--note thta there are 2 spreadsheets
Buyers Guide: Communications Software--Overview; Ratings Digest; Reviews; Benchmarks.
ERIC Educational Resources Information Center
Lockwood, Russ; And Others
1988-01-01
Contains articles which review communications software. Includes "Crosstalk Mark 4,""ProComm,""Freeway Advanced,""Windows InTalk,""Relay Silver," and "Smartcom III." Compares in terms of text proprietary, MCI upload, Test ASCII, Spreadsheet Proprietary, Text XMODEM, Spreadsheet XMODEM, MCI Download, Documentation, Support and Service, ease of use,…
Spreadsheet Analysis of Harvesting Systems
R.B. Rummer; B.L. Lanford
1987-01-01
Harvesting systems can be modeled and analyzed on microcomputers using commercially available "spreadsheet" software. The effect of system or external variables on the production rate or system cost can be evaluated and alternative systems can be easily examined. The tedious calculations associated with such analyses are performed by the computer. For users...
Constructing Meanings and Utilities within Algebraic Tasks
ERIC Educational Resources Information Center
Ainley, Janet; Bills, Liz; Wilson, Kirsty
2004-01-01
The Purposeful Algebraic Activity project aims to explore the potential of spreadsheets in the introduction to algebra and algebraic thinking. We discuss two sub-themes within the project: tracing the development of pupils' construction of meaning for variable from arithmetic-based activity, through use of spreadsheets, and into formal algebra,…
Mihalopoulos, Catherine; Cadilhac, Dominique A; Moodie, Marjory L; Dewey, Helen M; Thrift, Amanda G; Donnan, Geoffrey A; Carter, Robert C
2005-01-01
To outline the development, structure, data assumptions, and application of an Australian economic model for stroke (Model of Resource Utilization, Costs, and Outcomes for Stroke [MORUCOS]). The model has a linked spreadsheet format with four modules to describe the disease burden and treatment pathways, estimate prevalence-based and incidence-based costs, and derive life expectancy and quality of life consequences. The model uses patient-level, community-based, stroke cohort data and macro-level simulations. An interventions module allows options for change to be consistently evaluated by modifying aspects of the other modules. To date, model validation has included sensitivity testing, face validity, and peer review. Further validation of technical and predictive accuracy is needed. The generic pathway model was assessed by comparison with a stroke subtypes (ischemic, hemorrhagic, or undetermined) approach and used to determine the relative cost-effectiveness of four interventions. The generic pathway model produced lower costs compared with a subtypes version (total average first-year costs/case AUD$ 15,117 versus AUD$ 17,786, respectively). Optimal evidence-based uptake of anticoagulation therapy for primary and secondary stroke prevention and intravenous thrombolytic therapy within 3 hours of stroke were more cost-effective than current practice (base year, 1997). MORUCOS is transparent and flexible in describing Australian stroke care and can effectively be used to systematically evaluate a range of different interventions. Adjusting results to account for stroke subtypes, as they influence cost estimates, could enhance the generic model.
Lambert, Ronald J W; Mytilinaios, Ioannis; Maitland, Luke; Brown, Angus M
2012-08-01
This study describes a method to obtain parameter confidence intervals from the fitting of non-linear functions to experimental data, using the SOLVER and Analysis ToolPaK Add-In of the Microsoft Excel spreadsheet. Previously we have shown that Excel can fit complex multiple functions to biological data, obtaining values equivalent to those returned by more specialized statistical or mathematical software. However, a disadvantage of using the Excel method was the inability to return confidence intervals for the computed parameters or the correlations between them. Using a simple Monte-Carlo procedure within the Excel spreadsheet (without recourse to programming), SOLVER can provide parameter estimates (up to 200 at a time) for multiple 'virtual' data sets, from which the required confidence intervals and correlation coefficients can be obtained. The general utility of the method is exemplified by applying it to the analysis of the growth of Listeria monocytogenes, the growth inhibition of Pseudomonas aeruginosa by chlorhexidine and the further analysis of the electrophysiological data from the compound action potential of the rodent optic nerve. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Incorporating Inquiry into Upper-Level Homework Assignments: The Mini-Journal
NASA Astrophysics Data System (ADS)
Whittington, A. G.; Speck, A. K.; Witzig, S. B.; Abell, S. K.
2009-12-01
The U.S. National Science Education Standards provide guidelines for teaching science through inquiry, where students actively develop their understanding of science by combining scientific knowledge with reasoning and thinking skills. Inquiry activities include reading scientific literature, generating hypotheses, designing and carrying out investigations, interpreting data, and formulating conclusions. Inquiry-based instruction emphasizes questions, evidence, and explanation, the essential features of inquiry. As part of an NSF-funded project, “CUES: Connecting Undergraduates to the Enterprise of Science,” new inquiry-based homework materials were developed for two upper-level classes at the University of Missouri: Geochemistry (required for Geology majors), and Solar System Science (open to seniors and graduate students, co-taught and cross-listed between Geology and Physics & Astronomy). We engage students in inquiry-based learning by presenting homework exercises as “mini-journal” articles that follow the format of a scientific journal article, including a title, authors, abstract, introduction, methods, results, discussion and citations to peer-reviewed literature. The mini-journal provides a scaffold and serves as a springboard for students to develop and carry out their own follow-up investigation. They then present their findings in the form of their own mini-journal. Mini-journals replace traditional homework problem sets with a format that more directly reflects and encourages scientific practice. Students are engaged in inquiry-based homework which encompass doing, thinking, and communicating, while the minijournal allows the instructor to contain lines of inquiry within the limits posed by available resources. In the examples we present, research is conducted via spreadsheet modeling, where the students develop their own spreadsheets. The key differences between the old and new formats include (i) the active participation of the students in defining the problem that they will pursue, (ii) the open-ended nature of the inquiry, such that students need to recognize when they have enough information to answer their question, (iii) presentation of results in graphical and tabular formats, and (iv) a written discussion of their findings. We present both the rationale for and concept of using mini-journal homeworks, and provide specific examples we are currently employing in classes. In addition, we explore the challenges (real and perceived) and successes associated with implementing such a technique, and examine student feedback comparing mini-journal and traditional homework formats from the same classes.
Tools for Management of Chlorinated Solvent - Contaminated Sites
2009-12-03
Equations coded into RT3D reaction module Spreadsheet as series of CSTRs ● Model assumes No NAPL present ( , , , )ij i I S i j i M MD v M F C...ER‐0623 ● Mechanics MnO4 transport and consumption Based on series of CSTRs NOD kinetics identical to RT3D Includes cost estimating tool to
Parkhurst, David L.; Appelo, C.A.J.
1999-01-01
PHREEQC version 2 is a computer program written in the C programming language that is designed to perform a wide variety of low-temperature aqueous geochemical calculations. PHREEQC is based on an ion-association aqueous model and has capabilities for (1) speciation and saturation-index calculations; (2) batch-reaction and one-dimensional (1D) transport calculations involving reversible reactions, which include aqueous, mineral, gas, solid-solution, surface-complexation, and ion-exchange equilibria, and irreversible reactions, which include specified mole transfers of reactants, kinetically controlled reactions, mixing of solutions, and temperature changes; and (3) inverse modeling, which finds sets of mineral and gas mole transfers that account for differences in composition between waters, within specified compositional uncertainty limits.New features in PHREEQC version 2 relative to version 1 include capabilities to simulate dispersion (or diffusion) and stagnant zones in 1D-transport calculations, to model kinetic reactions with user-defined rate expressions, to model the formation or dissolution of ideal, multicomponent or nonideal, binary solid solutions, to model fixed-volume gas phases in addition to fixed-pressure gas phases, to allow the number of surface or exchange sites to vary with the dissolution or precipitation of minerals or kinetic reactants, to include isotope mole balances in inverse modeling calculations, to automatically use multiple sets of convergence parameters, to print user-defined quantities to the primary output file and (or) to a file suitable for importation into a spreadsheet, and to define solution compositions in a format more compatible with spreadsheet programs. This report presents the equations that are the basis for chemical equilibrium, kinetic, transport, and inverse-modeling calculations in PHREEQC; describes the input for the program; and presents examples that demonstrate most of the program's capabilities.
Siebert, Janet C; Munsil, Wes; Rosenberg-Hasson, Yael; Davis, Mark M; Maecker, Holden T
2012-03-28
Systems-level approaches are increasingly common in both murine and human translational studies. These approaches employ multiple high information content assays. As a result, there is a need for tools to integrate heterogeneous types of laboratory and clinical/demographic data, and to allow the exploration of that data by aggregating and/or segregating results based on particular variables (e.g., mean cytokine levels by age and gender). Here we describe the application of standard data warehousing tools to create a novel environment for user-driven upload, integration, and exploration of heterogeneous data. The system presented here currently supports flow cytometry and immunoassays performed in the Stanford Human Immune Monitoring Center, but could be applied more generally. Users upload assay results contained in platform-specific spreadsheets of a defined format, and clinical and demographic data in spreadsheets of flexible format. Users then map sample IDs to connect the assay results with the metadata. An OLAP (on-line analytical processing) data exploration interface allows filtering and display of various dimensions (e.g., Luminex analytes in rows, treatment group in columns, filtered on a particular study). Statistics such as mean, median, and N can be displayed. The views can be expanded or contracted to aggregate or segregate data at various levels. Individual-level data is accessible with a single click. The result is a user-driven system that permits data integration and exploration in a variety of settings. We show how the system can be used to find gender-specific differences in serum cytokine levels, and compare them across experiments and assay types. We have used the tools and techniques of data warehousing, including open-source business intelligence software, to support investigator-driven data integration and mining of diverse immunological data.
2012-01-01
Background Systems-level approaches are increasingly common in both murine and human translational studies. These approaches employ multiple high information content assays. As a result, there is a need for tools to integrate heterogeneous types of laboratory and clinical/demographic data, and to allow the exploration of that data by aggregating and/or segregating results based on particular variables (e.g., mean cytokine levels by age and gender). Methods Here we describe the application of standard data warehousing tools to create a novel environment for user-driven upload, integration, and exploration of heterogeneous data. The system presented here currently supports flow cytometry and immunoassays performed in the Stanford Human Immune Monitoring Center, but could be applied more generally. Results Users upload assay results contained in platform-specific spreadsheets of a defined format, and clinical and demographic data in spreadsheets of flexible format. Users then map sample IDs to connect the assay results with the metadata. An OLAP (on-line analytical processing) data exploration interface allows filtering and display of various dimensions (e.g., Luminex analytes in rows, treatment group in columns, filtered on a particular study). Statistics such as mean, median, and N can be displayed. The views can be expanded or contracted to aggregate or segregate data at various levels. Individual-level data is accessible with a single click. The result is a user-driven system that permits data integration and exploration in a variety of settings. We show how the system can be used to find gender-specific differences in serum cytokine levels, and compare them across experiments and assay types. Conclusions We have used the tools and techniques of data warehousing, including open-source business intelligence software, to support investigator-driven data integration and mining of diverse immunological data. PMID:22452993
Forming conjectures within a spreadsheet environment
NASA Astrophysics Data System (ADS)
Calder, Nigel; Brown, Tony; Hanley, Una; Darby, Susan
2006-12-01
This paper is concerned with the use of spreadsheets within mathematical investigational tasks. Considering the learning of both children and pre-service teaching students, it examines how mathematical phenomena can be seen as a function of the pedagogical media through which they are encountered. In particular, it shows how pedagogical apparatus influence patterns of social interaction, and how this interaction shapes the mathematical ideas that are engaged with. Notions of conjecture, along with the particular faculty of the spreadsheet setting, are considered with regard to the facilitation of mathematical thinking. Employing an interpretive perspective, a key focus is on how alternative pedagogical media and associated discursive networks influence the way that students form and test informal conjectures.
Rose, Adam; Avetisyan, Misak; Chatterjee, Samrat
2014-08-01
This article presents a framework for economic consequence analysis of terrorism countermeasures. It specifies major categories of direct and indirect costs, benefits, spillover effects, and transfer payments that must be estimated in a comprehensive assessment. It develops a spreadsheet tool for data collection, storage, and refinement, as well as estimation of the various components of the necessary economic accounts. It also illustrates the usefulness of the framework in the first assessment of the tradeoffs between enhanced security and changes in commercial activity in an urban area, with explicit attention to the role of spillover effects. The article also contributes a practical user interface to the model for emergency managers. © 2014 Society for Risk Analysis.
A toy model for the yield of a tamped fission bomb
NASA Astrophysics Data System (ADS)
Reed, B. Cameron
2018-02-01
A simple expression is developed for estimating the yield of a tamped fission bomb, that is, a basic nuclear weapon comprising a fissile core jacketed by a surrounding neutron-reflecting tamper. This expression is based on modeling the nuclear chain reaction as a geometric progression in combination with a previously published expression for the threshold-criticality condition for such a core. The derivation is especially straightforward, as it requires no knowledge of diffusion theory and should be accessible to students of both physics and policy. The calculation can be set up as a single page spreadsheet. Application to the Little Boy and Fat Man bombs of World War II gives results in reasonable accord with published yield estimates for these weapons.
Predictive thermodynamics for ionic solids and liquids.
Glasser, Leslie; Jenkins, H Donald Brooke
2016-08-21
The application of thermodynamics is simple, even if the theory may appear intimidating. We describe tools, developed over recent years, which make it easy to estimate often elusive thermodynamic parameter values, generally (but not exclusively) for ionic materials, both solid and liquid, as well as for their solid hydrates and solvates. The tools are termed volume-based thermodynamics (VBT) and thermodynamic difference rules (TDR), supplemented by the simple salt approximation (SSA) and single-ion values for volume, Vm, heat capacity, , entropy, , formation enthalpy, ΔfH°, and Gibbs formation energy, ΔfG°. These tools can be applied to provide values of thermodynamic and thermomechanical properties such as standard enthalpy of formation, ΔfH°, standard entropy, , heat capacity, Cp, Gibbs function of formation, ΔfG°, lattice potential energy, UPOT, isothermal expansion coefficient, α, and isothermal compressibility, β, and used to suggest the thermodynamic feasibility of reactions among condensed ionic phases. Because many of these methods yield results largely independent of crystal structure, they have been successfully extended to the important and developing class of ionic liquids as well as to new and hypothesised materials. Finally, these predictive methods are illustrated by application to K2SnCl6, for which known experimental results are available for comparison. A selection of applications of VBT and TDR is presented which have enabled input, usually in the form of thermodynamics, to be brought to bear on a range of topical problems. Perhaps the most significant advantage of VBT and TDR methods is their inherent simplicity in that they do not require a high level of computational expertise nor expensive high-performance computation tools - a spreadsheet will usually suffice - yet the techniques are extremely powerful and accessible to non-experts. The connection between formula unit volume, Vm, and standard thermodynamic parameters represents a major advance exploited by these techniques.
Mathematical Modeling with MyMaps and Spreadsheets
ERIC Educational Resources Information Center
Weber, Victoria; Fortune, Nicholas; Williams, Derek; Whitehead, Ashley
2016-01-01
Software programs such as Tinkerplots ® or Geometer's Sketchpad ® can help students solve problems in mathematics classes, but may not be available to them after high school. In contrast, many students who become familiar with Internet tools and programs in office packages (word processing, spreadsheets, etc.) may use them daily to enhance their…
ERIC Educational Resources Information Center
Lai, Chiu-Lin; Hwang, Gwo-Jen
2015-01-01
In this study, a spreadsheet-based visualized Mindtool was developed for improving students' learning performance when finding relationships between numerical variables by engaging them in reasoning and decision-making activities. To evaluate the effectiveness of the proposed approach, an experiment was conducted on the "phenomena of climate…
Working Together: Google Apps Goes to School
ERIC Educational Resources Information Center
Oishi, Lindsay
2007-01-01
Online collaboration and project-management tools allow people to work together without being in the same place at the same time. However, that is not all, Google Docs & Spreadsheets, for example, allows the creation of documents and spreadsheets just like in Microsoft Word and Excel, but with more collaborative capacity. Google Calendar lets…
Simulating Satellite and Space Probe Motion at High School with Spreadsheets
ERIC Educational Resources Information Center
Benacka, Jan
2017-01-01
This paper gives an account of an experiment in which thirty-three high school students of ages 17-19 developed spreadsheet numerical models of satellite and space probe motion. The models are free to download. A survey was carried out to find out the students' opinion of the lessons.
Using Spreadsheets to Teach Aspects of Biology Involving Mathematical Models
ERIC Educational Resources Information Center
Carlton, Kevin; Nicholls, Mike; Ponsonby, David
2004-01-01
Some aspects of biology, for example the Hardy-Weinberg simulation of population genetics or modelling heat flow in lizards, have an undeniable mathematical basis. Students can find the level of mathematical skill required to deal with such concepts to be an insurmountable hurdle to understanding. If not used effectively, spreadsheet models…
Diary of a Conversion--Lotus 1-2-3 to Symphony 1.1.
ERIC Educational Resources Information Center
Dunnewin, Larry
1986-01-01
Describes the uses of Lotus 1-2-3 (a spreadsheet-graphics-database program created by Lotus Development Corporation) and Symphony 1.1 (a refinement and expansion of Symphony 1.01 providing memory efficiency, speed, ease of use, greater file compatibility). Spreadsheet and graphics capabilities, the use of windows, database environment, and…
Using Spreadsheet Modeling Techniques for Capital Project Review. AIR 1985 Annual Forum Paper.
ERIC Educational Resources Information Center
Kaynor, Robert K.
The value of microcomputer modeling tools and spreadsheets to help college institutional researchers analyze proposed capital projects is discussed, along with strengths and weaknesses of different software packages. Capital budgeting is the analysis that supports decisions about the allocation and commitment of funds to long-term capital…
Negative Effects of Learning Spreadsheet Management on Learning Database Management
ERIC Educational Resources Information Center
Vágner, Anikó; Zsakó, László
2015-01-01
A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…
Spreadsheets as a Transparent Resource for Learning the Mathematics of Annuities
ERIC Educational Resources Information Center
Pournara, Craig
2009-01-01
The ability of mathematics teachers to decompress mathematics and to move between representations are two key features of mathematical knowledge that is usable for teaching. This article reports on four pre-service secondary mathematics teachers learning the mathematics of annuities. In working with spreadsheets students began to make sense of…
A Simple Spreadsheet Strikes a Nerve among Adjuncts
ERIC Educational Resources Information Center
Stratford, Michael
2012-01-01
Energized by his fellow adjunct professors who had gathered for a national meeting last month in Washington, District of Columbia, Joshua A. Boldt flew home to Athens, Georgia, opened his laptop, and created a Google document. On his personal blog, the writing instructor implored colleagues to contribute to the publicly editable spreadsheet,…
Studying Faculty Flows Using an Interactive Spreadsheet Model. AIR 1997 Annual Forum Paper.
ERIC Educational Resources Information Center
Kelly, Wayne
This paper describes a spreadsheet-based faculty flow model developed and implemented at the University of Calgary (Canada) to analyze faculty retirement, turnover, and salary issues. The study examined whether, given expected faculty turnover, the current salary increment system was sustainable in a stable or declining funding environment, and…
Transition Matrices: A Tool to Assess Student Learning and Improve Instruction
ERIC Educational Resources Information Center
Morris, Gary A.; Walter, Paul; Skees, Spencer; Schwartz, Samantha
2017-01-01
This paper introduces a new spreadsheet tool for adoption by high school or college-level physics teachers who use common assessments in a pre-instruction/post-instruction mode to diagnose student learning and teaching effectiveness. The spreadsheet creates a simple matrix that identifies the percentage of students who select each possible…
ERIC Educational Resources Information Center
Agyei, Douglas D.; Voogt, Joke M.
2016-01-01
In this study, 12 pre-service mathematics teachers worked in teams to develop their knowledge and skills in using teacher-led spreadsheet demonstrations to help students explore mathematics concepts, stimulate discussions and perform authentic tasks through activity-based lessons. Pre-service teachers' lesson plans, their instruction of the…
Examining Errors in Simple Spreadsheet Modeling from Different Research Perspectives
ERIC Educational Resources Information Center
Kadijevich, Djordje M.
2012-01-01
By using a sample of 1st-year undergraduate business students, this study dealt with the development of simple (deterministic and non-optimization) spreadsheet models of income statements within an introductory course on business informatics. The study examined students' errors in doing this for business situations of their choice and found three…
Using Spreadsheets to Discover Meaning for Parameters in Nonlinear Models
ERIC Educational Resources Information Center
Green, Kris H.
2008-01-01
This paper explores the use of spreadsheets to develop an exploratory environment where mathematics students can develop their own understanding of the parameters of commonly encountered families of functions: linear, logarithmic, exponential and power. The key to this understanding involves opening up the definition of rate of change from the…
Using a Spreadsheet Scroll Bar to Solve Equilibrium Concentrations
ERIC Educational Resources Information Center
Raviolo, Andres
2012-01-01
A simple, conceptual method is described for using the spreadsheet scroll bar to find the composition of a system at chemical equilibrium. Simulation of any kind of chemical equilibrium can be carried out using this method, and the effects of different disturbances can be predicted. This simulation, which can be used in general chemistry…
Evolving Polygons and Spreadsheets: Connecting Mathematics across Grade Levels in Teacher Education
ERIC Educational Resources Information Center
Abramovich, Sergei; Brouwer, Peter
2009-01-01
This paper was prepared in response to the Conference Board of Mathematical Sciences recommendations for the preparation of secondary teachers. It shows how using trigonometry as a conceptual tool in spreadsheet-based applications enables one to develop mathematical understanding in the context of constructing geometric representations of unit…
Excel Yourself with Personalised Email Messages
ERIC Educational Resources Information Center
McClean, Stephen
2008-01-01
Combining the Excel spreadsheet with an email program provides a very powerful tool for sending students personalised emails. Most email clients now support a Mail Merge facility whereby a generic template is created and information unique to each student record in the spreadsheet is filled into that template, generating tens if not hundreds of…
NASA Astrophysics Data System (ADS)
Locock, Andrew J.; Mitchell, Roger H.
2018-04-01
Perovskite mineral oxides commonly exhibit extensive solid-solution, and are therefore classified on the basis of the proportions of their ideal end-members. A uniform sequence of calculation of the end-members is required if comparisons are to be made between different sets of analytical data. A Microsoft Excel spreadsheet has been programmed to assist with the classification and depiction of the minerals of the perovskite- and vapnikite-subgroups following the 2017 nomenclature of the perovskite supergroup recommended by the International Mineralogical Association (IMA). Compositional data for up to 36 elements are input into the spreadsheet as oxides in weight percent. For each analysis, the output includes the formula, the normalized proportions of 15 end-members, and the percentage of cations which cannot be assigned to those end-members. The data are automatically plotted onto the ternary and quaternary diagrams recommended by the IMA for depiction of perovskite compositions. Up to 200 analyses can be entered into the spreadsheet, which is accompanied by data calculated for 140 perovskite compositions compiled from the literature.
NASA Astrophysics Data System (ADS)
Grose, C. J.
2008-05-01
Numerical geodynamics models of heat transfer are typically thought of as specialized topics of research requiring knowledge of specialized modelling software, linux platforms, and state-of-the-art finite-element codes. I have implemented analytical and numerical finite-difference techniques with Microsoft Excel 2007 spreadsheets to solve for complex solid-earth heat transfer problems for use by students, teachers, and practicing scientists without specialty in geodynamics modelling techniques and applications. While implementation of equations for use in Excel spreadsheets is occasionally cumbersome, once case boundary structure and node equations are developed, spreadsheet manipulation becomes routine. Model experimentation by modifying parameter values, geometry, and grid resolution makes Excel a useful tool whether in the classroom at the undergraduate or graduate level or for more engaging student projects. Furthermore, the ability to incorporate complex geometries and heat-transfer characteristics makes it ideal for first and occasionally higher order geodynamics simulations to better understand and constrain the results of professional field research in a setting that does not require the constraints of state-of-the-art modelling codes. The straightforward expression and manipulation of model equations in excel can also serve as a medium to better understand the confusing notations of advanced mathematical problems. To illustrate the power and robustness of computation and visualization in spreadsheet models I focus primarily on one-dimensional analytical and two-dimensional numerical solutions to two case problems: (i) the cooling of oceanic lithosphere and (ii) temperatures within subducting slabs. Excel source documents will be made available.
Solving L-L Extraction Problems with Excel Spreadsheet
ERIC Educational Resources Information Center
Teppaitoon, Wittaya
2016-01-01
This work aims to demonstrate the use of Excel spreadsheets for solving L-L extraction problems. The key to solving the problems successfully is to be able to determine a tie line on the ternary diagram where the calculation must be carried out. This enables the reader to analyze the extraction process starting with a simple operation, the…
Electronic spreadsheet vs. manual payroll.
Kiley, M M
1991-01-01
Medical groups with direct employees must employ someone or contract with a company to compute payroll, writes Michael Kiley, Ph.D., M.P.H. However, many medical groups, including small ones, own a personal or minicomputer to handle accounts receivable. Kiley explains, in detail, how this same computer and a spreadsheet program also can be used to perform payroll functions.
ERIC Educational Resources Information Center
Caputi, Peter; Chan, Amy; Jayasuriya, Rohan
2011-01-01
This paper examined the impact of training strategies on the types of errors that novice users make when learning a commonly used spreadsheet application. Fifty participants were assigned to a counterfactual thinking training (CFT) strategy, an error management training strategy, or a combination of both strategies, and completed an easy task…
ERIC Educational Resources Information Center
Peterlin, Primoz
2010-01-01
Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…
ERIC Educational Resources Information Center
Berardi, Victor L.
2012-01-01
Using information systems to solve business problems is increasingly required of everyone in an organization, not just technical specialists. In the operations management class, spreadsheet usage has intensified with the focus on building decision models to solve operations management concerns such as forecasting, process capability, and inventory…
ERIC Educational Resources Information Center
Davies, Randall S.; Dean, Douglas L.; Ball, Nick
2013-01-01
The purpose of this research was to explore how technology can be used to teach technological skills and to determine what benefit "flipping" the classroom might have for students taking an introductory-level college course on spreadsheets in terms of student achievement and satisfaction with the class. A pretest posttest…
ERIC Educational Resources Information Center
Agyei, Douglas D.; Voogt, Joke M.
2015-01-01
This article explored the impact of strategies applied in a mathematics instructional technology course for developing technology integration competencies, in particular in the use of spreadsheets, in pre-service teachers. In this respect, 104 pre-service mathematics teachers from a teacher training programme in Ghana enrolled in the mathematics…
A Computer Simulation Using Spreadsheets for Learning Concept of Steady-State Equilibrium
ERIC Educational Resources Information Center
Sharda, Vandana; Sastri, O. S. K. S.; Bhardwaj, Jyoti; Jha, Arbind K.
2016-01-01
In this paper, we present a simple spreadsheet based simulation activity that can be performed by students at the undergraduate level. This simulation is implemented in free open source software (FOSS) LibreOffice Calc, which is available for both Windows and Linux platform. This activity aims at building the probability distribution for the…
ERIC Educational Resources Information Center
Fasoula, S.; Nikitas, P.; Pappa-Louisi, A.
2017-01-01
A series of Microsoft Excel spreadsheets were developed to simulate the process of separation optimization under isocratic and simple gradient conditions. The optimization procedure is performed in a stepwise fashion using simple macros for an automatic application of this approach. The proposed optimization approach involves modeling of the peak…
A Simple Spreadsheet Program for the Calculation of Lattice-Site Distributions
ERIC Educational Resources Information Center
McCaffrey, John G.
2009-01-01
A simple spreadsheet program is presented that can be used by undergraduate students to calculate the lattice-site distributions in solids. A major strength of the method is the natural way in which the correct number of ions or atoms are present, or absent, at specific lattice distances. The expanding-cube method utilized is straightforward to…
ERIC Educational Resources Information Center
Horton, Robert M.; Leonard, William H.
2005-01-01
In science, inquiry is used as students explore important and interesting questions concerning the world around them. In mathematics, one contemporary inquiry approach is to create models that describe real phenomena. Creating mathematical models using spreadsheets can help students learn at deep levels in both science and mathematics, and give…
Easy Leaf Area: Automated digital image analysis for rapid and accurate measurement of leaf area.
Easlon, Hsien Ming; Bloom, Arnold J
2014-07-01
Measurement of leaf areas from digital photographs has traditionally required significant user input unless backgrounds are carefully masked. Easy Leaf Area was developed to batch process hundreds of Arabidopsis rosette images in minutes, removing background artifacts and saving results to a spreadsheet-ready CSV file. • Easy Leaf Area uses the color ratios of each pixel to distinguish leaves and calibration areas from their background and compares leaf pixel counts to a red calibration area to eliminate the need for camera distance calculations or manual ruler scale measurement that other software methods typically require. Leaf areas estimated by this software from images taken with a camera phone were more accurate than ImageJ estimates from flatbed scanner images. • Easy Leaf Area provides an easy-to-use method for rapid measurement of leaf area and nondestructive estimation of canopy area from digital images.
From ClinicalTrials.gov trial registry to an analysis-ready database of clinical trial results.
Cepeda, M Soledad; Lobanov, Victor; Berlin, Jesse A
2013-04-01
The ClinicalTrials.gov web site provides a convenient interface to look up study results, but it does not allow downloading data in a format that can be readily used for quantitative analyses. To develop a system that automatically downloads study results from ClinicalTrials.gov and provides an interface to retrieve study results in a spreadsheet format ready for analysis. Sherlock(®) identifies studies by intervention, population, or outcome of interest and in seconds creates an analytic database of study results ready for analyses. The outcome classification algorithms used in Sherlock were validated against a classification by an expert. Having a database ready for analysis that can be updated automatically, dramatically extends the utility of the ClinicalTrials.gov trial registry. It increases the speed of comparative research, reduces the need for manual extraction of data, and permits answering a vast array of questions.
Parametric estimation for reinforced concrete relief shelter for Aceh cases
NASA Astrophysics Data System (ADS)
Atthaillah; Saputra, Eri; Iqbal, Muhammad
2018-05-01
This paper was a work in progress (WIP) to discover a rapid parametric framework for post-disaster permanent shelter’s materials estimation. The intended shelters were reinforced concrete construction with bricks as its wall. Inevitably, in post-disaster cases, design variations were needed to help suited victims condition. It seemed impossible to satisfy a beneficiary with a satisfactory design utilizing the conventional method. This study offered a parametric framework to overcome slow construction-materials estimation issue against design variations. Further, this work integrated parametric tool, which was Grasshopper to establish algorithms that simultaneously model, visualize, calculate and write the calculated data to a spreadsheet in a real-time. Some customized Grasshopper components were created using GHPython scripting for a more optimized algorithm. The result from this study was a partial framework that successfully performed modeling, visualization, calculation and writing the calculated data simultaneously. It meant design alterations did not escalate time needed for modeling, visualization, and material estimation. Further, the future development of the parametric framework will be made open source.
ERIC Educational Resources Information Center
Abriata, Luciano A.
2011-01-01
A simple algorithm was implemented in a spreadsheet program to simulate the circular dichroism spectra of proteins from their secondary structure content and to fit [alpha]-helix, [beta]-sheet, and random coil contents from experimental far-UV circular dichroism spectra. The physical basis of the method is briefly reviewed within the context of…
Trent Wickman; Ann Acheson
2005-01-01
The Smoke Impact Spreadsheet (SIS) is a simple-to-use planning model for calculating particulate matter (PM) emissions and concentrations downwind of wildland fires. This fact sheet identifies the intended users and uses, required inputs, what the model does and does not do, and tells the user how to obtain the model.
Teaching Graphical Simulations of Fourier Series Expansion of Some Periodic Waves Using Spreadsheets
ERIC Educational Resources Information Center
Singh, Iqbal; Kaur, Bikramjeet
2018-01-01
The present article demonstrates a way of programming using an Excel spreadsheet to teach Fourier series expansion in school/colleges without the knowledge of any typical programming language. By using this, a student learns to approximate partial sum of the n terms of Fourier series for some periodic signals such as square wave, saw tooth wave,…
ERIC Educational Resources Information Center
Ray, Darrell L.
2013-01-01
Students often enter biology programs deficient in the math and computational skills that would enhance their attainment of a deeper understanding of the discipline. To address some of these concerns, I developed a series of spreadsheet simulation exercises that focus on some of the mathematical foundations of scientific inquiry and the benefits…
ERIC Educational Resources Information Center
Tekinarslan, Erkan
2013-01-01
The purpose of this study is to investigate the effects of screencasts on the Turkish undergraduate students' achievement and knowledge acquisitions in spreadsheet applications. The methodology of the study is based on a pretest-posttest experimental design with a control group. A total of 66 undergraduate students in two groups (n = 33 in…
E.M. Bilek; Peter Becker; Tim. McAbee
2009-01-01
This documentation is meant to accompany CVal, a downloadable spreadsheet tool. CVal was constructed for foresters, other land management advisors, landowners, and carbon credit aggregators to evaluate the direct benefits and costs of entering into contracts for carbon sequestered in managed forests and forest plantations. CVal was designed to evaluate Exchange...
ERIC Educational Resources Information Center
Halpern, Arthur M.; Glendening, Eric D.
2013-01-01
A three-part project for students in physical chemistry, computational chemistry, or independent study is described in which they explore applications of valence bond (VB) and molecular orbital-configuration interaction (MO-CI) treatments of H[subscript 2]. Using a scientific spreadsheet, students construct potential-energy (PE) curves for several…
Numerical Modelling with Spreadsheets as a Means to Promote STEM to High School Students
ERIC Educational Resources Information Center
Benacka, Jan
2016-01-01
The article gives an account of an experiment in which sixty-eight high school students of age 16 - 19 developed spreadsheet applications that simulated fall and projectile motion in the air. The students applied the Euler method to solve the governing differential equations. The aim was to promote STEM to the students and motivate them to study…
ERIC Educational Resources Information Center
Kunzler, Jayson S.
2012-01-01
This dissertation describes a research study designed to explore whether customization of online instruction results in improved learning in a college business statistics course. The study involved utilizing computer spreadsheet technology to develop an intelligent tutoring system (ITS) designed to: a) collect and monitor individual real-time…
ERIC Educational Resources Information Center
Thohir, M. Anas
2018-01-01
In the 21st century, the competence of instructional technological design is important for pre-service physics teachers. This case study described the pre-service physics teachers' design of optical spreadsheet simulation and evaluated teaching and learning the task in the classroom. The case study chose three of thirty pre-service teacher's…
Eric van Steenis
2013-01-01
This paper illustrates how to use an excel spreadsheet as a decision-making tool to determine optimum sowing factor to minimize seedling production cost. Factors incorporated into the spreadsheet calculations include germination percentage, seeder accuracy, cost per seed, cavities per block, costs of handling, thinning, and transplanting labor, and more. In addition to...
Development of a spreadsheet for SNPs typing using Microsoft EXCEL.
Hashiyada, Masaki; Itakura, Yukio; Takahashi, Shirushi; Sakai, Jun; Funayama, Masato
2009-04-01
Single-nucleotide polymorphisms (SNPs) have some characteristics that make them very appropriate for forensic studies and applications. In our institute, SNPs typings were performed by the TaqMan SNP Genotyping Assays using the ABI PRISM 7500 FAST Real-Time PCR System (AppliedBiosystems) and Sequence Detection Software ver.1.4 (AppliedBiosystem). The TaqMan method was desired two positive control (Allele1 and 2) and one negative control to analyze each SNP locus. Therefore, it can be analyzed up to 24 loci of a person on a 96-well-plate at the same time. If SNPs analysis is expected to apply to biometrics authentication, 48 and over loci are required to identify a person. In this study, we designed a spreadsheet package using Microsoft EXCEL, and population data were used from our 120 SNPs population studies. On the spreadsheet, we defined SNP types using 'template files' instead of positive and negative controls. "Template files" consisted of the results of 94 unknown samples and two negative controls of each of 120 SNPs loci we had previously studied. By the use of the files, the spreadsheet could analyze 96 SNPs on a 96-wells-plate simultaneously.
NASA Astrophysics Data System (ADS)
Quagliato, Luca; Berti, Guido A.
2017-10-01
In this paper, a statically determined slip-line solution algorithm is proposed for the calculation of the axial forming force in the radial-axial ring rolling process of flat rings. The developed solution is implemented in an Excel spreadsheet for the construction of the slip-line field and the calculation of the pressure factor to be used in the force model. The comparison between analytical solution and authors' FE simulation allows stating that the developed model supersedes the previous literature ones and proves the reliability of the proposed approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
St. Onge, Melinda
The Geothermal Resource Portfolio Optimization and Reporting Tool (GeoRePORT) was developed as a way to distill large amounts of geothermal project data into an objective, reportable data set that can be used to communicate with experts and non-experts. GeoRePORT summarizes (1) resource grade and certainty and (2) project readiness. This Excel file allows users to easily navigate through the resource grade attributes, using drop-down menus to pick grades and project readiness, and then easily print and share the summary with others. This spreadsheet is the first draft, for which we are soliciting expert feedback. The spreadsheet will be updated basedmore » on this feedback to increase usability of the tool. If you have any comments, please feel free to contact us.« less
Nutrient modeling for a semi-intensive IMC pond: an MS-Excel approach.
Ray, Lala I P; Mal, B C; Moulick, S
2017-11-01
Semi-intensive Indian Major Carp (IMC) culture was practised in polythene lined dugout ponds at the Aquacultural Farm of Indian Institute of Technology, Kharagpur, West Bengal for 3 consecutive years at three different stocking densities (S.D), viz., 20,000, 35,000 and 50,000 numbers of fingerlings per hectare of water spread area. Fingerlings of Catla, Rohu and Mrigal were raised at a stocking ratio of 4:3:3. Total ammonia nitrogen (TAN) value along with other fishpond water quality parameters was monitored at 1 day intervals to ensure a good water ecosystem for a better fish growth. Water exchange was carried out before the TAN reached the critical limit. Field data on TAN obtained from the cultured fishponds stocked with three different stocking densities were used to study the dynamics of TAN. A developed model used to study the nutrient dynamics in shrimp pond was used to validate the observed data in the IMC pond ecosystem. Two years of observed TAN data were used to calibrate the spreadsheet model and the same model was validated using the third year observed data. The manual calibration based on the trial and error process of parameters adjustments was used and several simulations were performed by changing the model parameters. After adjustment of each parameter, the simulated and measured values of the water quality parameters were compared to judge the improvement in the model prediction. Forward finite difference discretization method was used in a MS-Excel spreadsheet to calibrate and validate the model for obtaining the TAN levels during the culture period. Observed data from the cultured fishponds of three different S.D were used to standardize 13 model parameters. The efficiency of the developed spreadsheet model was found to be more than 90% for the TAN estimation in the IMC cultured fishponds.
Heuristic Modeling for TRMM Lifetime Predictions
NASA Technical Reports Server (NTRS)
Jordan, P. S.; Sharer, P. J.; DeFazio, R. L.
1996-01-01
Analysis time for computing the expected mission lifetimes of proposed frequently maneuvering, tightly altitude constrained, Earth orbiting spacecraft have been significantly reduced by means of a heuristic modeling method implemented in a commercial-off-the-shelf spreadsheet product (QuattroPro) running on a personal computer (PC). The method uses a look-up table to estimate the maneuver frequency per month as a function of the spacecraft ballistic coefficient and the solar flux index, then computes the associated fuel use by a simple engine model. Maneuver frequency data points are produced by means of a single 1-month run of traditional mission analysis software for each of the 12 to 25 data points required for the table. As the data point computations are required only a mission design start-up and on the occasion of significant mission redesigns, the dependence on time consuming traditional modeling methods is dramatically reduced. Results to date have agreed with traditional methods to within 1 to 1.5 percent. The spreadsheet approach is applicable to a wide variety of Earth orbiting spacecraft with tight altitude constraints. It will be particularly useful to such missions as the Tropical Rainfall Measurement Mission scheduled for launch in 1997, whose mission lifetime calculations are heavily dependent on frequently revised solar flux predictions.
NASA Astrophysics Data System (ADS)
Al-Mishwat, Ali T.
2016-05-01
PHASS99 is a FORTRAN program designed to retrieve and decode radiometric and other physical age information of igneous rocks contained in the international database IGBADAT (Igneous Base Data File). In the database, ages are stored in a proprietary format using mnemonic representations. The program can handle up to 99 ages in an igneous rock specimen and caters to forty radiometric age systems. The radiometric age alphanumeric strings assigned to each specimen description in the database consist of four components: the numeric age and its exponential modifier, a four-character mnemonic method identification, a two-character mnemonic name of analysed material, and the reference number in the rock group bibliography vector. For each specimen, the program searches for radiometric age strings, extracts them, parses them, decodes the different age components, and converts them to high-level English equivalents. IGBADAT and similarly-structured files are used for input. The output includes three files: a flat raw ASCII text file containing retrieved radiometric age information, a generic spreadsheet-compatible file for data import to spreadsheets, and an error file. PHASS99 builds on the old program TSTPHA (Test Physical Age) decoder program and expands greatly its capabilities. PHASS99 is simple, user friendly, fast, efficient, and does not require users to have knowledge of programing.
Mukhebi, A W; Kariuki, D P; Mussukuya, E; Mullins, G; Ngumi, P N; Thorpe, W; Perry, B D
1995-07-01
The cost of immunising cattle against East Coast fever by the infection and treatment method has been calculated for a pilot scheme in Kaloleni Division of the Coast Province of Kenya by using a spreadsheet model. The cost was calculated to be KSh 544 (US$25) per animal (in 1990 values). If a farmer were to bear all this cost, immunisation would be financially profitable in grade cattle, but the benefits of immunisation would not be sufficient to justify the immunisation of zebu cattle. For these animals, the cost of immunisation would have to be in the range of KSh 230 to KSh 415 per animal, or the farm-gate price of milk would have to increase by at least 80 per cent from KSh 7.50 to 13.50/litre, or the government would have to subsidise the cost either partially or fully. The first two possibilities are realistic, because the costs of routine immunisation are likely to be lower than for the pilot scheme, and because the increasing demand for milk is likely to push up prices in the liberalised markets. If both the grade and zebu cattle in Kaloleni Division were targets for immunisation, it is estimated that there would be 14,500 head for immunisation annually, costing an estimated KSh 8 million. The spreadsheet model used to assess the economics of immunisation in the Kaloleni Division could be applied to determine the government or private veterinary service charges for immunisation that would be financially profitable to farmers in a defined cattle production system in any division, district or country. The model could also be used to estimate the annual total number of cattle for immunisation in a target cattle production system and thus help with the financial planning for the exercise.
Kelly, Christopher; Pashayan, Nora; Munisamy, Sreetharan; Powles, John W
2009-01-01
Background Our aim was to estimate the burden of fatal disease attributable to excess adiposity in England and Wales in 2003 and 2015 and to explore the sensitivity of the estimates to the assumptions and methods used. Methods A spreadsheet implementation of the World Health Organization's (WHO) Comparative Risk Assessment (CRA) methodology for continuously distributed exposures was used. For our base case, adiposity-related risks were assumed to be minimal with a mean (SD) BMI of 21 (1) Kg m-2. All cause mortality risks for 2015 were taken from the Government Actuary and alternative compositions by cause derived. Disease-specific relative risks by BMI were taken from the CRA project and varied in sensitivity analyses. Results Under base case methods and assumptions for 2003, approximately 41,000 deaths and a loss of 1.05 years of life expectancy were attributed to excess adiposity. Seventy-seven percent of all diabetic deaths, 23% of all ischaemic heart disease deaths and 14% of all cerebrovascular disease deaths were attributed to excess adiposity. Predictions for 2015 were found to be more sensitive to assumptions about the future course of mortality risks for diabetes than to variation in the assumed trend in BMI. On less favourable assumptions the attributable loss of life expectancy in 2015 would rise modestly to 1.28 years. Conclusion Excess adiposity appears to contribute materially but modestly to mortality risks in England and Wales and this contribution is likely to increase in the future. Uncertainty centres on future trends of associated diseases, especially diabetes. The robustness of these estimates is limited by the lack of control for correlated risks by stratification and by the empirical uncertainty surrounding the effects of prolonged excess adiposity beginning in adolescence. PMID:19566928
ERIC Educational Resources Information Center
Fuglestad, Anne Berit
2013-01-01
This paper presents a case of collaboration with three teachers and a didactician on task development within a developmental research project based on ideas of inquiry and learning community. The teachers' goal was to utilise a spreadsheet to orchestrate the pupils' investigations and build a library of tasks for the classroom. The focus is on one…
ERIC Educational Resources Information Center
Adhitama, Egy; Fauzi, Ahmad
2018-01-01
In this study, a pendulum experimental tool with a light-based timer has been developed to measure the period of a simple pendulum. The obtained data was automatically recorded in an Excel spreadsheet. The intensity of monochromatic light, sensed by a 3DU5C phototransistor, dynamically changes as the pendulum swings. The changed intensity varies…
ERIC Educational Resources Information Center
Fetter, Gary; Shockley, Jeff
2014-01-01
Instructors look for ways to explain to students how supply chains can be constructed so that competing suppliers can work together to improve inventory management performance (i.e., a phenomenon known as co-opetition). An Excel spreadsheet-driven simulation is presented that models a complete multilevel supply chain system--customer, retailer,…
ERIC Educational Resources Information Center
Halat, Erdogan; Peker, Murat
2011-01-01
The purpose of this study was to compare the influence of instruction using WebQuest activities with the influence of an instruction using spreadsheet activities on the motivation of pre-service elementary school teachers in mathematics teaching course. There were a total of 70 pre-service elementary school teachers involved in this study. Thirty…
Great Basin NV Play Fairway Analysis - Carson Sink
Jim Faulds
2015-10-28
All datasets and products specific to the Carson Sink Basin. Includes a packed ArcMap (.mpk), individually zipped shapefiles, and a file geodatabase for the Carson Sink area; a GeoSoft Oasis montaj project containing GM-SYS 2D gravity profiles along the trace of our seismic reflection lines; a 3D model in EarthVision; spreadsheet of links to published maps; and spreadsheets of well data.
ERIC Educational Resources Information Center
Krange, Ingeborg; Arnseth, Hans Christian
2012-01-01
The aim of this study is to scrutinize the characteristics of conceptual meaning making when students engage with virtual worlds in combination with a spreadsheet with the aim to develop graphs. We study how these tools and the representations they contain or enable students to construct serve to influence their understanding of energy resource…
Jim Faulds
2015-10-29
All datasets and products specific to the Steptoe Valley model area. Includes a packed ArcMap project (.mpk), individually zipped shapefiles, and a file geodatabase for the northern Steptoe Valley area; a GeoSoft Oasis montaj project containing GM-SYS 2D gravity profiles along the trace of our seismic reflection lines; a 3D model in EarthVision; spreadsheet of links to published maps; and spreadsheets of well data.
Sang-Kyun Han; Han-Sup Han; William J. Elliot; Edward M. Bilek
2017-01-01
We developed a spreadsheet-based model, named ThinTool, to evaluate the cost of mechanical fuel reduction thinning including biomass removal, to predict net energy output, and to assess nutrient impacts from thinning treatments in northern California and southern Oregon. A combination of literature reviews, field-based studies, and contractor surveys was used to...
ERIC Educational Resources Information Center
Masalski, William J.
This book seeks to develop, enhance, and expand students' understanding of mathematics by using technology. Topics covered include the advantages of spreadsheets along with the opportunity to explore the 'what if?' type of questions encountered in the problem-solving process, enhancing the user's insight into the development and use of algorithms,…
A user-friendly tool for incremental haemodialysis prescription.
Casino, Francesco Gaetano; Basile, Carlo
2018-01-05
There is a recently heightened interest in incremental haemodialysis (IHD), the main advantage of which could likely be a better preservation of the residual kidney function of the patients. The implementation of IHD, however, is hindered by many factors, among them, the mathematical complexity of its prescription. The aim of our study was to design a user-friendly tool for IHD prescription, consisting of only a few rows of a common spreadsheet. The keystone of our spreadsheet was the following fundamental concept: the dialysis dose to be prescribed in IHD depends only on the normalized urea clearance provided by the native kidneys (KRUn) of the patient for each frequency of treatment, according to the variable target model recently proposed by Casino and Basile (The variable target model: a paradigm shift in the incremental haemodialysis prescription. Nephrol Dial Transplant 2017; 32: 182-190). The first step was to put in sequence a series of equations in order to calculate, firstly, KRUn and, then, the key parameters to be prescribed for an adequate IHD; the second step was to compare KRUn values obtained with our spreadsheet with KRUn values obtainable with the gold standard Solute-solver (Daugirdas JT et al., Solute-solver: a web-based tool for modeling urea kinetics for a broad range of hemodialysis schedules in multiple patients. Am J Kidney Dis 2009; 54: 798-809) in a sample of 40 incident haemodialysis patients. Our spreadsheet provided excellent results. The differences with Solute-solver were clinically negligible. This was confirmed by the Bland-Altman plot built to analyse the agreement between KRUn values obtained with the two methods: the difference was 0.07 ± 0.05 mL/min/35 L. Our spreadsheet is a user-friendly tool able to provide clinically acceptable results in IHD prescription. Two immediate consequences could derive: (i) a larger dissemination of IHD might occur; and (ii) our spreadsheet could represent a useful tool for an ineludibly needed full-fledged clinical trial, comparing IHD with standard thrice-weekly HD. © The Author(s) 2018. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etmektzoglou, A; Mishra, P; Svatos, M
Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomesmore » available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly translate research ideas into machine readable scripts without programming knowledge. As an open source initiative, it also enables researcher collaboration on future developments. I am a full time employee at Varian Medical Systems, Palo Alto, California.« less
MIRAGE: The data acquisition, analysis, and display system
NASA Technical Reports Server (NTRS)
Rosser, Robert S.; Rahman, Hasan H.
1993-01-01
Developed for the NASA Johnson Space Center and Life Sciences Directorate by GE Government Services, the Microcomputer Integrated Real-time Acquisition Ground Equipment (MIRAGE) system is a portable ground support system for Spacelab life sciences experiments. The MIRAGE system can acquire digital or analog data. Digital data may be NRZ-formatted telemetry packets of packets from a network interface. Analog signal are digitized and stored in experimental packet format. Data packets from any acquisition source are archived to a disk as they are received. Meta-parameters are generated from the data packet parameters by applying mathematical and logical operators. Parameters are displayed in text and graphical form or output to analog devices. Experiment data packets may be retransmitted through the network interface. Data stream definition, experiment parameter format, parameter displays, and other variables are configured using spreadsheet database. A database can be developed to support virtually any data packet format. The user interface provides menu- and icon-driven program control. The MIRAGE system can be integrated with other workstations to perform a variety of functions. The generic capabilities, adaptability and ease of use make the MIRAGE a cost-effective solution to many experimental data processing requirements.
Converting analog interpretive data to digital formats for use in database and GIS applications
Flocks, James G.
2004-01-01
There is a growing need by researchers and managers for comprehensive and unified nationwide datasets of scientific data. These datasets must be in a digital format that is easily accessible using database and GIS applications, providing the user with access to a wide variety of current and historical information. Although most data currently being collected by scientists are already in a digital format, there is still a large repository of information in the literature and paper archive. Converting this information into a format accessible by computer applications is typically very difficult and can result in loss of data. However, since scientific data are commonly collected in a repetitious, concise matter (i.e., forms, tables, graphs, etc.), these data can be recovered digitally by using a conversion process that relates the position of an attribute in two-dimensional space to the information that the attribute signifies. For example, if a table contains a certain piece of information in a specific row and column, then the space that the row and column occupies becomes an index of that information. An index key is used to identify the relation between the physical location of the attribute and the information the attribute contains. The conversion process can be achieved rapidly, easily and inexpensively using widely available digitizing and spreadsheet software, and simple programming code. In the geological sciences, sedimentary character is commonly interpreted from geophysical profiles and descriptions of sediment cores. In the field and laboratory, these interpretations were typically transcribed to paper. The information from these paper archives is still relevant and increasingly important to scientists, engineers and managers to understand geologic processes affecting our environment. Direct scanning of this information produces a raster facsimile of the data, which allows it to be linked to the electronic world. But true integration of the content with database and GIS software as point, vector or text information is commonly lost. Sediment core descriptions and interpretation of geophysical profiles are usually portrayed as lines, curves, symbols and text information. They have vertical and horizontal dimensions associated with depth, category, time, or geographic position. These dimensions are displayed in consistent positions, which can be digitized and converted to a digital format, such as a spreadsheet. Once this data is in a digital, tabulated form it can easily be made available to a wide variety of imaging and data manipulation software for compilation and world-wide dissemination.
Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya
2016-02-01
The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
1986-01-01
All manpower numbers, number of heads (by skill), serial time and manhours have been accumulated and compiled on a per subtask basis in spreadsheet format for both the ground based and the space based data flows. To aid in identifying the facility resources required to process the Ground Based Orbital Transfer Vehicle (GBOTV) and/or the space based orbital transfer vehicle (SBOTV) through the ground facilities at Kennedy Space Center (KSC), a software application package was developed using a general purpose data base management system known as Data Flex. The facility requirements are used as the basic input to this software application. The resources of the KSC facility that could be used by orbital transfer vehicle program were digitized in the same format used to identify facility requirements. The facility capabilities were digitized in this format for subsequent, automated comparative analyses. Composite facility requirements are compared to each of the baseline facility capabilities and the system generates a relative score that indicates how each facility weighs against the composite requirements in relation to the other facilities in the set.
An X Window system for statlab results reporting.
Barrows, R. C.; Allen, B.; Fink, D. J.
1993-01-01
We have developed a system that receives "stat" results encoded in Health Level Seven from the Laboratory Information System, prints a report in destination Intensive Care Units (ICUs), and captures the data for review in a custom spreadsheet format at color X-terminals located in ICUs. Available services include a reference nomogram plot of arterial blood gas data, printed summaries, automated access to the Clinical Information System and a Medline database, electronic mail, a simulated electronic calculator, and general news and information. Security mechanisms include an audit trail of user activities on the system. Noteworthy technical aspects and non-technical factors impacting success are discussed. Images Figure 2 Figure 3 PMID:8130490
ListingAnalyst: A program for analyzing the main output file from MODFLOW
Winston, Richard B.; Paulinski, Scott
2014-01-01
ListingAnalyst is a Windows® program for viewing the main output file from MODFLOW-2005, MODFLOW-NWT, or MODFLOW-LGR. It organizes and displays large files quickly without using excessive memory. The sections and subsections of the file are displayed in a tree-view control, which allows the user to navigate quickly to desired locations in the files. ListingAnalyst gathers error and warning messages scattered throughout the main output file and displays them all together in an error and a warning tab. A grid view displays tables in a readable format and allows the user to copy the table into a spreadsheet. The user can also search the file for terms of interest.
NASA Technical Reports Server (NTRS)
Holderman, James D.; Clisset, James R.; Moder, Jeffrey P.
2010-01-01
This is a printout of the supplemental spreadsheet that is a supplement to the document found in NASA/TM-2010-216100. The calculations for cases of opposed rows of jets with the orifices on one side shifted show that staggering can improve the mixing, particularly for cases where jets would overpenetrate slightly if the orifices were in an aligned configuration.
Teaching graphical simulations of Fourier series expansion of some periodic waves using spreadsheets
NASA Astrophysics Data System (ADS)
Singh, Iqbal; Kaur, Bikramjeet
2018-05-01
The present article demonstrates a way of programming using an Excel spreadsheet to teach Fourier series expansion in school/colleges without the knowledge of any typical programming language. By using this, a student learns to approximate partial sum of the n terms of Fourier series for some periodic signals such as square wave, saw tooth wave, half wave rectifier and full wave rectifier signals.
ERIC Educational Resources Information Center
Benacka, Jan
2015-01-01
This paper provides the formula for the elevation angle at which a projectile has to be fired in a vacuum from a general position to hit a target at a given distance. A spreadsheet application that models the trajectory is presented, and the problem of finding the points of shot and impact of a projectile moving in a vacuum if three points of the…
ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations
NASA Astrophysics Data System (ADS)
Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai
2017-07-01
The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.
A Spreadsheet for the Mixing of a Row of Jets with a Confined Crossflow
NASA Technical Reports Server (NTRS)
Holderman, J. D.; Smith, T. D.; Clisset, J. R.; Lear, W. E.
2005-01-01
An interactive computer code, written with a readily available software program, Microsoft Excel (Microsoft Corporation, Redmond, WA) is presented which displays 3 D oblique plots of a conserved scalar distribution downstream of jets mixing with a confined crossflow, for a single row, double rows, or opposed rows of jets with or without flow area convergence and/or a non-uniform crossflow scalar distribution. This project used a previously developed empirical model of jets mixing in a confined crossflow to create an Microsoft Excel spreadsheet that can output the profiles of a conserved scalar for jets injected into a confined crossflow given several input variables. The program uses multiple spreadsheets in a single Microsoft Excel notebook to carry out the modeling. The first sheet contains the main program, controls for the type of problem to be solved, and convergence criteria. The first sheet also provides for input of the specific geometry and flow conditions. The second sheet presents the results calculated with this routine to show the effects on the mixing of varying flow and geometric parameters. Comparisons are also made between results from the version of the empirical correlations implemented in the spreadsheet and the versions originally written in Applesoft BASIC (Apple Computer, Cupertino, CA) in the 1980's.
A Spreadsheet for the Mixing of a Row of Jets with a Confined Crossflow. Supplement
NASA Technical Reports Server (NTRS)
Holderman, J. D.; Smith, T. D.; Clisset, J. R.; Lear, W. E.
2005-01-01
An interactive computer code, written with a readily available software program, Microsoft Excel (Microsoft Corporation, Redmond, WA) is presented which displays 3 D oblique plots of a conserved scalar distribution downstream of jets mixing with a confined crossflow, for a single row, double rows, or opposed rows of jets with or without flow area convergence and/or a non-uniform crossflow scalar distribution. This project used a previously developed empirical model of jets mixing in a confined crossflow to create an Microsoft Excel spreadsheet that can output the profiles of a conserved scalar for jets injected into a confined crossflow given several input variables. The program uses multiple spreadsheets in a single Microsoft Excel notebook to carry out the modeling. The first sheet contains the main program, controls for the type of problem to be solved, and convergence criteria. The first sheet also provides for input of the specific geometry and flow conditions. The second sheet presents the results calculated with this routine to show the effects on the mixing of varying flow and geometric parameters. Comparisons are also made between results from the version of the empirical correlations implemented in the spreadsheet and the versions originally written in Applesoft BASIC (Apple Computer, Cupertino, CA) in the 1980's.
ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations.
Laloo, Jalal Z A; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai
2017-07-01
The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.
The program LOPT for least-squares optimization of energy levels
NASA Astrophysics Data System (ADS)
Kramida, A. E.
2011-02-01
The article describes a program that solves the least-squares optimization problem for finding the energy levels of a quantum-mechanical system based on a set of measured energy separations or wavelengths of transitions between those energy levels, as well as determining the Ritz wavelengths of transitions and their uncertainties. The energy levels are determined by solving the matrix equation of the problem, and the uncertainties of the Ritz wavenumbers are determined from the covariance matrix of the problem. Program summaryProgram title: LOPT Catalogue identifier: AEHM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHM_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 19 254 No. of bytes in distributed program, including test data, etc.: 427 839 Distribution format: tar.gz Programming language: Perl v.5 Computer: PC, Mac, Unix workstations Operating system: MS Windows (XP, Vista, 7), Mac OS X, Linux, Unix (AIX) RAM: 3 Mwords or more Word size: 32 or 64 Classification: 2.2 Nature of problem: The least-squares energy-level optimization problem, i.e., finding a set of energy level values that best fits the given set of transition intervals. Solution method: The solution of the least-squares problem is found by solving the corresponding linear matrix equation, where the matrix is constructed using a new method with variable substitution. Restrictions: A practical limitation on the size of the problem N is imposed by the execution time, which scales as N and depends on the computer. Unusual features: Properly rounds the resulting data and formats the output in a format suitable for viewing with spreadsheet editing software. Estimates numerical errors resulting from the limited machine precision. Running time: 1 s for N=100, or 60 s for N=400 on a typical PC.
Organizational Linkages: Understanding the Productivity Paradox,
1994-01-01
students were asked to make a decision regarding a production scheduling. Some used a Lotus spreadsheet’s what-if capacity, which enabled them to...the degree to which managers and MBA students believed that they make better decisions using what-if spreadsheet models, despite the fact that their...for this system is Naylor et al.’s (1980) view of behavior in organizations. When Pritchard and his students (Pritchard et al., 1988) applied this
NASA Astrophysics Data System (ADS)
Fauzi, Ahmad
2017-11-01
Numerical computation has many pedagogical advantages: it develops analytical skills and problem-solving skills, helps to learn through visualization, and enhances physics education. Unfortunately, numerical computation is not taught to undergraduate education physics students in Indonesia. Incorporate numerical computation into the undergraduate education physics curriculum presents many challenges. The main challenges are the dense curriculum that makes difficult to put new numerical computation course and most students have no programming experience. In this research, we used case study to review how to integrate numerical computation into undergraduate education physics curriculum. The participants of this research were 54 students of the fourth semester of physics education department. As a result, we concluded that numerical computation could be integrated into undergraduate education physics curriculum using spreadsheet excel combined with another course. The results of this research become complements of the study on how to integrate numerical computation in learning physics using spreadsheet excel.
NASA Astrophysics Data System (ADS)
Sokolova, Tatiana S.; Dorogokupets, Peter I.; Dymshits, Anna M.; Danilov, Boris S.; Litasov, Konstantin D.
2016-09-01
We present Microsoft Excel spreadsheets for calculation of thermodynamic functions and P-V-T properties of MgO, diamond and 9 metals, Al, Cu, Ag, Au, Pt, Nb, Ta, Mo, and W, depending on temperature and volume or temperature and pressure. The spreadsheets include the most common pressure markers used in in situ experiments with diamond anvil cell and multianvil techniques. The calculations are based on the equation of state formalism via the Helmholtz free energy. The program was developed using Visual Basic for Applications in Microsoft Excel and is a time-efficient tool to evaluate volume, pressure and other thermodynamic functions using T-P and T-V data only as input parameters. This application is aimed to solve practical issues of high pressure experiments in geosciences and mineral physics.
Marcot, Bruce G.; Jorgenson, M. Torre; DeGange, Anthony R.
2014-01-01
5. A Canon® Rebel 3Ti with a Sigma zoom lens (18–200 mm focal length). The Drift® HD-170 and GoPro® Hero3 cameras were secured to the struts and underwing for nadir (direct downward) imaging. The Panasonic® and Canon® cameras were each hand-held for oblique-angle landscape images, shooting through the airplanes’ windows, targeting both general landscape conditions as well as landscape features of special interest, such as tundra fire scars and landslips. The Drift® and GoPro® cameras each were set for time-lapse photography at 5-second intervals for overlapping coverage. Photographs from all cameras (100 percent .jpg format) were date- and time-synchronized to geographic positioning system waypoints taken during the flights, also at 5-second intervals, providing precise geotagging (latitude-longitude) of all files. All photographs were adjusted for color saturation and gamma, and nadir photographs were corrected for lens distortion for the Drift® and GoPro® cameras’ 170° wide-angle distortion. EXIF (exchangeable image file format) data on camera settings and geotagging were extracted into spreadsheet databases. An additional 1 hour, 20 minutes, and 43 seconds of high-resolution videos were recorded at 60 frames per second with the GoPro® camera along selected transect segments, and also were image-adjusted and corrected for lens distortion. Geotagged locations of 12,395 nadir photographs from the Drift® and GoPro® cameras were overlayed in a geographic information system (ArcMap 10.0) onto a map of 44 ecotypes (land- and water-cover types) of the Arctic Network study area. Presence and area of each ecotype occurring within a geographic information system window centered on the location of each photograph were recorded and included in the spreadsheet databases. All original and adjusted photographs, videos, geographic positioning system flight tracks, and photograph databases are available by contacting ascweb@usgs.gov.
TagDigger: user-friendly extraction of read counts from GBS and RAD-seq data.
Clark, Lindsay V; Sacks, Erik J
2016-01-01
In genotyping-by-sequencing (GBS) and restriction site-associated DNA sequencing (RAD-seq), read depth is important for assessing the quality of genotype calls and estimating allele dosage in polyploids. However, existing pipelines for GBS and RAD-seq do not provide read counts in formats that are both accurate and easy to access. Additionally, although existing pipelines allow previously-mined SNPs to be genotyped on new samples, they do not allow the user to manually specify a subset of loci to examine. Pipelines that do not use a reference genome assign arbitrary names to SNPs, making meta-analysis across projects difficult. We created the software TagDigger, which includes three programs for analyzing GBS and RAD-seq data. The first script, tagdigger_interactive.py, rapidly extracts read counts and genotypes from FASTQ files using user-supplied sets of barcodes and tags. Input and output is in CSV format so that it can be opened by spreadsheet software. Tag sequences can also be imported from the Stacks, TASSEL-GBSv2, TASSEL-UNEAK, or pyRAD pipelines, and a separate file can be imported listing the names of markers to retain. A second script, tag_manager.py, consolidates marker names and sequences across multiple projects. A third script, barcode_splitter.py, assists with preparing FASTQ data for deposit in a public archive by splitting FASTQ files by barcode and generating MD5 checksums for the resulting files. TagDigger is open-source and freely available software written in Python 3. It uses a scalable, rapid search algorithm that can process over 100 million FASTQ reads per hour. TagDigger will run on a laptop with any operating system, does not consume hard drive space with intermediate files, and does not require programming skill to use.
An Australian stocks and flows model for asbestos.
Donovan, Sally; Pickin, Joe
2016-10-01
All available data on asbestos consumption in Australia were collated in order to determine the most common asbestos-containing materials remaining in the built environment. The proportion of asbestos contained within each material and the types of products these materials are most commonly found in was also determined. The lifetime of these asbestos containing products was estimated in order to develop a model that projects stocks and flows of asbestos products in Australia through to the year 2100. The model is based on a Weibull distribution and was built in an excel spreadsheet to make it user-friendly and accessible. The nature of the products under consideration means both their asbestos content and lifetime parameters are highly variable, and so for each of these a high and low estimate is presented along with the estimate used in the model. The user is able to vary the parameters in the model as better data become available. © The Author(s) 2016.
NASA Astrophysics Data System (ADS)
Brown, L.; Syed, B.; Jarvis, S. C.; Sneath, R. W.; Phillips, V. R.; Goulding, K. W. T.; Li, C.
A mechanistic model of N 2O emission from agricultural soil (DeNitrification-DeComposition—DNDC) was modified for application to the UK, and was used as the basis of an inventory of N 2O emission from UK agriculture in 1990. UK-specific input data were added to DNDC's database and the ability to simulate daily C and N inputs from grazing animals and applied animal waste was added to the model. The UK version of the model, UK-DNDC, simulated emissions from 18 different crop types on the 3 areally dominant soils in each county. Validation of the model at the field scale showed that predictions matched observations well. Emission factors for the inventory were calculated from estimates of N 2O emission from UK-DNDC, in order to maintain direct comparability with the IPCC approach. These, along with activity data, were included in a transparent spreadsheet format. Using UK-DNDC, the estimate of N 2O-N emission from UK current agricultural practice in 1990 was 50.9 Gg. This total comprised 31.7 Gg from the soil sector, 5.9 Gg from animals and 13.2 Gg from the indirect sector. The range of this estimate (using the range of soil organic C for each soil used) was 30.5-62.5 Gg N. Estimates of emissions in each sector were compared to those calculated using the IPCC default methodology. Emissions from the soil and indirect sectors were smaller with the UK-DNDC approach than with the IPCC methodology, while emissions from the animal sector were larger. The model runs suggested a relatively large emission from agricultural land that was not attributable to current agricultural practices (33.8 Gg in total, 27.4 Gg from the soil sector). This 'background' component is partly the result of historical agricultural land use. It is not normally included in inventories of emission, but would increase the total emission of N 2O-N from agricultural land in 1990 to 78.3 Gg.
Basic statistics with Microsoft Excel: a review.
Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-06-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.
Basic statistics with Microsoft Excel: a review
Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-01-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel. PMID:28740690
2017-09-01
Figure 58. Click on run ................................................................................................61 Figure 59. Top view of...engines, helicopter rotors, and turbine blades , and so forth Creating Marks Readable with a Scanner 4. Simple techniques to follow: Make the light...spreadsheet with data Figure 58. Click on Menu bar and find “View” then click on “Macros.” Click on run Figure 59. 62 Top view of xml spreadsheet
Small-Caliber Projectile Target Impact Angle Determined From Close Proximity Radiographs
2006-10-01
discrete motion data that can be numerically modeled using linear aerodynamic theory or 6-degrees-of- freedom equations of motion. The values of Fφ...Prediction Excel® Spreadsheet shown in figure 9. The Gamma at Impact Spreadsheet uses the linear aerodynamics model , equations 5 and 6, to calculate αT...trajectory angle error via consideration of the RMS fit errors of the actual firings. However, the linear aerodynamics model does not include this effect
A retention index calculator simplifies identification of plant volatile organic compounds.
Lucero, Mary; Estell, Rick; Tellez, María; Fredrickson, Ed
2009-01-01
Plant volatiles (PVOCs) are important targets for studies in natural products, chemotaxonomy and biochemical ecology. The complexity of PVOC profiles often limits research to studies targeting only easily identified compounds. With the availability of mass spectral libraries and recent growth of retention index (RI) libraries, PVOC identification can be achieved using only gas chromatography coupled to mass spectrometry (GCMS). However, RI library searching is not typically automated, and until recently, RI libraries were both limited in scope and costly to obtain. To automate RI calculation and lookup functions commonly utilised in PVOC analysis. Formulae required for calculating retention indices from retention time data were placed in a spreadsheet along with lookup functions and a retention index library. Retention times obtained from GCMS analysis of alkane standards and Koeberlinia spinosa essential oil were entered into the spreadsheet to determine retention indices. Indices were used in combination with mass spectral analysis to identify compounds contained in Koeberlinia spinosa essential oil. Eighteen compounds were positively identified. Total oil yield was low, with only 5 ppm in purple berries. The most abundant compounds were octen-3-ol and methyl salicylate. The spreadsheet accurately calculated RIs of the detected compounds. The downloadable spreadsheet tool developed for this study provides a calculator and RI library that works in conjuction with GCMS or other analytical techniques to identify PVOCs in plant extracts.
Improvements to the Magnetics Information Consortium (MagIC) Paleo and Rock Magnetic Database
NASA Astrophysics Data System (ADS)
Jarboe, N.; Minnett, R.; Tauxe, L.; Koppers, A. A. P.; Constable, C.; Jonestrask, L.
2015-12-01
The Magnetic Information Consortium (MagIC) database (http://earthref.org/MagIC/) continues to improve the ease of data uploading and editing, the creation of complex searches, data visualization, and data downloads for the paleomagnetic, geomagnetic, and rock magnetic communities. Online data editing is now available and the need for proprietary spreadsheet software is therefore entirely negated. The data owner can change values in the database or delete entries through an HTML 5 web interface that resembles typical spreadsheets in behavior and uses. Additive uploading now allows for additions to data sets to be uploaded with a simple drag and drop interface. Searching the database has improved with the addition of more sophisticated search parameters and with the facility to use them in complex combinations. A comprehensive summary view of a search result has been added for increased quick data comprehension while a raw data view is available if one desires to see all data columns as stored in the database. Data visualization plots (ARAI, equal area, demagnetization, Zijderveld, etc.) are presented with the data when appropriate to aid the user in understanding the dataset. MagIC data associated with individual contributions or from online searches may be downloaded in the tab delimited MagIC text file format for susbsequent offline use and analysis. With input from the paleomagnetic, geomagnetic, and rock magnetic communities, the MagIC database will continue to improve as a data warehouse and resource.
A web-based relational database for monitoring and analyzing mosquito population dynamics.
Sucaet, Yves; Van Hemert, John; Tucker, Brad; Bartholomay, Lyric
2008-07-01
Mosquito population dynamics have been monitored on an annual basis in the state of Iowa since 1969. The primary goal of this project was to integrate light trap data from these efforts into a centralized back-end database and interactive website that is available through the internet at http://iowa-mosquito.ent.iastate.edu. For comparative purposes, all data were categorized according to the week of the year and normalized according to the number of traps running. Users can readily view current, weekly mosquito abundance compared with data from previous years. Additional interactive capabilities facilitate analyses of the data based on mosquito species, distribution, or a time frame of interest. All data can be viewed in graphical and tabular format and can be downloaded to a comma separated value (CSV) file for import into a spreadsheet or more specialized statistical software package. Having this long-term dataset in a centralized database/website is useful for informing mosquito and mosquito-borne disease control and for exploring the ecology of the species represented therein. In addition to mosquito population dynamics, this database is available as a standardized platform that could be modified and applied to a multitude of projects that involve repeated collection of observational data. The development and implementation of this tool provides capacity for the user to mine data from standard spreadsheets into a relational database and then view and query the data in an interactive website.
Tree Height Calculator: An Android App for Estimating Tree Height
NASA Astrophysics Data System (ADS)
Burca, V. S.; Htet, N. M.; Huang, X.; de Lanerolle, T. R.; Morelli, R.; Gourley, J. R.
2011-12-01
Conventionally, measuring tree height requires a collection of different tools - clinometer, transit, pencil, paper, laptop computer. Results are recorded manually and entered into a spreadsheet or database for future calculation and analysis. Tree Height Calculator is a mobile Android app the integrates the various steps in this process thereby improving the accuracy and dramatically reducing the time required to go from taking measurements to analyzing data. Given the user's height and the distance from the base of the tree (which can be downloaded into the app from a server), the app uses the phone's orientation sensor to calculate the angle of elevation. A simple trigonometric formula is then used to calculate and record the tree's height in the phone's database. When the phone has a WiFi connection, the data are transmitted to a server, from where they can be downloaded directly into a spreadsheet. The application was first tested in an Environmental Science laboratory at Trinity College. On the first trial, 103 data samples were collected, stored, and uploaded to the online database with only couple of dropped data points. On the second trial, 98 data samples were gathered with no loss of data. The app combined the individual measurements taken by the students in the lab, reducing the time required to produce a graph of the class's results from days to hours.
NASA Astrophysics Data System (ADS)
Le Roux, Jacobus P.; Demirbilek, Zeki; Brodalka, Marysia; Flemming, Burghard W.
2010-10-01
The generation and growth of waves in deep water is controlled by winds blowing over the sea surface. In fully developed sea states, where winds and waves are in equilibrium, wave parameters may be calculated directly from the wind velocity. We provide an Excel spreadsheet to compute the wave period, length, height and celerity, as well as horizontal and vertical particle velocities for any water depth, bottom slope, and distance below the reference water level. The wave profile and propagation can also be visualized for any water depth, modeling the sea surface change from sinusoidal to trochoidal and finally cnoidal profiles into shallow water. Bedload entrainment is estimated under both the wave crest and the trough, using the horizontal water particle velocity at the top of the boundary layer. The calculations are programmed in an Excel file called WAVECALC, which is available online to authorized users. Although many of the recently published formulas are based on theoretical arguments, the values agree well with several existing theories and limited field and laboratory observations. WAVECALC is a user-friendly program intended for sedimentologists, coastal engineers and oceanographers, as well as marine ecologists and biologists. It provides a rapid means to calculate many wave characteristics required in coastal and shallow marine studies, and can also serve as an educational tool.
2012-01-01
RECS relies on actual records from energy suppliers to produce robust survey estimates of household energy consumption and expenditures. During the RECS Energy Supplier Survey (ESS), energy billing records are collected from the companies that supply electricity, natural gas, fuel oil/kerosene, and propane (LPG) to the interviewed households. As Federal agencies expand the use of administrative records to enhance, replace, or evaluate survey data, EIA has explored more flexible, reliable and efficient techniques to collect energy billing records. The ESS has historically been a mail-administered survey, but EIA introduced web data collection with the 2009 RECS ESS. In that survey, energy suppliers self-selected their reporting mode among several options: standardized paper form, on-line fillable form or spreadsheet, or failing all else, a nonstandard format of their choosing. In this paper, EIA describes where reporting mode appears to influence the data quality. We detail the reporting modes, the embedded and post-hoc quality control and consistency checks that were performed, the extent of detectable errors, and the methods used for correcting data errors. We explore by mode the levels of unit and item nonresponse, number of errors, and corrections made to the data. In summary, we find notable differences in data quality between modes and analyze where the benefits of offering these new modes outweigh the "costs".
A generic model for evaluating payor net cost savings from a disease management program.
McKay, Niccie L
2006-01-01
Private and public payors increasingly are turning to disease management programs as a means of improving the quality of care provided and controlling expenditures for individuals with specific medical conditions. This article presents a generic model that can be adapted to evaluate payor net cost savings from a variety of types of disease management programs, with net cost savings taking into account both changes in expenditures resulting from the program and the costs of setting up and operating the program. The model specifies the required data, describes the data collection process, and shows how to calculate the net cost savings in a spreadsheet format. An accompanying hypothetical example illustrates how to use the model.
NASA Technical Reports Server (NTRS)
Roberts, Floyd E., III
1994-01-01
Software provides for control and acquisition of data from optical pyrometer. There are six individual programs in PYROLASER package. Provides quick and easy way to set up, control, and program standard Pyrolaser. Temperature and emisivity measurements either collected as if Pyrolaser in manual operating mode or displayed on real-time strip charts and stored in standard spreadsheet format for posttest analysis. Shell supplied to allow macros, which are test-specific, added to system easily. Written using Labview software for use on Macintosh-series computers running System 6.0.3 or later, Sun Sparc-series computers running Open-Windows 3.0 or MIT's X Window System (X11R4 or X11R5), and IBM PC or compatible computers running Microsoft Windows 3.1 or later.
Computer-based Astronomy Labs for Non-science Majors
NASA Astrophysics Data System (ADS)
Smith, A. B. E.; Murray, S. D.; Ward, R. A.
1998-12-01
We describe and demonstrate two laboratory exercises, Kepler's Third Law and Stellar Structure, which are being developed for use in an astronomy laboratory class aimed at non-science majors. The labs run with Microsoft's Excel 98 (Macintosh) or Excel 97 (Windows). They can be run in a classroom setting or in an independent learning environment. The intent of the labs is twofold; first and foremost, students learn the subject matter through a series of informational frames. Next, students enhance their understanding by applying their knowledge in lab procedures, while also gaining familiarity with the use and power of a widely-used software package and scientific tool. No mathematical knowledge beyond basic algebra is required to complete the labs or to understand the computations in the spreadsheets, although the students are exposed to the concepts of numerical integration. The labs are contained in Excel workbook files. In the files are multiple spreadsheets, which contain either a frame with information on how to run the lab, material on the subject, or one or more procedures. Excel's VBA macro language is used to automate the labs. The macros are accessed through button interfaces positioned on the spreadsheets. This is done intentionally so that students can focus on learning the subject matter and the basic spreadsheet features without having to learn advanced Excel features all at once. Students open the file and progress through the informational frames to the procedures. After each procedure, student comments and data are automatically recorded in a preformatted Lab Report spreadsheet. Once all procedures have been completed, the student is prompted for a filename in which to save their Lab Report. The lab reports can then be printed or emailed to the instructor. The files will have full worksheet and workbook protection, and will have a "redo" feature at the end of the lab for students who want to repeat a procedure.
The EnzymeTracker: an open-source laboratory information management system for sample tracking.
Triplet, Thomas; Butler, Gregory
2012-01-26
In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50 scientists. The EnzymeTracker is freely available online at http://cubique.fungalgenomics.ca/enzymedb/index.html under the GNU GPLv3 license.
The EnzymeTracker: an open-source laboratory information management system for sample tracking
2012-01-01
Background In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. Results In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. Conclusions The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50 scientists. The EnzymeTracker is freely available online at http://cubique.fungalgenomics.ca/enzymedb/index.html under the GNU GPLv3 license. PMID:22280360
Electronic collection system for spacelab mission timeline requirements
NASA Technical Reports Server (NTRS)
Lindberg, James P.; Piner, John R.; Huang, Allen K. H.
1995-01-01
This paper describes the Functional Objective Requirements Collection System (FORCS) software tool that has been developed for use by Principal Investigators (PI's) and Payload Element Developers (PED's) on their own personal computers to develop on-orbit timelining requirements for their payloads. The FORCS tool can be used either in a totally stand-alone mode, storing the information in a local file on the user's personal computer hard disk or in a remote mode where the user's computer is linked to a host computer containing the integrated database of the timeline requirements for all of the payloads on a mission. There are a number of features incorporated in the FORCS software to assist the user. The user may move freely back and forth between the various forms for inputting the data. Several methods are used to input the information, depending on the type of the information. These methods range from filling in text boxes, using check boxes and radio buttons, to inputting information into a spreadsheet format. There are automated features provided to assist in developing the proper format for the data, ranging from limit checking on some of the parameters to automatic conversion of different formats of time data inputs to the one standard format used for the timeline scheduling software.
NASA Technical Reports Server (NTRS)
Cole, Stuart K.; Reeves, John D.; Williams-Byrd, Julie A.; Greenberg, Marc; Comstock, Doug; Olds, John R.; Wallace, Jon; DePasquale, Dominic; Schaffer, Mark
2013-01-01
NASA is investing in new technologies that include 14 primary technology roadmap areas, and aeronautics. Understanding the cost for research and development of these technologies and the time it takes to increase the maturity of the technology is important to the support of the ongoing and future NASA missions. Overall, technology estimating may help provide guidance to technology investment strategies to help improve evaluation of technology affordability, and aid in decision support. The research provides a summary of the framework development of a Technology Estimating process where four technology roadmap areas were selected to be studied. The framework includes definition of terms, discussion for narrowing the focus from 14 NASA Technology Roadmap areas to four, and further refinement to include technologies, TRL range of 2 to 6. Included in this paper is a discussion to address the evaluation of 20 unique technology parameters that were initially identified, evaluated and then subsequently reduced for use in characterizing these technologies. A discussion of data acquisition effort and criteria established for data quality are provided. The findings obtained during the research included gaps identified, and a description of a spreadsheet-based estimating tool initiated as a part of the Technology Estimating process.
Space-Plane Spreadsheet Program
NASA Technical Reports Server (NTRS)
Mackall, Dale
1993-01-01
Basic Hypersonic Data and Equations (HYPERDATA) spreadsheet computer program provides data gained from three analyses of performance of space plane. Equations used to perform analyses derived from Newton's second law of physics, derivation included. First analysis is parametric study of some basic factors affecting ability of space plane to reach orbit. Second includes calculation of thickness of spherical fuel tank. Third produces ratio between volume of fuel and total mass for each of various aircraft. HYPERDATA intended for use on Macintosh(R) series computers running Microsoft Excel 3.0.
Social Security: A Present Value Analysis of Old Age Survivors Insurance (OASI) Taxes and Benefits.
1995-12-01
private sector plans and provides a spreadsheet model for making this comparison of plans using different assumptions. The investigation was done by collecting data from various books, Government publications, and various Government agencies to conduct a spreadsheet analysis of three different wage-earning groups, assuming various real interest rates potentially earned in the private sector . A comparison of Social Security with alternative private sector plans is important to the DoD/DoN because less constrained budgets could
Day, Warren C.; Granitto, Matthew
2014-01-01
The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources/Missouri Geological Survey, undertook a study from 1988 to 1994 on the iron-oxide deposits and their host Mesoproterozoic igneous rocks in southeastern Missouri. The project resulted in an improvement of our understanding of the geologic setting, mode of formation, and the composition of many of the known deposits and prospects and the associated rocks of the St. Francois terrane in Missouri. The goal for this earlier work was to allow the comparison of Missouri iron-oxide deposits in context with other iron oxide-copper ± uranium (IOCG) types of mineral deposits observed globally. The raw geochemical analyses were released originally through the USGS National Geochemical Database (NGDB, http://mrdata.usgs.gov). The data presented herein offers all of the field notes, locations, rock descriptions, and geochemical analyses in a coherent package to facilitate new research efforts in IOCG deposit types. The data are provided in both Microsoft Excel (Version Office 2010) spreadsheet format (*.xlsx) and MS-DOS text formats (*.txt) for ease of use by numerous computer programs.
NASA Astrophysics Data System (ADS)
Dill, Harald G.
2010-06-01
Economic geology is a mixtum compositum of all geoscientific disciplines focused on one goal, finding new mineral depsosits and enhancing their exploitation. The keystones of this mixtum compositum are geology and mineralogy whose studies are centered around the emplacement of the ore body and the development of its minerals and rocks. In the present study, mineralogy and geology act as x- and y-coordinates of a classification chart of mineral resources called the "chessboard" (or "spreadsheet") classification scheme. Magmatic and sedimentary lithologies together with tectonic structures (1 -D/pipes, 2 -D/veins) are plotted along the x-axis in the header of the spreadsheet diagram representing the columns in this chart diagram. 63 commodity groups, encompassing minerals and elements are plotted along the y-axis, forming the lines of the spreadsheet. These commodities are subjected to a tripartite subdivision into ore minerals, industrial minerals/rocks and gemstones/ornamental stones. Further information on the various types of mineral deposits, as to the major ore and gangue minerals, the current models and the mode of formation or when and in which geodynamic setting these deposits mainly formed throughout the geological past may be obtained from the text by simply using the code of each deposit in the chart. This code can be created by combining the commodity (lines) shown by numbers plus lower caps with the host rocks or structure (columns) given by capital letters. Each commodity has a small preface on the mineralogy and chemistry and ends up with an outlook into its final use and the supply situation of the raw material on a global basis, which may be updated by the user through a direct link to databases available on the internet. In this case the study has been linked to the commodity database of the US Geological Survey. The internal subdivision of each commodity section corresponds to the common host rock lithologies (magmatic, sedimentary, and metamorphic) and structures. Cross sections and images illustrate the common ore types of each commodity. Ore takes priority over the mineral. The minerals and host rocks are listed by their chemical and mineralogical compositions, respectively, separated from the text but supplemented with cross-references to the columns and lines, where they prevalently occur. A metallogenetic-geodynamic overview is given at the bottom of each column in the spreadsheet. It may be taken as the "sum" or the " mean" of a number of geodynamic models and ideas put forward by the various researchers for all the deposits pertaining to a certain clan of lithology or structure. This classical or conservative view of metallotects related to the common plate tectonic settings is supplemented by an approach taken for the first time for such a number of deposits, using the concepts of sequence stratigraphy. This paper, so as to say, is a "launch pad" for a new mindset in metallogenesis rather than the final result. The relationship supergene-hypogene and syngenetic-epigenetic has been the topic of many studies for ages but to keep them as separate entities is often unworkable in practice, especially in the so-called epithermal or near-surface/shallow deposits. Vein-type and stratiform ore bodies are generally handled also very differently. To get these different structural elements (space) and various mineralizing processes (time) together and to allow for a forward modeling in mineral exploration, architectural elements of sequence stratigraphy are adapted to mineral resources. Deposits are geological bodies which need accommodation space created by the environment of formation and the tectonic/geodynamic setting through time. They are controlled by horizontal to subhorizontal reference planes and/or vertical structures. Prerequisites for the deposits to evolve are thermal and/or mechanical gradients. Thermal energy is for most of the settings under consideration deeply rooted in the mantle. A perspective on how this concept might work is given in the text by a pilot project on mineral deposits in Central Europe and in the spreadsheet classification scheme by providing a color-coded categorization into 1. mineralization mainly related to planar architectural elements, e.g. sequence boundaries subaerial and unconformities 2. mineralization mainly related to planar architectural elements, e.g. sequence boundaries submarine, transgressive surfaces and maximum flooding zones/surfaces) 3. mineralization mainly controlled by system tracts (lowstand system tracts transgressive system tracts, highstand system tracts) 4. mineralization of subvolcanic or intermediate level to be correlated with the architectural elements of basin evolution 5. mineralization of deep level to be correlated with the deep-seated structural elements. There are several squares on the chessboard left blank mainly for lack of information on sequence stratigraphy of mineral deposits. This method has not found many users yet in mineral exploration. This review is designed as an "interactive paper" open, for amendments in the electronic spreadsheet version and adjustable to the needs and wants of application, research and training in geosciences. Metamorphic host rock lithologies and commodities are addressed by different color codes in the chessboard classification scheme.
Emissions & Generation Resource Integrated Database (eGRID), eGRID2002 (with years 1996 - 2000 data)
The Emissions & Generation Resource Integrated Database (eGRID) is a comprehensive source of data on the environmental characteristics of almost all electric power generated in the United States. These environmental characteristics include air emissions for nitrogen oxides, sulfur dioxide, carbon dioxide, methane, nitrous oxide, and mercury; emissions rates; net generation; resource mix; and many other attributes. eGRID2002 (years 1996 through 2000 data) contains 16 Excel spreadsheets and the Technical Support Document, as well as the eGRID Data Browser, User's Manual, and Readme file. Archived eGRID data can be viewed as spreadsheets or by using the eGRID Data Browser. The eGRID spreadsheets can be manipulated by data users and enables users to view all the data underlying eGRID. The eGRID Data Browser enables users to view key data using powerful search features. Note that the eGRID Data Browser will not run on a Mac-based machine without Windows emulation.
NASA Astrophysics Data System (ADS)
Eso, R.; Safiuddin, L. O.; Agusu, L.; Arfa, L. M. R. F.
2018-04-01
We propose a teaching instrument demonstrating the circular membrane waves using the excel interactive spreadsheets with the Visual Basic for Application (VBA) programming. It is based on the analytic solution of circular membrane waves involving Bessel function. The vibration modes and frequencies are determined by using Bessel approximation and initial conditions. The 3D perspective based on the spreadsheets functions and facilities has been explored to show the 3D moving objects in transitional or rotational processes. This instrument is very useful both in teaching activity and learning process of wave physics. Visualizing of the vibration of waves in the circular membrane which is showing a very clear manner of m and n vibration modes of the wave in a certain frequency has been compared and matched to the experimental result using resonance method. The peak of deflection varies in time if the initial condition was working and have the same pattern with matlab simulation in zero initial velocity
Advanced Technology Lifecycle Analysis System (ATLAS)
NASA Technical Reports Server (NTRS)
O'Neil, Daniel A.; Mankins, John C.
2004-01-01
Developing credible mass and cost estimates for space exploration and development architectures require multidisciplinary analysis based on physics calculations, and parametric estimates derived from historical systems. Within the National Aeronautics and Space Administration (NASA), concurrent engineering environment (CEE) activities integrate discipline oriented analysis tools through a computer network and accumulate the results of a multidisciplinary analysis team via a centralized database or spreadsheet Each minute of a design and analysis study within a concurrent engineering environment is expensive due the size of the team and supporting equipment The Advanced Technology Lifecycle Analysis System (ATLAS) reduces the cost of architecture analysis by capturing the knowledge of discipline experts into system oriented spreadsheet models. A framework with a user interface presents a library of system models to an architecture analyst. The analyst selects models of launchers, in-space transportation systems, and excursion vehicles, as well as space and surface infrastructure such as propellant depots, habitats, and solar power satellites. After assembling the architecture from the selected models, the analyst can create a campaign comprised of missions spanning several years. The ATLAS controller passes analyst specified parameters to the models and data among the models. An integrator workbook calls a history based parametric analysis cost model to determine the costs. Also, the integrator estimates the flight rates, launched masses, and architecture benefits over the years of the campaign. An accumulator workbook presents the analytical results in a series of bar graphs. In no way does ATLAS compete with a CEE; instead, ATLAS complements a CEE by ensuring that the time of the experts is well spent Using ATLAS, an architecture analyst can perform technology sensitivity analysis, study many scenarios, and see the impact of design decisions. When the analyst is satisfied with the system configurations, technology portfolios, and deployment strategies, he or she can present the concepts to a team, which will conduct a detailed, discipline-oriented analysis within a CEE. An analog to this approach is the music industry where a songwriter creates the lyrics and music before entering a recording studio.
Tritium permeation model for plasma facing components
NASA Astrophysics Data System (ADS)
Longhurst, G. R.
1992-12-01
This report documents the development of a simplified one-dimensional tritium permeation and retention model. The model makes use of the same physical mechanisms as more sophisticated, time-transient codes such as implantation, recombination, diffusion, trapping and thermal gradient effects. It takes advantage of a number of simplifications and approximations to solve the steady-state problem and then provides interpolating functions to make estimates of intermediate states based on the steady-state solution. The model is developed for solution using commercial spread-sheet software such as Lotus 123. Comparison calculations are provided with the verified and validated TMAP4 transient code with good agreement. Results of calculations for the ITER CDA diverter are also included.
Archfield, Stacey A.; Steeves, Peter A.; Guthrie, John D.; Ries, Kernell G.
2013-01-01
Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals) at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.
NASA Astrophysics Data System (ADS)
Tsuzuki, Yoshiaki; Yoneda, Minoru
2011-04-01
SummaryA Social Experiment Program to decrease municipal wastewater pollutant discharge by "soft interventions" in households and to improve river water quality was conducted in the Yamato-gawa River Basin, Japan. Environmental accounting housekeeping (EAH) books of municipal wastewater were prepared mainly for dissemination purpose to be applied during the Social Experiment Program. The EAH books are table format spreadsheets to estimate pollutant discharges. Pollutant load per capita flowing into water body (PLC wb) and pollutant runoff yields from sub-river basins to the river mouth are indispensable parameters for their preparation. In order to estimate the pollutant runoff yields of the pollutants, BOD, TN and TP, a concept of pollutant runoff yield from upper monitoring point, MP n, to lower monitoring point, MP n+1 ( Rm n(n+1)), and that from corresponding sub-river basin ( Rd(n+1)(n+1)) was introduced in this paper. When proportion of the pollutant runoff yields, p n (= Rm n(n+1)/ Rd(n+1)(n+1)), was equal to 1.0 in all the river sections, which was determined based on the simulation results of Rm and Rd, pollutant runoff yield from sub-river basin n to the monitoring point nearest to the river mouth, Ry n7, were estimated to be 0.3-66.8% for BOD, 25.8-75.8% for TN and 18.9-78.5% for TP. The EAH books of municipal wastewater were prepared by adopting the estimated pollutant runoff yields, Ry n7. The EAH books were thought to be distributed widely, however, they did not seem to be used by many ordinary citizens in the Social Experiment Program in February, 2010, judging from the small number of website visitor counter and less responses from people. Possible reasons for less usage than expected were considered to be unsuccessful negotiation with the official organizations of the Social Experiment Program on the EAH books utilization as official tools and some difficulties in using the EAH books for ordinary people.
Value of Information spreadsheet
Trainor-Guitton, Whitney
2014-05-12
This spreadsheet represents the information posteriors derived from synthetic data of magnetotellurics (MT). These were used to calculate value of information of MT for geothermal exploration. Information posteriors describe how well MT was able to locate the "throat" of clay caps, which are indicative of hidden geothermal resources. This data is full explained in the peer-reviewed publication: Trainor-Guitton, W., Hoversten, G. M., Ramirez, A., Roberts, J., Júlíusson, E., Key, K., Mellors, R. (Sept-Oct. 2014) The value of spatial information for determining well placement: a geothermal example, Geophysics.
Angular Speed of a Compact Disc
NASA Astrophysics Data System (ADS)
Sawicki, Mikolaj ``Mik''
2006-09-01
A spinning motion of a compact disc in a CD player offers an interesting and challenging problem in rotational kinematics with a nonconstant angular acceleration that can be incorporated into a typical introductory physics class for engineers and scientists. It can be used either as an example presented during the lecture, emphasizing application of calculus, or as a homework assignment that could be handled easily with the help of a spreadsheet, thus eliminating the calculus aspect altogether. I tried both approaches, and the spreadsheet study was favored by my students.
Zhang, Di; Cagnon, Chris H; Villablanca, J Pablo; McCollough, Cynthia H; Cody, Dianna D; Zankl, Maria; Demarco, John J; McNitt-Gray, Michael F
2013-09-01
CT neuroperfusion examinations are capable of delivering high radiation dose to the skin or lens of the eyes of a patient and can possibly cause deterministic radiation injury. The purpose of this study is to: (a) estimate peak skin dose and eye lens dose from CT neuroperfusion examinations based on several voxelized adult patient models of different head size and (b) investigate how well those doses can be approximated by some commonly used CT dose metrics or tools, such as CTDIvol, American Association of Physicists in Medicine (AAPM) Report No. 111 style peak dose measurements, and the ImPACT organ dose calculator spreadsheet. Monte Carlo simulation methods were used to estimate peak skin and eye lens dose on voxelized patient models, including GSF's Irene, Frank, Donna, and Golem, on four scanners from the major manufacturers at the widest collimation under all available tube potentials. Doses were reported on a per 100 mAs basis. CTDIvol measurements for a 16 cm CTDI phantom, AAPM Report No. 111 style peak dose measurements, and ImPACT calculations were performed for available scanners at all tube potentials. These were then compared with results from Monte Carlo simulations. The dose variations across the different voxelized patient models were small. Dependent on the tube potential and scanner and patient model, CTDIvol values overestimated peak skin dose by 26%-65%, and overestimated eye lens dose by 33%-106%, when compared to Monte Carlo simulations. AAPM Report No. 111 style measurements were much closer to peak skin estimates ranging from a 14% underestimate to a 33% overestimate, and with eye lens dose estimates ranging from a 9% underestimate to a 66% overestimate. The ImPACT spreadsheet overestimated eye lens dose by 2%-82% relative to voxelized model simulations. CTDIvol consistently overestimates dose to eye lens and skin. The ImPACT tool also overestimated dose to eye lenses. As such they are still useful as a conservative predictor of dose for CT neuroperfusion studies. AAPM Report No. 111 style measurements are a better predictor of both peak skin and eye lens dose than CTDIvol and ImPACT for the patient models used in this study. It should be remembered that both the AAPM Report No. 111 peak dose metric and CTDIvol dose metric are dose indices and were not intended to represent actual organ doses.
Zhang, Di; Cagnon, Chris H.; Villablanca, J. Pablo; McCollough, Cynthia H.; Cody, Dianna D.; Zankl, Maria; Demarco, John J.; McNitt-Gray, Michael F.
2013-01-01
Purpose: CT neuroperfusion examinations are capable of delivering high radiation dose to the skin or lens of the eyes of a patient and can possibly cause deterministic radiation injury. The purpose of this study is to: (a) estimate peak skin dose and eye lens dose from CT neuroperfusion examinations based on several voxelized adult patient models of different head size and (b) investigate how well those doses can be approximated by some commonly used CT dose metrics or tools, such as CTDIvol, American Association of Physicists in Medicine (AAPM) Report No. 111 style peak dose measurements, and the ImPACT organ dose calculator spreadsheet. Methods: Monte Carlo simulation methods were used to estimate peak skin and eye lens dose on voxelized patient models, including GSF's Irene, Frank, Donna, and Golem, on four scanners from the major manufacturers at the widest collimation under all available tube potentials. Doses were reported on a per 100 mAs basis. CTDIvol measurements for a 16 cm CTDI phantom, AAPM Report No. 111 style peak dose measurements, and ImPACT calculations were performed for available scanners at all tube potentials. These were then compared with results from Monte Carlo simulations. Results: The dose variations across the different voxelized patient models were small. Dependent on the tube potential and scanner and patient model, CTDIvol values overestimated peak skin dose by 26%–65%, and overestimated eye lens dose by 33%–106%, when compared to Monte Carlo simulations. AAPM Report No. 111 style measurements were much closer to peak skin estimates ranging from a 14% underestimate to a 33% overestimate, and with eye lens dose estimates ranging from a 9% underestimate to a 66% overestimate. The ImPACT spreadsheet overestimated eye lens dose by 2%–82% relative to voxelized model simulations. Conclusions: CTDIvol consistently overestimates dose to eye lens and skin. The ImPACT tool also overestimated dose to eye lenses. As such they are still useful as a conservative predictor of dose for CT neuroperfusion studies. AAPM Report No. 111 style measurements are a better predictor of both peak skin and eye lens dose than CTDIvol and ImPACT for the patient models used in this study. It should be remembered that both the AAPM Report No. 111 peak dose metric and CTDIvol dose metric are dose indices and were not intended to represent actual organ doses. PMID:24007152
Wahi, Monika M; Parks, David V; Skeate, Robert C; Goldin, Steven B
2008-01-01
We conducted a reliability study comparing single data entry (SE) into a Microsoft Excel spreadsheet to entry using the existing forms (EF) feature of the Teleforms software system, in which optical character recognition is used to capture data off of paper forms designed in non-Teleforms software programs. We compared the transcription of data from multiple paper forms from over 100 research participants representing almost 20,000 data entry fields. Error rates for SE were significantly lower than those for EF, so we chose SE for data entry in our study. Data transcription strategies from paper to electronic format should be chosen based on evidence from formal evaluations, and their design should be contemplated during the paper forms development stage.
Wahi, Monika M.; Parks, David V.; Skeate, Robert C.; Goldin, Steven B.
2008-01-01
We conducted a reliability study comparing single data entry (SE) into a Microsoft Excel spreadsheet to entry using the existing forms (EF) feature of the Teleforms software system, in which optical character recognition is used to capture data off of paper forms designed in non-Teleforms software programs. We compared the transcription of data from multiple paper forms from over 100 research participants representing almost 20,000 data entry fields. Error rates for SE were significantly lower than those for EF, so we chose SE for data entry in our study. Data transcription strategies from paper to electronic format should be chosen based on evidence from formal evaluations, and their design should be contemplated during the paper forms development stage. PMID:18308994
``Sweetening'' Technical Physics with Hershey's Kisses
NASA Astrophysics Data System (ADS)
Stone, Chuck
2003-04-01
This paper describes an activity in which students measure the mass of each candy in one full bag of Hershey's Kisses and then use a simple spreadsheet program to construct a histogram showing the number of candies as a function of mass. Student measurements indicate that one single bag of 80 Kisses yields enough data to produce a noticeable variation in the candy's mass distribution. The bimodal character of this distribution provides a useful discussion topic. This activity can be performed as a classroom project, a laboratory exercise, or an interactive lecture demonstration. In all these formats, students have the opportunity to collect, organize, process, and analyze real data. In addition to strengthening graphical analysis skills, this activity introduces students to fundamentals of statistics, manufacturing processes in the industrial workplace, and process control techniques.
Determination of the Optimal Fourier Number on the Dynamic Thermal Transmission
NASA Astrophysics Data System (ADS)
Bruzgevičius, P.; Burlingis, A.; Norvaišienė, R.
2016-12-01
This article represents the result of experimental research on transient heat transfer in a multilayered (heterogeneous) wall. Our non-steady thermal transmission simulation is based on a finite-difference calculation method. The value of a Fourier number shows the similarity of thermal variation in conditional layers of an enclosure. Most scientists recommend using no more than a value of 0.5 for the Fourier number when performing calculations on dynamic (transient) heat transfer. The value of the Fourier number is determined in order to acquire reliable calculation results with optimal accuracy. To compare the results of simulation with experimental research, a transient heat transfer calculation spreadsheet was created. Our research has shown that a Fourier number of around 0.5 or even 0.32 is not sufficient ({≈ }17 % of oscillation amplitude) for calculations of transient heat transfer in a multilayered wall. The least distorted calculation results were obtained when the multilayered enclosure was divided into conditional layers with almost equal Fourier number values and when the value of the Fourier number was around 1/6, i.e., approximately 0.17. Statistical deviation analysis using the Statistical Analysis System was applied to assess the accuracy of the spreadsheet calculation and was developed on the basis of our established methodology. The mean and median absolute error as well as their confidence intervals has been estimated by the two methods with optimal accuracy ({F}_{oMDF}= 0.177 and F_{oEPS}= 0.1633 values).
NASA Astrophysics Data System (ADS)
Bell, Kevin D.; Dafesh, Philip A.; Hsu, L. A.; Tsuda, A. S.
1995-12-01
Current architectural and design trade techniques often carry unaffordable alternatives late into the decision process. Early decisions made during the concept exploration and development (CE&D) phase will drive the cost of a program more than any other phase of development; thus, designers must be able to assess both the performance and cost impacts of their early choices. The Space Based Infrared System (SBIRS) cost engineering model (CEM) described in this paper is an end-to-end process integrating engineering and cost expertise through commonly available spreadsheet software, allowing for concurrent design engineering and cost estimation to identify and balance system drives to reduce acquisition costs. The automated interconnectivity between subsystem models using spreadsheet software allows for the quick and consistent assessment of the system design impacts and relative cost impacts due to requirement changes. It is different from most CEM efforts attempted in the past as it incorporates more detailed spacecraft and sensor payload models, and has been applied to determine the cost drivers for an advanced infrared satellite system acquisition. The CEM is comprised of integrated detailed engineering and cost estimating relationships describing performance, design, and cost parameters. Detailed models have been developed to evaluate design parameters for the spacecraft bus and sensor; both step-starer and scanner sensor types incorporate models of focal plane array, optics, processing, thermal, communications, and mission performance. The current CEM effort has provided visibility to requirements, design, and cost drivers for system architects and decision makers to determine the configuration of an infrared satellite architecture that meets essential requirements cost effectively. In general, the methodology described in this paper consists of process building blocks that can be tailored to the needs of many applications. Descriptions of the spacecraft and payload subsystem models provide insight into The Aerospace Corporation expertise and scope of the SBIRS concept development effort.
System-of-Systems Technology-Portfolio-Analysis Tool
NASA Technical Reports Server (NTRS)
O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne
2012-01-01
Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.
Chu, Khim Hoong
2017-11-09
Surface diffusion coefficients may be estimated by fitting solutions of a diffusion model to batch kinetic data. For non-linear systems, a numerical solution of the diffusion model's governing equations is generally required. We report here the application of the classic Langmuir kinetics model to extract surface diffusion coefficients from batch kinetic data. The use of the Langmuir kinetics model in lieu of the conventional surface diffusion model allows derivation of an analytical expression. The parameter estimation procedure requires determining the Langmuir rate coefficient from which the pertinent surface diffusion coefficient is calculated. Surface diffusion coefficients within the 10 -9 to 10 -6 cm 2 /s range obtained by fitting the Langmuir kinetics model to experimental kinetic data taken from the literature are found to be consistent with the corresponding values obtained from the traditional surface diffusion model. The virtue of this simplified parameter estimation method is that it reduces the computational complexity as the analytical expression involves only an algebraic equation in closed form which is easily evaluated by spreadsheet computation.
Refinery spreadsheet highlights microcomputer process applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tucker, M.A.
1984-01-23
Microcomputer applications in the process areas at Chevron U.S.A. refineries and at the Chevron Research Co. illustrate how the microcomputer has changed the way we do our jobs. This article will describe major uses of the microcomputer as a personal work tool in Chevron process areas. It will also describe how and why many of Chevron's microcomputer applications were developed and their characteristics. One of our earliest microcomputer applications, developed in late 1981, was an electronic spreadsheet program using a small desktop microcomputer. It was designed to help a refinery planner prepare monthly plans for a small portion of onemore » of our major refineries. This particular microcomputer had a tiny 4-in. screen, and the reports were several strips of print-out from the microcomputer's 3-in.-wide internal printer taped together. In spite of these archaic computing conditions, it was a successful application. It automated what had been very tedious and time-consuming calculations with a pencil, a calculator, and a great deal of erasing. It eliminated filling out large ''horseblanket'' reports. The electronic spreadsheet was also flexible; the planner could easily change the worksheet to match new operating constraints, new process conditions, and new feeds and products. Fortunately, within just a few months, this application graduated to a similar electronic spreadsheet program on a new, more powerful microcomputer. It had a bigger display screen and a letter-size printer. The same application is still in use today, although it has been greatly enhanced and altered to match extensive plant modifications. And there are plans to expand it again onto yet another, more powerful microcomputer.« less
Schneider, Walter; Bolger, D J; Eschman, Amy; Neff, Christopher; Zuccolotto, Anthony P
2005-05-01
In academic courses in which one task for the students is to understand empirical methodology and the nature of scientific inquiry, the ability of students to create and implement their own experiments allows them to take intellectual ownership of, and greatly facilitates, the learning process. The Psychology Experiment Authoring Kit (PEAK) is a novel spreadsheet-based interface allowing students and researchers with rudimentary spreadsheet skills to create cognitive and cognitive neuroscience experiments in minutes. Students fill in a spreadsheet listing of independent variables and stimuli, insert columns that represent experimental objects such as slides (presenting text, pictures, and sounds) and feedback displays to create complete experiments, all within a single spreadsheet. The application then executes experiments with centisecond precision. Formal usability testing was done in two stages: (1) detailed coding of 10 individual subjects in one-on-one experimenter/subject videotaped sessions and (2) classroom testing of 64 undergraduates. In both individual and classroom testing, the students learned to effectively use PEAK within 2 h, and were able to create a lexical decision experiment in under 10 min. Findings from the individual testing in Stage 1 resulted in significant changes to documentation and training materials and identification of bugs to be corrected. Stage 2 testing identified additional bugs to be corrected and new features to be considered to facilitate student understanding of the experiment model. Such testing will improve the approach with each semester. The students were typically able to create their own projects in 2 h.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitehead, Camilla Dunham; McNeil, Michael; Dunham_Whitehead, Camilla
2008-02-28
The U.S. Environmental Protection Agency (EPA) influences the market for plumbing fixtures and fittings by encouraging consumers to purchase products that carry the WaterSense label, which certifies those products as performing at low flow rates compared to unlabeled fixtures and fittings. As consumers decide to purchase water-efficient products, water consumption will decline nationwide. Decreased water consumption should prolong the operating life of water and wastewater treatment facilities.This report describes the method used to calculate national water savings attributable to EPA?s WaterSense program. A Microsoft Excel spreadsheet model, the National Water Savings (NWS) analysis model, accompanies this methodology report. Version 1.0more » of the NWS model evaluates indoor residential water consumption. Two additional documents, a Users? Guide to the spreadsheet model and an Impacts Report, accompany the NWS model and this methodology document. Altogether, these four documents represent Phase One of this project. The Users? Guide leads policy makers through the spreadsheet options available for projecting the water savings that result from various policy scenarios. The Impacts Report shows national water savings that will result from differing degrees of market saturation of high-efficiency water-using products.This detailed methodology report describes the NWS analysis model, which examines the effects of WaterSense by tracking the shipments of products that WaterSense has designated as water-efficient. The model estimates market penetration of products that carry the WaterSense label. Market penetration is calculated for both existing and new construction. The NWS model estimates savings based on an accounting analysis of water-using products and of building stock. Estimates of future national water savings will help policy makers further direct the focus of WaterSense and calculate stakeholder impacts from the program.Calculating the total gallons of water the WaterSense program saves nationwide involves integrating two components, or modules, of the NWS model. Module 1 calculates the baseline national water consumption of typical fixtures, fittings, and appliances prior to the program (as described in Section 2.0 of this report). Module 2 develops trends in efficiency for water-using products both in the business-as-usual case and as a result of the program (Section 3.0). The NWS model combines the two modules to calculate total gallons saved by the WaterSense program (Section 4.0). Figure 1 illustrates the modules and the process involved in modeling for the NWS model analysis.The output of the NWS model provides the base case for each end use, as well as a prediction of total residential indoor water consumption during the next two decades. Based on the calculations described in Section 4.0, we can project a timeline of water savings attributable to the WaterSense program. The savings increase each year as the program results in the installation of greater numbers of efficient products, which come to compose more and more of the product stock in households throughout the United States.« less
Repins, Ingrid L.; Harvey, Steve; Bowers, Karen; ...
2017-05-15
Cu(In,Ga)Se 2(CIGS) photovoltaic absorbers frequently develop Ga gradients during growth. These gradients vary as a function of growth recipe, and are important to device performance. Prediction of Ga profiles using classic diffusion equations is not possible because In and Ga atoms occupy the same lattice sites and thus diffuse interdependently, and there is not yet a detailed experimental knowledge of the chemical potential as a function of composition that describes this interaction. Here, we show how diffusion equations can be modified to account for site sharing between In and Ga atoms. The analysis has been implemented in an Excel spreadsheet,more » and outputs predicted Cu, In, and Ga profiles for entered deposition recipes. A single set of diffusion coefficients and activation energies are chosen, such that simulated elemental profiles track with published data and those from this study. Extent and limits of agreement between elemental profiles predicted from the growth recipes and the spreadsheet tool are demonstrated.« less
Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B
2012-01-20
Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.
2012-01-01
Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software. PMID:22264277
Groups: knowledge spreadsheets for symbolic biocomputing.
Travers, Michael; Paley, Suzanne M; Shrager, Jeff; Holland, Timothy A; Karp, Peter D
2013-01-01
Knowledge spreadsheets (KSs) are a visual tool for interactive data analysis and exploration. They differ from traditional spreadsheets in that rather than being oriented toward numeric data, they work with symbolic knowledge representation structures and provide operations that take into account the semantics of the application domain. 'Groups' is an implementation of KSs within the Pathway Tools system. Groups allows Pathway Tools users to define a group of objects (e.g. groups of genes or metabolites) from a Pathway/Genome Database. Groups can be transformed (e.g. by transforming a metabolite group to the group of pathways in which those metabolites are substrates); combined through set operations; analysed (e.g. through enrichment analysis); and visualized (e.g. by painting onto a metabolic map diagram). Users of the Pathway Tools-based BioCyc.org website have made extensive use of Groups, and an informal survey of Groups users suggests that Groups has achieved the goal of allowing biologists themselves to perform some data manipulations that previously would have required the assistance of a programmer. Database URL: BioCyc.org.
Multiphysics Object Oriented Simulation Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
The Multiphysics Object Oriented Simulation Environment (MOOSE) software library developed at Idaho National Laboratory is a tool. MOOSE, like other tools, doesn't actually complete a task. Instead, MOOSE seeks to reduce the effort required to create engineering simulation applications. MOOSE itself is a software library: a blank canvas upon which you write equations and then MOOSE can help you solve them. MOOSE is comparable to a spreadsheet application. A spreadsheet, by itself, doesn't do anything. Only once equations are entered into it will a spreadsheet application compute anything. Such is the same for MOOSE. An engineer or scientist can utilizemore » the equation solvers within MOOSE to solve equations related to their area of study. For instance, a geomechanical scientist can input equations related to water flow in underground reservoirs and MOOSE can solve those equations to give the scientist an idea of how water could move over time. An engineer might input equations related to the forces in steel beams in order to understand the load bearing capacity of a bridge. Because MOOSE is a blank canvas it can be useful in many scientific and engineering pursuits.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Repins, Ingrid L.; Harvey, Steve; Bowers, Karen
Cu(In,Ga)Se 2(CIGS) photovoltaic absorbers frequently develop Ga gradients during growth. These gradients vary as a function of growth recipe, and are important to device performance. Prediction of Ga profiles using classic diffusion equations is not possible because In and Ga atoms occupy the same lattice sites and thus diffuse interdependently, and there is not yet a detailed experimental knowledge of the chemical potential as a function of composition that describes this interaction. Here, we show how diffusion equations can be modified to account for site sharing between In and Ga atoms. The analysis has been implemented in an Excel spreadsheet,more » and outputs predicted Cu, In, and Ga profiles for entered deposition recipes. A single set of diffusion coefficients and activation energies are chosen, such that simulated elemental profiles track with published data and those from this study. Extent and limits of agreement between elemental profiles predicted from the growth recipes and the spreadsheet tool are demonstrated.« less
Significant Published Articles for Pharmacy Nutrition Support Practice in 2014 and 2015.
Dickerson, Roland N; Kumpf, Vanessa J; Blackmer, Allison B; Bingham, Angela L; Tucker, Anne M; Ybarra, Joseph V; Kraft, Michael D; Canada, Todd W
2016-07-01
To assist the pharmacy clinician engaged in nutrition support in staying current with the most pertinent literature. Several experienced board-certified clinical pharmacists engaged in nutrition support therapy compiled a list of articles published in 2014 and 2015 that they considered to be important to their practice. Only those articles available in print format were considered for potential inclusion. Articles available only in preprint electronic format were not evaluated. The citation list was compiled into a single spreadsheet where the author participants were asked to ascertain whether they considered the paper important to nutrition support pharmacy practice. A culled list of publications was then identified whereby the majority of author participants (at least 5 out of 8) considered the paper to be important. A total of 108 articles were identified; 36 of which were considered to be of high importance. An important guideline article published in early 2016, but not ranked, was also included. The top-ranked articles from the primary literature were reviewed. It is recommended that the informed pharmacist, who is engaged in nutrition support therapy, be familiar with the majority of these articles.
Mackay, Donald; Hughes, Lauren; Powell, David E; Kim, Jaeshin
2014-09-01
The QWASI fugacity mass balance model has been widely used since 1983 for both scientific and regulatory purposes to estimate the concentrations of organic chemicals in water and sediment, given an assumed rate of chemical emission, advective inflow in water or deposition from the atmosphere. It has become apparent that an updated version is required, especially to incorporate improved methods of obtaining input parameters such as partition coefficients. Accordingly, the model has been revised and it is now available in spreadsheet format. Changes to the model are described and the new version is applied to two chemicals, D5 (decamethylcyclopentasiloxane) and PCB-180, in two lakes, Lake Pepin (MN, USA) and Lake Ontario, showing the model's capability of illustrating both the chemical to chemical differences and lake to lake differences. Since there are now increased regulatory demands for rigorous sensitivity and uncertainty analyses, these aspects are discussed and two approaches are illustrated. It is concluded that the new QWASI water quality model can be of value for both evaluative and simulation purposes, thus providing a tool for obtaining an improved understanding of chemical mass balances in lakes, as a contribution to the assessment of fate and exposure and as a step towards the assessment of risk. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
ROE Carbon Storage - Forest Biomass
This polygon dataset depicts the density of forest biomass in counties across the United States, in terms of metric tons of carbon per square mile of land area. These data were provided in spreadsheet form by the U.S. Department of Agriculture (USDA) Forest Service. To produce the Web mapping application, EPA joined the spreadsheet with a shapefile of U.S. county (and county equivalent) boundaries downloaded from the U.S. Census Bureau. EPA calculated biomass density based on the area of each county polygon. These data sets were converted into a single polygon feature class inside a file geodatabase.
A Microsoft Excel® 2010 Based Tool for Calculating Interobserver Agreement
Azulay, Richard L
2011-01-01
This technical report provides detailed information on the rationale for using a common computer spreadsheet program (Microsoft Excel®) to calculate various forms of interobserver agreement for both continuous and discontinuous data sets. In addition, we provide a brief tutorial on how to use an Excel spreadsheet to automatically compute traditional total count, partial agreement-within-intervals, exact agreement, trial-by-trial, interval-by-interval, scored-interval, unscored-interval, total duration, and mean duration-per-interval interobserver agreement algorithms. We conclude with a discussion of how practitioners may integrate this tool into their clinical work. PMID:22649578
A microsoft excel(®) 2010 based tool for calculating interobserver agreement.
Reed, Derek D; Azulay, Richard L
2011-01-01
This technical report provides detailed information on the rationale for using a common computer spreadsheet program (Microsoft Excel(®)) to calculate various forms of interobserver agreement for both continuous and discontinuous data sets. In addition, we provide a brief tutorial on how to use an Excel spreadsheet to automatically compute traditional total count, partial agreement-within-intervals, exact agreement, trial-by-trial, interval-by-interval, scored-interval, unscored-interval, total duration, and mean duration-per-interval interobserver agreement algorithms. We conclude with a discussion of how practitioners may integrate this tool into their clinical work.
Gravity Data for West-Central Colorado
Richard Zehner
2012-04-06
Modeled Bouger-Corrected Gravity data was extracted from the Pan American Center for Earth and Environmental Studies Gravity Database of the U.S. at http://irpsrvgis08.utep.edu/viewers/Flex/GravityMagnetic/GravityMagnetic_CyberShare/ on 2/29/2012. The downloaded text file was opened in an Excel spreadsheet. This spreadsheet data was then converted into an ESRI point shapefile in UTM Zone 13 NAD27 projection, showing location and gravity (in milligals). This data was then converted to grid and then contoured using ESRI Spatial Analyst. Data from From University of Texas: Pan American Center for Earth and Environmental Studies
NASA Technical Reports Server (NTRS)
Kalcic, Maria; Turowski, Mark; Hall, Callie
2010-01-01
Presentation topics include: importance of salinity of coastal waters, habitat switching algorithm, habitat switching module, salinity estimates from Landsat for Sabine Calcasieu Basin, percent of time inundated in 2006, salinity data, prototyping the system, system as packaged for field tests, salinity probe and casing, opening for water flow, cellular antenna used to transmit data, preparing to launch, system is launched in the Pearl River at Stennis Space Center, data are transmitted to Twitter by cell phone modem every 15 minutes, Google spreadsheet I used to import the data from the Twitter feed and to compute salinity (from conductivity) and display charts of salinity and temperature, results are uploaded to NASA's Applied Science and Technology Project Office Webpage.
Adamusiak, Tomasz; Parkinson, Helen; Muilu, Juha; Roos, Erik; van der Velde, Kasper Joeri; Thorisson, Gudmundur A; Byrne, Myles; Pang, Chao; Gollapudi, Sirisha; Ferretti, Vincent; Hillege, Hans; Brookes, Anthony J; Swertz, Morris A
2012-05-01
Genetic and epidemiological research increasingly employs large collections of phenotypic and molecular observation data from high quality human and model organism samples. Standardization efforts have produced a few simple formats for exchange of these various data, but a lightweight and convenient data representation scheme for all data modalities does not exist, hindering successful data integration, such as assignment of mouse models to orphan diseases and phenotypic clustering for pathways. We report a unified system to integrate and compare observation data across experimental projects, disease databases, and clinical biobanks. The core object model (Observ-OM) comprises only four basic concepts to represent any kind of observation: Targets, Features, Protocols (and their Applications), and Values. An easy-to-use file format (Observ-TAB) employs Excel to represent individual and aggregate data in straightforward spreadsheets. The systems have been tested successfully on human biobank, genome-wide association studies, quantitative trait loci, model organism, and patient registry data using the MOLGENIS platform to quickly setup custom data portals. Our system will dramatically lower the barrier for future data sharing and facilitate integrated search across panels and species. All models, formats, documentation, and software are available for free and open source (LGPLv3) at http://www.observ-om.org. © 2012 Wiley Periodicals, Inc.
Intervals for posttest probabilities: a comparison of 5 methods.
Mossman, D; Berger, J O
2001-01-01
Several medical articles discuss methods of constructing confidence intervals for single proportions and the likelihood ratio, but scant attention has been given to the systematic study of intervals for the posterior odds, or the positive predictive value, of a test. The authors describe 5 methods of constructing confidence intervals for posttest probabilities when estimates of sensitivity, specificity, and the pretest probability of a disorder are derived from empirical data. They then evaluate each method to determine how well the intervals' coverage properties correspond to their nominal value. When the estimates of pretest probabilities, sensitivity, and specificity are derived from more than 80 subjects and are not close to 0 or 1, all methods generate intervals with appropriate coverage properties. When these conditions are not met, however, the best-performing method is an objective Bayesian approach implemented by a simple simulation using a spreadsheet. Physicians and investigators can generate accurate confidence intervals for posttest probabilities in small-sample situations using the objective Bayesian approach.
Lloyd-Williams, Ffion; O'Flaherty, Martin; Mwatsama, Modi; Birt, Christopher; Ireland, Robin; Capewell, Simon
2008-07-01
To estimate the burden of cardiovascular disease within 15 European Union countries (before the 2004 enlargement) as a result of excess dietary saturated fats attributable to the Common Agricultural Policy (CAP). A spreadsheet model was developed to synthesize data on population, diet, cholesterol levels and mortality rates. A conservative estimate of a reduction in saturated fat consumption of just 2.2 g was chosen, representing 1% of daily energy intake. The fall in serum cholesterol concentration was then calculated, assuming that this 1% reduction in saturated fat consumption was replaced with 0.5% monounsaturated and 0.5% polyunsaturated fats. The resulting reduction in cardiovascular and stroke deaths was then estimated, and a sensitivity analysis conducted. Reducing saturated fat consumption by 1% and increasing monounsaturated and polyunsaturated fat by 0.5% each would lower blood cholesterol levels by approximately 0.06 mmol/l, resulting in approximately 9800 fewer coronary heart disease deaths and 3000 fewer stroke deaths each year. The cardiovascular disease burden attributable to CAP appears substantial. Furthermore, these calculations were conservative estimates, and the true mortality burden may be higher. The analysis contributes to the current wider debate concerning the relationship between CAP, health and chronic disease across Europe, together with recent international developments and commitments to reduce chronic diseases. The reported mortality estimates should be considered in relation to the current CAP and any future reforms.
Ernstoff, Alexi S; Fantke, Peter; Huang, Lei; Jolliet, Olivier
2017-11-01
Specialty software and simplified models are often used to estimate migration of potentially toxic chemicals from packaging into food. Current models, however, are not suitable for emerging applications in decision-support tools, e.g. in Life Cycle Assessment and risk-based screening and prioritization, which require rapid computation of accurate estimates for diverse scenarios. To fulfil this need, we develop an accurate and rapid (high-throughput) model that estimates the fraction of organic chemicals migrating from polymeric packaging materials into foods. Several hundred step-wise simulations optimised the model coefficients to cover a range of user-defined scenarios (e.g. temperature). The developed model, operationalised in a spreadsheet for future dissemination, nearly instantaneously estimates chemical migration, and has improved performance over commonly used model simplifications. When using measured diffusion coefficients the model accurately predicted (R 2 = 0.9, standard error (S e ) = 0.5) hundreds of empirical data points for various scenarios. Diffusion coefficient modelling, which determines the speed of chemical transfer from package to food, was a major contributor to uncertainty and dramatically decreased model performance (R 2 = 0.4, S e = 1). In all, this study provides a rapid migration modelling approach to estimate exposure to chemicals in food packaging for emerging screening and prioritization approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.
Analysis options for estimating status and trends in long-term monitoring
Bart, Jonathan; Beyer, Hawthorne L.
2012-01-01
This chapter describes methods for estimating long-term trends in ecological parameters. Other chapters in this volume discuss more advanced methods for analyzing monitoring data, but these methods may be relatively inaccessible to some readers. Therefore, this chapter provides an introduction to trend analysis for managers and biologists while also discussing general issues relevant to trend assessment in any long-term monitoring program. For simplicity, we focus on temporal trends in population size across years. We refer to the survey results for each year as the “annual means” (e.g. mean per transect, per plot, per time period). The methods apply with little or no modification, however, to formal estimates of population size, other temporal units (e.g. a month), to spatial or other dimensions such as elevation or a north–south gradient, and to other quantities such as chemical or geological parameters. The chapter primarily discusses methods for estimating population-wide parameters rather than studying variation in trend within the population, which can be examined using methods presented in other chapters (e.g. Chapters 7, 12, 20). We begin by reviewing key concepts related to trend analysis. We then describe how to evaluate potential bias in trend estimates. An overview of the statistical models used to quantify trends is then presented. We conclude by showing ways to estimate trends using simple methods that can be implemented with spreadsheets.
Database improvements for motor vehicle/bicycle crash analysis
Lusk, Anne C; Asgarzadeh, Morteza; Farvid, Maryam S
2015-01-01
Background Bicycling is healthy but needs to be safer for more to bike. Police crash templates are designed for reporting crashes between motor vehicles, but not between vehicles/bicycles. If written/drawn bicycle-crash-scene details exist, these are not entered into spreadsheets. Objective To assess which bicycle-crash-scene data might be added to spreadsheets for analysis. Methods Police crash templates from 50 states were analysed. Reports for 3350 motor vehicle/bicycle crashes (2011) were obtained for the New York City area and 300 cases selected (with drawings and on roads with sharrows, bike lanes, cycle tracks and no bike provisions). Crashes were redrawn and new bicycle-crash-scene details were coded and entered into the existing spreadsheet. The association between severity of injuries and bicycle-crash-scene codes was evaluated using multiple logistic regression. Results Police templates only consistently include pedal-cyclist and helmet. Bicycle-crash-scene coded variables for templates could include: 4 bicycle environments, 18 vehicle impact-points (opened-doors and mirrors), 4 bicycle impact-points, motor vehicle/bicycle crash patterns, in/out of the bicycle environment and bike/relevant motor vehicle categories. A test of including these variables suggested that, with bicyclists who had minor injuries as the control group, bicyclists on roads with bike lanes riding outside the lane had lower likelihood of severe injuries (OR, 0.40, 95% CI 0.16 to 0.98) compared with bicyclists riding on roads without bicycle facilities. Conclusions Police templates should include additional bicycle-crash-scene codes for entry into spreadsheets. Crash analysis, including with big data, could then be conducted on bicycle environments, motor vehicle potential impact points/doors/mirrors, bicycle potential impact points, motor vehicle characteristics, location and injury. PMID:25835304
Wu, Sheng-Nan
2004-03-31
The purpose of this study was to develop a method to simulate the cardiac action potential using a Microsoft Excel spreadsheet. The mathematical model contained voltage-gated ionic currents that were modeled using either Beeler-Reuter (B-R) or Luo-Rudy (L-R) phase 1 kinetics. The simulation protocol involves the use of in-cell formulas directly typed into a spreadsheet. The capability of spreadsheet iteration was used in these simulations. It does not require any prior knowledge of computer programming, although the use of the macro language can speed up the calculation. The normal configuration of the cardiac ventricular action potential can be well simulated in the B-R model that is defined by four individual ionic currents, each representing the diffusion of ions through channels in the membrane. The contribution of Na+ inward current to the rate of depolarization is reproduced in this model. After removal of Na+ current from the model, a constant current stimulus elicits an oscillatory change in membrane potential. In the L-R phase 1 model where six types of ionic currents were defined, the effect of extracellular K+ concentration on changes both in the time course of repolarization and in the time-independent K+ current can be demonstrated, when the solutions are implemented in Excel. Using the simulation protocols described here, the users can readily study and graphically display the underlying properties of ionic currents to see how changes in these properties determine the behavior of the heart cell. The method employed in these simulation protocols may also be extended or modified to other biological simulation programs.
Data Cubes Integration in Spatial OLAP for Agricultural Commodities
NASA Astrophysics Data System (ADS)
Putri, A. I.; Sitanggang, I. S.
2017-03-01
Ministry of Agriculture Indonesia collects data of agricultural commodities in Indonesia in the annual period. Agricultural commodities data include food crops, horticulture, plantations, and livestock. The data are available in the spreadsheet format. This study developed data cubes for food crops, plantations, and livestock using the galaxy schema of data warehouse and integrated the data cubes into the SOLAP Horticulture using SpagoBI. SOLAP is useful for data analysis and data visualization. The application displays agricultural commodities data in form of crosstab and chart. This study also developed the location intelligence module that visualizes agricultural commodities data on the map. The system was tested using the black box approach. The result showed that main functions including roll up, drill down, slice, dice, and pivot work properly. This application is expected to enable users to easily obtain data summaries of agricultural commodities.
Van Gosen, Bradley S.
2008-01-01
A study conducted in 2006 by the U.S. Geological Survey collected 57 surface rock samples from nine types of intrusive rock in the Iron Hill carbonatite complex. This intrusive complex, located in Gunnison County of southwestern Colorado, is known for its classic carbonatite-alkaline igneous geology and petrology. The Iron Hill complex is also noteworthy for its diverse mineral resources, including enrichments in titanium, rare earth elements, thorium, niobium (columbium), and vanadium. This study was performed to reexamine the chemistry and metallic content of the major rock units of the Iron Hill complex by using modern analytical techniques, while providing a broader suite of elements than the earlier published studies. The report contains the geochemical analyses of the samples in tabular and digital spreadsheet format, providing the analytical results for 55 major and trace elements.
Engine Power Turbine and Propulsion Pod Arrangement Study
NASA Technical Reports Server (NTRS)
Robuck, Mark; Zhang, Yiyi
2014-01-01
A study has been conducted for NASA Glenn Research Center under contract NNC10BA05B, Task NNC11TA80T to identify beneficial arrangements of the turboshaft engine, transmissions and related systems within the propulsion pod nacelle of NASA's Large Civil Tilt-Rotor 2nd iteration (LCTR2) vehicle. Propulsion pod layouts were used to investigate potential advantages, disadvantages, as well as constraints of various arrangements assuming front or aft shafted engines. Results from previous NASA LCTR2 propulsion system studies and tasks performed by Boeing under NASA contracts are used as the basis for this study. This configuration consists of two Fixed Geometry Variable Speed Power Turbine Engines and related drive and rotor systems (per nacelle) arranged in tilting nacelles near the wing tip. Entry-into-service (EIS) 2035 technology is assumed for both the engine and drive systems. The variable speed rotor system changes from 100 percent speed for hover to 54 percent speed for cruise by the means of a two speed gearbox concept developed under previous NASA contracts. Propulsion and drive system configurations that resulted in minimum vehicle gross weight were identified in previous work and used here. Results reported in this study illustrate that a forward shafted engine has a slight weight benefit over an aft shafted engine for the LCTR2 vehicle. Although the aft shafted engines provide a more controlled and centered CG (between hover and cruise), the length of the long rotor shaft and complicated engine exhaust arrangement outweighed the potential benefits. A Multi-Disciplinary Analysis and Optimization (MDAO) approach for transmission sizing was also explored for this study. This tool offers quick analysis of gear loads, bearing lives, efficiencies, etc., through use of commercially available RomaxDESIGNER software. The goal was to create quick methods to explore various concept models. The output results from RomaxDESIGNER have been successfully linked to Boeing spreadsheets that generate gear tooth geometry in Catia 3D environment. Another initial goal was to link information from RomaxDESIGNER (such as hp, rpm, gear ratio) to populate Boeing's parametric weight spreadsheet and create an automated method to estimate drive system weight. This was only partially achieved due to the variety of weight models, number of manual inputs, and qualitative assessments required. A simplified weight spreadsheet was used with data inputs from RomaxDESIGNER along with manual inputs to perform rough weight calculations.
Incorporating Inquiry into Upper-Level Undergraduate Homework Assignments: The Mini-Journal
NASA Astrophysics Data System (ADS)
Whittington, Alan; Speck, Angela; Witzig, Stephen; Abell, Sandra
2010-05-01
The US National Science Education Standards (2000) state that science should be taught through inquiry. The five essential features of classroom inquiry are that the leaner (i) engages in scientifically oriented questions, (ii) gives priority to evidence in responding to questions, (iii) formulates explanations from evidence, (iv) connects explanations to scientific knowledge, and (v) communicates and justifies explanations. One difficulty in achieving this vision at the university level lies in the common perception that inquiry be fully open and unstructured, and that its implementation will be impractical due to time and material constraints. In an NSF-funded project, "CUES: Connecting Undergraduates to the Enterprise of Science," faculty developed new inquiry-based laboratory curriculum materials using a "mini-journal" approach, which is designed as an alternative to the cookbook laboratory and represents the way that scientists do science. Here we adapt this approach to a homework assignment in an upper-level Planetary Science class, and show that inquiry is achievable in this setting. Traditional homeworks in this class consisted of problem sets requiring algebraic manipulation, computation, and in most cases an appraisal of the result Longer questions are broken down into chunks worth 1 to 4 points. In contrast, the mini-journal is a short article that is modeled in the way that scientists do and report science. It includes a title, abstract, introduction (with clear statement of the problem to be tackled), a description of the methods, results (presented as both tables and graphs), a discussion (with suggestions for future work) and a list of cited work. Students devise their research questions and hypothesis from the paper based on a logical next step in the investigation. Guiding questions in the discussion can assist the students ("it would be interesting to evaluate the effect of ..."). Students submit their own minijournal, using the same journal-style format. A detailed grading rubric was supplied with the mini-journal, with credit given for formatting, accuracy of calculation, and quality of intepretation and discussion. In the examples we present, research is conducted via spreadsheet modeling, where the students develop their own spreadsheets. The key differences between the old and new formats include (i) the active participation of the students in defining the problem that they will pursue, (ii) the open-ended nature of the inquiry, such that students need to recognize when they have enough information to answer their question, (iii) presentation of results in graphical and tabular formats, and (iv) a written discussion of their findings. Based on detailed student and instructor feedback, our conclusions are: (i) Limited inquiry is achievable in upper-level science homework assignments, and is transferable to other topics and classes (ii) Students experience discomfort on being presented with an open-ended assignment, but like the freedom to define their own homework problem (iii) Students recognize that the reading, writing and critical thinking skills employed in the minijournal format increase their understanding (iv) Students suggest a combination of minijournal and traditional homework formats in this class, or replacing midterm exams with minijournals (v) Student written comments are far more useful than Likert scale responses in assessing instructional techniques and effectiveness
Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y
2014-07-08
The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model for drug screens in industry and academia.
C-C1-04: How to Win Friends and Influence People with the SAS Output Delivery System
Tolbert, William
2010-01-01
Background and Aims: Long-time SAS users remember the days when SAS output was embarrassingly ugly. Version 7 saw the introduction of the Output Delivery System (ODS). ODS has matured into a very capable subsystem that gives users powerful reporting options. This presentation will highlight useful features and outline a macro-based system for handling multiple ODS destinations simultaneously. Nowadays there is no excuse for ugly SAS output! When building reports, SAS users should think about the needs of those using the reports. Some people just want to review frequency tables, and are happy to do so on a monitor. Others want to be able to print data for review in a meeting. And, there are always those that want to work with the data in a spreadsheet. Consider the ideal formats for each of the users outlined above. For the casual data browser, HTML output is ideal. For printing, PDF is preferred. And for the additional analysis, Excel is a popular option. With ODS, we can meet all of these needs. Methods: Because ODS permits opening multiple output destinations simultaneously, a single procedure can be used to generate data in HTML, PDF, and Excel at once. The presentation will demonstrate the following: o- basic ODS syntax for HTML, PDF, and Excel output o- custom HTML table of contents o- using the ExcelXP tagset for multi-tab spreadsheets o- a custom macro for managing multiple ODS destinations simultaneously o- simple PROC Template code for easy customization o- techniques for consistent output from multiple platforms. Results: The techniques outlined here have been well-received in a variety of business reporting environments. Conclusions: The SAS ODS provides a wide array of reporting options. Don’t limit yourself to just one type of output.
A pier-scour database: 2,427 field and laboratory measurements of pier scour
Benedict, Stephen T.; Caldwell, Andral W.
2014-01-01
The U.S. Geological Survey conducted a literature review to identify potential sources of published pier-scour data, and selected data were compiled into a digital spreadsheet called the 2014 USGS Pier-Scour Database (PSDb-2014) consisting of 569 laboratory and 1,858 field measurements. These data encompass a wide range of laboratory and field conditions and represent field data from 23 States within the United States and from 6 other countries. The digital spreadsheet is available on the Internet and offers a valuable resource to engineers and researchers seeking to understand pier-scour relations in the laboratory and field.
Ground Magnetic Data for West-Central Colorado
Richard Zehner
2012-03-08
Modeled ground magnetic data was extracted from the Pan American Center for Earth and Environmental Studies database at http://irpsrvgis08.utep.edu/viewers/Flex/GravityMagnetic/GravityMagnetic_CyberShare/ on 2/29/2012. The downloaded text file was then imported into an Excel spreadsheet. This spreadsheet data was converted into an ESRI point shapefile in UTM Zone 13 NAD27 projection, showing location and magnetic field strength in nano-Teslas. This point shapefile was then interpolated to an ESRI grid using an inverse-distance weighting method, using ESRI Spatial Analyst. The grid was used to create a contour map of magnetic field strength.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cottam, Joseph A.; Blaha, Leslie M.
Systems have biases. Their interfaces naturally guide a user toward specific patterns of action. For example, modern word-processors and spreadsheets are both capable of taking word wrapping, checking spelling, storing tables, and calculating formulas. You could write a paper in a spreadsheet or could do simple business modeling in a word-processor. However, their interfaces naturally communicate which function they are designed for. Visual analytic interfaces also have biases. In this paper, we outline why simple Markov models are a plausible tool for investigating that bias and how they might be applied. We also discuss some anticipated difficulties in such modelingmore » and touch briefly on what some Markov model extensions might provide.« less
Spreadsheets for Analyzing and Optimizing Space Missions
NASA Technical Reports Server (NTRS)
Some, Raphael R.; Agrawal, Anil K.; Czikmantory, Akos J.; Weisbin, Charles R.; Hua, Hook; Neff, Jon M.; Cowdin, Mark A.; Lewis, Brian S.; Iroz, Juana; Ross, Rick
2009-01-01
XCALIBR (XML Capability Analysis LIBRary) is a set of Extensible Markup Language (XML) database and spreadsheet- based analysis software tools designed to assist in technology-return-on-investment analysis and optimization of technology portfolios pertaining to outer-space missions. XCALIBR is also being examined for use in planning, tracking, and documentation of projects. An XCALIBR database contains information on mission requirements and technological capabilities, which are related by use of an XML taxonomy. XCALIBR incorporates a standardized interface for exporting data and analysis templates to an Excel spreadsheet. Unique features of XCALIBR include the following: It is inherently hierarchical by virtue of its XML basis. The XML taxonomy codifies a comprehensive data structure and data dictionary that includes performance metrics for spacecraft, sensors, and spacecraft systems other than sensors. The taxonomy contains >700 nodes representing all levels, from system through subsystem to individual parts. All entries are searchable and machine readable. There is an intuitive Web-based user interface. The software automatically matches technologies to mission requirements. The software automatically generates, and makes the required entries in, an Excel return-on-investment analysis software tool. The results of an analysis are presented in both tabular and graphical displays.
Earthquake Magnitude: A Teaching Module for the Spreadsheets Across the Curriculum Initiative
NASA Astrophysics Data System (ADS)
Wetzel, L. R.; Vacher, H. L.
2006-12-01
Spreadsheets Across the Curriculum (SSAC) is a library of computer-based activities designed to reinforce or teach quantitative-literacy or mathematics concepts and skills in context. Each activity (called a "module" in the SSAC project) consists of a PowerPoint presentation with embedded Excel spreadsheets. Each module focuses on one or more problems for students to solve. Each student works through a presentation, thinks about the in-context problem, figures out how to solve it mathematically, and builds the spreadsheets to calculate and examine answers. The emphasis is on mathematical problem solving. The intention is for the in- context problems to span the entire range of subjects where quantitative thinking, number sense, and math non-anxiety are relevant. The self-contained modules aim to teach quantitative concepts and skills in a wide variety of disciplines (e.g., health care, finance, biology, and geology). For example, in the Earthquake Magnitude module students create spreadsheets and graphs to explore earthquake magnitude scales, wave amplitude, and energy release. In particular, students realize that earthquake magnitude scales are logarithmic. Because each step in magnitude represents a 10-fold increase in wave amplitude and approximately a 30-fold increase in energy release, large earthquakes are much more powerful than small earthquakes. The module has been used as laboratory and take-home exercises in small structural geology and solid earth geophysics courses with upper level undergraduates. Anonymous pre- and post-tests assessed students' familiarity with Excel as well as other quantitative skills. The SSAC library consists of 27 modules created by a community of educators who met for one-week "module-making workshops" in Olympia, Washington, in July of 2005 and 2006. The educators designed the modules at the workshops both to use in their own classrooms and to make available for others to adopt and adapt at other locations and in other classes. When fully developed, the module collection will be available at the on-line Science Education Resource Center at Carleton College, searchable by quantitative skill, subject area, and Excel level. The number of modules will continue to grow through individual efforts as well as an additional module-making workshop in July of 2007 facilitated by the Washington Center for Improving the Quality of Undergraduate Education.
Methodology for Outdoor Water Savings Model and Spreadsheet Tool for U.S. and Selected States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Alison A.; Chen, Yuting; Dunham, Camilla
Green lawns and landscaping are archetypical of the populated American landscape, and typically require irrigation, which corresponds to a significant fraction of residential, commercial, and institutional water use. In North American cities, the estimated portion of residential water used for outdoor purposes ranges from 22-38% in cooler climates up to 59-67% in dry and hot environments, while turfgrass coverage within the United States spans 11.1-20.2 million hectares (Milesi et al. 2009). One national estimate uses satellite and aerial photography data to develop a relationship between impervious surface and lawn surface area, yielding a conservative estimate of 16.4 (± 3.6) millionmore » hectares of lawn surface area in the United States—an area three times larger than that devoted to any irrigated crop (Milesi et al. 2005). One approach that holds promise for cutting unnecessary outdoor water use is the increased deployment of “smart” irrigation controllers to increase the water efficiency of irrigation systems. This report describes the methodology and inputs employed in a mathematical model that quantifies the effects of the U.S. Environmental Protection Agency’s WaterSense labeling program for one such type of controller, weather-based irrigation controllers (WBIC). This model builds off that described in “Methodology for National Water Savings Model and Spreadsheet Tool–Outdoor Water Use” and uses a two-tiered approach to quantify outdoor water savings attributable to the WaterSense program for WBIC, as well as net present value (NPV) of that savings. While the first iteration of the model assessed national impacts using averaged national values, this version begins by evaluating impacts in three key large states that make up a sizable portion of the irrigation market: California, Florida, and Texas. These states are considered to be the principal market of “smart” irrigation controllers that may result in the bulk of national savings. Modeled water savings and net present value for these three states should be more accurate and representative than the averaged national values given state-specific inputs such as lot size, water price, and housing stock. To complete the picture of national impacts, the remaining WBIC shipments not assigned to these three states are assessed using the original methodology based on the averaged national values.« less
NASA Astrophysics Data System (ADS)
Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.
2017-12-01
Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of getting a hit is N%" or "the probability of an earthquake is N%" involves specifying the assumptions made. Different plausible assumptions yield a wide range of estimates. In both seismology and sports, how to better predict future performance remains an important question.
CALiPER Exploratory Study: Accounting for Uncertainty in Lumen Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergman, Rolf; Paget, Maria L.; Richman, Eric E.
2011-03-31
With a well-defined and shared understanding of uncertainty in lumen measurements, testing laboratories can better evaluate their processes, contributing to greater consistency and credibility of lighting testing a key component of the U.S. Department of Energy (DOE) Commercially Available LED Product Evaluation and Reporting (CALiPER) program. Reliable lighting testing is a crucial underlying factor contributing toward the success of many energy-efficient lighting efforts, such as the DOE GATEWAY demonstrations, Lighting Facts Label, ENERGY STAR® energy efficient lighting programs, and many others. Uncertainty in measurements is inherent to all testing methodologies, including photometric and other lighting-related testing. Uncertainty exists for allmore » equipment, processes, and systems of measurement in individual as well as combined ways. A major issue with testing and the resulting accuracy of the tests is the uncertainty of the complete process. Individual equipment uncertainties are typically identified, but their relative value in practice and their combined value with other equipment and processes in the same test are elusive concepts, particularly for complex types of testing such as photometry. The total combined uncertainty of a measurement result is important for repeatable and comparative measurements for light emitting diode (LED) products in comparison with other technologies as well as competing products. This study provides a detailed and step-by-step method for determining uncertainty in lumen measurements, working closely with related standards efforts and key industry experts. This report uses the structure proposed in the Guide to Uncertainty Measurements (GUM) for evaluating and expressing uncertainty in measurements. The steps of the procedure are described and a spreadsheet format adapted for integrating sphere and goniophotometric uncertainty measurements is provided for entering parameters, ordering the information, calculating intermediate values and, finally, obtaining expanded uncertainties. Using this basis and examining each step of the photometric measurement and calibration methods, mathematical uncertainty models are developed. Determination of estimated values of input variables is discussed. Guidance is provided for the evaluation of the standard uncertainties of each input estimate, covariances associated with input estimates and the calculation of the result measurements. With this basis, the combined uncertainty of the measurement results and finally, the expanded uncertainty can be determined.« less
Whitfield, Malcolm D; Gillett, Michael; Holmes, Michael; Ogden, Elaine
2006-12-01
The brief for this study was to produce a practical, evidence based, financial planning tool, which could be used to present an economic argument for funding a public health-based prevention programme in coronary heart disease (CHD) related illness on the same basis as treatment interventions. To explore the possibility of using multivariate risk prediction equations, derived from the Framingham and other studies, to estimate how many people in a population are likely to be admitted to hospital in the next 5-10 years with cardio vascular disease (CVD) related events such as heart attacks, strokes, heart failure and kidney disease. To estimate the potential financial impact of reductions in hospital admissions, on an 'invest to save' basis, if primary care trusts (PCTs) were to invest in public health based interventions to reduce cardiovascular risk at a population level. The populations of five UK PCTs were entered into a spreadsheet based decision tree model, in terms of age and sex (this equated to around 620,000 adults). An estimation was made to determine how many people, in each age group, were likely to be diabetic. Population risk factors such as smoking rates, mean body mass index (BMI), mean total cholesterol and mean systolic blood pressure were entered by age group. The spreadsheet then used a variant of the Framingham equation to calculate how many non-diabetic people in each age group were likely to have a heart attack or stroke in the next 5 years. In addition heart failure and dialysis admission rates were estimated based upon risk factors for incidence. The United Kingdom Prospective Diabetes Study (UKPDS) risk engines 56 and 60 were used to calculate the risk of CHD and stroke, respectively, in people with type 2 diabetes. The spreadsheet deducted the number of people likely to die before reaching hospital and produced a predicted number of hospital admissions for each category over a 5-year period. The final part of the calculation attached a cost to the hospital activity using the UK Health Resource Grouping (HRG) tariffs. The predicted number of events in each of the primary care trusts was then compared with the actual number of events the previous year (2004/2005). The study used a decision tree type model, which was populated with data from the research literature. The model applied the risk equations to population data from five primary care trusts to estimate how many people would suffer from an acute CVD related event over the next 5 years. The predicted number of events was then compared with the actual number of acute admissions for heart attacks, strokes, heart failure, acute hypoglycaemic attacks, renal failure and coronary bypass surgery the previous year. The first outcome of the model was to compare the estimated number of people in each PCT likely to suffer from a heart attack, a stroke, heart failure or chronic kidney failure with the actual number the previous year 2004/2005. The predicted number was remarkably accurate in the case of heart attack and stroke. There was some over-prediction of chronic kidney disease (CKD) which could be accounted for by known under-diagnosis in this illness group and the inability of the model to pick up, at this stage, the fact that many CKD patients die of a CHD related event before they reach the stage of requiring renal replacement. The second outcome of the model was to estimate the financial consequence of risk reduction. Moderate reductions in risk in the order of around 2-4% were estimated to lead to saving in acute admission costs or around pounds sterling 5.4 million over 5 years. More ambitious targets of risk reduction in the order of 5-6% led to estimated savings of around pounds sterling 8.7 million. This study is not presented as the definitive approach to predicting the economic consequences of investment in public health on the cost of secondary care. It is simply a logical, systematic approach to quantifying these issues in order to present a business case for such investment. The research team do not know if the predicted savings would accrue from such investments; it is theoretical at this stage. The point is, however, that if the predictions are correct then the savings will accrue from over 4000 people, from an adult population of around 185,000 not having a heart attack or a stroke or an acute exacerbation of heart failure.
Favorable Geochemistry from Springs and Wells in Colorado
Richard E. Zehner
2012-02-01
This layer contains favorable geochemistry for high-temperature geothermal systems, as interpreted by Richard "Rick" Zehner. The data is compiled from the data obtained from the USGS. The original data set combines 15,622 samples collected in the State of Colorado from several sources including 1) the original Geotherm geochemical database, 2) USGS NWIS (National Water Information System), 3) Colorado Geological Survey geothermal sample data, and 4) original samples collected by R. Zehner at various sites during the 2011 field season. These samples are also available in a separate shapefile FlintWaterSamples.shp. Data from all samples were reportedly collected using standard water sampling protocols (filtering through 0.45 micron filter, etc.) Sample information was standardized to ppm (micrograms/liter) in spreadsheet columns. Commonly-used cation and silica geothermometer temperature estimates are included.
Computational scheme for the prediction of metal ion binding by a soil fulvic acid
Marinsky, J.A.; Reddy, M.M.; Ephraim, J.H.; Mathuthu, A.S.
1995-01-01
The dissociation and metal ion binding properties of a soil fulvic acid have been characterized. Information thus gained was used to compensate for salt and site heterogeneity effects in metal ion complexation by the fulvic acid. An earlier computational scheme has been modified by incorporating an additional step which improves the accuracy of metal ion speciation estimates. An algorithm is employed for the prediction of metal ion binding by organic acid constituents of natural waters (once the organic acid is characterized in terms of functional group identity and abundance). The approach discussed here, currently used with a spreadsheet program on a personal computer, is conceptually envisaged to be compatible with computer programs available for ion binding by inorganic ligands in natural waters.
Challenges to Standardization: A Case Study Using Coastal and Deep-Ocean Water Level Data
NASA Astrophysics Data System (ADS)
Sweeney, A. D.; Stroker, K. J.; Mungov, G.; McLean, S. J.
2015-12-01
Sea levels recorded at coastal stations and inferred from deep-ocean pressure observations at the seafloor are submitted for archive in multiple data and metadata formats. These formats include two forms of schema-less XML and a custom binary format accompanied by metadata in a spreadsheet. The authors report on efforts to use existing standards to make this data more discoverable and more useful beyond their initial use in detecting tsunamis. An initial review of data formats for sea level data around the globe revealed heterogeneity in presentation and content. In the absence of a widely-used domain-specific format, we adopted the general model for structuring data and metadata expressed by the Network Common Data Form (netCDF). netCDF has been endorsed by the Open Geospatial Consortium and has the advantages of small size when compared to equivalent plain text representation and provides a standard way of embedding metadata in the same file. We followed the orthogonal time-series profile of the Climate and Forecast discrete sampling geometries as the convention for structuring the data and describing metadata relevant for use. We adhered to the Attribute Convention for Data Discovery for capturing metadata to support user search. Beyond making it possible to structure data and metadata in a standard way, netCDF is supported by multiple software tools in providing programmatic cataloging, access, subsetting, and transformation to other formats. We will describe our successes and failures in adhering to existing standards and provide requirements for either augmenting existing conventions or developing new ones. Some of these enhancements are specific to sea level data, while others are applicable to time-series data in general.
NASA Technical Reports Server (NTRS)
Chambers, L. H.; Chaudhury, S.; Page, M. T.; Lankey, A. J.; Doughty, J.; Kern, Steven; Rogerson, Tina M.
2008-01-01
During the summer of 2007, as part of the second year of a NASA-funded project in partnership with Christopher Newport University called SPHERE (Students as Professionals Helping Educators Research the Earth), a group of undergraduate students spent 8 weeks in a research internship at or near NASA Langley Research Center. Three students from this group formed the Clouds group along with a NASA mentor (Chambers), and the brief addition of a local high school student fulfilling a mentorship requirement. The Clouds group was given the task of exploring and analyzing ground-based cloud observations obtained by K-12 students as part of the Students' Cloud Observations On-Line (S'COOL) Project, and the corresponding satellite data. This project began in 1997. The primary analysis tools developed for it were in FORTRAN, a computer language none of the students were familiar with. While they persevered through computer challenges and picky syntax, it eventually became obvious that this was not the most fruitful approach for a project aimed at motivating K-12 students to do their own data analysis. Thus, about halfway through the summer the group shifted its focus to more modern data analysis and visualization tools, namely spreadsheets and Google(tm) Earth. The result of their efforts, so far, is two different Excel spreadsheets and a Google(tm) Earth file. The spreadsheets are set up to allow participating classrooms to paste in a particular dataset of interest, using the standard S'COOL format, and easily perform a variety of analyses and comparisons of the ground cloud observation reports and their correspondence with the satellite data. This includes summarizing cloud occurrence and cloud cover statistics, and comparing cloud cover measurements from the two points of view. A visual classification tool is also provided to compare the cloud levels reported from the two viewpoints. This provides a statistical counterpart to the existing S'COOL data visualization tool, which is used for individual ground-to-satellite correspondences. The Google(tm) Earth file contains a set of placemarks and ground overlays to show participating students the area around their school that the satellite is measuring. This approach will be automated and made interactive by the S'COOL database expert and will also be used to help refine the latitude/longitude location of the participating schools. Once complete, these new data analysis tools will be posted on the S'COOL website for use by the project participants in schools around the US and the world.
NASA Astrophysics Data System (ADS)
Sakimoto, S. E. H.
2016-12-01
Planetary volcanism has redefined what is considered volcanism. "Magma" now may be considered to be anything from the molten rock familiar at terrestrial volcanoes to cryovolcanic ammonia-water mixes erupted on an outer solar system moon. However, even with unfamiliar compositions and source mechanisms, we find familiar landforms such as volcanic channels, lakes, flows, and domes and thus a multitude of possibilities for modeling. As on Earth, these landforms lend themselves to analysis for estimating storage, eruption and/or flow rates. This has potential pitfalls, as extension of the simplified analytic models we often use for terrestrial features into unfamiliar parameter space might yield misleading results. Our most commonly used tools for estimating flow and cooling have tended to lag significantly behind state-of-the-art; the easiest methods to use are neither realistic or accurate, but the more realistic and accurate computational methods are not simple to use. Since the latter computational tools tend to be both expensive and require a significant learning curve, there is a need for a user-friendly approach that still takes advantage of their accuracy. One method is use of the computational package for generation of a server-based tool that allows less computationally inclined users to get accurate results over their range of input parameters for a given problem geometry. A second method is to use the computational package for the generation of a polynomial empirical solution for each class of flow geometry that can be fairly easily solved by anyone with a spreadsheet. In this study, we demonstrate both approaches for several channel flow and lava lake geometries with terrestrial and extraterrestrial examples and compare their results. Specifically, we model cooling rectangular channel flow with a yield strength material, with applications to Mauna Loa, Kilauea, Venus, and Mars. This approach also shows promise with model applications to lava lakes, magma flow through cracks, and volcanic dome formation.
NASA Astrophysics Data System (ADS)
Adhitama, Egy; Fauzi, Ahmad
2018-05-01
In this study, a pendulum experimental tool with a light-based timer has been developed to measure the period of a simple pendulum. The obtained data was automatically recorded in an Excel spreadsheet. The intensity of monochromatic light, sensed by a 3DU5C phototransistor, dynamically changes as the pendulum swings. The changed intensity varies the resistance value and was processed by the microcontroller, ATMega328, to obtain a signal period as a function of time and brightness when the pendulum crosses the light. Through the experiment, using calculated average periods, the gravitational acceleration value has been accurately and precisely determined.
(abstract) Simple Spreadsheet Thermal Models for Cryogenic Applications
NASA Technical Reports Server (NTRS)
Nash, A. E.
1994-01-01
Self consistent circuit analog thermal models, that can be run in commercial spreadsheet programs on personal computers, have been created to calculate the cooldown and steady state performance of cryogen cooled Dewars. The models include temperature dependent conduction and radiation effects. The outputs of the models provide temperature distribution and Dewar performance information. These models have been used to analyze the Cryogenic Telescope Test Facility (CTTF). The facility will be on line in early 1995 for its first user, the Infrared Telescope Technology Testbed (ITTT), for the Space Infrared Telescope Facility (SIRTF) at JPL. The model algorithm as well as a comparison of the model predictions and actual performance of this facility will be presented.
Simple Spreadsheet Thermal Models for Cryogenic Applications
NASA Technical Reports Server (NTRS)
Nash, Alfred
1995-01-01
Self consistent circuit analog thermal models that can be run in commercial spreadsheet programs on personal computers have been created to calculate the cooldown and steady state performance of cryogen cooled Dewars. The models include temperature dependent conduction and radiation effects. The outputs of the models provide temperature distribution and Dewar performance information. these models have been used to analyze the SIRTF Telescope Test Facility (STTF). The facility has been brought on line for its first user, the Infrared Telescope Technology Testbed (ITTT), for the Space Infrared Telescope Facility (SIRTF) at JPL. The model algorithm as well as a comparison between the models' predictions and actual performance of this facility will be presented.
Barker, Charles E.; Dallegge, Todd A.; Clark, Arthur C.
2002-01-01
We have updated a simple polyvinyl chloride plastic canister design by adding internal headspace temperature measurement, and redesigned it so it is made with mostly off-the-shelf components for ease of construction. Using self-closing quick connects, this basic canister is mated to a zero-head manometer to make a simple coalbed methane desorption system that is easily transported in small aircraft to remote localities. This equipment is used to gather timed measurements of pressure, volume and temperature data that are corrected to standard pressure and temperature (STP) and graphically analyzed using an Excel(tm)-based spreadsheet. Used together these elements form an effective, practical canister desorption method.
Transition Matrices: A Tool to Assess Student Learning and Improve Instruction
NASA Astrophysics Data System (ADS)
Morris, Gary A.; Walter, Paul; Skees, Spencer; Schwartz, Samantha
2017-03-01
This paper introduces a new spreadsheet tool for adoption by high school or college-level physics teachers who use common assessments in a pre-instruction/post-instruction mode to diagnose student learning and teaching effectiveness. The spreadsheet creates a simple matrix that identifies the percentage of students who select each possible pre-/post-test answer combination on each question of the diagnostic exam. Leveraging analysis of the quality of the incorrect answer choices, one can order the answer choices from worst to best (i.e., correct), resulting in "transition matrices" that can provide deeper insight into student learning and the success or failure of the pedagogical approach than traditional analyses that employ dichotomous scoring.
Modeling the world in a spreadsheet: Environmental simulation on a microcomputer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cartwright, T.J.
1993-12-31
This article focuses on the following: Modeling Natural Systems Blowing Smoke; Atmospheric Dispersion of Air Pollution Running Water; The Underground Transport of Pollutants Preserving the Species; Determining Minimum Viable Population Sustainable Yield; Managing the Forest for the Trees Here Comes the Sun; Solar Energy from a Flat-Plate Collector Modeling Social Systems Macroeconomic Policy; Econometrics and the Klein Model Urban Form; The Lowry Model of Population Distribution Affordable Housing; The Bertaud/World Bank Model Traffic on the Roads; Modeling Trip Generation and Trip Distribution Throwing Things Away; A Model for Waste Management Apples and Oranges; and An Environmental Impact Assessment Model Modelingmore » Artificial Systems Life in a Spreadsheet.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, Louise F.; Harmon, Anna C.
2015-04-09
This project was funded jointly by the National Renewable Energy Laboratory (NREL) and Oak Ridge National Laboratory (ORNL). ORNL focused on developing a full basement wall system experimental database to enable others to validate hygrothermal simulation codes. NREL focused on testing the moisture durability of practical basement wall interior insulation retrofit solutions for cold climates. The project has produced a physically credible and reliable long-term hygrothermal performance database for retrofit foundation wall insulation systems in zone 6 and 7 climates that are fully compliant with the performance criteria in the 2009 Minnesota Energy Code. These data currently span the periodmore » from November 10, 2012 through May 31, 2014 and are anticipated to be extended through November 2014. The experimental data were configured into a standard format that can be published online and that is compatible with standard commercially available spreadsheet and database software.« less
Laboratory Animal Management Assistant (LAMA): a LIMS for active research colonies.
Milisavljevic, Marko; Hearty, Taryn; Wong, Tony Y T; Portales-Casamar, Elodie; Simpson, Elizabeth M; Wasserman, Wyeth W
2010-06-01
Laboratory Animal Management Assistant (LAMA) is an internet-based system for tracking large laboratory mouse colonies. It has a user-friendly interface with powerful search capabilities that ease day-to-day tasks such as tracking breeding cages and weaning litters. LAMA was originally developed to manage hundreds of new mouse strains generated by a large functional genomics program, the Pleiades Promoter Project ( http://www.pleiades.org ). The software system has proven to be highly flexible, suitable for diverse management approaches to mouse colonies. It allows custom tagging and grouping of animals, simplifying project-specific handling and access to data. Finally, LAMA was developed in close collaboration with mouse technicians to ease the transition from paper- or Excel-based management systems to computerized tracking, allowing data export in a popular spreadsheet format and automatic printing of cage cards. LAMA is an open-access software tool, freely available to the research community at http://launchpad.net/mousedb .
Indoor air pollutants from household-product sources: Project report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sack, T.M.; Steele, D.H.
1991-09-01
A Gas Chromatography/Mass Spectrometry (GS/MS) data base obtained during the analysis of 1,159 household products for six common chlorocarbon solvents has been reanalyzed for the presence and concentration of 25 additional chemicals. Using computerized GS/MS software, 1,043 of the original GC/MS data files were recovered and analyzed for the presence of the additional chemicals. Of the 25 additional chemicals, those found most frequently in the household products include acetone (315 products), 2-butanone (200 products), methylcyclohexane (150 products), toluene (488 products), ethylbenzene (157 products), m-xylene (101 products), and o.p-xylene (93 products). A total of 63.6% of the products analyzed in themore » study contained one or more of the 25 additional analytes at concentrations greater than or equal to 0.1% by weight. The quantitative information presented in the report is also available on diskette in a spreadsheet format.« less
Installation Torque Tables for Noncritical Applications
NASA Technical Reports Server (NTRS)
Rivera-Rosario, Hazel T.; Powell, Joseph S.
2017-01-01
The objective of this project is to define torque values for bolts and screws when loading is not a concern. Fasteners require a certain torque to fulfill its function and prevent failure. NASA Glenn Research Center did not have a set of fastener torque tables for non-critical applications without loads, usually referring to hand-tight or wrench-tight torqueing. The project is based on two formulas, torque and pullout load. Torque values are calculated giving way to preliminary data tables. Testing is done to various bolts and metal plates, torqueing them until the point of failure. Around 640 torque tables were developed for UNC, UNF, and M fasteners. Different lengths of thread engagement were analyzed for the 5 most common materials used at GRC. The tables were put together in an Excel spreadsheet and then formatted into a Word document. The plan is to later convert this to an official technical publication or memorandum.
Dusel-Bacon, Cynthia; Slack, John F.; Koenig, Alan E.; Foley, Nora K.; Oscarson, Robert L.; Gans, Kathleen D.
2011-01-01
This Open-File Report presents geochemical data for outcrop and drill-core samples from volcanogenic massive sulfide deposits and associated metaigneous and metasedimentary rocks in the Wood River area of the Bonnifield mining district, northern Alaska Range, east-central Alaska. The data consist of major- and trace-element whole-rock geochemical analyses, and major- and trace-element analyses of sulfide minerals determined by electron microprobe and laser ablation—inductively coupled plasma—mass spectrometry (LA-ICP-MS) techniques. The PDF consists of text, appendix explaining the analytical methods used for the analyses presented in the data tables, a sample location map, and seven data tables. The seven tables are also available as spreadsheets in several file formats. Descriptions and discussions of the Bonnifield deposits are given in Dusel-Bacon and others (2004, 2005, 2006, 2007, 2010).
Massive problem reports mining and analysis based parallelism for similar search
NASA Astrophysics Data System (ADS)
Zhou, Ya; Hu, Cailin; Xiong, Han; Wei, Xiafei; Li, Ling
2017-05-01
Massive problem reports and solutions accumulated over time and continuously collected in XML Spreadsheet (XMLSS) format from enterprises and organizations, which record a series of comprehensive description about problems that can help technicians to trace problems and their solutions. It's a significant and challenging issue to effectively manage and analyze these massive semi-structured data to provide similar problem solutions, decisions of immediate problem and assisting product optimization for users during hardware and software maintenance. For this purpose, we build a data management system to manage, mine and analyze these data search results that can be categorized and organized into several categories for users to quickly find out where their interesting results locate. Experiment results demonstrate that this system is better than traditional centralized management system on the performance and the adaptive capability of heterogeneous data greatly. Besides, because of re-extracting topics, it enables each cluster to be described more precise and reasonable.
Rdesign: A data dictionary with relational database design capabilities in Ada
NASA Technical Reports Server (NTRS)
Lekkos, Anthony A.; Kwok, Teresa Ting-Yin
1986-01-01
Data Dictionary is defined to be the set of all data attributes, which describe data objects in terms of their intrinsic attributes, such as name, type, size, format and definition. It is recognized as the data base for the Information Resource Management, to facilitate understanding and communication about the relationship between systems applications and systems data usage and to help assist in achieving data independence by permitting systems applications to access data knowledge of the location or storage characteristics of the data in the system. A research and development effort to use Ada has produced a data dictionary with data base design capabilities. This project supports data specification and analysis and offers a choice of the relational, network, and hierarchical model for logical data based design. It provides a highly integrated set of analysis and design transformation tools which range from templates for data element definition, spreadsheet for defining functional dependencies, normalization, to logical design generator.
O’Flaherty, Martin; Mwatsama, Modi; Birt, Christopher; Ireland, Robin; Capewell, Simon
2008-01-01
Abstract Objective To estimate the burden of cardiovascular disease within 15 European Union countries (before the 2004 enlargement) as a result of excess dietary saturated fats attributable to the Common Agricultural Policy (CAP). Methods A spreadsheet model was developed to synthesize data on population, diet, cholesterol levels and mortality rates. A conservative estimate of a reduction in saturated fat consumption of just 2.2 g was chosen, representing 1% of daily energy intake. The fall in serum cholesterol concentration was then calculated, assuming that this 1% reduction in saturated fat consumption was replaced with 0.5% monounsaturated and 0.5% polyunsaturated fats. The resulting reduction in cardiovascular and stroke deaths was then estimated, and a sensitivity analysis conducted. Findings Reducing saturated fat consumption by 1% and increasing monounsaturated and polyunsaturated fat by 0.5% each would lower blood cholesterol levels by approximately 0.06 mmol/l, resulting in approximately 9800 fewer coronary heart disease deaths and 3000 fewer stroke deaths each year. Conclusion The cardiovascular disease burden attributable to CAP appears substantial. Furthermore, these calculations were conservative estimates, and the true mortality burden may be higher. The analysis contributes to the current wider debate concerning the relationship between CAP, health and chronic disease across Europe, together with recent international developments and commitments to reduce chronic diseases. The reported mortality estimates should be considered in relation to the current CAP and any future reforms. PMID:18670665
AMPLISAS: a web server for multilocus genotyping using next-generation amplicon sequencing data.
Sebastian, Alvaro; Herdegen, Magdalena; Migalska, Magdalena; Radwan, Jacek
2016-03-01
Next-generation sequencing (NGS) technologies are revolutionizing the fields of biology and medicine as powerful tools for amplicon sequencing (AS). Using combinations of primers and barcodes, it is possible to sequence targeted genomic regions with deep coverage for hundreds, even thousands, of individuals in a single experiment. This is extremely valuable for the genotyping of gene families in which locus-specific primers are often difficult to design, such as the major histocompatibility complex (MHC). The utility of AS is, however, limited by the high intrinsic sequencing error rates of NGS technologies and other sources of error such as polymerase amplification or chimera formation. Correcting these errors requires extensive bioinformatic post-processing of NGS data. Amplicon Sequence Assignment (AMPLISAS) is a tool that performs analysis of AS results in a simple and efficient way, while offering customization options for advanced users. AMPLISAS is designed as a three-step pipeline consisting of (i) read demultiplexing, (ii) unique sequence clustering and (iii) erroneous sequence filtering. Allele sequences and frequencies are retrieved in excel spreadsheet format, making them easy to interpret. AMPLISAS performance has been successfully benchmarked against previously published genotyped MHC data sets obtained with various NGS technologies. © 2015 John Wiley & Sons Ltd.
Kilaru, Varun; Barfield, Richard T; Schroeder, James W; Smith, Alicia K
2012-01-01
Recent evidence suggests that DNA methylation changes may underlie numerous complex traits and diseases. The advent of commercial, array-based methods to interrogate DNA methylation has led to a profusion of epigenetic studies in the literature. Array-based methods, such as the popular Illumina GoldenGate and Infinium platforms, estimate the proportion of DNA methylated at single-base resolution for thousands of CpG sites across the genome. These arrays generate enormous amounts of data, but few software resources exist for efficient and flexible analysis of these data. We developed a software package called MethLAB (http://genetics.emory.edu/conneely/MethLAB) using R, an open source statistical language that can be edited to suit the needs of the user. MethLAB features a graphical user interface (GUI) with a menu-driven format designed to efficiently read in and manipulate array-based methylation data in a user-friendly manner. MethLAB tests for association between methylation and relevant phenotypes by fitting a separate linear model for each CpG site. These models can incorporate both continuous and categorical phenotypes and covariates, as well as fixed or random batch or chip effects. MethLAB accounts for multiple testing by controlling the false discovery rate (FDR) at a user-specified level. Standard output includes a spreadsheet-ready text file and an array of publication-quality figures. Considering the growing interest in and availability of DNA methylation data, there is a great need for user-friendly open source analytical tools. With MethLAB, we present a timely resource that will allow users with no programming experience to implement flexible and powerful analyses of DNA methylation data. PMID:22430798
Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools.
Villar, Alejandro; Zarrabeitia, María T; Fdez-Arroyabe, Pablo; Santurtún, Ana
2018-06-01
Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.
Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools
NASA Astrophysics Data System (ADS)
Villar, Alejandro; Zarrabeitia, María T.; Fdez-Arroyabe, Pablo; Santurtún, Ana
2018-03-01
Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.
Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools
NASA Astrophysics Data System (ADS)
Villar, Alejandro; Zarrabeitia, María T.; Fdez-Arroyabe, Pablo; Santurtún, Ana
2018-06-01
Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.
Zou, Jiaqi; Li, Na
2013-09-01
Proper design of nucleic acid sequences is crucial for many applications. We have previously established a thermodynamics-based quantitative model to help design aptamer-based nucleic acid probes by predicting equilibrium concentrations of all interacting species. To facilitate customization of this thermodynamic model for different applications, here we present a generic and easy-to-use platform to implement the algorithm of the model with Microsoft(®) Excel formulas and VBA (Visual Basic for Applications) macros. Two Excel spreadsheets have been developed: one for the applications involving only nucleic acid species, the other for the applications involving both nucleic acid and non-nucleic acid species. The spreadsheets take the nucleic acid sequences and the initial concentrations of all species as input, guide the user to retrieve the necessary thermodynamic constants, and finally calculate equilibrium concentrations for all species in various bound and unbound conformations. The validity of both spreadsheets has been verified by comparing the modeling results with the experimental results on nucleic acid sequences reported in the literature. This Excel-based platform described here will allow biomedical researchers to rationalize the sequence design of nucleic acid probes using the thermodynamics-based modeling even without relevant theoretical and computational skills. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Sprecher, D J; Ley, W B; Whittier, W D; Bowen, J M; Thatcher, C D; Pelzer, K D; Moore, J M
1989-07-15
A computer spreadsheet was developed to predict the economic impact of a management decision to use B-mode ultrasonographic ovine pregnancy diagnosis. The spreadsheet design and spreadsheet cell formulas are provided. The program used the partial farm budget technique to calculate net return (NR) or cash flow changes that resulted from the decision to use ultrasonography. Using the program, either simple pregnancy diagnosis or pregnancy diagnosis with the ability to determine singleton or multiple pregnancies may be compared with no flock ultrasonographic pregnancy diagnosis. A wide range of user-selected regional variables are used to calculate the cash flow changes associated with the ultrasonography decisions. A variable may be altered through a range of values to conduct a sensitivity analysis of predicted NR. Example sensitivity analyses are included for flock conception rate, veterinary ultrasound fee, and the price of corn. Variables that influence the number of cull animals and the cost of ultrasonography have the greatest impact on predicted NR. Because the determination of singleton or multiple pregnancies is more time consuming, its economic practicality in comparison with simple pregnancy diagnosis is questionable. The value of feed saved by identifying and separately feeding ewes with singleton pregnancies is not offset by the increased ultrasonography cost.
Baker, Ronald J.; Reilly, Timothy J.; Lopez, Anthony R.; Romanok, Kristin M.; Wengrowski, Edward W
2015-01-01
A screening tool for quantifying levels of concern for contaminants detected in monitoring wells on or near landfills to down-gradient receptors (streams, wetlands and residential lots) was developed and evaluated. The tool uses Quick Domenico Multi-scenario (QDM), a spreadsheet implementation of Domenico-based solute transport, to estimate concentrations of contaminants reaching receptors under steady-state conditions from a constant-strength source. Unlike most other available Domenico-based model applications, QDM calculates the time for down-gradient contaminant concentrations to approach steady state and appropriate dispersivity values, and allows for up to fifty simulations on a single spreadsheet. Sensitivity of QDM solutions to critical model parameters was quantified. The screening tool uses QDM results to categorize landfills as having high, moderate and low levels of concern, based on contaminant concentrations reaching receptors relative to regulatory concentrations. The application of this tool was demonstrated by assessing levels of concern (as defined by the New Jersey Pinelands Commission) for thirty closed, uncapped landfills in the New Jersey Pinelands National Reserve, using historic water-quality data from monitoring wells on and near landfills and hydraulic parameters from regional flow models. Twelve of these landfills are categorized as having high levels of concern, indicating a need for further assessment. This tool is not a replacement for conventional numerically-based transport model or other available Domenico-based applications, but is suitable for quickly assessing the level of concern posed by a landfill or other contaminant point source before expensive and lengthy monitoring or remediation measures are taken. In addition to quantifying the level of concern using historic groundwater-monitoring data, the tool allows for archiving model scenarios and adding refinements as new data become available.
NASA Astrophysics Data System (ADS)
Malinconico, L. L., Jr.; Sunderlin, D.; Liew, C. W.
2015-12-01
Over the course of the last three years we have designed, developed and refined two Apps for the iPad. GeoFieldBook and StratLogger allow for the real-time display of spatial (structural) and temporal (stratigraphic) field data as well as very easy in-field navigation. Field techniques and methods for data acquisition and mapping in the field have dramatically advanced and simplified how we collect and analyze data while in the field. The Apps are not geologic mapping programs, but rather a way of bypassing the analog field book step to acquire digital data directly that can then be used in various analysis programs (GIS, Google Earth, Stereonet, spreadsheet and drawing programs). We now complete all of our fieldwork digitally. GeoFieldBook can be used to collect structural and other field observations. Each record includes location/date/time information, orientation measurements, formation names, text observations and photos taken with the tablet camera. Records are customizable, so users can add fields of their own choosing. Data are displayed on an image base in real time with oriented structural symbols. The image base is also used for in-field navigation. In StratLogger, the user records bed thickness, lithofacies, biofacies, and contact data in preset and modifiable fields. Each bed/unit record may also be photographed and geo-referenced. As each record is collected, a column diagram of the stratigraphic sequence is built in real time, complete with lithology color, lithology texture, and fossil symbols. The recorded data from any measured stratigraphic sequence can be exported as both the live-drawn column image and as a .csv formatted file for use in spreadsheet or other applications. Common to both Apps is the ability to export the data (via .csv files), photographs and maps or stratigraphic columns (images). Since the data are digital they are easily imported into various processing programs (for example for stereoplot analysis). Requiring that all maps, stratigraphic columns and cross-sections be produced digitally continues our integration on the use of digital technologies throughout the curriculum. Initial evaluation suggests that students using the Apps more quickly progress towards synthesis and interpretation of the data as well as a deeper understanding of complex 4D field relationships.
LINKING NUTRIENTS TO ALTERATIONS IN AQUATIC LIFE ...
This report estimates the natural background and ambient concentrations of primary producer abundance indicators in California wadeable streams, identifies thresholds of adverse effects of nutrient-stimulated primary producer abundance on benthic macroinvertebrate and algal community structure in CA wadeable streams, and evaluates existing nutrient-algal response models for CA wadeable streams (Tetra Tech 2006), with recommendations for improvements. This information will be included in an assessment of the science forming the basis of recommendations for stream nutrient criteria for the state of California. The objectives of the project are three-fold: 1. Estimate the natural background and ambient concentrations of nutrients and candidate indicators of primary producer abundance in California wadeable streams; 2. Explore relationships and identify thresholds of adverse effects of nutrient concentrations and primary producer abundance on indicators of aquatic life use in California wadeable streams; and 3. Evaluate the Benthic Biomass Spreadsheet Tool (BBST) for California wadeable streams using existing data sets, and recommend avenues for refinement. The intended outcome of this study is NOT final regulatory endpoints for nutrient and response indicators for California wadeable streams.
Source Lines Counter (SLiC) Version 4.0
NASA Technical Reports Server (NTRS)
Monson, Erik W.; Smith, Kevin A.; Newport, Brian J.; Gostelow, Roli D.; Hihn, Jairus M.; Kandt, Ronald K.
2011-01-01
Source Lines Counter (SLiC) is a software utility designed to measure software source code size using logical source statements and other common measures for 22 of the programming languages commonly used at NASA and the aerospace industry. Such metrics can be used in a wide variety of applications, from parametric cost estimation to software defect analysis. SLiC has a variety of unique features such as automatic code search, automatic file detection, hierarchical directory totals, and spreadsheet-compatible output. SLiC was written for extensibility; new programming language support can be added with minimal effort in a short amount of time. SLiC runs on a variety of platforms including UNIX, Windows, and Mac OSX. Its straightforward command-line interface allows for customization and incorporation into the software build process for tracking development metrics. T
[Guidelines for budget impact analysis of health technologies in Brazil].
Ferreira-Da-Silva, Andre Luis; Ribeiro, Rodrigo Antonini; Santos, Vânia Cristina Canuto; Elias, Flávia Tavares Silva; d'Oliveira, Alexandre Lemgruber Portugal; Polanczyk, Carisi Anne
2012-07-01
Budget impact analysis (BIA) provides operational financial forecasts to implement new technologies in healthcare systems. There were no previous specific recommendations to conduct such analyses in Brazil. This paper reviews BIA methods for health technologies and proposes BIA guidelines for the public and private Brazilian healthcare system. The following recommendations were made: adopt the budget administrator's perspective; use a timeframe of 1 to 5 years; compare reference and alternative scenarios; consider the technology's rate of incorporation; estimate the target population by either an epidemiological approach or measured demand; consider restrictions on technologies' indication or factors that increase the demand for them; consider direct and averted costs; do not adjust for inflation or discounts; preferably, integrate information on a spreadsheet; calculate the incremental budget impact between scenarios; and summarize information in a budget impact report.
Implementing a method of screening one-well hydraulic barrier design alternatives.
Rubin, Hillel; Shoemaker, Christine A; Köngeter, Jürgen
2009-01-01
This article provides details of applying the method developed by the authors (Rubin et al. 2008b) for screening one-well hydraulic barrier design alternatives. The present article with its supporting information (manual and electronic spreadsheets with a case history example) provides the reader complete details and examples of solving the set of nonlinear equations developed by Rubin et al. (2008b). It allows proper use of the analytical solutions and also depicting the various charts given by Rubin et al. (2008b). The final outputs of the calculations are the required position and the discharge of the pumping well. If the contaminant source is nonaqueous phase liquid (NAPL) entrapped within the aquifer, then the method provides an estimate of the aquifer remediation progress (which is a by-product) due to operating the hydraulic barrier.
A user interface for the Kansas Geological Survey slug test model.
Esling, Steven P; Keller, John E
2009-01-01
The Kansas Geological Survey (KGS) developed a semianalytical solution for slug tests that incorporates the effects of partial penetration, anisotropy, and the presence of variable conductivity well skins. The solution can simulate either confined or unconfined conditions. The original model, written in FORTRAN, has a text-based interface with rigid input requirements and limited output options. We re-created the main routine for the KGS model as a Visual Basic macro that runs in most versions of Microsoft Excel and built a simple-to-use Excel spreadsheet interface that automatically displays the graphical results of the test. A comparison of the output from the original FORTRAN code to that of the new Excel spreadsheet version for three cases produced identical results.