The Evolution of Spreadsheets.
ERIC Educational Resources Information Center
Schuyler, Michael
1985-01-01
Discusses basic features and functions of spreadsheet programs and describes additional capabilities (editing, windowing, graphics, and word processing) of two second-generation spreadsheet programs: Lotus 1-2-3 and Symphony. (MBR)
Software Reviews: Programs Worth a Second Look.
ERIC Educational Resources Information Center
Classroom Computer Learning, 1989
1989-01-01
Reviews three software programs: (1) "Microsoft Works 2.0": word processing, data processing, and telecommunications, grades 7 and up; (2) "AppleWorks GS": word processor, database, spreadsheet, graphics, and telecommunications, grades 3-12, Apple IIGS; (3) "Choices, Choices: On the Playground, Taking Responsibility":…
Mathematical Modeling with MyMaps and Spreadsheets
ERIC Educational Resources Information Center
Weber, Victoria; Fortune, Nicholas; Williams, Derek; Whitehead, Ashley
2016-01-01
Software programs such as Tinkerplots ® or Geometer's Sketchpad ® can help students solve problems in mathematics classes, but may not be available to them after high school. In contrast, many students who become familiar with Internet tools and programs in office packages (word processing, spreadsheets, etc.) may use them daily to enhance their…
Improving Information Management at Mare Island Naval Shipyard.
1987-03-01
copy reports [Ref. 8: pp. 1-41. C. PRIME TOKEN RING The prime ring is a token-type computer network linking five PRIME computers electronically . Each...the PRIME net are for news (a bulletin board), electronic mail, word processing, and data filing. d 4 o, " ,, " "." ’-" r...communications application) This is a group of general-purpose programs that includes word processing. electronic mail, and spreadsheet applications. Access is
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-11
... massive emails, word processing documents, PDF files, spreadsheets, presentations, database entries, and....pdf . PURPOSES: OGC-EDMS provides OGC with a method to initiate, track, and manage the collection...
Computer Literacy for Teachers.
ERIC Educational Resources Information Center
Sarapin, Marvin I.; Post, Paul E.
Basic concepts of computer literacy are discussed as they relate to industrial arts/technology education. Computer hardware development is briefly examined, and major software categories are defined, including database management, computer graphics, spreadsheet programs, telecommunications and networking, word processing, and computer assisted and…
What Software Skills Do Employers Want Their Employees to Possess?
ERIC Educational Resources Information Center
Perry, William
1998-01-01
Computer skills were identified and grouped as follows: operating systems, graphical user interface, word processing, spreadsheets, and databases. Responses from 47 of 420 employers rated proficiency in all of these groups essential. Database skills were particularly highly rated. (SK)
ERIC Educational Resources Information Center
Zhao, Jensen J.; Ray, Charles M.; Dye, Lee J.; Davis, Rodney
1998-01-01
Executives (n=63) and office-systems educators (n=88) recommended for workers the following categories of computer end-user skills: hardware, operating systems, word processing, spreadsheets, database, desktop publishing, and presentation. (SK)
Using a spreadsheet/table template for economic value added analysis.
Cassey, Margaret
2008-01-01
Translating clinical research into practical applications that are cost effective has received significant attention as staff nurses attempt to expand new knowledge into an already complex daily workflow. spreadsheet/table template created in a word processing format can assist with setting up and carrying out the analysis of costs for comparing different approaches to routine activities. By encouraging nurses to take the initiative to examine parts of everyday nursing practice with an eye to cost analysis, significant contributions can be made to maximizing the bottom line.
The UNIX/XENIX Advantage: Applications in Libraries.
ERIC Educational Resources Information Center
Gordon, Kelly L.
1988-01-01
Discusses the application of the UNIX/XENIX operating system to support administrative office automation functions--word processing, spreadsheets, database management systems, electronic mail, and communications--at the Central Michigan University Libraries. Advantages and disadvantages of the XENIX operating system and system configuration are…
Plugging into Marketing Education.
ERIC Educational Resources Information Center
Lunkenheimer, Gary; Swift, Teri
This text contains activities that allow marketing education instructors to integrate their curriculum with word-processing, spreadsheet, and presentation software. Their students can gain experience with technology, fulfill marketing education learner outcomes, and meet the demands of a marketing job. The instructor provides an outline for…
Schools Inc.: An Administrator's Guide to the Business of Education.
ERIC Educational Resources Information Center
McCarthy, Bob; And Others
1989-01-01
This theme issue describes ways in which educational administrators are successfully automating many of their administrative tasks. Articles focus on student management; office automation, including word processing, databases, and spreadsheets; human resources; support services, including supplies, textbooks, and learning resources; financial…
Pupils, Teachers & Palmtop Computers.
ERIC Educational Resources Information Center
Robertson, S. I.; And Others
1996-01-01
To examine the effects of introducing portable computers into secondary schools, a study was conducted regarding information technology skills and attitudes of staff and eighth grade students prior to and after receiving individual portable computers. Knowledge and use of word processing, spreadsheets, and database applications increased for both…
The Microcomputer in the Administrative Office.
ERIC Educational Resources Information Center
Huntington, Fred
1983-01-01
Discusses microcomputer uses for administrative computing in education at site level and central office and recommends that administrators start with a word processing program for time management, an electronic spreadsheet for financial accounting, a database management system for inventories, and self-written programs to alleviate paper…
Applied Educational Computing: Putting Skills to Practice.
ERIC Educational Resources Information Center
Thomerson, J. D.
The College of Education at Valdosta State University (Georgia) developed a followup course to their required entry-level educational computing course. The introductory course covers word processing, spreadsheet, database, presentation, Internet, electronic mail, and operating system software and basic computer concepts. Students expressed a need…
Strategies of Successful Technology Integrators. Part I: Streamlining Classroom Management.
ERIC Educational Resources Information Center
McNally, Lynn; Etchison, Cindy
2000-01-01
Discussion of how to develop curriculum that successfully integrates technology into elementary and secondary school classrooms focuses on solutions for school and classroom management tasks. Highlights include Web-based solutions; student activities; word processing; desktop publishing; draw and paint programs; spreadsheets; and database…
Cloud Computing Based E-Learning System
ERIC Educational Resources Information Center
Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.
2010-01-01
Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…
ERIC Educational Resources Information Center
Curtis, Rick
This paper summarizes information about using computer hardware and software to aid in making purchase decisions that are based on user needs. The two major options in hardware are IBM-compatible machines and the Apple Macintosh line. The three basic software applications include word processing, database management, and spreadsheet applications.…
ERIC Educational Resources Information Center
Johnson, Donald M.; Ferguson, James A.; Vokins, Nancy W.; Lester, Melissa L.
2000-01-01
Over 50% of faculty teaching undergraduate agriculture courses (n=58) required use of word processing, Internet, and electronic mail; less than 50% required spreadsheets, databases, graphics, or specialized software. They planned to maintain or increase required computer tasks in their courses. (SK)
Learning about Tasks Computers Can Perform. ERIC Digest.
ERIC Educational Resources Information Center
Brosnan, Patricia A.
Knowing what different kinds of computer equipment can do is the first step in choosing the computer that is right for you. This digest describes a developmental progression of computer capabilities. First the basic three software programs (word processing, spreadsheets, and database programs) are discussed using examples. Next, an explanation of…
The Use of Microcomputers in Distance Teaching Systems. ZIFF Papiere 70.
ERIC Educational Resources Information Center
Rumble, Greville
Microcomputers have revolutionized distance education in virtually every area. Used alone, personal computers provide students with a wide range of utilities, including word processing, graphics packages, and spreadsheets. When linked to a mainframe computer or connected to other personal computers in local area networks, microcomputers can…
Nursing Faculty's Evaluations of Technology Integration into the Instructional Setting
ERIC Educational Resources Information Center
Yu, Weichieh Wayne; Wang, Jenny; Lin, Chunfu Charlie
2013-01-01
A descriptive and correctional research was conducted to assess teachers' perceived expertise in using word processing, spreadsheet, and presentation software applications to facilitate instruction in various nursing subjects. The participants were 313 full- and part-time teachers who taught primarily undergraduate classes and possessed necessary…
Evaluating Technology Integration in the Elementary School: A Site-Based Approach.
ERIC Educational Resources Information Center
Mowe, Richard
This book enables educators at the elementary level to conduct formative evaluations of their technology programs in minimum time. Most of the technology is computer related, including word processing, graphics, desktop publishing, spreadsheets, databases, instructional software, programming, and telecommunications. The design of the book is aimed…
Use of Computer-Based Case Studies in a Problem-Solving Curriculum.
ERIC Educational Resources Information Center
Haworth, Ian S.; And Others
1997-01-01
Describes the use of three case studies, on computer, to enhance problem solving and critical thinking among doctoral pharmacy students in a physical chemistry course. Students are expected to use specific computer programs, spreadsheets, electronic mail, molecular graphics, word processing, online literature searching, and other computer-based…
Handheld Computers in Education. Research Brief
ERIC Educational Resources Information Center
Education Partnerships, Inc., 2003
2003-01-01
For over the last 20 years, educators have been trying to find the best practice in using technology for student learning. Some of the most widely used applications with computers have been student learning of programming, word processing, Web research, spreadsheets, games, and Web design. The difficulty with integrating many of these activities…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cottam, Joseph A.; Blaha, Leslie M.
Systems have biases. Their interfaces naturally guide a user toward specific patterns of action. For example, modern word-processors and spreadsheets are both capable of taking word wrapping, checking spelling, storing tables, and calculating formulas. You could write a paper in a spreadsheet or could do simple business modeling in a word-processor. However, their interfaces naturally communicate which function they are designed for. Visual analytic interfaces also have biases. In this paper, we outline why simple Markov models are a plausible tool for investigating that bias and how they might be applied. We also discuss some anticipated difficulties in such modelingmore » and touch briefly on what some Markov model extensions might provide.« less
Measuring Assurance of Learning Goals: Effectiveness of Computer Training and Assessment Tools
ERIC Educational Resources Information Center
Murphy, Marianne C.; Sharma, Aditya; Rosso, Mark
2012-01-01
Teaching office applications such as word processing, spreadsheet and presentation skills has been widely debated regarding its necessity, extent and delivery method. Training and Assessment applications such as MyITLab, SAM, etc. are popular tools for training students and are particularly useful in measuring Assurance of Learning (AOL)…
ERIC Educational Resources Information Center
Computing Teacher, 1985
1985-01-01
Defines computer literacy and describes a computer literacy course which stresses ethics, hardware, and disk operating systems throughout. Core units on keyboarding, word processing, graphics, database management, problem solving, algorithmic thinking, and programing are outlined, together with additional units on spreadsheets, simulations,…
Establishing the Content Validity of a Basic Computer Literacy Course.
ERIC Educational Resources Information Center
Clements, James; Carifio, James
1995-01-01
Content analysis of 13 textbooks and 2 Department of Education documents was conducted to ascertain common word processing, database, and spreadsheet software skills in order to determine which specific skills should be taught in a high school computer literacy course. Aspects of a basic computer course, created from this analysis, are described.…
A Comparison of Student Perceptions of Their Computer Skills to Their Actual Abilities
ERIC Educational Resources Information Center
Grant, Donna M.; Malloy, Alisha D.; Murphy, Marianne C.
2009-01-01
In this technology intensive society, most students are required to be proficient in computer skills to compete in today's global job market. These computer skills usually consist of basic to advanced knowledge in word processing, presentation, and spreadsheet applications. In many U.S. states, students are required to demonstrate computer…
ERIC Educational Resources Information Center
Stevens, William E.
This report presents a model for conducting a statewide conference for the approximately 900 members of the South Carolina Council of Teachers of Mathematics (SCCTM) using the AppleWorks integrated software as the basis of the implementation plan. The first and second chapters provide background information on the conference and the…
Working Together: Google Apps Goes to School
ERIC Educational Resources Information Center
Oishi, Lindsay
2007-01-01
Online collaboration and project-management tools allow people to work together without being in the same place at the same time. However, that is not all, Google Docs & Spreadsheets, for example, allows the creation of documents and spreadsheets just like in Microsoft Word and Excel, but with more collaborative capacity. Google Calendar lets…
A document-centric approach for developing the tolAPC ontology.
Blfgeh, Aisha; Warrender, Jennifer; Hilkens, Catharien M U; Lord, Phillip
2017-11-28
There are many challenges associated with ontology building, as the process often touches on many different subject areas; it needs knowledge of the problem domain, an understanding of the ontology formalism, software in use and, sometimes, an understanding of the philosophical background. In practice, it is very rare that an ontology can be completed by a single person, as they are unlikely to combine all of these skills. So people with these skills must collaborate. One solution to this is to use face-to-face meetings, but these can be expensive and time-consuming for teams that are not co-located. Remote collaboration is possible, of course, but one difficulty here is that domain specialists use a wide-variety of different "formalisms" to represent and share their data - by the far most common, however, is the "office file" either in the form of a word-processor document or a spreadsheet. Here we describe the development of an ontology of immunological cell types; this was initially developed by domain specialists using an Excel spreadsheet for collaboration. We have transformed this spreadsheet into an ontology using highly-programmatic and pattern-driven ontology development. Critically, the spreadsheet remains part of the source for the ontology; the domain specialists are free to update it, and changes will percolate to the end ontology. We have developed a new ontology describing immunological cell lines built by instantiating ontology design patterns written programmatically, using values from a spreadsheet catalogue. This method employs a spreadsheet that was developed by domain experts. The spreadsheet is unconstrained in its usage and can be freely updated resulting in a new ontology. This provides a general methodology for ontology development using data generated by domain specialists.
Intelligence Dissemination to the Warfighter
2007-12-01
that prevent other JWICS users from exchanging data. The CIA conducts most of their business on the CIAnet , which can pull data from JWICS but...data. Spreadsheets and word processors, in order to retain a high level of user- friendliness, handle several complex background processes that...the “ complex adaptive systems”, where the onus is placed equally on the analyst and on the tools to be receptive and adaptable. It is the
Collaborative writing: Tools and tips.
Eapen, Bell Raj
2007-01-01
Majority of technical writing is done by groups of experts and various web based applications have made this collaboration easy. Email exchange of word processor documents with tracked changes used to be the standard technique for collaborative writing. However web based tools like Google docs and Spreadsheets have made the process fast and efficient. Various versioning tools and synchronous editors are available for those who need additional functionality. Having a group leader who decides the scheduling, communication and conflict resolving protocols is important for successful collaboration.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-23
... a separate document, our preferred file format is Microsoft Word. If you attach multiple comments (such as form letters), our preferred format is a Microsoft Excel spreadsheet. (2) By Hard Copy: Submit...
NASA Astrophysics Data System (ADS)
Ariana, I. M.; Bagiada, I. M.
2018-01-01
Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).
Erickson, Collin B; Ankenman, Bruce E; Sanchez, Susan M
2018-06-01
This data article provides the summary data from tests comparing various Gaussian process software packages. Each spreadsheet represents a single function or type of function using a particular input sample size. In each spreadsheet, a row gives the results for a particular replication using a single package. Within each spreadsheet there are the results from eight Gaussian process model-fitting packages on five replicates of the surface. There is also one spreadsheet comparing the results from two packages performing stochastic kriging. These data enable comparisons between the packages to determine which package will give users the best results.
A Discovery-Learning 2,4-Dinitrophenylhydrazone Experiment
ERIC Educational Resources Information Center
Vittimberga, Bruno M.; Ruekberg, Ben
2006-01-01
Selections of liquid aldehydes and ketones are proposed for students to determine what property is the best predictor of the color (yellow to red) of their 2,4-dinitrophenylhydrazone derivative. Students may use a computer (spreadsheet or word processor) to analyze results. (Contains 1 table and 3 notes.)
Definition and maintenance of a telemetry database dictionary
NASA Technical Reports Server (NTRS)
Knopf, William P. (Inventor)
2007-01-01
A telemetry dictionary database includes a component for receiving spreadsheet workbooks of telemetry data over a web-based interface from other computer devices. Another component routes the spreadsheet workbooks to a specified directory on the host processing device. A process then checks the received spreadsheet workbooks for errors, and if no errors are detected the spreadsheet workbooks are routed to another directory to await initiation of a remote database loading process. The loading process first converts the spreadsheet workbooks to comma separated value (CSV) files. Next, a network connection with the computer system that hosts the telemetry dictionary database is established and the CSV files are ported to the computer system that hosts the telemetry dictionary database. This is followed by a remote initiation of a database loading program. Upon completion of loading a flatfile generation program is manually initiated to generate a flatfile to be used in a mission operations environment by the core ground system.
Tools for Requirements Management: A Comparison of Telelogic DOORS and the HiVe
2006-07-01
types DOORS deals with are text files, spreadsheets, FrameMaker , rich text, Microsoft Word and Microsoft Project. 2.5.1 Predefined file formats DOORS...during the export. DOORS exports FrameMaker files in an incomplete format, meaning DOORS exported files will have to be opened in FrameMaker and saved
Spreadsheet-Like Image Analysis
1992-08-01
1 " DTIC AD-A254 395 S LECTE D, ° AD-E402 350 Technical Report ARPAD-TR-92002 SPREADSHEET-LIKE IMAGE ANALYSIS Paul Willson August 1992 U.S. ARMY...August 1992 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS SPREADSHEET-LIKE IMAGE ANALYSIS 6. AUTHOR(S) Paul Willson 7. PERFORMING ORGANIZATION NAME(S) AND...14. SUBJECT TERMS 15. NUMBER OF PAGES Image analysis , nondestructive inspection, spreadsheet, Macintosh software, 14 neural network, signal processing
Economic Comparison of Processes Using Spreadsheet Programs
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Pappano, A. W.; Jennings, C. N.
1986-01-01
Inexpensive approach aids plant-design decisions. Commercially available electronic spreadsheet programs aid economic comparison of different processes for producing particular end products. Facilitates plantdesign decisions without requiring large expenditures for powerful mainframe computers.
The Growing Problems with Spreadsheet Budgeting
ERIC Educational Resources Information Center
Solomon, Jeff; Johnson, Stella; Wilcox, Leon; Olson, Tom
2010-01-01
The ubiquitous spreadsheet in some version has been the sole and unrivaled instrument of financial management for decades. And it has served well. The spreadsheet provides the flexibility to design a unique business process. It allows users to create formulas that execute complex calculations, and it is available in the globally standardized Excel…
Strontium-90 Error Discovered in Subcontract Laboratory Spreadsheet
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. D. Brown A. S. Nagel
1999-07-31
West Valley Demonstration Project health physicists and environment scientists discovered a series of errors in a subcontractor's spreadsheet being used to reduce data as part of their strontium-90 analytical process.
DataSpread: Unifying Databases and Spreadsheets.
Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya
2015-08-01
Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current "pane" (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases.
DataSpread: Unifying Databases and Spreadsheets
Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya
2015-01-01
Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current “pane” (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases. PMID:26900487
Engine Icing Data - An Analytics Approach
NASA Technical Reports Server (NTRS)
Fitzgerald, Brooke A.; Flegel, Ashlie B.
2017-01-01
Engine icing researchers at the NASA Glenn Research Center use the Escort data acquisition system in the Propulsion Systems Laboratory (PSL) to generate and collect a tremendous amount of data every day. Currently these researchers spend countless hours processing and formatting their data, selecting important variables, and plotting relationships between variables, all by hand, generally analyzing data in a spreadsheet-style program (such as Microsoft Excel). Though spreadsheet-style analysis is familiar and intuitive to many, processing data in spreadsheets is often unreproducible and small mistakes are easily overlooked. Spreadsheet-style analysis is also time inefficient. The same formatting, processing, and plotting procedure has to be repeated for every dataset, which leads to researchers performing the same tedious data munging process over and over instead of making discoveries within their data. This paper documents a data analysis tool written in Python hosted in a Jupyter notebook that vastly simplifies the analysis process. From the file path of any folder containing time series datasets, this tool batch loads every dataset in the folder, processes the datasets in parallel, and ingests them into a widget where users can search for and interactively plot subsets of columns in a number of ways with a click of a button, easily and intuitively comparing their data and discovering interesting dynamics. Furthermore, comparing variables across data sets and integrating video data (while extremely difficult with spreadsheet-style programs) is quite simplified in this tool. This tool has also gathered interest outside the engine icing branch, and will be used by researchers across NASA Glenn Research Center. This project exemplifies the enormous benefit of automating data processing, analysis, and visualization, and will help researchers move from raw data to insight in a much smaller time frame.
The personal computer and GP-B management. [Gravity Probe experiment
NASA Technical Reports Server (NTRS)
Neighbors, A. K.
1986-01-01
The Gravity Probe-B (GP-B) experiment is one of the most sophisticated and challenging developments to be undertaken by NASA. Its objective is to measure the relativistic drift of gyroscopes in orbit about the earth. In this paper, the experiment is described, and the strategy of phased procurements for accomplishing the engineering development of the hardware is discussed. The microcomputer is a very convenient and powerful tool in the management of GP-B. It is used in creating and monitoring such project data as schedules, budgets, hardware procurements and technical and interface requirements. Commercially available software in word processing, database management, communications, spreadsheet, graphics and program management are used. Examples are described of the efficacy of the application of the computer by the management team.
The Devil and Daniel's Spreadsheet
ERIC Educational Resources Information Center
Burke, Maurice J.
2012-01-01
"When making mathematical models, technology is valuable for varying assumptions, exploring consequences, and comparing predictions with data," notes the Common Core State Standards Initiative (2010, p. 72). This exploration of the recursive process in the Devil and Daniel Webster problem reveals that the symbolic spreadsheet fits this bill.…
Animated-simulation modeling facilitates clinical-process costing.
Zelman, W N; Glick, N D; Blackmore, C C
2001-09-01
Traditionally, the finance department has assumed responsibility for assessing process costs in healthcare organizations. To enhance process-improvement efforts, however, many healthcare providers need to include clinical staff in process cost analysis. Although clinical staff often use electronic spreadsheets to model the cost of specific processes, PC-based animated-simulation tools offer two major advantages over spreadsheets: they allow clinicians to interact more easily with the costing model so that it more closely represents the process being modeled, and they represent cost output as a cost range rather than as a single cost estimate, thereby providing more useful information for decision making.
Introducing Artificial Neural Networks through a Spreadsheet Model
ERIC Educational Resources Information Center
Rienzo, Thomas F.; Athappilly, Kuriakose K.
2012-01-01
Business students taking data mining classes are often introduced to artificial neural networks (ANN) through point and click navigation exercises in application software. Even if correct outcomes are obtained, students frequently do not obtain a thorough understanding of ANN processes. This spreadsheet model was created to illuminate the roles of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-02-01
This appendix is a compilation of work done to predict overall cycle performance from gasifier to generator terminals. A spreadsheet has been generated for each case to show flows within a cycle. The spreadsheet shows gaseous or solid composition of flow, temperature of flow, quantity of flow, and heat heat content of flow. Prediction of steam and gas turbine performance was obtained by the computer program GTPro. Outputs of all runs for each combined cycle reviewed has been added to this appendix. A process schematic displaying all flows predicted through GTPro and the spreadsheet is also added to this appendix.more » The numbered bubbles on the schematic correspond to columns on the top headings of the spreadsheet.« less
When Spreadsheets Become Software - Quality Control Challenges and Approaches - 13360
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fountain, Stefanie A.; Chen, Emmie G.; Beech, John F.
2013-07-01
As part of a preliminary waste acceptance criteria (PWAC) development, several commercial models were employed, including the Hydrologic Evaluation of Landfill Performance model (HELP) [1], the Disposal Unit Source Term - Multiple Species model (DUSTMS) [2], and the Analytical Transient One, Two, and Three-Dimensional model (AT123D) [3]. The results of these models were post-processed in MS Excel spreadsheets to convert the model results to alternate units, compare the groundwater concentrations to the groundwater concentration thresholds, and then to adjust the waste contaminant masses (based on average concentration over the waste volume) as needed in an attempt to achieve groundwater concentrationsmore » at the limiting point of assessment that would meet the compliance concentrations while maximizing the potential use of the landfill (i.e., maximizing the volume of projected waste being generated that could be placed in the landfill). During the course of the PWAC calculation development, one of the Microsoft (MS) Excel spreadsheets used to post-process the results of the commercial model packages grew to include more than 575,000 formulas across 18 worksheets. This spreadsheet was used to assess six base scenarios as well as nine uncertainty/sensitivity scenarios. The complexity of the spreadsheet resulted in the need for a rigorous quality control (QC) procedure to verify data entry and confirm the accuracy of formulas. (authors)« less
ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations
NASA Astrophysics Data System (ADS)
Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai
2017-07-01
The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.
ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations.
Laloo, Jalal Z A; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai
2017-07-01
The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.
Solving L-L Extraction Problems with Excel Spreadsheet
ERIC Educational Resources Information Center
Teppaitoon, Wittaya
2016-01-01
This work aims to demonstrate the use of Excel spreadsheets for solving L-L extraction problems. The key to solving the problems successfully is to be able to determine a tie line on the ternary diagram where the calculation must be carried out. This enables the reader to analyze the extraction process starting with a simple operation, the…
ERIC Educational Resources Information Center
Berardi, Victor L.
2012-01-01
Using information systems to solve business problems is increasingly required of everyone in an organization, not just technical specialists. In the operations management class, spreadsheet usage has intensified with the focus on building decision models to solve operations management concerns such as forecasting, process capability, and inventory…
ERIC Educational Resources Information Center
Fasoula, S.; Nikitas, P.; Pappa-Louisi, A.
2017-01-01
A series of Microsoft Excel spreadsheets were developed to simulate the process of separation optimization under isocratic and simple gradient conditions. The optimization procedure is performed in a stepwise fashion using simple macros for an automatic application of this approach. The proposed optimization approach involves modeling of the peak…
Lifelong learning skills: how experienced are students when they enter medical school?
Whittle, Sue R; Murdoch-Eaton, Deborah G
2004-09-01
Widening participation initiatives together with changes in school curricula in England may broaden the range of lifelong learning skills experience of new undergraduates. This project examines the experience levels of current students, as a comparative baseline. First-year medical students completed a questionnaire on arrival, investigating their practice of 31 skills during the previous two years. Responses show that most students have regularly practised transferable skills. However, significant numbers report little experience, particularly in IT skills such as email, using the Internet, spreadsheets and databases. Some remain unfamiliar with word processing. Library research, essay writing and oral presentation are also rarely practised by substantial numbers. One-third of students lack experience of evaluating their own strengths and weaknesses. Current students already show diversity of experience in skills on arrival at medical school. Changes in the near future may increase this range of experience further, and necessitate changes to undergraduate courses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-02-01
This appendix is a compilation of work done to predict overall cycle performance from gasifier to generator terminals. A spreadsheet has been generated for each case to show flows within a cycle. The spreadsheet shows gaseous or solid composition of flow, temperature of flow, quantity of flow, and heat heat content of flow. Prediction of steam and gas turbine performance was obtained by the computer program GTPro. Outputs of all runs for each combined cycle reviewed has been added to this appendix. A process schematic displaying all flows predicted through GTPro and the spreadsheet is also added to this appendix.more » The numbered bubbles on the schematic correspond to columns on the top headings of the spreadsheet.« less
ERIC Educational Resources Information Center
Masalski, William J.
This book seeks to develop, enhance, and expand students' understanding of mathematics by using technology. Topics covered include the advantages of spreadsheets along with the opportunity to explore the 'what if?' type of questions encountered in the problem-solving process, enhancing the user's insight into the development and use of algorithms,…
OntoMaton: a bioportal powered ontology widget for Google Spreadsheets.
Maguire, Eamonn; González-Beltrán, Alejandra; Whetzel, Patricia L; Sansone, Susanna-Assunta; Rocca-Serra, Philippe
2013-02-15
Data collection in spreadsheets is ubiquitous, but current solutions lack support for collaborative semantic annotation that would promote shared and interdisciplinary annotation practices, supporting geographically distributed players. OntoMaton is an open source solution that brings ontology lookup and tagging capabilities into a cloud-based collaborative editing environment, harnessing Google Spreadsheets and the NCBO Web services. It is a general purpose, format-agnostic tool that may serve as a component of the ISA software suite. OntoMaton can also be used to assist the ontology development process. OntoMaton is freely available from Google widgets under the CPAL open source license; documentation and examples at: https://github.com/ISA-tools/OntoMaton.
NASA Technical Reports Server (NTRS)
Messaro. Semma; Harrison, Phillip
2010-01-01
Ares I Zonal Random vibration environments due to acoustic impingement and combustion processes are develop for liftoff, ascent and reentry. Random Vibration test criteria for Ares I Upper Stage pyrotechnic components are developed by enveloping the applicable zonal environments where each component is located. Random vibration tests will be conducted to assure that these components will survive and function appropriately after exposure to the expected vibration environments. Methodology: Random Vibration test criteria for Ares I Upper Stage pyrotechnic components were desired that would envelope all the applicable environments where each component was located. Applicable Ares I Vehicle drawings and design information needed to be assessed to determine the location(s) for each component on the Ares I Upper Stage. Design and test criteria needed to be developed by plotting and enveloping the applicable environments using Microsoft Excel Spreadsheet Software and documenting them in a report Using Microsoft Word Processing Software. Conclusion: Random vibration liftoff, ascent, and green run design & test criteria for the Upper Stage Pyrotechnic Components were developed by using Microsoft Excel to envelope zonal environments applicable to each component. Results were transferred from Excel into a report using Microsoft Word. After the report is reviewed and edited by my mentor it will be submitted for publication as an attachment to a memorandum. Pyrotechnic component designers will extract criteria from my report for incorporation into the design and test specifications for components. Eventually the hardware will be tested to the environments I developed to assure that the components will survive and function appropriately after exposure to the expected vibration environments.
Refinery spreadsheet highlights microcomputer process applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tucker, M.A.
1984-01-23
Microcomputer applications in the process areas at Chevron U.S.A. refineries and at the Chevron Research Co. illustrate how the microcomputer has changed the way we do our jobs. This article will describe major uses of the microcomputer as a personal work tool in Chevron process areas. It will also describe how and why many of Chevron's microcomputer applications were developed and their characteristics. One of our earliest microcomputer applications, developed in late 1981, was an electronic spreadsheet program using a small desktop microcomputer. It was designed to help a refinery planner prepare monthly plans for a small portion of onemore » of our major refineries. This particular microcomputer had a tiny 4-in. screen, and the reports were several strips of print-out from the microcomputer's 3-in.-wide internal printer taped together. In spite of these archaic computing conditions, it was a successful application. It automated what had been very tedious and time-consuming calculations with a pencil, a calculator, and a great deal of erasing. It eliminated filling out large ''horseblanket'' reports. The electronic spreadsheet was also flexible; the planner could easily change the worksheet to match new operating constraints, new process conditions, and new feeds and products. Fortunately, within just a few months, this application graduated to a similar electronic spreadsheet program on a new, more powerful microcomputer. It had a bigger display screen and a letter-size printer. The same application is still in use today, although it has been greatly enhanced and altered to match extensive plant modifications. And there are plans to expand it again onto yet another, more powerful microcomputer.« less
Spreadsheets Answer "What If...?
ERIC Educational Resources Information Center
Pogge, Alfred F.; Lunetta, Vincent N.
1987-01-01
Demonstrates how a spreadsheet program can do calculations, freeing students to question, analyze data and learn science. Notes several popular spreadsheet programs. Gives an example using Lotus 1-2-3 spreadsheets for a sampling experiment in Biology. Shows other examples of spreadsheet use in laboratory activities. (CW)
NASA Astrophysics Data System (ADS)
Eso, R.; Safiuddin, L. O.; Agusu, L.; Arfa, L. M. R. F.
2018-04-01
We propose a teaching instrument demonstrating the circular membrane waves using the excel interactive spreadsheets with the Visual Basic for Application (VBA) programming. It is based on the analytic solution of circular membrane waves involving Bessel function. The vibration modes and frequencies are determined by using Bessel approximation and initial conditions. The 3D perspective based on the spreadsheets functions and facilities has been explored to show the 3D moving objects in transitional or rotational processes. This instrument is very useful both in teaching activity and learning process of wave physics. Visualizing of the vibration of waves in the circular membrane which is showing a very clear manner of m and n vibration modes of the wave in a certain frequency has been compared and matched to the experimental result using resonance method. The peak of deflection varies in time if the initial condition was working and have the same pattern with matlab simulation in zero initial velocity
Basic statistics with Microsoft Excel: a review.
Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-06-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.
Basic statistics with Microsoft Excel: a review
Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-01-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel. PMID:28740690
Using Spreadsheets in the Management, Analysis, and Reporting of Evaluation Data.
ERIC Educational Resources Information Center
Glowacki, Margaret L.; Rice, Richard L., Jr.
Currently available spreadsheet programs for microcomputers provide many features that can be very useful for evaluators and researchers. Some of the basic concepts involved in spreadsheet use are introduced, and information is provided on the use of spreadsheets in maintaining and analyzing evaluation data. The spreadsheet used in the discussion…
Beyond the Mechanics of Spreadsheets: Using Design Instruction to Address Spreadsheet Errors
ERIC Educational Resources Information Center
Schneider, Kent N.; Becker, Lana L.; Berg, Gary G.
2017-01-01
Given that the usage and complexity of spreadsheets in the accounting profession are expected to increase, it is more important than ever to ensure that accounting graduates are aware of the dangers of spreadsheet errors and are equipped with design skills to minimize those errors. Although spreadsheet mechanics are prevalent in accounting…
Problem Solving with Spreadsheets.
ERIC Educational Resources Information Center
Catterall, P.; Lewis, R.
1985-01-01
Documents the educational use of spreadsheets through a description of exploratory work which utilizes spreadsheets to achieve the objectives of Conway's Game of Life, a scientific method game for the development of problem-solving techniques. The implementation and classroom use of the spreadsheet programs are discussed. (MBR)
Errors in patient specimen collection: application of statistical process control.
Dzik, Walter Sunny; Beckman, Neil; Selleng, Kathleen; Heddle, Nancy; Szczepiorkowski, Zbigniew; Wendel, Silvano; Murphy, Michael
2008-10-01
Errors in the collection and labeling of blood samples for pretransfusion testing increase the risk of transfusion-associated patient morbidity and mortality. Statistical process control (SPC) is a recognized method to monitor the performance of a critical process. An easy-to-use SPC method was tested to determine its feasibility as a tool for monitoring quality in transfusion medicine. SPC control charts were adapted to a spreadsheet presentation. Data tabulating the frequency of mislabeled and miscollected blood samples from 10 hospitals in five countries from 2004 to 2006 were used to demonstrate the method. Control charts were produced to monitor process stability. The participating hospitals found the SPC spreadsheet very suitable to monitor the performance of the sample labeling and collection and applied SPC charts to suit their specific needs. One hospital monitored subcategories of sample error in detail. A large hospital monitored the number of wrong-blood-in-tube (WBIT) events. Four smaller-sized facilities, each following the same policy for sample collection, combined their data on WBIT samples into a single control chart. One hospital used the control chart to monitor the effect of an educational intervention. A simple SPC method is described that can monitor the process of sample collection and labeling in any hospital. SPC could be applied to other critical steps in the transfusion processes as a tool for biovigilance and could be used to develop regional or national performance standards for pretransfusion sample collection. A link is provided to download the spreadsheet for free.
Determination of Needed Spreadsheet Competencies for Business Personnel in the Mid-South States.
ERIC Educational Resources Information Center
Rogers, Betty S.; Arn, Joseph V.
1993-01-01
A survey of 209 Mid-South businesses determined spreadsheet usage, what competencies are needed for entry-level and continued employment, and sources of spreadsheet training. Recommended that, because of their widespread use, spreadsheets should be taught to all business students. (Author/JOW)
FORGE Milford Triaxial Test Data and Summary from EGI labs
Joe Moore
2016-03-01
Six samples were evaluated in unconfined and triaxial compression, their data are included in separate excel spreadsheets, and summarized in the word document. Three samples were plugged along the axis of the core (presumed to be nominally vertical) and three samples were plugged perpendicular to the axis of the core. A designation of "V"indicates vertical or the long axis of the plugged sample is aligned with the axis of the core. Similarly, "H" indicates a sample that is nominally horizontal and cut orthogonal to the axis of the core. Stress-strain curves were made before and after the testing, and are included in the word doc.. The confining pressure for this test was 2800 psi. A series of tests are being carried out on to define a failure envelope, to provide representative hydraulic fracture design parameters and for future geomechanical assessments. The samples are from well 52-21, which reaches a maximum depth of 3581 ft +/- 2 ft into a gneiss complex.
Solving Optimization Problems with Spreadsheets
ERIC Educational Resources Information Center
Beigie, Darin
2017-01-01
Spreadsheets provide a rich setting for first-year algebra students to solve problems. Individual spreadsheet cells play the role of variables, and creating algebraic expressions for a spreadsheet to perform a task allows students to achieve a glimpse of how mathematics is used to program a computer and solve problems. Classic optimization…
Using Spreadsheets to Teach Statistics in Geography.
ERIC Educational Resources Information Center
Lee, M. P.; Soper, J. B.
1987-01-01
Maintains that teaching methods of statistical calculation in geography may be enhanced by using a computer spreadsheet. The spreadsheet format of rows and columns allows the data to be inspected and altered to demonstrate various statistical properties. The inclusion of graphics and database facilities further adds to the value of a spreadsheet.…
Simulation Software's Effect on College Students Spreadsheet Project Scores
ERIC Educational Resources Information Center
Atkinson, J. Kirk; Thrasher, Evelyn H.; Coleman, Phillip D.
2011-01-01
The purpose of this study is to explore the potential impact of support materials on student spreadsheet skill acquisition. Specifically, this study examines the use of an online spreadsheet simulation tool versus a printed book across two independent student groups. This study hypothesizes that the online spreadsheet simulation tool will have a…
Finding P-Values for F Tests of Hypothesis on a Spreadsheet.
ERIC Educational Resources Information Center
Rochowicz, John A., Jr.
The calculation of the F statistic for a one-factor analysis of variance (ANOVA) and the construction of an ANOVA tables are easily implemented on a spreadsheet. This paper describes how to compute the p-value (observed significance level) for a particular F statistic on a spreadsheet. Decision making on a spreadsheet and applications to the…
Cognitive Skills, Domain Knowledge, and Self-Efficacy: Effects on Spreadsheet Quality
ERIC Educational Resources Information Center
Adkins, Joni K.
2011-01-01
Numerous studies have shown that spreadsheets used in companies often have errors which may affect the quality of the decisions made with these tools. Many businesses are unaware or choose to ignore the risks associated with spreadsheet use. The intent of this study was to learn more about the characteristics of spreadsheet end user developers,…
Spreadsheets and Bulgarian goats
NASA Astrophysics Data System (ADS)
Sugden, Steve
2012-10-01
We consider a problem appearing in an Australian Mathematics Challenge in 2003. This article considers whether a spreadsheet might be used to model this problem, thus allowing students to explore its structure within the spreadsheet environment. It then goes on to reflect on some general principles of problem decomposition when the final goal is a successful and lucid spreadsheet implementation.
NASA Technical Reports Server (NTRS)
Orr, John L.
1997-01-01
In many ways, the typical approach to the handling of bibliographic material for generating review articles and similar manuscripts has changed little since the use of xerographic reproduction has become widespread. The basic approach is to collect reprints of the relevant material and place it in folders or stacks based on its dominant content. As the amount of information available increases with the passage of time, the viability of this mechanical approach to bibliographic management decreases. The personal computer revolution has changed the way we deal with many familiar tasks. For example, word processing on personal computers has supplanted the typewriter for many applications. Similarly, spreadsheets have not only replaced many routine uses of calculators but have also made possible new applications because the cost of calculation is extremely low. Objective The objective of this research was to use personal computer bibliographic software technology to support the determination of spacecraft maximum acceptable concentration (SMAC) values. Specific Aims The specific aims were to produce draft SMAC documents for hydrogen sulfide and tetrachloroethylene taking maximum advantage of the bibliographic software.
Teaching physics using Microsoft Excel
NASA Astrophysics Data System (ADS)
Uddin, Zaheer; Ahsanuddin, Muhammad; Khan, Danish Ahmed
2017-09-01
Excel is both ubiquitous and easily understandable. Most people from every walk of life know how to use MS office and Excel spreadsheets. Students are also familiar with spreadsheets. Most students know how to use spreadsheets for data analysis. Besides basic use of Excel, some important aspects of spreadsheets are highlighted in this article. MS Excel can be used to visualize effects of various parameters in a physical system. It can be used as a simulating tool; simulation of wind data has been done through spreadsheets in this study. Examples of Lissajous figures and a damped harmonic oscillator are presented in this article.
Data acquisition and real-time control using spreadsheets: interfacing Excel with external hardware.
Aliane, Nourdine
2010-07-01
Spreadsheets have become a popular computational tool and a powerful platform for performing engineering calculations. Moreover, spreadsheets include a macro language, which permits the inclusion of standard computer code in worksheets, and thereby enable developers to greatly extend spreadsheets' capabilities by designing specific add-ins. This paper describes how to use Excel spreadsheets in conjunction to Visual Basic for Application programming language to perform data acquisition and real-time control. Afterwards, the paper presents two Excel applications with interactive user interfaces developed for laboratory demonstrations and experiments in an introductory course in control. 2010 ISA. Published by Elsevier Ltd. All rights reserved.
A simple node and conductor data generator for SINDA
NASA Technical Reports Server (NTRS)
Gottula, Ronald R.
1992-01-01
This paper presents a simple, automated method to generate NODE and CONDUCTOR DATA for thermal match modes. The method uses personal computer spreadsheets to create SINDA inputs. It was developed in order to make SINDA modeling less time consuming and serves as an alternative to graphical methods. Anyone having some experience using a personal computer can easily implement this process. The user develops spreadsheets to automatically calculate capacitances and conductances based on material properties and dimensional data. The necessary node and conductor information is then taken from the spreadsheets and automatically arranged into the proper format, ready for insertion directly into the SINDA model. This technique provides a number of benefits to the SINDA user such as a reduction in the number of hand calculations, and an ability to very quickly generate a parametric set of NODE and CONDUCTOR DATA blocks. It also provides advantages over graphical thermal modeling systems by retaining the analyst's complete visibility into the thermal network, and by permitting user comments anywhere within the DATA blocks.
Numerous features have been included to facilitate the modeling process, from model setup and data input, presentation and analysis of results, to easy export of results to spreadsheet programs for additional analysis.
76 FR 70517 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-14
... requested. These systems generally also provide analytics, spreadsheets, and other tools designed to enable funds to analyze the data presented, as well as communication tools to process fund instructions...
SE Requirements Development Tool User Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benson, Faith Ann
2016-05-13
The LANL Systems Engineering Requirements Development Tool (SERDT) is a data collection tool created in InfoPath for use with the Los Alamos National Laboratory’s (LANL) SharePoint sites. Projects can fail if a clear definition of the final product requirements is not performed. For projects to be successful requirements must be defined early in the project and those requirements must be tracked during execution of the project to ensure the goals of the project are met. Therefore, the focus of this tool is requirements definition. The content of this form is based on International Council on Systems Engineering (INCOSE) and Departmentmore » of Defense (DoD) process standards and allows for single or collaborative input. The “Scoping” section is where project information is entered by the project team prior to requirements development, and includes definitions and examples to assist the user in completing the forms. The data entered will be used to define the requirements and once the form is filled out, a “Requirements List” is automatically generated and a Word document is created and saved to a SharePoint document library. SharePoint also includes the ability to download the requirements data defined in the InfoPath from into an Excel spreadsheet. This User Guide will assist you in navigating through the data entry process.« less
Spreadsheet-based engine data analysis tool - user's guide.
DOT National Transportation Integrated Search
2016-07-01
This record refers to both the spreadsheet tool - Fleet Equipment Performance Measurement Preventive Maintenance Model: Spreadsheet-Based Engine Data Analysis Tool, http://ntl.bts.gov/lib/60000/60000/60007/0-6626-P1_Final.xlsm - and its accompanying ...
Modeling Steady-State Groundwater Flow Using Microcomputer Spreadsheets.
ERIC Educational Resources Information Center
Ousey, John Russell, Jr.
1986-01-01
Describes how microcomputer spreadsheets are easily adapted for use in groundwater modeling. Presents spreadsheet set-ups and the results of five groundwater models. Suggests that this approach can provide a basis for demonstrations, laboratory exercises, and student projects. (ML)
Numerical Stimulation of Multicomponent Chromatography Using Spreadsheets.
ERIC Educational Resources Information Center
Frey, Douglas D.
1990-01-01
Illustrated is the use of spreadsheet programs for implementing finite difference numerical simulations of chromatography as an instructional tool in a separations course. Discussed are differential equations, discretization and integration, spreadsheet development, computer requirements, and typical simulation results. (CW)
How Spreadsheets Boost Productivity.
ERIC Educational Resources Information Center
Ross, James
1988-01-01
Explains the use of computerized bookkeeping systems called spreadsheets to perform mathematical and accounting functions such as totaling expenditures, averaging test grades, and transferring funds. Advises about adapting spreadsheet programs and discusses several essential features, including linkage, macro functions, and sharing capabilities.…
This work describes a method for using spreadsheet analyses of process designs and retrofits to provide simple and quick economic and environmental evaluations simultaneously. The method focuses attention onto those streams and components that have the largest monetary values and...
Modeling the Milky Way: Spreadsheet Science.
ERIC Educational Resources Information Center
Whitmer, John C.
1990-01-01
Described is the generation of a scale model of the solar system and the milky way galaxy using a computer spreadsheet program. A sample spreadsheet including cell formulas is provided. Suggestions for using this activity as a teaching technique are included. (CW)
Using Spreadsheets to Produce Acid-Base Titration Curves.
ERIC Educational Resources Information Center
Cawley, Martin James; Parkinson, John
1995-01-01
Describes two spreadsheets for producing acid-base titration curves, one uses relatively simple cell formulae that can be written into the spreadsheet by inexperienced students and the second uses more complex formulae that are best written by the teacher. (JRH)
In this spreadsheet, user(s) provide their company’s manufacturer code, user contact information for EV-CIS, and user roles. This spreadsheet is used for the Company Authorizing Official (CAO), CROMERR Signer, and EV-CIS Submitters.
Preparation of School District Budgets with Microcomputer Electronic Spreadsheets.
ERIC Educational Resources Information Center
Hinitz, Herman J.
1996-01-01
Preparing a microcomputer electronic spreadsheet containing all relevant school district budgetary information is possible with currently available hardware and software (such as Lotus 1-2-3), despite random-access-memory limitations. Spreadsheets can provide financial summaries, inventory-control listings, scheduling alternatives,…
A Spreadsheet in the Mathematics Classroom.
ERIC Educational Resources Information Center
Watkins, Will; Taylor, Monty
1989-01-01
Demonstrates how spreadsheets can be used to implement linear system solving algorithms in college mathematics classes. Lotus 1-2-3 is described, a linear system of equations is illustrated using spreadsheets, and the interplay between applications, computations, and theory is discussed. (four references) (LRW)
The Iodine-Clock Reaction--A Spreadsheet Simulation To Test.
ERIC Educational Resources Information Center
Swain, P. A.
1997-01-01
Describes a spreadsheet activity for the iodine-clock reaction which follows the concentrations of all reactions and products for 200 seconds and gives the induction period. Explains that, although there are limitations to the spreadsheet, it is nevertheless illuminating. (Author/ASK)
XLWrap - Querying and Integrating Arbitrary Spreadsheets with SPARQL
NASA Astrophysics Data System (ADS)
Langegger, Andreas; Wöß, Wolfram
In this paper a novel approach is presented for generating RDF graphs of arbitrary complexity from various spreadsheet layouts. Currently, none of the available spreadsheet-to-RDF wrappers supports cross tables and tables where data is not aligned in rows. Similar to RDF123, XLWrap is based on template graphs where fragments of triples can be mapped to specific cells of a spreadsheet. Additionally, it features a full expression algebra based on the syntax of OpenOffice Calc and various shift operations, which can be used to repeat similar mappings in order to wrap cross tables including multiple sheets and spreadsheet files. The set of available expression functions includes most of the native functions of OpenOffice Calc and can be easily extended by users of XLWrap.
Integrating Critical Spreadsheet Competencies into the Accounting Curriculum
ERIC Educational Resources Information Center
Walters, L. Melissa; Pergola, Teresa M.
2012-01-01
The American Institute of Certified Public Accountants (AICPA) and the International Accounting Education Standards Board (IAESB) identify spreadsheet technology as a key information technology (IT) competency for accounting professionals. However requisite spreadsheet competencies are not specifically defined by the AICPA or IAESB nor are they…
Exploring Difference Equations with Spreadsheets.
ERIC Educational Resources Information Center
Walsh, Thomas P.
1996-01-01
When using spreadsheets to explore real-world problems involving periodic change, students can observe what happens at each period, generate a graph, and learn how changing the starting quantity or constants affects results. Spreadsheet lessons for high school students are presented that explore mathematical modeling, linear programming, and…
Visual Basic programs for spreadsheet analysis.
Hunt, Bruce
2005-01-01
A collection of Visual Basic programs, entitled Function.xls, has been written for ground water spreadsheet calculations. This collection includes programs for calculating mathematical functions and for evaluating analytical solutions in ground water hydraulics and contaminant transport. Several spreadsheet examples are given to illustrate their use.
Decision Analysis Using Spreadsheets.
ERIC Educational Resources Information Center
Sounderpandian, Jayavel
1989-01-01
Discussion of decision analysis and its importance in a business curriculum focuses on the use of spreadsheets instead of commercial software packages for computer assisted instruction. A hypothetical example is given of a company drilling for oil, and suggestions are provided for classroom exercises using spreadsheets. (seven references) (LRW)
ERIC Educational Resources Information Center
Batt, Russell H., Ed.
1988-01-01
Notes two uses of computer spreadsheets in the chemistry classroom. Discusses the general use of the spreadsheet to easily provide changing parameters of equations and then replotting the results on the screen. Presents a molecular orbital spreadsheet calculation of the LCAO-MO approach. Supplies representative printouts and graphs. (MVL)
How to Create Automatically Graded Spreadsheets for Statistics Courses
ERIC Educational Resources Information Center
LoSchiavo, Frank M.
2016-01-01
Instructors often use spreadsheet software (e.g., Microsoft Excel) in their statistics courses so that students can gain experience conducting computerized analyses. Unfortunately, students tend to make several predictable errors when programming spreadsheets. Without immediate feedback, programming errors are likely to go undetected, and as a…
Teaching with Spreadsheets: An Example from Heat Transfer.
ERIC Educational Resources Information Center
Drago, Peter
1993-01-01
Provides an activity which measures the heat transfer through an insulated cylindrical tank, allowing the student to gain a better knowledge of both the physics involved and the working of spreadsheets. Provides both a spreadsheet solution and a maximum-minimum method of solution for the problem. (MVL)
Spreadsheet-Based Program for Simulating Atomic Emission Spectra
ERIC Educational Resources Information Center
Flannigan, David J.
2014-01-01
A simple Excel spreadsheet-based program for simulating atomic emission spectra from the properties of neutral atoms (e.g., energies and statistical weights of the electronic states, electronic partition functions, transition probabilities, etc.) is described. The contents of the spreadsheet (i.e., input parameters, formulas for calculating…
Teaching Quality Control with Chocolate Chip Cookies
ERIC Educational Resources Information Center
Baker, Ardith
2014-01-01
Chocolate chip cookies are used to illustrate the importance and effectiveness of control charts in Statistical Process Control. By counting the number of chocolate chips, creating the spreadsheet, calculating the control limits and graphing the control charts, the student becomes actively engaged in the learning process. In addition, examining…
Teaching Raster GIS Operations with Spreadsheets.
ERIC Educational Resources Information Center
Raubal, Martin; Gaupmann, Bernhard; Kuhn, Werner
1997-01-01
Defines raster technology in its relationship to geographic information systems and notes that it is typically used with the application of remote sensing techniques and scanning devices. Discusses the role of spreadsheets in a raster model, and describes a general approach based on spreadsheets. Includes six computer-generated illustrations. (MJP)
Spreadsheet Design: An Optimal Checklist for Accountants
ERIC Educational Resources Information Center
Barnes, Jeffrey N.; Tufte, David; Christensen, David
2009-01-01
Just as good grammar, punctuation, style, and content organization are important to well-written documents, basic fundamentals of spreadsheet design are essential to clear communication. In fact, the very principles of good writing should be integrated into spreadsheet workpaper design and organization. The unique contributions of this paper are…
Computer Applications: Using Electronic Spreadsheets.
ERIC Educational Resources Information Center
Riley, Connee; And Others
This instructional unit is intended to assist teachers in helping students learn to use electronic spreadsheets. The 11 learning activities included, all of which are designed for use in conjunction with Multiplan Spreadsheet Software, are arranged in order of increasing difficulty. An effort has been made to include problems applicable to each of…
Manipulative and Numerical Spreadsheet Templates for the Study of Discrete Structures.
ERIC Educational Resources Information Center
Abramovich, Sergei
1998-01-01
Argues that basic components of discrete mathematics can be introduced to students through gradual elaboration of experiences with iconic spreadsheet-based simulations of concrete materials. Suggests that the study of homogeneous and heterogeneous patterns of manipulative spreadsheet templates allows for appreciation of the development of…
Excel Spreadsheets for Algebra: Improving Mental Modeling for Problem Solving
ERIC Educational Resources Information Center
Engerman, Jason; Rusek, Matthew; Clariana, Roy
2014-01-01
This experiment investigates the effectiveness of Excel spreadsheets in a high school algebra class. Students in the experiment group convincingly outperformed the control group on a post lesson assessment. The student responses, teacher observations involving Excel spreadsheet revealed that it operated as a mindtool, which formed the users'…
ERIC Educational Resources Information Center
Barreto, Humberto
2015-01-01
This article is not the usual Excel pedagogy fare in that it does not provide an application or example taught via a spreadsheet. Instead, it briefly reviews the history of spreadsheets in the economics classroom and explores the current environment, with an emphasis on modern learning theory. The conclusion is not surprising: spreadsheets improve…
Levels of Student Responses in a Spreadsheet-Based Environment
ERIC Educational Resources Information Center
Tabach, Michal; Friedlander, Alex
2004-01-01
The purpose of this report is to investigate the range of student responses in three domains--hypothesizing, organizing data, and algebraic generalization of patterns during their work on a spreadsheet-based activity. In a wider context, we attempted to investigate students' utilization schemes of spreadsheets in their learning of introductory…
User's guide: RPGrow$: a red pine growth and analysis spreadsheet for the Lake States.
Carol A. Hyldahl; Gerald H. Grossman
1993-01-01
Describes RPGrow$, a stand-level, interactive spreadsheet for projecting growth and yield and estimating financial returns of red pine plantations in the Lake States. This spreadsheet is based on published growth models for red pine. Financial analyses are based on discounted cash flow methods.
Spreadsheets and Bulgarian Goats
ERIC Educational Resources Information Center
Sugden, Steve
2012-01-01
We consider a problem appearing in an Australian Mathematics Challenge in 2003. This article considers whether a spreadsheet might be used to model this problem, thus allowing students to explore its structure within the spreadsheet environment. It then goes on to reflect on some general principles of problem decomposition when the final goal is a…
CEASAW: A User-Friendly Computer Environment Analysis for the Sawmill Owner
Guillermo Mendoza; William Sprouse; Philip A. Araman; William G. Luppold
1991-01-01
Improved spreadsheet software capabilities have brought optimization to users with little or no background in mathematical programming. Better interface capabilities of spreadsheet models now make it possible to combine optimization models with a spreadsheet system. Sawmill production and inventory systems possess many features that make them suitable application...
Lens ray diagrams with a spreadsheet
NASA Astrophysics Data System (ADS)
González, Manuel I.
2018-05-01
Physicists create spreadsheets customarily to carry out numerical calculations and to display their results in a meaningful, nice-looking way. Spreadsheets can also be used to display a vivid geometrical model of a physical system. This statement is illustrated with an example taken from geometrical optics: images formed by a thin lens. A careful mixture of standard Excel functions allows to display a realistic automated ray diagram. The suggested spreadsheet is intended as an auxiliary didactic tool for instructors who wish to teach their students to create their own ray diagrams.
Andrew C. Oishi; David Hawthorne; Ram Oren
2016-01-01
Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS). We developed the Baseliner software to help establish a standardized protocol for processing sap...
ERIC Educational Resources Information Center
Clarke, Matthew A.; Giraldo, Carlos
2009-01-01
Chemical process simulation is one of the most fundamental skills that is expected from chemical engineers, yet relatively few graduates have the opportunity to learn, in depth, how a process simulator works, from programming the unit operations to the sequencing. The University of Calgary offers a "hands-on" postgraduate course in…
Waste Characterization Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lambert, Patrick E.
2014-11-01
The purpose is to provide guidance to the Radiological Characterization Reviewer to complete the radiological characterization of waste items. This information is used for Department of Transportation (DOT) shipping and disposal, typically at the Nevada National Security Site (NNSS). Complete characterization ensures compliance with DOT shipping laws and NNSS Waste Acceptance Criteria (WAC). The fines for noncompliance can be extreme. This does not include possible bad press, and endangerment to the public, employees and the environment. A Radiological Characterization Reviewer has an important role in the organization. The scope is to outline the characterization process, but does not to includemore » every possible situation. The Radiological Characterization Reviewer position requires a strong background in Health Physics; therefore, these concepts are minimally addressed. The characterization process includes many Excel spreadsheets that were developed by Michael Enghauser known as the WCT software suite. New Excel spreadsheets developed as part of this project include the Ra- 226 Decider and the Density Calculator by Jesse Bland, MicroShield Density Calculator and Molecular Weight Calculator by Pat Lambert.« less
A quality-based cost model for new electronic systems and products
NASA Astrophysics Data System (ADS)
Shina, Sammy G.; Saigal, Anil
1998-04-01
This article outlines a method for developing a quality-based cost model for the design of new electronic systems and products. The model incorporates a methodology for determining a cost-effective design margin allocation for electronic products and systems and its impact on manufacturing quality and cost. A spreadsheet-based cost estimating tool was developed to help implement this methodology in order for the system design engineers to quickly estimate the effect of design decisions and tradeoffs on the quality and cost of new products. The tool was developed with automatic spreadsheet connectivity to current process capability and with provisions to consider the impact of capital equipment and tooling purchases to reduce the product cost.
NASA Astrophysics Data System (ADS)
Adhitama, Egy; Fauzi, Ahmad
2018-05-01
In this study, a pendulum experimental tool with a light-based timer has been developed to measure the period of a simple pendulum. The obtained data was automatically recorded in an Excel spreadsheet. The intensity of monochromatic light, sensed by a 3DU5C phototransistor, dynamically changes as the pendulum swings. The changed intensity varies the resistance value and was processed by the microcontroller, ATMega328, to obtain a signal period as a function of time and brightness when the pendulum crosses the light. Through the experiment, using calculated average periods, the gravitational acceleration value has been accurately and precisely determined.
Lens Ray Diagrams with a Spreadsheet
ERIC Educational Resources Information Center
González, Manuel I.
2018-01-01
Physicists create spreadsheets customarily to carry out numerical calculations and to display their results in a meaningful, nice-looking way. Spreadsheets can also be used to display a vivid geometrical model of a physical system. This statement is illustrated with an example taken from geometrical optics: images formed by a thin lens. A careful…
Spreadsheet Modeling of Electron Distributions in Solids
ERIC Educational Resources Information Center
Glassy, Wingfield V.
2006-01-01
A series of spreadsheet modeling exercises constructed as part of a new upper-level elective course on solid state materials and surface chemistry is described. The spreadsheet exercises are developed to provide students with the opportunity to interact with the conceptual framework where the role of the density of states and the Fermi-Dirac…
Designing Spreadsheet-Based Tasks for Purposeful Algebra
ERIC Educational Resources Information Center
Ainley, Janet; Bills, Liz; Wilson, Kirsty
2005-01-01
We describe the design of a sequence of spreadsheet-based pedagogic tasks for the introduction of algebra in the early years of secondary schooling within the Purposeful Algebraic Activity project. This design combines two relatively novel features to bring a different perspective to research in the use of spreadsheets for the learning and…
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Demand Side Variability, and Network Variability studies, including input data, processing programs, and... should include the product or product groups carried under each listed contract; (k) Spreadsheets and...
A Spreadsheet for a 2 x 3 x 2 Log-Linear Analysis. AIR 1991 Annual Forum Paper.
ERIC Educational Resources Information Center
Saupe, Joe L.
This paper describes a personal computer spreadsheet set up to carry out hierarchical log-linear analyses, a type of analysis useful for institutional research into multidimensional frequency tables formed from categorical variables such as faculty rank, student class level, gender, or retention status. The spreadsheet provides a concrete vehicle…
Using Spreadsheets to Help Students Think Recursively
ERIC Educational Resources Information Center
Webber, Robert P.
2012-01-01
Spreadsheets lend themselves naturally to recursive computations, since a formula can be defined as a function of one of more preceding cells. A hypothesized closed form for the "n"th term of a recursive sequence can be tested easily by using a spreadsheet to compute a large number of the terms. Similarly, a conjecture about the limit of a series…
A Spreadsheet Tool for Learning the Multiple Regression F-Test, T-Tests, and Multicollinearity
ERIC Educational Resources Information Center
Martin, David
2008-01-01
This note presents a spreadsheet tool that allows teachers the opportunity to guide students towards answering on their own questions related to the multiple regression F-test, the t-tests, and multicollinearity. The note demonstrates approaches for using the spreadsheet that might be appropriate for three different levels of statistics classes,…
ERIC Educational Resources Information Center
Abramovich, Sergei
2016-01-01
The paper presents the use of spreadsheets integrated with digital tools capable of symbolic computations and graphic constructions in a master's level capstone course for secondary mathematics teachers. Such use of spreadsheets is congruent with the Type II technology applications framework aimed at the development of conceptual knowledge in the…
The EnzymeTracker: an open-source laboratory information management system for sample tracking.
Triplet, Thomas; Butler, Gregory
2012-01-26
In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50 scientists. The EnzymeTracker is freely available online at http://cubique.fungalgenomics.ca/enzymedb/index.html under the GNU GPLv3 license.
The EnzymeTracker: an open-source laboratory information management system for sample tracking
2012-01-01
Background In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. Results In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. Conclusions The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50 scientists. The EnzymeTracker is freely available online at http://cubique.fungalgenomics.ca/enzymedb/index.html under the GNU GPLv3 license. PMID:22280360
NetpathXL - An Excel Interface to the Program NETPATH
Parkhurst, David L.; Charlton, Scott R.
2008-01-01
NetpathXL is a revised version of NETPATH that runs under Windows? operating systems. NETPATH is a computer program that uses inverse geochemical modeling techniques to calculate net geochemical reactions that can account for changes in water composition between initial and final evolutionary waters in hydrologic systems. The inverse models also can account for the isotopic composition of waters and can be used to estimate radiocarbon ages of dissolved carbon in ground water. NETPATH relies on an auxiliary, database program, DB, to enter the chemical analyses and to perform speciation calculations that define total concentrations of elements, charge balance, and redox state of aqueous solutions that are then used in inverse modeling. Instead of DB, NetpathXL relies on Microsoft Excel? to enter the chemical analyses. The speciation calculation formerly included in DB is implemented within the program NetpathXL. A program DBXL can be used to translate files from the old DB format (.lon files) to NetpathXL spreadsheets, or to create new NetpathXL spreadsheets. Once users have a NetpathXL spreadsheet with the proper format, new spreadsheets can be generated by copying or saving NetpathXL spreadsheets. In addition, DBXL can convert NetpathXL spreadsheets to PHREEQC input files. New capabilities in PHREEQC (version 2.15) allow solution compositions to be written to a .lon file, and inverse models developed in PHREEQC to be written as NetpathXL .pat and model files. NetpathXL can open NetpathXL spreadsheets, NETPATH-format path files (.pat files), and NetpathXL-format path files (.pat files). Once the speciation calculations have been performed on a spreadsheet file or a .pat file has been opened, the NetpathXL calculation engine is identical to the original NETPATH. Development of models and viewing results in NetpathXL rely on keyboard entry as in NETPATH.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpkins, A.A.
1996-09-01
AXAOTHER XL is an Excel Spreadsheet used to determine dose to the maximally exposed offsite individual during high-velocity straight winds or tornado conditions. Both individual and population doses may be considered. Potential exposure pathways are inhalation and plume shine. For high-velocity straight winds the spreadsheet has the capability to determine the downwind relative air concentration, however for the tornado conditions, the user must enter the relative air concentration. Theoretical models are discussed and hand calculations are performed to ensure proper application of methodologies. A section has also been included that contains user instructions for the spreadsheet.
Contemporary issues in HIM. The application layer--III.
Wear, L L; Pinkert, J R
1993-07-01
We have seen document preparation systems evolve from basic line editors through powerful, sophisticated desktop publishing programs. This component of the application layer is probably one of the most used, and most readily identifiable. Ask grade school children nowadays, and many will tell you that they have written a paper on a computer. Next month will be a "fun" tour through a number of other application programs we find useful. They will range from a simple notebook reminder to a sophisticated photograph processor. Application layer: Software targeted for the end user, focusing on a specific application area, and typically residing in the computer system as distinct components on top of the OS. Desktop publishing: A document preparation program that begins with the text features of a word processor, then adds the ability for a user to incorporate outputs from a variety of graphic programs, spreadsheets, and other applications. Line editor: A document preparation program that manipulates text in a file on the basis of numbered lines. Word processor: A document preparation program that can, among other things, reformat sections of documents, move and replace blocks of text, use multiple character fonts, automatically create a table of contents and index, create complex tables, and combine text and graphics.
Breaking free from chemical spreadsheets.
Segall, Matthew; Champness, Ed; Leeding, Chris; Chisholm, James; Hunt, Peter; Elliott, Alex; Garcia-Martinez, Hector; Foster, Nick; Dowling, Samuel
2015-09-01
Drug discovery scientists often consider compounds and data in terms of groups, such as chemical series, and relationships, representing similarity or structural transformations, to aid compound optimisation. This is often supported by chemoinformatics algorithms, for example clustering and matched molecular pair analysis. However, chemistry software packages commonly present these data as spreadsheets or form views that make it hard to find relevant patterns or compare related compounds conveniently. Here, we review common data visualisation and analysis methods used to extract information from chemistry data. We introduce a new framework that enables scientists to work flexibly with drug discovery data to reflect their thought processes and interact with the output of algorithms to identify key structure-activity relationships and guide further optimisation intuitively. Copyright © 2015 Elsevier Ltd. All rights reserved.
Spreadsheet log analysis in subsurface geology
Doveton, J.H.
2000-01-01
Most of the direct knowledge of the geology of the subsurface is gained from the examination of core and drill-cuttings recovered from boreholes drilled by the petroleum and water industries. Wireline logs run in these same boreholes generally have been restricted to tasks of lithostratigraphic correlation and thee location of hydrocarbon pay zones. However, the range of petrophysical measurements has expanded markedly in recent years, so that log traces now can be transformed to estimates of rock composition. Increasingly, logs are available in a digital format that can be read easily by a desktop computer and processed by simple spreadsheet software methods. Taken together, these developments offer accessible tools for new insights into subsurface geology that complement the traditional, but limited, sources of core and cutting observations.
An Introduction to Simulated Annealing
ERIC Educational Resources Information Center
Albright, Brian
2007-01-01
An attempt to model the physical process of annealing lead to the development of a type of combinatorial optimization algorithm that takes on the problem of getting trapped in a local minimum. The author presents a Microsoft Excel spreadsheet that illustrates how this works.
Multimodal system planning technique : an analytical approach to peak period operation
DOT National Transportation Integrated Search
1995-11-01
The multimodal system planning technique described in this report is an improvement of the methodology used in the Dallas System Planning Study. The technique includes a spreadsheet-based process to identify the costs of congestion, construction, and...
Computerized Budget Monitoring.
ERIC Educational Resources Information Center
Stein, Julian U.; Rowe, Joe N.
1989-01-01
This article discusses the importance of budget monitoring in fiscal management; describes ways in which computerized budget monitoring increases accuracy, efficiency, and flexibility; outlines steps in the budget process; and presents sample reports, generated using the Lotus 1-2-3 spreadsheet and graphics program. (IAH)
This report estimates environmental emission factors (EmF) for key chemicals, construction and treatment materials, transportation/on-site equipment, and other processes used at remediation sites. The basis for chemical, construction, and treatment material EmFs is life cycle inv...
This report estimates environmental emission factors (EmF) for key chemicals, construction and treatment materials, transportation/on-site equipment, and other processes used at remediation sites. The basis for chemical, construction, and treatment material EmFs is life cycle inv...
DARPA Initiative in Concurrent Engineering (DICE). Phase 2
1990-07-31
XS spreadsheet tool " Q-Calc spreadsheet tool " TAE+ outer wrapper for XS • Framemaker-based formal EDN (Electronic Design Notebook) " Data...shared global object space and object persistence. Technical Results Module Development XS Integration Environment A prototype of the wrapper concepts...for a spreadsheet integration environment, using an X-Windows based extensible Lotus 1-2-3 emulation called XS , and was (initially) targeted for
ERIC Educational Resources Information Center
Gierdien, M. Faaiz
2014-01-01
This paper reports on the initial stages of a small-scale project involving the use of "spreadsheet algebra programs" in the professional development of eight teachers from three township high schools. In terms of the education context, the paper draws on social practice theory. It then details what is meant by spreadsheet algebra. An…
Schneider, Walter; Bolger, D J; Eschman, Amy; Neff, Christopher; Zuccolotto, Anthony P
2005-05-01
In academic courses in which one task for the students is to understand empirical methodology and the nature of scientific inquiry, the ability of students to create and implement their own experiments allows them to take intellectual ownership of, and greatly facilitates, the learning process. The Psychology Experiment Authoring Kit (PEAK) is a novel spreadsheet-based interface allowing students and researchers with rudimentary spreadsheet skills to create cognitive and cognitive neuroscience experiments in minutes. Students fill in a spreadsheet listing of independent variables and stimuli, insert columns that represent experimental objects such as slides (presenting text, pictures, and sounds) and feedback displays to create complete experiments, all within a single spreadsheet. The application then executes experiments with centisecond precision. Formal usability testing was done in two stages: (1) detailed coding of 10 individual subjects in one-on-one experimenter/subject videotaped sessions and (2) classroom testing of 64 undergraduates. In both individual and classroom testing, the students learned to effectively use PEAK within 2 h, and were able to create a lexical decision experiment in under 10 min. Findings from the individual testing in Stage 1 resulted in significant changes to documentation and training materials and identification of bugs to be corrected. Stage 2 testing identified additional bugs to be corrected and new features to be considered to facilitate student understanding of the experiment model. Such testing will improve the approach with each semester. The students were typically able to create their own projects in 2 h.
Abdominal surgery process modeling framework for simulation using spreadsheets.
Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja
2015-08-01
We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Bailey, Stephanie L.; Bono, Rose S.; Nash, Denis; Kimmel, April D.
2018-01-01
Background Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. Methods We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. Results We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Conclusions Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited. PMID:29570737
Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D
2018-01-01
Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited.
Individualized Human CAD Models: Anthropmetric Morphing and Body Tissue Layering
2014-07-31
Part Flow Chart of the Interaction among VBA Macros, Excel® Spreadsheet, and SolidWorks Front View of the Male and Female Soldier CAD Model...yellow highlighting. The spreadsheet is linked to the CAD model by macros created with the Visual Basic for Application ( VBA ) editor in Microsoft Excel...basically three working parts to the anthropometric morphing that are all interconnected ( VBA macros, Excel spreadsheet, and SolidWorks). The flow
Edwardson, S R; Pejsa, J
1993-01-01
A computer-based tutorial for teaching nursing financial management concepts was developed using the macro function of a commercially available spreadsheet program. The goals of the tutorial were to provide students with an experience with spreadsheets as a computer tool and to teach selected financial management concepts. Preliminary results show the tutorial was well received by students. Suggestions are made for overcoming the general lack of computer sophistication among students.
ERIC Educational Resources Information Center
Ge, Yingbin; Rittenhouse, Robert C.; Buchanan, Jacob C.; Livingston, Benjamin
2014-01-01
We have designed an exercise suitable for a lab or project in an undergraduate physical chemistry course that creates a Microsoft Excel spreadsheet to calculate the energy of the S[subscript 0] ground electronic state and the S[subscript 1] and T[subscript 1] excited states of H[subscript 2]. The spreadsheet calculations circumvent the…
PYROLASER - PYROLASER OPTICAL PYROMETER OPERATING SYSTEM
NASA Technical Reports Server (NTRS)
Roberts, F. E.
1994-01-01
The PYROLASER package is an operating system for the Pyrometer Instrument Company's Pyrolaser. There are 6 individual programs in the PYROLASER package: two main programs, two lower level subprograms, and two programs which, although independent, function predominantly as macros. The package provides a quick and easy way to setup, control, and program a standard Pyrolaser. Temperature and emissivity measurements may be either collected as if the Pyrolaser were in the manual operations mode, or displayed on real time strip charts and stored in standard spreadsheet format for post-test analysis. A shell is supplied to allow macros, which are test-specific, to be easily added to the system. The Pyrolaser Simple Operation program provides full on-screen remote operation capabilities, thus allowing the user to operate the Pyrolaser from the computer just as it would be operated manually. The Pyrolaser Simple Operation program also allows the use of "quick starts". Quick starts provide an easy way to permit routines to be used as setup macros for specific applications or tests. The specific procedures required for a test may be ordered in a sequence structure and then the sequence structure can be started with a simple button in the cluster structure provided. One quick start macro is provided for continuous Pyrolaser operation. A subprogram, Display Continuous Pyr Data, is used to display and store the resulting data output. Using this macro, the system is set up for continuous operation and the subprogram is called to display the data in real time on strip charts. The data is simultaneously stored in a spreadsheet format. The resulting spreadsheet file can be opened in any one of a number of commercially available spreadsheet programs. The Read Continuous Pyrometer program is provided as a continuously run subprogram for incorporation of the Pyrolaser software into a process control or feedback control scheme in a multi-component system. The program requires the Pyrolaser to be set up using the Pyrometer String Transfer macro. It requires no inputs and provides temperature and emissivity as outputs. The Read Continuous Pyrometer program can be run continuously and the data can be sampled as often or as seldom as updates of temperature and emissivity are required. PYROLASER is written using the Labview software for use on Macintosh series computers running System 6.0.3 or later, Sun Sparc series computers running OpenWindows 3.0 or MIT's X Window System (X11R4 or X11R5), and IBM PC or compatibles running Microsoft Windows 3.1 or later. Labview requires a minimum of 5Mb of RAM on a Macintosh, 24Mb of RAM on a Sun, and 8Mb of RAM on an IBM PC or compatible. The Labview software is a product of National Instruments (Austin,TX; 800-433-3488), and is not included with this program. The standard distribution medium for PYROLASER is a 3.5 inch 800K Macintosh format diskette. It is also available on a 3.5 inch 720K MS-DOS format diskette, a 3.5 inch diskette in UNIX tar format, and a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation in Macintosh WordPerfect version 2.0.4 format is included on the distribution medium. Printed documentation is included in the price of the program. PYROLASER was developed in 1992.
NASA Astrophysics Data System (ADS)
Benacka, Jan
2016-08-01
This paper reports on lessons in which 18-19 years old high school students modelled random processes with Excel. In the first lesson, 26 students formulated a hypothesis on the area of ellipse by using the analogy between the areas of circle, square and rectangle. They verified the hypothesis by the Monte Carlo method with a spreadsheet model developed in the lesson. In the second lesson, 27 students analysed the dice poker game. First, they calculated the probability of the hands by combinatorial formulae. Then, they verified the result with a spreadsheet model developed in the lesson. The students were given a questionnaire to find out if they found the lesson interesting and contributing to their mathematical and technological knowledge.
ERIC Educational Resources Information Center
Smith, Michael
1990-01-01
Presents several examples of the iteration method using computer spreadsheets. Examples included are simple iterative sequences and the solution of equations using the Newton-Raphson formula, linear interpolation, and interval bisection. (YP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKone, T.E.; Enoch, K.G.
2002-08-01
CalTOX has been developed as a set of spreadsheet models and spreadsheet data sets to assist in assessing human exposures from continuous releases to multiple environmental media, i.e. air, soil, and water. It has also been used for waste classification and for setting soil clean-up levels at uncontrolled hazardous wastes sites. The modeling components of CalTOX include a multimedia transport and transformation model, multi-pathway exposure scenario models, and add-ins to quantify and evaluate uncertainty and variability. All parameter values used as inputs to CalTOX are distributions, described in terms of mean values and a coefficient of variation, rather than asmore » point estimates or plausible upper values such as most other models employ. This probabilistic approach allows both sensitivity and uncertainty analyses to be directly incorporated into the model operation. This manual provides CalTOX users with a brief overview of the CalTOX spreadsheet model and provides instructions for using the spreadsheet to make deterministic and probabilistic calculations of source-dose-risk relationships.« less
COMPUTER SIMULATOR (BEST) FOR DESIGNING SULFATE-REDUCING BACTERIA FIELD BIOREACTORS
BEST (bioreactor economics, size and time of operation) is a spreadsheet-based model that is used in conjunction with public domain software, PhreeqcI. BEST is used in the design process of sulfate-reducing bacteria (SRB) field bioreactors to passively treat acid mine drainage (A...
DESIGNING SULFATE-REDUCING BACTERIA FIELD BIOREACTORS USING THE BEST MODEL
BEST (bioreactor economics, size and time of operation) is a spreadsheet-based model that is used in conjunction with a public domain computer software package, PHREEQCI. BEST is intended to be used in the design process of sulfate-reducing bacteria (SRB)field bioreactors to pas...
Zhang, Yong; Huo, Meirong; Zhou, Jianping; Xie, Shaofei
2010-09-01
This study presents PKSolver, a freely available menu-driven add-in program for Microsoft Excel written in Visual Basic for Applications (VBA), for solving basic problems in pharmacokinetic (PK) and pharmacodynamic (PD) data analysis. The program provides a range of modules for PK and PD analysis including noncompartmental analysis (NCA), compartmental analysis (CA), and pharmacodynamic modeling. Two special built-in modules, multiple absorption sites (MAS) and enterohepatic circulation (EHC), were developed for fitting the double-peak concentration-time profile based on the classical one-compartment model. In addition, twenty frequently used pharmacokinetic functions were encoded as a macro and can be directly accessed in an Excel spreadsheet. To evaluate the program, a detailed comparison of modeling PK data using PKSolver and professional PK/PD software package WinNonlin and Scientist was performed. The results showed that the parameters estimated with PKSolver were satisfactory. In conclusion, the PKSolver simplified the PK and PD data analysis process and its output could be generated in Microsoft Word in the form of an integrated report. The program provides pharmacokinetic researchers with a fast and easy-to-use tool for routine and basic PK and PD data analysis with a more user-friendly interface. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
Combating adverse selection in secondary PC markets.
Hickey, Stewart W; Fitzpatrick, Colin
2008-04-15
Adverse selection is a significant contributor to market failure in secondary personal computer (PC) markets. Signaling can act as a potential solution to adverse selection and facilitate superior remarketing of second-hand PCs. Signaling is a means whereby usage information can be utilized to enhance consumer perception of both value and utility of used PCs and, therefore, promote lifetime extension for these systems. This can help mitigate a large portion of the environmental impact associated with PC system manufacture. In this paper, the computer buying and selling behavior of consumers is characterized via a survey of 270 Irish residential users. Results confirm the existence of adverse selection in the Irish market with 76% of potential buyers being unwilling to purchase and 45% of potential vendors being unwilling to sell a used PC. The so-called "closet affect" is also apparent with 78% of users storing their PC after use has ceased. Results also indicate that consumers place a higher emphasis on specifications when considering a second-hand purchase. This contradicts their application needs which are predominantly Internet and word-processing/spreadsheet/presentation applications, 88% and 60% respectively. Finally, a market solution utilizing self monitoring and reporting technology (SMART) sensors for the purpose of real time usage monitoring is proposed, that can change consumer attitudes with regard to second-hand computer equipment.
NASA Astrophysics Data System (ADS)
Karlitasari, L.; Suhartini, D.; Benny
2017-01-01
The process of determining the employee remuneration for PT Sepatu Mas Idaman currently are still using Microsoft Excel-based spreadsheet where in the spreadsheet there is the value of criterias that must be calculated for every employee. This can give the effect of doubt during the assesment process, therefore resulting in the process to take much longer time. The process of employee remuneration determination is conducted by the assesment team based on some criterias that have been predetermined. The criteria used in the assessment process are namely the ability to work, human relations, job responsibility, discipline, creativity, work, achievement of targets, and absence. To ease the determination of employee remuneration to be more efficient and effective, the Simple Additive Weighting (SAW) method is used. SAW method can help in decision making for a certain case, and the calculation that generates the greatest value will be chosen as the best alternative. Other than SAW, also by using another method was the CPI method which is one of the calculating method in decision making based on performance index. Where SAW method was more faster by 89-93% compared to CPI method. Therefore it is expected that this application can be an evaluation material for the need of training and development for employee performances to be more optimal.
CALCULATIONAL TOOL FOR SKIN CONTAMINATION DOSE ESTIMATE
DOE Office of Scientific and Technical Information (OSTI.GOV)
HILL, R.L.
2005-03-31
A spreadsheet calculational tool was developed to automate the calculations performed for estimating dose from skin contamination. This document reports on the design and testing of the spreadsheet calculational tool.
ERIC Educational Resources Information Center
Sims, Paul A.
2010-01-01
An approach is presented that utilizes a spreadsheet to allow students to explore different means of calculating and visualizing how the charge on peptides and proteins varies as a function of pH. In particular, the concept of isoelectric point is developed to allow students to compare the results of their spreadsheet calculations with those of…
Enabling Process Improvement and Control in Higher Education Management
ERIC Educational Resources Information Center
Bell, Gary; Warwick, Jon; Kennedy, Mike
2009-01-01
The emergence of "managerialism" in the governance and direction of UK higher education (HE) institutions has been led by government demands for greater accountability in the quality and cost of universities. There is emerging anecdotal evidence indicating that the estimation performance of HE spreadsheets and regression models are poor.…
Teaching Accounting with Computers.
ERIC Educational Resources Information Center
Shaoul, Jean
This paper addresses the numerous ways that computers may be used to enhance the teaching of accounting and business topics. It focuses on the pedagogical use of spreadsheet software to improve the conceptual coverage of accounting principles and practice, increase student understanding by involvement in the solution process, and reduce the amount…
Simple Numerical Analysis of Longboard Speedometer Data
ERIC Educational Resources Information Center
Hare, Jonathan
2013-01-01
Simple numerical data analysis is described, using a standard spreadsheet program, to determine distance, velocity (speed) and acceleration from voltage data generated by a skateboard/longboard speedometer (Hare 2012 "Phys. Educ." 47 409-17). This simple analysis is an introduction to data processing including scaling data as well as…
Fitting Planetary Orbits with a Spreadsheet.
ERIC Educational Resources Information Center
Bridges, Richard
1995-01-01
Describes how to fit binocular observations of the planets to a theoretical model of circular orbits using a modern computer spreadsheet, from which fundamental data about the solar system may be deduced. (AIM)
Documentation of spreadsheets for the analysis of aquifer-test and slug-test data
Halford, Keith J.; Kuniansky, Eve L.
2002-01-01
Several spreadsheets have been developed for the analysis of aquifer-test and slug-test data. Each spreadsheet incorporates analytical solution(s) of the partial differential equation for ground-water flow to a well for a specific type of condition or aquifer. The derivations of the analytical solutions were previously published. Thus, this report abbreviates the theoretical discussion, but includes practical information about each method and the important assumptions for the applications of each method. These spreadsheets were written in Microsoft Excel 9.0 (use of trade names does not constitute endorsement by the USGS). Storage properties should not be estimated with many of the spreadsheets because most are for analyzing single-well tests. Estimation of storage properties from single-well tests is generally discouraged because single-well tests are affected by wellbore storage and by well construction. These non-ideal effects frequently cause estimates of storage to be erroneous by orders of magnitude. Additionally, single-well tests are not sensitive to aquifer-storage properties. Single-well tests include all slug tests (Bouwer and Rice Method, Cooper, Bredehoeft, Papadopulos Method, and van der Kamp Method), the Cooper-Jacob straight-line Method, Theis recovery-data analysis, Jacob-Lohman method for flowing wells in a confined aquifer, and the step-drawdown test. Multi-well test spreadsheets included in this report are; Hantush-Jacob Leaky Aquifer Method and Distance-Drawdown Methods. The distance-drawdown method is an equilibrium or steady-state method, thus storage cannot be estimated.
The meaning of diagnostic test results: a spreadsheet for swift data analysis.
Maceneaney, P M; Malone, D E
2000-03-01
To design a spreadsheet program to: (a) analyse rapidly diagnostic test result data produced in local research or reported in the literature; (b) correct reported predictive values for disease prevalence in any population; (c) estimate the post-test probability of disease in individual patients. Microsoft Excel(TM)was used. Section A: a contingency (2 x 2) table was incorporated into the spreadsheet. Formulae for standard calculations [sample size, disease prevalence, sensitivity and specificity with 95% confidence intervals, predictive values and likelihood ratios (LRs)] were linked to this table. The results change automatically when the data in the true or false negative and positive cells are changed. Section B: this estimates predictive values in any population, compensating for altered disease prevalence. Sections C-F: Bayes' theorem was incorporated to generate individual post-test probabilities. The spreadsheet generates 95% confidence intervals, LRs and a table and graph of conditional probabilities once the sensitivity and specificity of the test are entered. The latter shows the expected post-test probability of disease for any pre-test probability when a test of known sensitivity and specificity is positive or negative. This spreadsheet can be used on desktop and palmtop computers. The MS Excel(TM)version can be downloaded via the Internet from the URL ftp://radiography.com/pub/Rad-data99.xls A spreadsheet is useful for contingency table data analysis and assessment of the clinical meaning of diagnostic test results. Copyright 2000 The Royal College of Radiologists.
Cuffney, Thomas F.
2003-01-01
The Invertebrate Data Analysis System (IDAS) software provides an accurate, consistent, and efficient mechanism for analyzing invertebrate data collected as part of the National Water-Quality Assessment Program and stored in the Biological Transactional Database (Bio-TDB). The IDAS software is a stand-alone program for personal computers that run Microsoft (MS) Windows?. It allows users to read data downloaded from Bio-TDB and stored either as MS Excel? or MS Access? files. The program consists of five modules. The Edit Data module allows the user to subset, combine, delete, and summarize community data. The Data Preparation module allows the user to select the type(s) of sample(s) to process, calculate densities, delete taxa based on laboratory processing notes, combine lifestages or keep them separate, select a lowest taxonomic level for analysis, delete rare taxa, and resolve taxonomic ambiguities. The Calculate Community Metrics module allows the user to calculate over 130 community metrics, including metrics based on organism tolerances and functional feeding groups. The Calculate Diversities and Similarities module allows the user to calculate nine diversity and eight similarity indices. The Data export module allows the user to export data to other software packages and produce tables of community data that can be imported into spreadsheet and word-processing programs. Though the IDAS program was developed to process invertebrate data downloaded from USGS databases, it will work with other data sets that are converted to the USGS (Bio-TDB) format. Consequently, the data manipulation, analysis, and export procedures provided by the IDAS program can be used by anyone involved in using benthic macroinvertebrates in applied or basic research.
Installation Torque Tables for Noncritical Applications
NASA Technical Reports Server (NTRS)
Rivera-Rosario, Hazel T.; Powell, Joseph S.
2017-01-01
The objective of this project is to define torque values for bolts and screws when loading is not a concern. Fasteners require a certain torque to fulfill its function and prevent failure. NASA Glenn Research Center did not have a set of fastener torque tables for non-critical applications without loads, usually referring to hand-tight or wrench-tight torqueing. The project is based on two formulas, torque and pullout load. Torque values are calculated giving way to preliminary data tables. Testing is done to various bolts and metal plates, torqueing them until the point of failure. Around 640 torque tables were developed for UNC, UNF, and M fasteners. Different lengths of thread engagement were analyzed for the 5 most common materials used at GRC. The tables were put together in an Excel spreadsheet and then formatted into a Word document. The plan is to later convert this to an official technical publication or memorandum.
Spreadsheets in Science Teaching.
ERIC Educational Resources Information Center
Elliot, Chris
1988-01-01
Described is the use of a spreadsheet to model dynamic phenomena using numerical iterative methods. Uses the discharge of a capacitor, simple and damped harmonic motion, and the flow of heat along a bar as examples. (Author/CW)
Spreadsheet Works: Graphing Functions on a Spreadsheet.
ERIC Educational Resources Information Center
Ramamurthi, V. S.
1989-01-01
Explains graphing functions when using LOTUS 1-2-3. Provides examples and explains keystroke entries needed to make the graphs. Notes up to six functions can be displayed on the same set of axes. (MVL)
Fitting Orbits to Jupiter's Moons with a Spreadsheet.
ERIC Educational Resources Information Center
Bridges, Richard
1995-01-01
Describes how a spreadsheet is used to fit a circular orbit model to observations of Jupiter's moons made with a small telescope. Kepler's Third Law and the inverse square law of gravity are observed. (AIM)
Software validation applied to spreadsheets used in laboratories working under ISO/IEC 17025
NASA Astrophysics Data System (ADS)
Banegas, J. M.; Orué, M. W.
2016-07-01
Several documents deal with software validation. Nevertheless, more are too complex to be applied to validate spreadsheets - surely the most used software in laboratories working under ISO/IEC 17025. The method proposed in this work is intended to be directly applied to validate spreadsheets. It includes a systematic way to document requirements, operational aspects regarding to validation, and a simple method to keep records of validation results and modifications history. This method is actually being used in an accredited calibration laboratory, showing to be practical and efficient.
LICSS - a chemical spreadsheet in microsoft excel
2012-01-01
Background Representations of chemical datasets in spreadsheet format are important for ready data assimilation and manipulation. In addition to the normal spreadsheet facilities, chemical spreadsheets need to have visualisable chemical structures and data searchable by chemical as well as textual queries. Many such chemical spreadsheet tools are available, some operating in the familiar Microsoft Excel environment. However, within this group, the performance of Excel is often compromised, particularly in terms of the number of compounds which can usefully be stored on a sheet. Summary LICSS is a lightweight chemical spreadsheet within Microsoft Excel for Windows. LICSS stores structures solely as Smiles strings. Chemical operations are carried out by calling Java code modules which use the CDK, JChemPaint and OPSIN libraries to provide cheminformatics functionality. Compounds in sheets or charts may be visualised (individually or en masse), and sheets may be searched by substructure or similarity. All the molecular descriptors available in CDK may be calculated for compounds (in batch or on-the-fly), and various cheminformatic operations such as fingerprint calculation, Sammon mapping, clustering and R group table creation may be carried out. We detail here the features of LICSS and how they are implemented. We also explain the design criteria, particularly in terms of potential corporate use, which led to this particular implementation. Conclusions LICSS is an Excel-based chemical spreadsheet with a difference: • It can usefully be used on sheets containing hundreds of thousands of compounds; it doesn't compromise the normal performance of Microsoft Excel • It is designed to be installed and run in environments in which users do not have admin privileges; installation involves merely file copying, and sharing of LICSS sheets invokes automatic installation • It is free and extensible LICSS is open source software and we hope sufficient detail is provided here to enable developers to add their own features and share with the community. PMID:22301088
LICSS - a chemical spreadsheet in microsoft excel.
Lawson, Kevin R; Lawson, Jonty
2012-02-02
Representations of chemical datasets in spreadsheet format are important for ready data assimilation and manipulation. In addition to the normal spreadsheet facilities, chemical spreadsheets need to have visualisable chemical structures and data searchable by chemical as well as textual queries. Many such chemical spreadsheet tools are available, some operating in the familiar Microsoft Excel environment. However, within this group, the performance of Excel is often compromised, particularly in terms of the number of compounds which can usefully be stored on a sheet. LICSS is a lightweight chemical spreadsheet within Microsoft Excel for Windows. LICSS stores structures solely as Smiles strings. Chemical operations are carried out by calling Java code modules which use the CDK, JChemPaint and OPSIN libraries to provide cheminformatics functionality. Compounds in sheets or charts may be visualised (individually or en masse), and sheets may be searched by substructure or similarity. All the molecular descriptors available in CDK may be calculated for compounds (in batch or on-the-fly), and various cheminformatic operations such as fingerprint calculation, Sammon mapping, clustering and R group table creation may be carried out.We detail here the features of LICSS and how they are implemented. We also explain the design criteria, particularly in terms of potential corporate use, which led to this particular implementation. LICSS is an Excel-based chemical spreadsheet with a difference:• It can usefully be used on sheets containing hundreds of thousands of compounds; it doesn't compromise the normal performance of Microsoft Excel• It is designed to be installed and run in environments in which users do not have admin privileges; installation involves merely file copying, and sharing of LICSS sheets invokes automatic installation• It is free and extensibleLICSS is open source software and we hope sufficient detail is provided here to enable developers to add their own features and share with the community.
ERIC Educational Resources Information Center
Clark, Joy L.; Hegji, Charles E.
1997-01-01
Notes that using spreadsheets to teach microeconomics principles enables learning by doing in the exploration of basic concepts. Introduction of increasingly complex topics leads to exploration of theory and managerial decision making. (SK)
Building Your Own Regression Model
ERIC Educational Resources Information Center
Horton, Robert, M.; Phillips, Vicki; Kenelly, John
2004-01-01
Spreadsheets to explore regression with an algebra 2 class in a medium-sized rural high school are presented. The use of spreadsheets can help students develop sophisticated understanding of mathematical models and use them to describe real-world phenomena.
Petrogenetic Modeling with a Spreadsheet Program.
ERIC Educational Resources Information Center
Holm, Paul Eric
1988-01-01
Describes how interactive programs for scientific modeling may be created by using spreadsheet software such as LOTUS 1-2-3. Lists the advantages of using this method. Discusses fractional distillation, batch partial melting, and combination models as examples. (CW)
[Development of an Excel spreadsheet for meta-analysis of indirect and mixed treatment comparisons].
Tobías, Aurelio; Catalá-López, Ferrán; Roqué, Marta
2014-01-01
Meta-analyses in clinical research usually aimed to evaluate treatment efficacy and safety in direct comparison with a unique comparator. Indirect comparisons, using the Bucher's method, can summarize primary data when information from direct comparisons is limited or nonexistent. Mixed comparisons allow combining estimates from direct and indirect comparisons, increasing statistical power. There is a need for simple applications for meta-analysis of indirect and mixed comparisons. These can easily be conducted using a Microsoft Office Excel spreadsheet. We developed a spreadsheet for indirect and mixed effects comparisons of friendly use for clinical researchers interested in systematic reviews, but non-familiarized with the use of more advanced statistical packages. The use of the proposed Excel spreadsheet for indirect and mixed comparisons can be of great use in clinical epidemiology to extend the knowledge provided by traditional meta-analysis when evidence from direct comparisons is limited or nonexistent.
BEST (bioreactor economics, size and time of operation) is an Excel™ spreadsheet-based model that is used in conjunction with the public domain geochemical modeling software, PHREEQCI. The BEST model is used in the design process of sulfate-reducing bacteria (SRB) field bioreacto...
Moving Fingers under a Stick: A Laboratory Activity
ERIC Educational Resources Information Center
Massalha, Taha; Lanir, Yuval; Gluck, Paul
2011-01-01
We consider a demonstration in which pupils alternately slide and stop their fingers under a long horizontal rod which they support. The changeover is described in terms of the relevant kinetic and static friction. We present a model calculation, performed on a spreadsheet, which clarifies the process and describes graphically the stepwise…
Maceneaney, P M; Malone, D E
2000-12-01
To design a spreadsheet program to analyse interventional radiology (IR) data rapidly produced in local research or reported in the literature using 'evidence-based medicine' (EBM) parameters of treatment benefit and harm. Microsoft Excel(TM)was used. The spreadsheet consists of three worksheets. The first shows the 'Levels of Evidence and Grades of Recommendations' that can be assigned to therapeutic studies as defined by the Oxford Centre for EBM. The second and third worksheets facilitate the EBM assessment of therapeutic benefit and harm. Validity criteria are described. These include the assessment of the adequacy of sample size in the detection of possible procedural complications. A contingency (2 x 2) table for raw data on comparative outcomes in treated patients and controls has been incorporated. Formulae for EBM calculations are related to these numerators and denominators in the spreadsheet. The parameters calculated are benefit - relative risk reduction, absolute risk reduction, number needed to treat (NNT). Harm - relative risk, relative odds, number needed to harm (NNH). Ninety-five per cent confidence intervals are calculated for all these indices. The results change automatically when the data in the therapeutic outcome cells are changed. A final section allows the user to correct the NNT or NNH in their application to individual patients. This spreadsheet can be used on desktop and palmtop computers. The MS Excel(TM)version can be downloaded via the Internet from the URL ftp://radiography.com/pub/TxHarm00.xls. A spreadsheet is useful for the rapid analysis of the clinical benefit and harm from IR procedures.
Do Vampires Exist? Using Spreadsheets To Investigate a Common Folktale.
ERIC Educational Resources Information Center
Drier, Hollylynne Stohl
1999-01-01
Describes the use of spreadsheets in a third grade class to teach basic mathematical concepts by investigating the existence of vampires. Incorporates addition and multiplication skills, patterning, variables, formulas, exponential growth, and proof by contradiction. (LRW)
The Computer Bulletin Board. Modified Gran Plots of Very Weak Acids on a Spreadsheet.
ERIC Educational Resources Information Center
Chau, F. T.; And Others
1990-01-01
Presented are two applications of computer technology to chemistry instruction: the use of a spreadsheet program to analyze acid-base titration curves and the use of database software to catalog stockroom inventories. (CW)
This page provides information and access to Standard Evaluation Procedures (SEPs) and Data Entry Spreadsheet Templates (DESTs) developed by EPA's Office of Chemical Safety and Pollution Prevention (OCSPP).
Computer Corner: Spreadsheets, Power Series, Generating Functions, and Integers.
ERIC Educational Resources Information Center
Snow, Donald R.
1989-01-01
Implements a table algorithm on a spreadsheet program and obtains functions for several number sequences such as the Fibonacci and Catalan numbers. Considers other applications of the table algorithm to integers represented in various number bases. (YP)
The processing of blend words in naming and sentence reading.
Johnson, Rebecca L; Slate, Sarah Rose; Teevan, Allison R; Juhasz, Barbara J
2018-04-01
Research exploring the processing of morphologically complex words, such as compound words, has found that they are decomposed into their constituent parts during processing. Although much is known about the processing of compound words, very little is known about the processing of lexicalised blend words, which are created from parts of two words, often with phoneme overlap (e.g., brunch). In the current study, blends were matched with non-blend words on a variety of lexical characteristics, and blend processing was examined using two tasks: a naming task and an eye-tracking task that recorded eye movements during reading. Results showed that blend words were processed more slowly than non-blend control words in both tasks. Blend words led to longer reaction times in naming and longer processing times on several eye movement measures compared to non-blend words. This was especially true for blends that were long, rated low in word familiarity, but were easily recognisable as blends.
A Brief User's Guide to the Excel ® -Based DF Calculator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jubin, Robert T.
2016-06-01
To understand the importance of capturing penetrating forms of iodine as well as the other volatile radionuclides, a calculation tool was developed in the form of an Excel ® spreadsheet to estimate the overall plant decontamination factor (DF). The tool requires the user to estimate splits of the volatile radionuclides within the major portions of the reprocessing plant, speciation of iodine and individual DFs for each off-gas stream within the Used Nuclear Fuel reprocessing plant. The Impact to the overall plant DF for each volatile radionuclide is then calculated by the tool based on the specific user choices. The Excelmore » ® spreadsheet tracks both elemental and penetrating forms of iodine separately and allows changes in the speciation of iodine at each processing step. It also tracks 3H, 14C and 85Kr. This document provides a basic user's guide to the manipulation of this tool.« less
ERIC Educational Resources Information Center
Carson, S. R.
1998-01-01
Presents a method for using spreadsheets to model special relativistic phenomena based on the connection between electric and magnetic fields in special relativity. Uses the time dilation equation to carry out transformations between reference frames that show the connection between the fields quantitatively. (DDR)
DOT National Transportation Integrated Search
2014-03-01
This study resulted in the development of the GASCAP model (the Greenhouse Gas Assessment : Spreadsheet for Transportation Capital Projects). This spreadsheet model provides a user-friendly interface for determining the greenhouse gas (GHG) emissions...
ERIC Educational Resources Information Center
Ivancevich, Daniel M.; And Others
1996-01-01
Points out that political and economic pressures have sometimes caused the Financial Accounting Standards Board to alter standards. Presents a spreadsheet tool that demonstrates the economic consequences of adopting accounting standards. (SK)
76 FR 34124 - Civil Supersonic Aircraft Panel Discussion
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-10
... and continuing to the second line in the second column, the Web site address should read as follows: https://spreadsheets.google.com/spreadsheet/viewform?formkey=dEFEdlRnYzBiaHZtTUozTHVtbkF4d0E6MQ . [FR...
Assessment of ODOT culvert load rating spreadsheets for use in Michigan.
DOT National Transportation Integrated Search
2013-01-01
The project Assessment of ODOT Culvert Load Rating Spreadsheets for use in Michigan was : a short time-frame project funded by the Michigan Department of Transportation (MDOT) : through the Center for Structural Durability (CSD) at Michigan Tec...
A TOOL FOR PLANNING AERIAL PHOTOGRAPHY
abstract The U.S. EPAs Pacific Coastal Ecology Branch has developed a tool in the form of an Excel. spreadsheet that facilitates planning aerial photography missions. The spreadsheet accepts various input parameters such as desired photo-scale and boundary coordinates of the stud...
Hand, Maureen; Augustine, Chad; Feldman, David; Kurup, Parthiv; Beiter, Philipp; O'Connor, Patrick
2017-08-21
Each year since 2015, NREL has presented Annual Technology Baseline (ATB) in a spreadsheet that contains detailed cost and performance data (both current and projected) for renewable and conventional technologies. The spreadsheet includes a workbook for each technology. This spreadsheet provides data for the 2017 ATB. In this edition of the ATB, offshore wind power has been updated to include 15 technical resource groups. And, two options are now provided for representing market conditions for project financing, including current market conditions and long-term historical conditions. For more information, see https://atb.nrel.gov/.
NASA Astrophysics Data System (ADS)
Conrad, Jon M.
1999-10-01
Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. Through these examples and additional exercises at the end of each chapter, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems.
NASA Astrophysics Data System (ADS)
Gaik Tay, Kim; Cheong, Tau Han; Foong Lee, Ming; Kek, Sie Long; Abdul-Kahar, Rosmila
2017-08-01
In the previous work on Euler’s spreadsheet calculator for solving an ordinary differential equation, the Visual Basic for Application (VBA) programming was used, however, a graphical user interface was not developed to capture users input. This weakness may make users confuse on the input and output since those input and output are displayed in the same worksheet. Besides, the existing Euler’s spreadsheet calculator is not interactive as there is no prompt message if there is a mistake in inputting the parameters. On top of that, there are no users’ instructions to guide users to input the derivative function. Hence, in this paper, we improved previous limitations by developing a user-friendly and interactive graphical user interface. This improvement is aimed to capture users’ input with users’ instructions and interactive prompt error messages by using VBA programming. This Euler’s graphical user interface spreadsheet calculator is not acted as a black box as users can click on any cells in the worksheet to see the formula used to implement the numerical scheme. In this way, it could enhance self-learning and life-long learning in implementing the numerical scheme in a spreadsheet and later in any programming language.
Recalling taboo and nontaboo words.
Jay, Timothy; Caldwell-Harris, Catherine; King, Krista
2008-01-01
People remember emotional and taboo words better than neutral words. It is well known that words that are processed at a deep (i.e., semantic) level are recalled better than words processed at a shallow (i.e., purely visual) level. To determine how depth of processing influences recall of emotional and taboo words, a levels of processing paradigm was used. Whether this effect holds for emotional and taboo words has not been previously investigated. Two experiments demonstrated that taboo and emotional words benefit less from deep processing than do neutral words. This is consistent with the proposal that memories for taboo and emotional words are a function of the arousal level they evoke, even under shallow encoding conditions. Recall was higher for taboo words, even when taboo words were cued to be recalled after neutral and emotional words. The superiority of taboo word recall is consistent with cognitive neuroscience and brain imaging research.
The impact of developmental dyslexia and dysgraphia on movement production during word writing.
Kandel, Sonia; Lassus-Sangosse, Delphine; Grosjacques, Géraldine; Perret, Cyril
This study investigated how deficits in orthographic processing affect movement production during word writing. Children with dyslexia and dysgraphia wrote words and pseudo-words on a digitizer. The words were orthographically regular and irregular of varying frequency. The group analysis revealed that writing irregular words and pseudo-words increased movement duration and dysfluency. This indicates that the spelling processes were active while the children were writing the words. The impact of these spelling processes was stronger for the children with dyslexia and dysgraphia. The analysis of individual performance revealed that most dyslexic/dysgraphic children presented similar writing patterns. However, selective lexical processing deficits affected irregular word writing but not pseudo-word writing. Selective poor sublexical processing affected pseudo-word writing more than irregular word writing. This study suggests that the interaction between orthographic and motor processing constitutes an important cognitive load that may disrupt the graphic outcome of the children with dyslexia/dysgraphia.
Using a Spreadsheet To Explore Melting, Dissolving and Phase Diagrams.
ERIC Educational Resources Information Center
Goodwin, Alan
2002-01-01
Compares phase diagrams relating to the solubilities and melting points of various substances in textbooks with those generated by a spreadsheet using data from the literature. Argues that differences between the diagrams give rise to new chemical insights. (Author/MM)
The Use of Lotus 1-2-3 Macros in Engineering Calculations.
ERIC Educational Resources Information Center
Rosen, Edward M.
1990-01-01
Described are the use of spreadsheet programs in chemical engineering calculations using Lotus 1-2-3 macros. Discusses the macro commands, subroutine operations, and solution of partial differential equation. Provides examples of the subroutine programs and spreadsheet solution. (YP)
Academic Testing and Grading with Spreadsheet Software.
ERIC Educational Resources Information Center
Ho, James K.
1987-01-01
Explains how spreadsheet software can be used in the design and grading of academic tests and in assigning grades. Macro programs and menu-driven software are highlighted and an example using IBM PCs and Lotus 1-2-3 software is given. (Author/LRW)
Spreadsheet Toolkit for Ulysses Hi-Scale Measurements of Interplanetary Ions and Electrons
NASA Astrophysics Data System (ADS)
Reza, J. Z.; Lanzerotti, L. J.; Denker, C.; Patterson, D.; Amstrong, T. P.
2004-05-01
Throughout the entire Ulysses out-of-the-ecliptic solar polar mission, the Heliosphere Instrument for Spectra, Composition, and Anisotropy at Low Energies (HI-SCALE) has collected measurements of interplanetary ions and electrons. Time-series of electron and ion fluxes obtained since 1990 have been carefully calibrated and will be stored in a data management system, which will be publicly accessible via the WWW. The goal of the Virtual Solar Observatory (VSO) is to provide data uniformly and efficiently to a diverse user community. However, data dissemination can only be a first step, which has to be followed by a suite of data analysis tools that are tailored towards a diverse user community in science, technology, and education. The widespread use and familiarity of spreadsheets, which are available at low cost or open source for many operating systems, make them an interesting tool to investigate for the analysis of HI-SCALE data. The data are written in comma separated variable (CSV) format, which is commonly used in spreadsheet programs. CSV files can simply be linked as external data to spreadsheet templates, which in turn can be used to generate tables and figures of basic statistical properties and frequency distributions, temporal evolution of electron and ion spectra, comparisons of various energy channels, automatic detection of solar events, solar cycle variations, and space weather. Exploring spreadsheet-assisted data analysis in the context of information technology research, data base information search and retrieval, and data visualization potentially impacts other VSO components, where diverse user communities are targeted. Finally, this presentation is the result of an undergraduate research project, which will allow us to evaluate the performance of user-based spreadsheet analysis "benchmarked" at the undergraduate skill level.
Design and Evaluation of a Personal Diffusion Battery.
Vosburgh, Donna J H; Klein, Timothy; Sheehan, Maura; Anthony, T Renee; Peters, Thomas M
A four-stage personal diffusion battery (pDB) was designed and constructed to measure submicron particle size distributions. The pDB consisted of a screen-type diffusion battery, solenoid valve system, and electronic controller. A data inversion spreadsheet was created to solve for the number median diameter (NMD), geometric standard deviation (GSD), and particle number concentration of unimodal aerosols using stage number concentrations from the pDB combined with a handheld condensation particle counter (pDB+CPC). The inversion spreadsheet included particle entry losses, theoretical penetrations across screens, the detection efficiency of the CPC, and constraints so the spreadsheet solved to values within the pDB range. Size distribution parameters (NMD, GSD, and number concentration) measured with the pDB+CPC with inversion spreadsheet were within 25% of those measured with a scanning mobility particle sizer (SMPS) for 5 of 12 polydisperse combustion aerosols. For three tests conducted with propylene torch exhaust, the pDB+CPC with inversion spreadsheet successfully identified that the NMD was smaller than the constraint value of 16 nm. The ratio of the nanoparticle portion of the aerosol compared to the reference ( R nano ) was calculated to determine the ability of pDB+CPC with inversion spreadsheet to measure the nanoparticle portion of the aerosols. The R nano ranged from 0.87 to 1.01 when the inversion solved and from 0.06 to 2.01 when the inversion solved to a constraint. The pDB combined with CPC has limited use as a personal monitor but combining the pDB with a different detector would allow for the pDB to be used as a personal monitor.
Design and Evaluation of a Personal Diffusion Battery
Vosburgh, Donna J. H.; Klein, Timothy; Sheehan, Maura; Anthony, T. Renee; Peters, Thomas M.
2016-01-01
A four-stage personal diffusion battery (pDB) was designed and constructed to measure submicron particle size distributions. The pDB consisted of a screen-type diffusion battery, solenoid valve system, and electronic controller. A data inversion spreadsheet was created to solve for the number median diameter (NMD), geometric standard deviation (GSD), and particle number concentration of unimodal aerosols using stage number concentrations from the pDB combined with a handheld condensation particle counter (pDB+CPC). The inversion spreadsheet included particle entry losses, theoretical penetrations across screens, the detection efficiency of the CPC, and constraints so the spreadsheet solved to values within the pDB range. Size distribution parameters (NMD, GSD, and number concentration) measured with the pDB+CPC with inversion spreadsheet were within 25% of those measured with a scanning mobility particle sizer (SMPS) for 5 of 12 polydisperse combustion aerosols. For three tests conducted with propylene torch exhaust, the pDB+CPC with inversion spreadsheet successfully identified that the NMD was smaller than the constraint value of 16 nm. The ratio of the nanoparticle portion of the aerosol compared to the reference (R nano) was calculated to determine the ability of pDB+CPC with inversion spreadsheet to measure the nanoparticle portion of the aerosols. The R nano ranged from 0.87 to 1.01 when the inversion solved and from 0.06 to 2.01 when the inversion solved to a constraint. The pDB combined with CPC has limited use as a personal monitor but combining the pDB with a different detector would allow for the pDB to be used as a personal monitor. PMID:26900207
Simplified risk assessment of noise induced hearing loss by means of 2 spreadsheet models.
Lie, Arve; Engdahl, Bo; Tambs, Kristian
2016-11-18
The objective of this study has been to test 2 spreadsheet models to compare the observed with the expected hearing loss for a Norwegian reference population. The prevalence rates of the Norwegian and the National Institute for Occupational Safety and Health (NIOSH) definitions of hearing outcomes were calculated in terms of sex and age, 20-64 years old, for a screened (with no occupational noise exposure) (N = 18 858) and unscreened (N = 38 333) Norwegian reference population from the Nord-Trøndelag Hearing Loss Study (NTHLS). Based on the prevalence rates, 2 different spreadsheet models were constructed in order to compare the prevalence rates of various groups of workers with the expected rates. The spreadsheets were then tested on 10 different occupational groups with varying degrees of hearing loss as compared to a reference population. Hearing of office workers, train drivers, conductors and teachers differed little from the screened reference values based on the Norwegian and the NIOSH criterion. The construction workers, miners, farmers and military had an impaired hearing and railway maintenance workers and bus drivers had a mildly impaired hearing. The spreadsheet models give a valid assessment of the hearing loss. The use of spreadsheet models to compare hearing in occupational groups with that of a reference population is a simple and quick method. The results are in line with comparable hearing thresholds, and allow for significance testing. The method is believed to be useful for occupational health services in the assessment of risk of noise induced hearing loss (NIHL) and the preventive potential in groups of noise-exposed workers. Int J Occup Med Environ Health 2016;29(6):991-999. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
The Processing of Novel and Lexicalised Prefixed Words in Reading
ERIC Educational Resources Information Center
Pollatsek, Alexander; Slattery, Timothy J.; Juhasz, Barbara
2008-01-01
Two experiments compared how relatively long novel prefixed words (e.g., "overfarm") and existing prefixed words were processed in reading. The use of novel prefixed words allows one to examine the roles of whole-word access and decompositional processing in the processing of non-novel prefixed words. The two experiments found that,…
Modeling the Monthly Water Balance of a First Order Coastal Forested Watershed
S. V. Harder; Devendra M. Amatya; T. J. Callahan; Carl C. Trettin
2006-01-01
A study has been conducted to evaluate a spreadsheet-based conceptual Thornthwaite monthly water balance model and the process-based DRAINMOD model for their reliability in predicting monthly water budgets of a poorly drained, first order forested watershed at the Santee Experimental Forest located along the Lower Coastal Plain of South Carolina. Measured precipitation...
A New Spin on Miscue Analysis: Using Spider Charts to Web Reading Processes
ERIC Educational Resources Information Center
Wohlwend, Karen E.
2012-01-01
This article introduces a way of seeing miscue analysis data through a "spider chart", a readily available digital graphing tool that provides an effective way to visually represent readers' complex coordination of interrelated cueing systems. A spider chart is a standard feature in recent spreadsheet software that puts a new spin on miscue…
Using Serial and Discrete Digit Naming to Unravel Word Reading Processes
Altani, Angeliki; Protopapas, Athanassios; Georgiou, George K.
2018-01-01
During reading acquisition, word recognition is assumed to undergo a developmental shift from slow serial/sublexical processing of letter strings to fast parallel processing of whole word forms. This shift has been proposed to be detected by examining the size of the relationship between serial- and discrete-trial versions of word reading and rapid naming tasks. Specifically, a strong association between serial naming of symbols and single word reading suggests that words are processed serially, whereas a strong association between discrete naming of symbols and single word reading suggests that words are processed in parallel as wholes. In this study, 429 Grade 1, 3, and 5 English-speaking Canadian children were tested on serial and discrete digit naming and word reading. Across grades, single word reading was more strongly associated with discrete naming than with serial naming of digits, indicating that short high-frequency words are processed as whole units early in the development of reading ability in English. In contrast, serial naming was not a unique predictor of single word reading across grades, suggesting that within-word sequential processing was not required for the successful recognition for this set of words. Factor mixture analysis revealed that our participants could be clustered into two classes, namely beginning and more advanced readers. Serial naming uniquely predicted single word reading only among the first class of readers, indicating that novice readers rely on a serial strategy to decode words. Yet, a considerable proportion of Grade 1 students were assigned to the second class, evidently being able to process short high-frequency words as unitized symbols. We consider these findings together with those from previous studies to challenge the hypothesis of a binary distinction between serial/sublexical and parallel/lexical processing in word reading. We argue instead that sequential processing in word reading operates on a continuum, depending on the level of reading proficiency, the degree of orthographic transparency, and word-specific characteristics. PMID:29706918
Using Serial and Discrete Digit Naming to Unravel Word Reading Processes.
Altani, Angeliki; Protopapas, Athanassios; Georgiou, George K
2018-01-01
During reading acquisition, word recognition is assumed to undergo a developmental shift from slow serial/sublexical processing of letter strings to fast parallel processing of whole word forms. This shift has been proposed to be detected by examining the size of the relationship between serial- and discrete-trial versions of word reading and rapid naming tasks. Specifically, a strong association between serial naming of symbols and single word reading suggests that words are processed serially, whereas a strong association between discrete naming of symbols and single word reading suggests that words are processed in parallel as wholes. In this study, 429 Grade 1, 3, and 5 English-speaking Canadian children were tested on serial and discrete digit naming and word reading. Across grades, single word reading was more strongly associated with discrete naming than with serial naming of digits, indicating that short high-frequency words are processed as whole units early in the development of reading ability in English. In contrast, serial naming was not a unique predictor of single word reading across grades, suggesting that within-word sequential processing was not required for the successful recognition for this set of words. Factor mixture analysis revealed that our participants could be clustered into two classes, namely beginning and more advanced readers. Serial naming uniquely predicted single word reading only among the first class of readers, indicating that novice readers rely on a serial strategy to decode words. Yet, a considerable proportion of Grade 1 students were assigned to the second class, evidently being able to process short high-frequency words as unitized symbols. We consider these findings together with those from previous studies to challenge the hypothesis of a binary distinction between serial/sublexical and parallel/lexical processing in word reading. We argue instead that sequential processing in word reading operates on a continuum, depending on the level of reading proficiency, the degree of orthographic transparency, and word-specific characteristics.
This SOP described the method used to automatically parse analytical data generated from gas chromatography/mass spectrometry (GC/MS) analyses into CTEPP summary spreadsheets and electronically import the summary spreadsheets into the CTEPP study database.
A Spreadsheet-based GIS tool for planning aerial photography
The U.S.EPA's Pacific Coastal Ecology Branch has developed a tool which facilitates planning aerial photography missions. This tool is an Excel spreadsheet which accepts various input parameters such as desired photo-scale and boundary coordinates of the study area and compiles ...
NASA Astrophysics Data System (ADS)
Conrad, Jon M.
2000-01-01
Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. These problems help make concepts operational, develop economic intuition, and serve as a bridge to the study of real-world problems of resource management. Through these examples and additional exercises at the end of Chapters 1 to 8, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems. Book is unique in its use of spreadsheet software (Excel) to solve dynamic allocation problems Conrad is co-author of a previous book for the Press on the subject for graduate students Approach is extremely student-friendly; gives students the tools to apply research results to actual environmental issues
Crovelli, Robert A.; revised by Charpentier, Ronald R.
2012-01-01
The U.S. Geological Survey (USGS) periodically assesses petroleum resources of areas within the United States and the world. The purpose of this report is to explain the development of an analytic probabilistic method and spreadsheet software system called Analytic Cell-Based Continuous Energy Spreadsheet System (ACCESS). The ACCESS method is based upon mathematical equations derived from probability theory. The ACCESS spreadsheet can be used to calculate estimates of the undeveloped oil, gas, and NGL (natural gas liquids) resources in a continuous-type assessment unit. An assessment unit is a mappable volume of rock in a total petroleum system. In this report, the geologic assessment model is defined first, the analytic probabilistic method is described second, and the spreadsheet ACCESS is described third. In this revised version of Open-File Report 00-044 , the text has been updated to reflect modifications that were made to the ACCESS program. Two versions of the program are added as appendixes.
Word Processing. A Handbook for Business Teachers.
ERIC Educational Resources Information Center
Stewart, Jeffrey R., Jr., Ed.
This handbook is designed to provide information to help teachers keep abreast of changes in word processing and to develop necessary teaching skills. The handbook is divided into two main parts: understanding word processing and teaching word processing skills. In the introduction the part word processing plays in the business scheme of a company…
Individual differences in emotion word processing: A diffusion model analysis.
Mueller, Christina J; Kuchinke, Lars
2016-06-01
The exploratory study investigated individual differences in implicit processing of emotional words in a lexical decision task. A processing advantage for positive words was observed, and differences between happy and fear-related words in response times were predicted by individual differences in specific variables of emotion processing: Whereas more pronounced goal-directed behavior was related to a specific slowdown in processing of fear-related words, the rate of spontaneous eye blinks (indexing brain dopamine levels) was associated with a processing advantage of happy words. Estimating diffusion model parameters revealed that the drift rate (rate of information accumulation) captures unique variance of processing differences between happy and fear-related words, with highest drift rates observed for happy words. Overall emotion recognition ability predicted individual differences in drift rates between happy and fear-related words. The findings emphasize that a significant amount of variance in emotion processing is explained by individual differences in behavioral data.
(abstract) Generic Modeling of a Life Support System for Process Technology Comparisons
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.
Buffer$--An Economic Analysis Tool
Gary Bentrup
2007-01-01
Buffer$ is an economic spreadsheet tool for analyzing the cost-benefits of conservation buffers by resource professionals. Conservation buffers are linear strips of vegetation managed for multiple landowner and societal objectives. The Microsoft Excel based spreadsheet can calculate potential income derived from a buffer, including income from cost-share/incentive...
A spreadsheet that calculates meteor orbits
NASA Astrophysics Data System (ADS)
Langbroek, M.
2004-08-01
The author has written an MS Excel spreadsheet application called Metorb08.xls which calculates a meteor's orbital elements from its apparent radiant position and initial speed. It can be downloaded from URL http://home.wanadoo.nl/marco.langbroek along with a suite of other meteor-related Excel applications.
Teaching Science and Mathematics Subjects Using the Excel Spreadsheet Package
ERIC Educational Resources Information Center
Ibrahim, Dogan
2009-01-01
The teaching of scientific subjects usually require laboratories where students can put the theory they have learned into practice. Traditionally, electronic programmable calculators, dedicated software, or expensive software simulation packages, such as MATLAB have been used to simulate scientific experiments. Recently, spreadsheet programs have…
Automated Formative Feedback and Summative Assessment Using Individualised Spreadsheet Assignments
ERIC Educational Resources Information Center
Blayney, Paul; Freeman, Mark
2004-01-01
This paper reports on the effects of automating formative feedback at the student's discretion and automating summative assessment with individualised spreadsheet assignments. Quality learning outcomes are achieved when students adopt deep approaches to learning (Ramsden, 2003). Learning environments designed to align assessment to learning…
Introduction to Classroom Sprego
ERIC Educational Resources Information Center
Csernoch, Mária; Biró, Piroska
2016-01-01
Sprego is programming with spreadsheet functions. The present paper provides introductory Sprego examples which have so far only been available in Hungarian. Spreadsheet environments offer both a programming tool which best serves beginner and end-user programmers' interest, and an approach which lightens the burden of coding and language details.…
Hydrogen Financial Analysis Scenario Tool (H2FAST) Documentation
for the web and spreadsheet versions of H2FAST. H2FAST Web Tool User's Manual H2FAST Spreadsheet Tool User's Manual (DRAFT) Technical Support Send questions or feedback about H2FAST to H2FAST@nrel.gov. Home
NASA Astrophysics Data System (ADS)
Zou, Yan-Rong; Wang, Lianyuan; Shuai, Yanhua; Peng, Ping'an
2005-08-01
A new kinetic model and an Excel © spreadsheet program for modeling the stable carbon isotope composition of natural gases is provided in this paper. The model and spreadsheet could be used to describe and predict the variances in stable carbon isotope of natural gases under both experimental and geological conditions with heating temperature or geological time. It is a user-friendly convenient tool for the modeling of isotope variation with time under experimental and geological conditions. The spreadsheet, based on experimental data, requires the input of the kinetic parameters of gaseous hydrocarbons generation. Some assumptions are made in this model: the conventional (non-isotope species) kinetic parameters represent the light isotope species; the initial isotopic value is the same for all parallel chemical reaction of gaseous hydrocarbons generation for simplicity, the re-exponential factor ratio, 13A/ 12A, is a constant, and both heavy and light isotope species have similar activation energy distribution. These assumptions are common in modeling of isotope ratios. The spreadsheet is used for searching the best kinetic parameters of the heavy isotope species to reach the minimum errors compared with experimental data, and then extrapolating isotopic changes to the thermal history of sedimentary basins. A short calculation example on the variation in δ13C values of methane is provided in this paper to show application to geological conditions.
Introducing Simulation via the Theory of Records
ERIC Educational Resources Information Center
Johnson, Arvid C.
2011-01-01
While spreadsheet simulation can be a useful method by which to help students to understand some of the more advanced concepts in an introductory statistics course, introducing the simulation methodology at the same time as these concepts can result in student cognitive overload. This article describes a spreadsheet model that has been…
The Spreadsheet in an Educational Setting. Microcomputing Working Paper Series F 84-4.
ERIC Educational Resources Information Center
Wozny, Lucy
This overview of a specific spreadsheet, Microsoft's Multiplan for the Apple Macintosh microcomputer, emphasizes specific features that are important to the academic community, including the mathematical functions of algebra, trigonometry, and statistical analysis. Additional features are summarized, including data formats for both numerical and…
Forming Conjectures within a Spreadsheet Environment
ERIC Educational Resources Information Center
Calder, Nigel; Brown, Tony; Hanley, Una; Darby, Susan
2006-01-01
This paper is concerned with the use of spreadsheets within mathematical investigational tasks. Considering the learning of both children and pre-service teaching students, it examines how mathematical phenomena can be seen as a function of the pedagogical media through which they are encountered. In particular, it shows how pedagogical apparatus…
Domestic Disasters and Geospatial Technology for the Defense Logistics Agency
2014-12-01
total distance traveled and satisfy all fuel demands. This report used the Vehicle Routing Problem (VRP) Spreadsheet Solver, developed by Erdogan ...Security Affairs, 2(2), 5–10. Erdogan , G. (2013). VRP spreadsheet solver. Retrieved from VeRoLog: EURO Working Group on Vehicle Routing and Logistics
Interactive Spreadsheets in JCE Webware
ERIC Educational Resources Information Center
Coleman, William F.; Fedosky, Edward W.
2005-01-01
A description of the Microsoft Excel spreadsheet simulation, Anharmonicity.xls that can be used to smoothly and continuously switch a plotted function and its quadratic approximation is presented. It can be used in a classroom demonstration or incorporated into a student-centered computer-laboratory exercise to examine the qualitative behavior of…
Spreadsheet Applications: Prototyping an Innovative Blended Course
ERIC Educational Resources Information Center
Baker, J. Howard
2004-01-01
After teaching the advanced spreadsheet course at a major university in Louisiana as a traditional classroom course for a number of years, it was decided to create a prototype-blended course, with a considerable portion offered via distance education. This research, which uses a prototyping methodology, is exploratory in nature. Prototyping can…
LOTUS 1-2-3 and Decision Support: Allocating the Monograph Budget.
ERIC Educational Resources Information Center
Perry-Holmes, Claudia
1985-01-01
Describes the use of electronic spreadsheet software for library decision support systems using personal computers. Discussion covers templates, formulas for allocating the materials budget, LOTUS 1-2-3 and budget allocations, choosing a formula, the spreadsheet itself, graphing capabilities, and advantages and disadvantages of templates. Six…
Triangular Plots and Spreadsheet Software.
ERIC Educational Resources Information Center
Holm, Paul Eric
1988-01-01
Describes how the limitations of the built-in graphics capabilities of spreadsheet software can be overcome by making full use of the flexibility of the grahics options. Uses triangular plots with labeled field boundaries produced using Lotus 1-2-3 to demonstrate these techniques and their use in teaching geology. (CW)
Calculating the Variables of Finance on a Spreadsheet.
ERIC Educational Resources Information Center
Rochowicz, John A., Jr.
The different approaches for solving problems and learning mathematics with technology are invaluable. This paper describes how to determine the variables of the ordinary annuity equation with a spreadsheet. Examples of future value of annuity, sinking fund annuity, the number of periods necessary for periodic payments plus interest to accumulate…
Hydroshear Simulation Lab Test 2
Bauer, Steve
2014-08-01
This data file is for test 2. In this test a sample of granite with a pre cut (man made fracture) is confined, heated and differential stress is applied. max temperature in this this system development test is 95C. test details on the spreadsheets--note thta there are 2 spreadsheets
Buyers Guide: Communications Software--Overview; Ratings Digest; Reviews; Benchmarks.
ERIC Educational Resources Information Center
Lockwood, Russ; And Others
1988-01-01
Contains articles which review communications software. Includes "Crosstalk Mark 4,""ProComm,""Freeway Advanced,""Windows InTalk,""Relay Silver," and "Smartcom III." Compares in terms of text proprietary, MCI upload, Test ASCII, Spreadsheet Proprietary, Text XMODEM, Spreadsheet XMODEM, MCI Download, Documentation, Support and Service, ease of use,…
Spreadsheet Analysis of Harvesting Systems
R.B. Rummer; B.L. Lanford
1987-01-01
Harvesting systems can be modeled and analyzed on microcomputers using commercially available "spreadsheet" software. The effect of system or external variables on the production rate or system cost can be evaluated and alternative systems can be easily examined. The tedious calculations associated with such analyses are performed by the computer. For users...
Constructing Meanings and Utilities within Algebraic Tasks
ERIC Educational Resources Information Center
Ainley, Janet; Bills, Liz; Wilson, Kirsty
2004-01-01
The Purposeful Algebraic Activity project aims to explore the potential of spreadsheets in the introduction to algebra and algebraic thinking. We discuss two sub-themes within the project: tracing the development of pupils' construction of meaning for variable from arithmetic-based activity, through use of spreadsheets, and into formal algebra,…
Abbassi, Ensie; Blanchette, Isabelle; Ansaldo, Ana I; Ghassemzadeh, Habib; Joanette, Yves
2015-01-01
Emotional words are processed rapidly and automatically in the left hemisphere (LH) and slowly, with the involvement of attention, in the right hemisphere (RH). This review aims to find the reason for this difference and suggests that emotional words can be processed superficially or deeply due to the involvement of the linguistic and imagery systems, respectively. During superficial processing, emotional words likely make connections only with semantically associated words in the LH. This part of the process is automatic and may be sufficient for the purpose of language processing. Deep processing, in contrast, seems to involve conceptual information and imagery of a word's perceptual and emotional properties using autobiographical memory contents. Imagery and the involvement of autobiographical memory likely differentiate between emotional and neutral word processing and explain the salient role of the RH in emotional word processing. It is concluded that the level of emotional word processing in the RH should be deeper than in the LH and, thus, it is conceivable that the slow mode of processing adds certain qualities to the output.
Emotional words facilitate lexical but not early visual processing.
Trauer, Sophie M; Kotz, Sonja A; Müller, Matthias M
2015-12-12
Emotional scenes and faces have shown to capture and bind visual resources at early sensory processing stages, i.e. in early visual cortex. However, emotional words have led to mixed results. In the current study ERPs were assessed simultaneously with steady-state visual evoked potentials (SSVEPs) to measure attention effects on early visual activity in emotional word processing. Neutral and negative words were flickered at 12.14 Hz whilst participants performed a Lexical Decision Task. Emotional word content did not modulate the 12.14 Hz SSVEP amplitude, neither did word lexicality. However, emotional words affected the ERP. Negative compared to neutral words as well as words compared to pseudowords lead to enhanced deflections in the P2 time range indicative of lexico-semantic access. The N400 was reduced for negative compared to neutral words and enhanced for pseudowords compared to words indicating facilitated semantic processing of emotional words. LPC amplitudes reflected word lexicality and thus the task-relevant response. In line with previous ERP and imaging evidence, the present results indicate that written emotional words are facilitated in processing only subsequent to visual analysis.
Wood fueled boiler financial feasibility user's manual
Robert Govett; Scott Bowe; Terry Mace; Steve Hubbard; John (Rusty) Dramm; Richard Bergman
2005-01-01
âWood Fueled Boiler Financial Feasibilityâ is a spreadsheet program designed for easy use on a personal computer. This program provides a starting point for interested parties to perform financial feasibility analysis of a steam boiler system for space heating or process heat. By allowing users to input the conditions applicable to their current or proposed fuel...
Users guide for noble fir bough cruiser.
Roger D. Fight; Keith A. Blatner; Roger C. Chapman; William E. Schlosser
2005-01-01
The bough cruiser spreadsheet was developed to provide a method for cruising noble fir (Abies procera Rehd.) stands to estimate the weight of boughs that might be harvested. No boughs are cut as part of the cruise process. The approach is based on a two-stage sample. The first stage consists of fixed-radius plots that are used to estimate the...
ERIC Educational Resources Information Center
Selwyn, Neil; Henderson, Michael; Chao, Shu-Hua
2015-01-01
The generation, processing and circulation of data in digital form is now an integral aspect of contemporary schooling. Based upon empirical study of two secondary school settings in Australia, this paper considers the different forms of digitally-based "data work" engaged in by school leaders, managers, administrators and teachers. In…
Cash Flow Statement Spreadsheet Modeling Case Using a Prototype System Development Process
ERIC Educational Resources Information Center
Davis, Jefferson T.
2015-01-01
U.S. GAAP and IFRS standards both require a cash flow statement that presents operating, investing and financing net cash flows (FASB, FAS 95; 1987; IASB, IAS 7, 1992). Although students are exposed to the cash flow statement in beginning accounting courses and then study the cash flow statement in more depth in intermediate accounting classes,…
A Switching-Mode Power Supply Design Tool to Improve Learning in a Power Electronics Course
ERIC Educational Resources Information Center
Miaja, P. F.; Lamar, D. G.; de Azpeitia, M.; Rodriguez, A.; Rodriguez, M.; Hernando, M. M.
2011-01-01
The static design of ac/dc and dc/dc switching-mode power supplies (SMPS) relies on a simple but repetitive process. Although specific spreadsheets, available in various computer-aided design (CAD) programs, are widely used, they are difficult to use in educational applications. In this paper, a graphic tool programmed in MATLAB is presented,…
Technology Focus: Using Technology to Promote Equity in Financial Decision Making
ERIC Educational Resources Information Center
Garofalo, Joe; Kitchell, Barbara Ann
2010-01-01
The process of borrowing money can be intimidating to some people. Many feel at the mercy of a loan officer and just accept terms and amounts at face value. A graphing calculator, or spreadsheet, with appropriate knowledge of how to use it, can be an empowering tool to help create a more equitable situation or circumstance. Given the proper…
Active Learning and Student Engagement in the Business Curriculum: Excel Can Be the Answer
ERIC Educational Resources Information Center
McCloskey, Donna W.; Bussom, Lisa
2013-01-01
Business educators are struggling with how better to engage their students in the learning process. At the same time, stakeholders are reporting that business students are ill prepared in problem solving techniques and the effective use of spreadsheets. The systemic use of Excel as a teaching tool in the business curriculum may be the answer to…
Supplemental knowledge acquisition through external product interface for CLIPS
NASA Technical Reports Server (NTRS)
Saito, Tim; Ebaud, Stephen; Loftin, Bowen R.
1990-01-01
Traditionally, the acquisition of knowledge for expert systems consisted of the interview process with the domain or subject matter expert (SME), observation of domain environment, and information gathering and research which constituted a direct form of knowledge acquisition (KA). The knowledge engineer would be responsible for accumulating pertinent information and/or knowledge from the SME(s) for input into the appropriate expert system development tool. The direct KA process may (or may not) have included forms of data or documentation to incorporate from the SME's surroundings. The differentiation between direct KA and supplemental KA (indirect) would be the difference in the use of data. In acquiring supplemental knowledge, the knowledge engineer would access other types of evidence (manuals, documents, data files, spreadsheets, etc.) that would support the reasoning or premises of the SME. When an expert makes a decision in a particular task, one tool that may have been used to justify a recommendation, would have been a spreadsheet total or column figure. Locating specific decision points from that data within the SME's framework would constitute supplemental KA. Data used for a specific purpose in one system or environment would be used as supplemental knowledge for another, specifically a CLIPS project.
Automatic Processing of Emotional Words in the Absence of Awareness: The Critical Role of P2
Lei, Yi; Dou, Haoran; Liu, Qingming; Zhang, Wenhai; Zhang, Zhonglu; Li, Hong
2017-01-01
It has been long debated to what extent emotional words can be processed in the absence of awareness. Behavioral studies have shown that the meaning of emotional words can be accessed even without any awareness. However, functional magnetic resonance imaging studies have revealed that emotional words that are unconsciously presented do not activate the brain regions involved in semantic or emotional processing. To clarify this point, we used continuous flash suppression (CFS) and event-related potential (ERP) techniques to distinguish between semantic and emotional processing. In CFS, we successively flashed some Mondrian-style images into one participant's eye steadily, which suppressed the images projected to the other eye. Negative, neutral, and scrambled words were presented to 16 healthy participants for 500 ms. Whenever the participants saw the stimuli—in both visible and invisible conditions—they pressed specific keyboard buttons. Behavioral data revealed that there was no difference in reaction time to negative words and to neutral words in the invisible condition, although negative words were processed faster than neutral words in the visible condition. The ERP results showed that negative words elicited a larger P2 amplitude in the invisible condition than in the visible condition. The P2 component was enhanced for the neutral words compared with the scrambled words in the visible condition; however, the scrambled words elicited larger P2 amplitudes than the neutral words in the invisible condition. These results suggest that the emotional processing of words is more sensitive than semantic processing in the conscious condition. Semantic processing was found to be attenuated in the absence of awareness. Our findings indicate that P2 plays an important role in the unconscious processing of emotional words, which highlights the fact that emotional processing may be automatic and prioritized compared with semantic processing in the absence of awareness. PMID:28473785
Automatic Processing of Emotional Words in the Absence of Awareness: The Critical Role of P2.
Lei, Yi; Dou, Haoran; Liu, Qingming; Zhang, Wenhai; Zhang, Zhonglu; Li, Hong
2017-01-01
It has been long debated to what extent emotional words can be processed in the absence of awareness. Behavioral studies have shown that the meaning of emotional words can be accessed even without any awareness. However, functional magnetic resonance imaging studies have revealed that emotional words that are unconsciously presented do not activate the brain regions involved in semantic or emotional processing. To clarify this point, we used continuous flash suppression (CFS) and event-related potential (ERP) techniques to distinguish between semantic and emotional processing. In CFS, we successively flashed some Mondrian-style images into one participant's eye steadily, which suppressed the images projected to the other eye. Negative, neutral, and scrambled words were presented to 16 healthy participants for 500 ms. Whenever the participants saw the stimuli-in both visible and invisible conditions-they pressed specific keyboard buttons. Behavioral data revealed that there was no difference in reaction time to negative words and to neutral words in the invisible condition, although negative words were processed faster than neutral words in the visible condition. The ERP results showed that negative words elicited a larger P2 amplitude in the invisible condition than in the visible condition. The P2 component was enhanced for the neutral words compared with the scrambled words in the visible condition; however, the scrambled words elicited larger P2 amplitudes than the neutral words in the invisible condition. These results suggest that the emotional processing of words is more sensitive than semantic processing in the conscious condition. Semantic processing was found to be attenuated in the absence of awareness. Our findings indicate that P2 plays an important role in the unconscious processing of emotional words, which highlights the fact that emotional processing may be automatic and prioritized compared with semantic processing in the absence of awareness.
Skipped words and fixated words are processed differently during reading.
Eskenazi, Michael A; Folk, Jocelyn R
2015-04-01
The purpose of this study was to investigate whether words are processed differently when they are fixated during silent reading than when they are skipped. According to a serial processing model of eye movement control (e.g., EZ Reader) skipped words are fully processed (Reichle, Rayner, Pollatsek, Behavioral and Brain Sciences, 26(04):445-476, 2003), whereas in a parallel processing model (e.g., SWIFT) skipped words do not need to be fully processed (Engbert, Nuthmann, Richter, Kliegl, Psychological Review, 112(4):777-813, 2005). Participants read 34 sentences with target words embedded in them while their eye movements were recorded. All target words were three-letter, low-frequency, and unpredictable nouns. After the reading session, participants completed a repetition priming lexical decision task with the target words from the reading session included as the repetition prime targets, with presentation of those same words during the reading task acting as the prime. When participants skipped a word during the reading session, their reaction times on the lexical decision task were significantly longer (M = 656.42 ms) than when they fixated the word (M = 614.43 ms). This result provides evidence that skipped words are sometimes not processed to the same degree as fixated words during reading.
Do Chinese Readers Follow the National Standard Rules for Word Segmentation during Reading?
Liu, Ping-Ping; Li, Wei-Jun; Lin, Nan; Li, Xing-Shan
2013-01-01
We conducted a preliminary study to examine whether Chinese readers’ spontaneous word segmentation processing is consistent with the national standard rules of word segmentation based on the Contemporary Chinese language word segmentation specification for information processing (CCLWSSIP). Participants were asked to segment Chinese sentences into individual words according to their prior knowledge of words. The results showed that Chinese readers did not follow the segmentation rules of the CCLWSSIP, and their word segmentation processing was influenced by the syntactic categories of consecutive words. In many cases, the participants did not consider the auxiliary words, adverbs, adjectives, nouns, verbs, numerals and quantifiers as single word units. Generally, Chinese readers tended to combine function words with content words to form single word units, indicating they were inclined to chunk single words into large information units during word segmentation. Additionally, the “overextension of monosyllable words” hypothesis was tested and it might need to be corrected to some degree, implying that word length have an implicit influence on Chinese readers’ segmentation processing. Implications of these results for models of word recognition and eye movement control are discussed. PMID:23408981
Forming conjectures within a spreadsheet environment
NASA Astrophysics Data System (ADS)
Calder, Nigel; Brown, Tony; Hanley, Una; Darby, Susan
2006-12-01
This paper is concerned with the use of spreadsheets within mathematical investigational tasks. Considering the learning of both children and pre-service teaching students, it examines how mathematical phenomena can be seen as a function of the pedagogical media through which they are encountered. In particular, it shows how pedagogical apparatus influence patterns of social interaction, and how this interaction shapes the mathematical ideas that are engaged with. Notions of conjecture, along with the particular faculty of the spreadsheet setting, are considered with regard to the facilitation of mathematical thinking. Employing an interpretive perspective, a key focus is on how alternative pedagogical media and associated discursive networks influence the way that students form and test informal conjectures.
Simulation modeling for the health care manager.
Kennedy, Michael H
2009-01-01
This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.
ERIC Educational Resources Information Center
Lai, Chiu-Lin; Hwang, Gwo-Jen
2015-01-01
In this study, a spreadsheet-based visualized Mindtool was developed for improving students' learning performance when finding relationships between numerical variables by engaging them in reasoning and decision-making activities. To evaluate the effectiveness of the proposed approach, an experiment was conducted on the "phenomena of climate…
A Computer Spreadsheet for Locating Assistive Devices.
ERIC Educational Resources Information Center
Palmer, Catherine V.; Garstecki, Dean C.
1988-01-01
The article presents a directory of assistive devices for persons with hearing impairments in a grid format by distributor and type of device (alerting devices, telephone, TV/radio/stereo, personal communication, group communication, and other). The product locator is also available in spreadsheet form for either the Macintosh or IBM-PC computers.…
Simulating Satellite and Space Probe Motion at High School with Spreadsheets
ERIC Educational Resources Information Center
Benacka, Jan
2017-01-01
This paper gives an account of an experiment in which thirty-three high school students of ages 17-19 developed spreadsheet numerical models of satellite and space probe motion. The models are free to download. A survey was carried out to find out the students' opinion of the lessons.
Using Spreadsheets to Teach Aspects of Biology Involving Mathematical Models
ERIC Educational Resources Information Center
Carlton, Kevin; Nicholls, Mike; Ponsonby, David
2004-01-01
Some aspects of biology, for example the Hardy-Weinberg simulation of population genetics or modelling heat flow in lizards, have an undeniable mathematical basis. Students can find the level of mathematical skill required to deal with such concepts to be an insurmountable hurdle to understanding. If not used effectively, spreadsheet models…
Diary of a Conversion--Lotus 1-2-3 to Symphony 1.1.
ERIC Educational Resources Information Center
Dunnewin, Larry
1986-01-01
Describes the uses of Lotus 1-2-3 (a spreadsheet-graphics-database program created by Lotus Development Corporation) and Symphony 1.1 (a refinement and expansion of Symphony 1.01 providing memory efficiency, speed, ease of use, greater file compatibility). Spreadsheet and graphics capabilities, the use of windows, database environment, and…
Using Spreadsheet Modeling Techniques for Capital Project Review. AIR 1985 Annual Forum Paper.
ERIC Educational Resources Information Center
Kaynor, Robert K.
The value of microcomputer modeling tools and spreadsheets to help college institutional researchers analyze proposed capital projects is discussed, along with strengths and weaknesses of different software packages. Capital budgeting is the analysis that supports decisions about the allocation and commitment of funds to long-term capital…
Negative Effects of Learning Spreadsheet Management on Learning Database Management
ERIC Educational Resources Information Center
Vágner, Anikó; Zsakó, László
2015-01-01
A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…
Spreadsheets as a Transparent Resource for Learning the Mathematics of Annuities
ERIC Educational Resources Information Center
Pournara, Craig
2009-01-01
The ability of mathematics teachers to decompress mathematics and to move between representations are two key features of mathematical knowledge that is usable for teaching. This article reports on four pre-service secondary mathematics teachers learning the mathematics of annuities. In working with spreadsheets students began to make sense of…
A Simple Spreadsheet Strikes a Nerve among Adjuncts
ERIC Educational Resources Information Center
Stratford, Michael
2012-01-01
Energized by his fellow adjunct professors who had gathered for a national meeting last month in Washington, District of Columbia, Joshua A. Boldt flew home to Athens, Georgia, opened his laptop, and created a Google document. On his personal blog, the writing instructor implored colleagues to contribute to the publicly editable spreadsheet,…
Studying Faculty Flows Using an Interactive Spreadsheet Model. AIR 1997 Annual Forum Paper.
ERIC Educational Resources Information Center
Kelly, Wayne
This paper describes a spreadsheet-based faculty flow model developed and implemented at the University of Calgary (Canada) to analyze faculty retirement, turnover, and salary issues. The study examined whether, given expected faculty turnover, the current salary increment system was sustainable in a stable or declining funding environment, and…
Transition Matrices: A Tool to Assess Student Learning and Improve Instruction
ERIC Educational Resources Information Center
Morris, Gary A.; Walter, Paul; Skees, Spencer; Schwartz, Samantha
2017-01-01
This paper introduces a new spreadsheet tool for adoption by high school or college-level physics teachers who use common assessments in a pre-instruction/post-instruction mode to diagnose student learning and teaching effectiveness. The spreadsheet creates a simple matrix that identifies the percentage of students who select each possible…
ERIC Educational Resources Information Center
Agyei, Douglas D.; Voogt, Joke M.
2016-01-01
In this study, 12 pre-service mathematics teachers worked in teams to develop their knowledge and skills in using teacher-led spreadsheet demonstrations to help students explore mathematics concepts, stimulate discussions and perform authentic tasks through activity-based lessons. Pre-service teachers' lesson plans, their instruction of the…
Examining Errors in Simple Spreadsheet Modeling from Different Research Perspectives
ERIC Educational Resources Information Center
Kadijevich, Djordje M.
2012-01-01
By using a sample of 1st-year undergraduate business students, this study dealt with the development of simple (deterministic and non-optimization) spreadsheet models of income statements within an introductory course on business informatics. The study examined students' errors in doing this for business situations of their choice and found three…
Using Spreadsheets to Discover Meaning for Parameters in Nonlinear Models
ERIC Educational Resources Information Center
Green, Kris H.
2008-01-01
This paper explores the use of spreadsheets to develop an exploratory environment where mathematics students can develop their own understanding of the parameters of commonly encountered families of functions: linear, logarithmic, exponential and power. The key to this understanding involves opening up the definition of rate of change from the…
Using a Spreadsheet Scroll Bar to Solve Equilibrium Concentrations
ERIC Educational Resources Information Center
Raviolo, Andres
2012-01-01
A simple, conceptual method is described for using the spreadsheet scroll bar to find the composition of a system at chemical equilibrium. Simulation of any kind of chemical equilibrium can be carried out using this method, and the effects of different disturbances can be predicted. This simulation, which can be used in general chemistry…
Evolving Polygons and Spreadsheets: Connecting Mathematics across Grade Levels in Teacher Education
ERIC Educational Resources Information Center
Abramovich, Sergei; Brouwer, Peter
2009-01-01
This paper was prepared in response to the Conference Board of Mathematical Sciences recommendations for the preparation of secondary teachers. It shows how using trigonometry as a conceptual tool in spreadsheet-based applications enables one to develop mathematical understanding in the context of constructing geometric representations of unit…
Excel Yourself with Personalised Email Messages
ERIC Educational Resources Information Center
McClean, Stephen
2008-01-01
Combining the Excel spreadsheet with an email program provides a very powerful tool for sending students personalised emails. Most email clients now support a Mail Merge facility whereby a generic template is created and information unique to each student record in the spreadsheet is filled into that template, generating tens if not hundreds of…
NASA Astrophysics Data System (ADS)
Locock, Andrew J.; Mitchell, Roger H.
2018-04-01
Perovskite mineral oxides commonly exhibit extensive solid-solution, and are therefore classified on the basis of the proportions of their ideal end-members. A uniform sequence of calculation of the end-members is required if comparisons are to be made between different sets of analytical data. A Microsoft Excel spreadsheet has been programmed to assist with the classification and depiction of the minerals of the perovskite- and vapnikite-subgroups following the 2017 nomenclature of the perovskite supergroup recommended by the International Mineralogical Association (IMA). Compositional data for up to 36 elements are input into the spreadsheet as oxides in weight percent. For each analysis, the output includes the formula, the normalized proportions of 15 end-members, and the percentage of cations which cannot be assigned to those end-members. The data are automatically plotted onto the ternary and quaternary diagrams recommended by the IMA for depiction of perovskite compositions. Up to 200 analyses can be entered into the spreadsheet, which is accompanied by data calculated for 140 perovskite compositions compiled from the literature.
Chen, Peiyao; Lin, Jie; Chen, Bingle; Lu, Chunming; Guo, Taomei
2015-10-01
Emotional words in a bilingual's second language (L2) seem to have less emotional impact compared to emotional words in the first language (L1). The present study examined the neural mechanisms of emotional word processing in Chinese-English bilinguals' two languages by using both event-related potentials (ERPs) and functional magnetic resonance imaging (fMRI). Behavioral results show a robust positive word processing advantage in L1 such that responses to positive words were faster and more accurate compared to responses to neutral words and negative words. In L2, emotional words only received higher accuracies than neutral words. In ERPs, positive words elicited a larger early posterior negativity and a smaller late positive component than neutral words in L1, while a trend of reduced N400 component was found for positive words compared to neutral words in L2. In fMRI, reduced activation was found for L1 emotional words in both the left middle occipital gyrus and the left cerebellum whereas increased activation in the left cerebellum was found for L2 emotional words. Altogether, these results suggest that emotional word processing advantage in L1 relies on rapid and automatic attention capture while facilitated semantic retrieval might help processing emotional words in L2. Copyright © 2015 Elsevier Ltd. All rights reserved.
The effect of sign language structure on complex word reading in Chinese deaf adolescents.
Lu, Aitao; Yu, Yanping; Niu, Jiaxin; Zhang, John X
2015-01-01
The present study was carried out to investigate whether sign language structure plays a role in the processing of complex words (i.e., derivational and compound words), in particular, the delay of complex word reading in deaf adolescents. Chinese deaf adolescents were found to respond faster to derivational words than to compound words for one-sign-structure words, but showed comparable performance for two-sign-structure words. For both derivational and compound words, response latencies to one-sign-structure words were shorter than to two-sign-structure words. These results provide strong evidence that the structure of sign language affects written word processing in Chinese. Additionally, differences between derivational and compound words in the one-sign-structure condition indicate that Chinese deaf adolescents acquire print morphological awareness. The results also showed that delayed word reading was found in derivational words with two signs (DW-2), compound words with one sign (CW-1), and compound words with two signs (CW-2), but not in derivational words with one sign (DW-1), with the delay being maximum in DW-2, medium in CW-2, and minimum in CW-1, suggesting that the structure of sign language has an impact on the delayed processing of Chinese written words in deaf adolescents. These results provide insight into the mechanisms about how sign language structure affects written word processing and its delayed processing relative to their hearing peers of the same age.
NASA Astrophysics Data System (ADS)
Grose, C. J.
2008-05-01
Numerical geodynamics models of heat transfer are typically thought of as specialized topics of research requiring knowledge of specialized modelling software, linux platforms, and state-of-the-art finite-element codes. I have implemented analytical and numerical finite-difference techniques with Microsoft Excel 2007 spreadsheets to solve for complex solid-earth heat transfer problems for use by students, teachers, and practicing scientists without specialty in geodynamics modelling techniques and applications. While implementation of equations for use in Excel spreadsheets is occasionally cumbersome, once case boundary structure and node equations are developed, spreadsheet manipulation becomes routine. Model experimentation by modifying parameter values, geometry, and grid resolution makes Excel a useful tool whether in the classroom at the undergraduate or graduate level or for more engaging student projects. Furthermore, the ability to incorporate complex geometries and heat-transfer characteristics makes it ideal for first and occasionally higher order geodynamics simulations to better understand and constrain the results of professional field research in a setting that does not require the constraints of state-of-the-art modelling codes. The straightforward expression and manipulation of model equations in excel can also serve as a medium to better understand the confusing notations of advanced mathematical problems. To illustrate the power and robustness of computation and visualization in spreadsheet models I focus primarily on one-dimensional analytical and two-dimensional numerical solutions to two case problems: (i) the cooling of oceanic lithosphere and (ii) temperatures within subducting slabs. Excel source documents will be made available.
Owens, John
2009-01-01
Technological advances in the acquisition of DNA and protein sequence information and the resulting onrush of data can quickly overwhelm the scientist unprepared for the volume of information that must be evaluated and carefully dissected to discover its significance. Few laboratories have the luxury of dedicated personnel to organize, analyze, or consistently record a mix of arriving sequence data. A methodology based on a modern relational-database manager is presented that is both a natural storage vessel for antibody sequence information and a conduit for organizing and exploring sequence data and accompanying annotation text. The expertise necessary to implement such a plan is equal to that required by electronic word processors or spreadsheet applications. Antibody sequence projects maintained as independent databases are selectively unified by the relational-database manager into larger database families that contribute to local analyses, reports, interactive HTML pages, or exported to facilities dedicated to sophisticated sequence analysis techniques. Database files are transposable among current versions of Microsoft, Macintosh, and UNIX operating systems.
Word Reading Aloud Skills: Their Positive Redefinition through Ageing
ERIC Educational Resources Information Center
Chapleau, Marianne; Wilson, Maximiliano A.; Potvin, Karel; Harvey-Langton, Alexandra; Montembeault, Maxime; Brambati, Simona M.
2017-01-01
Background: Successful reading can be achieved by means of two different procedures: sub-word processes for the pronunciation of words without semantics or pseudowords (PW) and whole-word processes that recruit word-specific information regarding the pronunciation of words with atypical orthography-to-phonology mappings (exception words, EW).…
The time course of morphological processing during spoken word recognition in Chinese.
Shen, Wei; Qu, Qingqing; Ni, Aiping; Zhou, Junyi; Li, Xingshan
2017-12-01
We investigated the time course of morphological processing during spoken word recognition using the printed-word paradigm. Chinese participants were asked to listen to a spoken disyllabic compound word while simultaneously viewing a printed-word display. Each visual display consisted of three printed words: a semantic associate of the first constituent of the compound word (morphemic competitor), a semantic associate of the whole compound word (whole-word competitor), and an unrelated word (distractor). Participants were directed to detect whether the spoken target word was on the visual display. Results indicated that both the morphemic and whole-word competitors attracted more fixations than the distractor. More importantly, the morphemic competitor began to diverge from the distractor immediately at the acoustic offset of the first constituent, which was earlier than the whole-word competitor. These results suggest that lexical access to the auditory word is incremental and morphological processing (i.e., semantic access to the first constituent) that occurs at an early processing stage before access to the representation of the whole word in Chinese.
Emotional words can be embodied or disembodied: the role of superficial vs. deep types of processing
Abbassi, Ensie; Blanchette, Isabelle; Ansaldo, Ana I.; Ghassemzadeh, Habib; Joanette, Yves
2015-01-01
Emotional words are processed rapidly and automatically in the left hemisphere (LH) and slowly, with the involvement of attention, in the right hemisphere (RH). This review aims to find the reason for this difference and suggests that emotional words can be processed superficially or deeply due to the involvement of the linguistic and imagery systems, respectively. During superficial processing, emotional words likely make connections only with semantically associated words in the LH. This part of the process is automatic and may be sufficient for the purpose of language processing. Deep processing, in contrast, seems to involve conceptual information and imagery of a word’s perceptual and emotional properties using autobiographical memory contents. Imagery and the involvement of autobiographical memory likely differentiate between emotional and neutral word processing and explain the salient role of the RH in emotional word processing. It is concluded that the level of emotional word processing in the RH should be deeper than in the LH and, thus, it is conceivable that the slow mode of processing adds certain qualities to the output. PMID:26217288
Cao, Hong-Wen; Yang, Ke-Yu; Yan, Hong-Mei
2017-01-01
Character order information is encoded at the initial stage of Chinese word processing, however, its time course remains underspecified. In this study, we assess the exact time course of the character decomposition and transposition processes of two-character Chinese compound words (canonical, transposed, or reversible words) compared with pseudowords using dual-target rapid serial visual presentation (RSVP) of stimuli appearing at 30 ms per character with no inter-stimulus interval. The results indicate that Chinese readers can identify words with character transpositions in rapid succession; however, a transposition cost is involved in identifying transposed words compared to canonical words. In RSVP reading, character order of words is more likely to be reversed during the period from 30 to 180 ms for canonical and reversible words, but the period from 30 to 240 ms for transposed words. Taken together, the findings demonstrate that the holistic representation of the base word is activated, however, the order of the two constituent characters is not strictly processed during the very early stage of visual word processing.
Involuntary awareness and implicit priming: role of retrieval context.
Zhou, Renlai; Hu, Senqi; Sun, Xuefei; Huang, Junhong
2006-10-01
This study examined the role of retrieval context in implicit priming by manipulating percentage of word-stem index as shallow and deep processing while performing a word-stem completion task. 80 subjects were randomly divided into four groups each of 20 subjects: shallow processing or deep processing with few retrieval indices, and shallow processing or deep processing with many retrieval indices. Analysis indicated that proportion of word-stem completion was significantly higher for studied words than for nonstudied words in all four groups and that the subjects in the groups with many retrieval indices had a significantly increased proportion of word-stem completion between studied and nonstudied words than those in the groups with few retrieval indices. Postquestionnaire analysis indicated that more previously studied items were retrieved if many studied items were available during implicit word-stem completion and that only a small proportion of word-stem completion was finished with studied words by the subjects who were aware of the prior studied and test word relations in all four groups. It was concluded that having more studied words retrievable contributed to more being retrieved and that involuntary awareness had very limited influence on the priming in the implicit word-stem completion.
Robinson, Amanda K; Plaut, David C; Behrmann, Marlene
2017-07-01
Words and faces have vastly different visual properties, but increasing evidence suggests that word and face processing engage overlapping distributed networks. For instance, fMRI studies have shown overlapping activity for face and word processing in the fusiform gyrus despite well-characterized lateralization of these objects to the left and right hemispheres, respectively. To investigate whether face and word perception influences perception of the other stimulus class and elucidate the mechanisms underlying such interactions, we presented images using rapid serial visual presentations. Across 3 experiments, participants discriminated 2 face, word, and glasses targets (T1 and T2) embedded in a stream of images. As expected, T2 discrimination was impaired when it followed T1 by 200 to 300 ms relative to longer intertarget lags, the so-called attentional blink. Interestingly, T2 discrimination accuracy was significantly reduced at short intertarget lags when a face was followed by a word (face-word) compared with glasses-word and word-word combinations, indicating that face processing interfered with word perception. The reverse effect was not observed; that is, word-face performance was no different than the other object combinations. EEG results indicated the left N170 to T1 was correlated with the word decrement for face-word trials, but not for other object combinations. Taken together, the results suggest face processing interferes with word processing, providing evidence for overlapping neural mechanisms of these 2 object types. Furthermore, asymmetrical face-word interference points to greater overlap of face and word representations in the left than the right hemisphere. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Assessment of the lumber drying industry and current potential for value-added processing in Alaska.
David L. Nicholls; Kenneth A. Kilborn
2001-01-01
An assessment was done of the lumber drying industry in Alaska. Part 1 of the assessment included an evaluation of kiln capacity, kiln type, and species dried, by geographic region of the state. Part 2 of the assessment considered the value-added potential associated with lumber drying. Various costs related to lumber drying were evaluated in an Excel spreadsheet....
Electronic spreadsheet vs. manual payroll.
Kiley, M M
1991-01-01
Medical groups with direct employees must employ someone or contract with a company to compute payroll, writes Michael Kiley, Ph.D., M.P.H. However, many medical groups, including small ones, own a personal or minicomputer to handle accounts receivable. Kiley explains, in detail, how this same computer and a spreadsheet program also can be used to perform payroll functions.
ERIC Educational Resources Information Center
Caputi, Peter; Chan, Amy; Jayasuriya, Rohan
2011-01-01
This paper examined the impact of training strategies on the types of errors that novice users make when learning a commonly used spreadsheet application. Fifty participants were assigned to a counterfactual thinking training (CFT) strategy, an error management training strategy, or a combination of both strategies, and completed an easy task…
ERIC Educational Resources Information Center
Peterlin, Primoz
2010-01-01
Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…
ERIC Educational Resources Information Center
Davies, Randall S.; Dean, Douglas L.; Ball, Nick
2013-01-01
The purpose of this research was to explore how technology can be used to teach technological skills and to determine what benefit "flipping" the classroom might have for students taking an introductory-level college course on spreadsheets in terms of student achievement and satisfaction with the class. A pretest posttest…
ERIC Educational Resources Information Center
Agyei, Douglas D.; Voogt, Joke M.
2015-01-01
This article explored the impact of strategies applied in a mathematics instructional technology course for developing technology integration competencies, in particular in the use of spreadsheets, in pre-service teachers. In this respect, 104 pre-service mathematics teachers from a teacher training programme in Ghana enrolled in the mathematics…
A Computer Simulation Using Spreadsheets for Learning Concept of Steady-State Equilibrium
ERIC Educational Resources Information Center
Sharda, Vandana; Sastri, O. S. K. S.; Bhardwaj, Jyoti; Jha, Arbind K.
2016-01-01
In this paper, we present a simple spreadsheet based simulation activity that can be performed by students at the undergraduate level. This simulation is implemented in free open source software (FOSS) LibreOffice Calc, which is available for both Windows and Linux platform. This activity aims at building the probability distribution for the…
Handling Math Expressions in Economics: Recoding Spreadsheet Teaching Tool of Growth Models
ERIC Educational Resources Information Center
Moro-Egido, Ana I.; Pedauga, Luis E.
2017-01-01
In the present paper, we develop a teaching methodology for economic theory. The main contribution of this paper relies on combining the interactive characteristics of spreadsheet programs such as Excel and Unicode plain-text linear format for mathematical expressions. The advantage of Unicode standard rests on its ease for writing and reading…
A Simple Spreadsheet Program for the Calculation of Lattice-Site Distributions
ERIC Educational Resources Information Center
McCaffrey, John G.
2009-01-01
A simple spreadsheet program is presented that can be used by undergraduate students to calculate the lattice-site distributions in solids. A major strength of the method is the natural way in which the correct number of ions or atoms are present, or absent, at specific lattice distances. The expanding-cube method utilized is straightforward to…
ERIC Educational Resources Information Center
Horton, Robert M.; Leonard, William H.
2005-01-01
In science, inquiry is used as students explore important and interesting questions concerning the world around them. In mathematics, one contemporary inquiry approach is to create models that describe real phenomena. Creating mathematical models using spreadsheets can help students learn at deep levels in both science and mathematics, and give…
Li, Sara Tze Kwan; Hsiao, Janet Hui-Wen
2018-07-01
Music notation and English word reading both involve mapping horizontally arranged visual components to components in sound, in contrast to reading in logographic languages such as Chinese. Accordingly, music-reading expertise may influence English word processing more than Chinese character processing. Here we showed that musicians named English words significantly faster than non-musicians when words were presented in the left visual field/right hemisphere (RH) or the center position, suggesting an advantage of RH processing due to music reading experience. This effect was not observed in Chinese character naming. A follow-up ERP study showed that in a sequential matching task, musicians had reduced RH N170 responses to English non-words under the processing of musical segments as compared with non-musicians, suggesting a shared visual processing mechanism in the RH between music notation and English non-word reading. This shared mechanism may be related to the letter-by-letter, serial visual processing that characterizes RH English word recognition (e.g., Lavidor & Ellis, 2001), which may consequently facilitate English word processing in the RH in musicians. Thus, music reading experience may have differential influences on the processing of different languages, depending on their similarities in the cognitive processes involved. Copyright © 2018 Elsevier B.V. All rights reserved.
The effects of gender and self-insight on early semantic processing.
Xu, Xu; Kang, Chunyan; Guo, Taomei
2014-01-01
This event-related potential (ERP) study explored individual differences associated with gender and level of self-insight in early semantic processing. Forty-eight Chinese native speakers completed a semantic judgment task with three different categories of words: abstract neutral words (e.g., logic, effect), concrete neutral words (e.g., teapot, table), and emotion words (e.g., despair, guilt). They then assessed their levels of self-insight. Results showed that women engaged in greater processing than did men. Gender differences also manifested in the relationship between level of self-insight and word processing. For women, level of self-insight was associated with level of semantic activation for emotion words and abstract neutral words, but not for concrete neutral words. For men, level of self-insight was related to processing speed, particularly in response to abstract and concrete neutral words. These findings provide electrophysiological evidence for the effects of gender and self-insight on semantic processing and highlight the need to take into consideration subject variables in related research.
Juhasz, Barbara J; Johnson, Rebecca L; Brewer, Jennifer
2017-04-01
New words enter the language through several word formation processes [see Simonini (Engl J 55:752-757, 1966)]. One such process, blending, occurs when two source words are combined to represent a new concept (e.g., SMOG, BRUNCH, BLOG, and INFOMERCIAL). While there have been examinations of the structure of blends [see Gries (Linguistics 42:639-667, 2004) and Lehrer (Am Speech 73:3-28, 1998)], relatively little attention has been given to how lexicalized blends are recognized and if this process differs from other types of words. In the present study, blend words were matched to non-blend control words on length, familiarity, and frequency. Two tasks were used to examine blend processing: lexical decision and sentence reading. The results demonstrated that blend words were processed differently than non-blend control words. However, the nature of the effect varied as a function of task demands. Blends were recognized slower than control words in the lexical decision task but received shorter fixation durations when embedded in sentences.
Ouederni, Monia; Ben Khaled, Monia; Mellouli, Fethi; Ben Fraj, Elhem; Dhouib, Nawel; Yakoub, Ismehen Ben; Abbes, Selem; Mnif, Nejla; Bejaoui, Mohamed
2017-01-01
Thalassemia is a common genetic disorder in Tunisia. Early iron concentration assessment is a crucial and challenging issue. Most of annual deaths due to iron overload occurred in underdeveloped regions of the world. Limited access to liver and heart MRI monitoring might partially explain these poor prognostic results. Standard software programs are not available in Tunisia. This study is the first to evaluate iron overload in heart and liver using the MRI T2* with excel spreadsheet for post processing. Association of this MRI tool results to serum ferritin level, and echocardiography was also investigated. One hundred Tunisian-transfused thalassemia patients older than 10 years (16.1 ± 5.2) were enrolled in the study. The mean myocardial iron concentration (MIC) was 1.26 ± 1.65 mg/g dw (0.06-8.32). Cardiac T2* (CT2*) was under 20 ms in 30 % of patients and under 10 ms in 21 % of patients. Left ventricular ejection function was significantly lower in patients with CT2* <10 ms. Abnormal liver iron concentration (LIC >3 mg/g dw) was found in 95 % of patients. LIC was over 15 mg/g dw in 25 % of patients. MIC was more correlated than CT2* to LIC and serum ferritin. Among patients with SF <1000 μg/l, 13 % had CT2* <20 ms. Our data showed that 30 % of the Tunisian thalassemia major patients enrolled in this cohort had myocardial iron overload despite being treated by iron chelators. SF could not reliably predict iron overload in all thalassemia patients. MRI T2* using excel spreadsheet for routine follow-up of iron overload might improve the prognosis of thalassemia major patients in developing countries, such as Tunisia, where standard MRI tools are not available or expensive.
WQEP - a computer spreadsheet program to evaluate water quality data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liddle, R.G.
1996-12-31
A flexible spreadsheet Water Quality Evaluation Program (WQEP) has been developed for mining companies, consultants, and regulators to interpret the results of water quality sampling. In order properly to evaluate hydrologic data, unit conversions and chemical calculations are done, quality control checks are needed, and a complete and up-to-date listing of water quality standards is necessary. This process is time consuming and tends not to be done for every sample. This program speeds the process by allowing the input of up to 115 chemical parameters from one sample. WQEP compares concentrations with EPA primary and secondary drinking water MCLs ormore » MCLG, EPA warmwater and Coldwater acute and chronic aquatic life criteria, irrigation criteria, livestock criteria, EPA human health criteria, and several other categories of criteria. The spreadsheet allows the input of State or local water standards of interest. Water quality checks include: anion/cations, TDS{sub m}/TDS{sub c} (where m=measured and c=calculated), EC{sub m}/EC{sub c}, EC{sub m}/ion sums, TDS{sub c}/EC ratio, TDS{sub m}/EC, EC vs. alkalinity, two hardness values, and EC vs. {Sigma} cations. WQEP computes the dissolved transport index of 23 parameters, computes ratios of 26 species for trend analysis, calculates non-carbonate alkalinity to adjust the bicarbonate concentration, and calculates 35 interpretive formulas (pE, SAR, S.I., unionized ammonia, ionized sulfide HS-, pK{sub x} values, etc.). Fingerprinting is conducted by automatic generation of stiff diagrams and ion histograms. Mass loading calculations, mass balance calculations, conversions of concentrations, ionic strength, and the activity coefficient and chemical activity of 33 parameters is calculated. This program allows a speedy and thorough evaluation of water quality data from metal mines, coal mining, and natural surface water systems and has been tested against hand calculations.« less
Tutorial: simulating chromatography with Microsoft Excel Macros.
Kadjo, Akinde; Dasgupta, Purnendu K
2013-04-22
Chromatography is one of the cornerstones of modern analytical chemistry; developing an instinctive feeling for how chromatography works will be invaluable to future generation of chromatographers. Specialized software programs exist that handle and manipulate chromatographic data; there are also some that simulate chromatograms. However, the algorithm details of such software are not transparent to a beginner. In contrast, how spreadsheet tools like Microsoft Excel™ work is well understood and the software is nearly universally available. We show that the simple repetition of an equilibration process at each plate (a spreadsheet row) followed by discrete movement of the mobile phase down by a row, easily automated by a subroutine (a "Macro" in Excel), readily simulates chromatography. The process is readily understood by a novice. Not only does this permit simulation of isocratic and simple single step gradient elution, linear or multistep gradients are also easily simulated. The versatility of a transparent and easily understandable computational platform further enables the simulation of complex but commonly encountered chromatographic scenarios such as the effects of nonlinear isotherms, active sites, column overloading, on-column analyte degradation, etc. These are not as easily simulated by available software. Views of the separation as it develops on the column and as it is seen by an end-column detector are both available in real time. Excel 2010™ also permits a 16-level (4-bit) color gradation of numerical values in a column/row; this permits visualization of a band migrating down the column, much as Tswett may have originally observed, but in a numerical domain. All parameters of relevance (partition constants, elution conditions, etc.) are readily changed so their effects can be examined. Illustrative Excel spreadsheets are given in the Supporting Information; these are easily modified by the user or the user can write his/her own routine. Copyright © 2012 Elsevier B.V. All rights reserved.
Using Spreadsheets and Internally Consistent Databases to Explore Thermodynamics
NASA Astrophysics Data System (ADS)
Dasgupta, S.; Chakraborty, S.
2003-12-01
Much common wisdom has been handed down to generations of petrology students in words - a non-exhaustive list may include (a) do not mix data from two different thermodynamic databases, (b) use of different heat capacity functions or extrapolation beyond the P-T range of fit can have disastrous results, (c) consideration of errors in thermodynamic calculations is crucial, (d) consideration of non-ideality, interaction parameters etc. are important in some cases, but not in others. Actual calculations to demonstrate these effects were either too laborious, tedious, time consuming or involved elaborate computer programming beyond the reaches of the average undergraduate. We have produced "Live" thermodynamic tables in the form of ExcelTM spreadsheets based on standard internally consistent thermodynamic databases (e.g. Berman, Holland and Powell) that allow quick, easy and most importantly, transparent manipulation of thermodynamic data to calculate mineral stabilities and to explore the role of different parameters. We have intentionally avoided the use of advanced tools such as macros, and have set up columns of data that are easy to relate to thermodynamic relationships to enhance transparency. The approach consists of the following basic steps: (i) use a simple supporting spreadsheet to enter mineral compositions (in formula units) to obtain a balanced reaction by matrix inversion. (ii) enter the stoichiometry of this reaction in a designated space and a P and T to get the delta G of the reaction (iii) vary P and or T to locate equilibrium through a change of sign of delta G. These results can be collected to explore practically any problem of chemical equilibrium and mineral stability. Some of our favorites include (a) hierarchical addition of complexity to equilibrium calculations - start with a simple end member reaction ignoring heat capacity and volume derivatives, add the effects of these, followed by addition of compositional effects in the form of ideal solutions, add non-ideality next and finally, explore the role of varying parameters in simple models of non-ideality. (b) Arbitrarily change (i.e. simulate error) or mix data from different sources to see the consequences directly. More traditional exercises such as exploration of slopes of reaction in P-T space are trivial, and other thermodynamic tidbits such as "bigger the mineral formula, greater its thermodynamic weight" become apparent to undergraduates early on through such direct handling of data. The overall outcome is a far more quantitative appreciation of mineral stabilities and thermodynamic variables without actually doing any Math!
[The effect of taboo word on language processing].
Huszár, Tamás; Makra, Emese; Hallgató, Emese; Janacsek, Karolina; Németh, Dezsö
2010-01-01
Knowledge about how we process taboo words brings us closer to the and emotional processes, and broadens the interpretative framework in psychiatry and psychotherapy. In this study the lexical decision paradigm was used. Subjects were presented neutral words, taboo words and pseudowords in a random order, and they had to indicate whether the presented word was meaningful (neutral and taboo words) or meaningless (pseudowords). Each target word was preceded by a prime word (either taboo or neutral). SOA differed in the two experimental conditions (it was 250 msec in the experimental group, and 500 msec in the control group). In the experimental group, response latencies increased for target words that were preceded by taboo prime words, as compared to those that were preceded by neutral prime words. In the control group prime had no such differential effects on response latencies. Results indicate that emotional processing of taboo words occur very early and the negative effect of taboo words on the following lexical decision fades away in 500 msec. Our experiment and other empirical data are presented in this paper.
Parafoveal load of word N+1 modulates preprocessing effectiveness of word N+2 in Chinese reading.
Yan, Ming; Kliegl, Reinhold; Shu, Hua; Pan, Jinger; Zhou, Xiaolin
2010-12-01
Preview benefits (PBs) from two words to the right of the fixated one (i.e., word N + 2) and associated parafoveal-on-foveal effects are critical for proposals of distributed lexical processing during reading. This experiment examined parafoveal processing during reading of Chinese sentences, using a boundary manipulation of N + 2-word preview with low- and high-frequency words N + 1. The main findings were (a) an identity PB for word N + 2 that was (b) primarily observed when word N + 1 was of high frequency (i.e., an interaction between frequency of word N + 1 and PB for word N + 2), and (c) a parafoveal-on-foveal frequency effect of word N + 1 for fixation durations on word N. We discuss implications for theories of serial attention shifts and parallel distributed processing of words during reading.
ERIC Educational Resources Information Center
García-Orza, Javier; Comesaña, Montserrat; Piñeiro, Ana; Soares, Ana Paula; Perea, Manuel
2016-01-01
Recent research has shown that leet words (i.e., words in which some of the letters are replaced by visually similar digits; e.g., VIRTU4L) can be processed as their base words without much cost. However, it remains unclear whether the digits inserted in leet words are simply processed as letters or whether they are simultaneously processed as…
Nonconscious semantic processing of emotional words modulates conscious access
Gaillard, Raphaël; Del Cul, Antoine; Naccache, Lionel; Vinckier, Fabien; Cohen, Laurent; Dehaene, Stanislas
2006-01-01
Whether masked words can be processed at a semantic level remains a controversial issue in cognitive psychology. Although recent behavioral studies have demonstrated masked semantic priming for number words, attempts to generalize this finding to other categories of words have failed. Here, as an alternative to subliminal priming, we introduce a sensitive behavioral method to detect nonconscious semantic processing of words. The logic of this method consists of presenting words close to the threshold for conscious perception and examining whether their semantic content modulates performance in objective and subjective tasks. Our results disclose two independent sources of modulation of the threshold for access to consciousness. First, prior conscious perception of words increases the detection rate of the same words when they are subsequently presented with stronger masking. Second, the threshold for conscious access is lower for emotional words than for neutral ones, even for words that have not been previously consciously perceived, thus implying that written words can receive nonconscious semantic processing. PMID:16648261
Intrusive effects of implicitly processed information on explicit memory.
Sentz, Dustin F; Kirkhart, Matthew W; LoPresto, Charles; Sobelman, Steven
2002-02-01
This study described the interference of implicitly processed information on the memory for explicitly processed information. Participants studied a list of words either auditorily or visually under instructions to remember the words (explicit study). They were then visually presented another word list under instructions which facilitate implicit but not explicit processing. Following a distractor task, memory for the explicit study list was tested with either a visual or auditory recognition task that included new words, words from the explicit study list, and words implicitly processed. Analysis indicated participants both failed to recognize words from the explicit study list and falsely recognized words that were implicitly processed as originating from the explicit study list. However, this effect only occurred when the testing modality was visual, thereby matching the modality for the implicitly processed information, regardless of the modality of the explicit study list. This "modality effect" for explicit memory was interpreted as poor source memory for implicitly processed information and in light of the procedures used. as well as illustrating an example of "remembering causing forgetting."
The relation between resource limitations and optional conceptual processing by children and adults.
Ackerman, B P; Spiker, K; Bailey, K
1989-10-01
In some situations children fail to perform optional conceptual processing that they are able to perform. The purpose of the 4 experiments was to determine if the difficulty of word identification affects optional conceptual processing by second/third graders, fifth graders, and college students in a cued recall task. Conceptual processing was manipulated by presenting Hard (e.g., hawk eagle canary) or Easy (river lake canary) word triplets that varied in the contrastive processing necessary to identify the "odd" target word (canary). The orienting activity also varied: for the Oddity Choice activity, contrastive processing was obligatory because the subject had to identify the target; for the Read activity, contrastive processing was optional because the experimenter identified the target. A recall advantage for the Hard over the Easy triplets was the measure of contrastive processing. Finally, the difficulty of word identification varied in that the subjects read the stimuli or the experimenter read the stimuli, and all the words were degraded, only the nontarget words were degraded, or all the words were intact. The results established that contrastive processing facilitates recall, and that word identification difficulty may limit the extent of optional contrastive processing.
ERIC Educational Resources Information Center
Abriata, Luciano A.
2011-01-01
A simple algorithm was implemented in a spreadsheet program to simulate the circular dichroism spectra of proteins from their secondary structure content and to fit [alpha]-helix, [beta]-sheet, and random coil contents from experimental far-UV circular dichroism spectra. The physical basis of the method is briefly reviewed within the context of…
Trent Wickman; Ann Acheson
2005-01-01
The Smoke Impact Spreadsheet (SIS) is a simple-to-use planning model for calculating particulate matter (PM) emissions and concentrations downwind of wildland fires. This fact sheet identifies the intended users and uses, required inputs, what the model does and does not do, and tells the user how to obtain the model.
Teaching Graphical Simulations of Fourier Series Expansion of Some Periodic Waves Using Spreadsheets
ERIC Educational Resources Information Center
Singh, Iqbal; Kaur, Bikramjeet
2018-01-01
The present article demonstrates a way of programming using an Excel spreadsheet to teach Fourier series expansion in school/colleges without the knowledge of any typical programming language. By using this, a student learns to approximate partial sum of the n terms of Fourier series for some periodic signals such as square wave, saw tooth wave,…
ERIC Educational Resources Information Center
Ray, Darrell L.
2013-01-01
Students often enter biology programs deficient in the math and computational skills that would enhance their attainment of a deeper understanding of the discipline. To address some of these concerns, I developed a series of spreadsheet simulation exercises that focus on some of the mathematical foundations of scientific inquiry and the benefits…
ERIC Educational Resources Information Center
Tekinarslan, Erkan
2013-01-01
The purpose of this study is to investigate the effects of screencasts on the Turkish undergraduate students' achievement and knowledge acquisitions in spreadsheet applications. The methodology of the study is based on a pretest-posttest experimental design with a control group. A total of 66 undergraduate students in two groups (n = 33 in…
E.M. Bilek; Peter Becker; Tim. McAbee
2009-01-01
This documentation is meant to accompany CVal, a downloadable spreadsheet tool. CVal was constructed for foresters, other land management advisors, landowners, and carbon credit aggregators to evaluate the direct benefits and costs of entering into contracts for carbon sequestered in managed forests and forest plantations. CVal was designed to evaluate Exchange...
An Integrated Management Support and Production Control System for Hardwood Forest Products
Guillermo A. Mendoza; Roger J. Meimban; William Sprouse; William G. Luppold; Philip A. Araman
1991-01-01
Spreadsheet and simulation models are tools which enable users to analyze a large number of variables affecting hardwood material utilization and profit in a systematic fashion. This paper describes two spreadsheet models; SEASaw and SEAIn, and a hardwood sawmill simulator. SEASaw is designed to estimate the amount of conversion from timber to lumber, while SEAIn is a...
ERIC Educational Resources Information Center
Halpern, Arthur M.; Glendening, Eric D.
2013-01-01
A three-part project for students in physical chemistry, computational chemistry, or independent study is described in which they explore applications of valence bond (VB) and molecular orbital-configuration interaction (MO-CI) treatments of H[subscript 2]. Using a scientific spreadsheet, students construct potential-energy (PE) curves for several…
Probabilistic assessment methodology for continuous-type petroleum accumulations
Crovelli, R.A.
2003-01-01
The analytic resource assessment method, called ACCESS (Analytic Cell-based Continuous Energy Spreadsheet System), was developed to calculate estimates of petroleum resources for the geologic assessment model, called FORSPAN, in continuous-type petroleum accumulations. The ACCESS method is based upon mathematical equations derived from probability theory in the form of a computer spreadsheet system. ?? 2003 Elsevier B.V. All rights reserved.
Numerical Modelling with Spreadsheets as a Means to Promote STEM to High School Students
ERIC Educational Resources Information Center
Benacka, Jan
2016-01-01
The article gives an account of an experiment in which sixty-eight high school students of age 16 - 19 developed spreadsheet applications that simulated fall and projectile motion in the air. The students applied the Euler method to solve the governing differential equations. The aim was to promote STEM to the students and motivate them to study…
ERIC Educational Resources Information Center
Kunzler, Jayson S.
2012-01-01
This dissertation describes a research study designed to explore whether customization of online instruction results in improved learning in a college business statistics course. The study involved utilizing computer spreadsheet technology to develop an intelligent tutoring system (ITS) designed to: a) collect and monitor individual real-time…
ERIC Educational Resources Information Center
Thohir, M. Anas
2018-01-01
In the 21st century, the competence of instructional technological design is important for pre-service physics teachers. This case study described the pre-service physics teachers' design of optical spreadsheet simulation and evaluated teaching and learning the task in the classroom. The case study chose three of thirty pre-service teacher's…
Eric van Steenis
2013-01-01
This paper illustrates how to use an excel spreadsheet as a decision-making tool to determine optimum sowing factor to minimize seedling production cost. Factors incorporated into the spreadsheet calculations include germination percentage, seeder accuracy, cost per seed, cavities per block, costs of handling, thinning, and transplanting labor, and more. In addition to...
Development of a spreadsheet for SNPs typing using Microsoft EXCEL.
Hashiyada, Masaki; Itakura, Yukio; Takahashi, Shirushi; Sakai, Jun; Funayama, Masato
2009-04-01
Single-nucleotide polymorphisms (SNPs) have some characteristics that make them very appropriate for forensic studies and applications. In our institute, SNPs typings were performed by the TaqMan SNP Genotyping Assays using the ABI PRISM 7500 FAST Real-Time PCR System (AppliedBiosystems) and Sequence Detection Software ver.1.4 (AppliedBiosystem). The TaqMan method was desired two positive control (Allele1 and 2) and one negative control to analyze each SNP locus. Therefore, it can be analyzed up to 24 loci of a person on a 96-well-plate at the same time. If SNPs analysis is expected to apply to biometrics authentication, 48 and over loci are required to identify a person. In this study, we designed a spreadsheet package using Microsoft EXCEL, and population data were used from our 120 SNPs population studies. On the spreadsheet, we defined SNP types using 'template files' instead of positive and negative controls. "Template files" consisted of the results of 94 unknown samples and two negative controls of each of 120 SNPs loci we had previously studied. By the use of the files, the spreadsheet could analyze 96 SNPs on a 96-wells-plate simultaneously.
Gandy, Lisa M; Gumm, Jordan; Fertig, Benjamin; Thessen, Anne; Kennish, Michael J; Chavan, Sameer; Marchionni, Luigi; Xia, Xiaoxin; Shankrit, Shambhavi; Fertig, Elana J
2017-01-01
Scientists have unprecedented access to a wide variety of high-quality datasets. These datasets, which are often independently curated, commonly use unstructured spreadsheets to store their data. Standardized annotations are essential to perform synthesis studies across investigators, but are often not used in practice. Therefore, accurately combining records in spreadsheets from differing studies requires tedious and error-prone human curation. These efforts result in a significant time and cost barrier to synthesis research. We propose an information retrieval inspired algorithm, Synthesize, that merges unstructured data automatically based on both column labels and values. Application of the Synthesize algorithm to cancer and ecological datasets had high accuracy (on the order of 85-100%). We further implement Synthesize in an open source web application, Synthesizer (https://github.com/lisagandy/synthesizer). The software accepts input as spreadsheets in comma separated value (CSV) format, visualizes the merged data, and outputs the results as a new spreadsheet. Synthesizer includes an easy to use graphical user interface, which enables the user to finish combining data and obtain perfect accuracy. Future work will allow detection of units to automatically merge continuous data and application of the algorithm to other data formats, including databases.
Gumm, Jordan; Fertig, Benjamin; Thessen, Anne; Kennish, Michael J.; Chavan, Sameer; Marchionni, Luigi; Xia, Xiaoxin; Shankrit, Shambhavi; Fertig, Elana J.
2017-01-01
Scientists have unprecedented access to a wide variety of high-quality datasets. These datasets, which are often independently curated, commonly use unstructured spreadsheets to store their data. Standardized annotations are essential to perform synthesis studies across investigators, but are often not used in practice. Therefore, accurately combining records in spreadsheets from differing studies requires tedious and error-prone human curation. These efforts result in a significant time and cost barrier to synthesis research. We propose an information retrieval inspired algorithm, Synthesize, that merges unstructured data automatically based on both column labels and values. Application of the Synthesize algorithm to cancer and ecological datasets had high accuracy (on the order of 85–100%). We further implement Synthesize in an open source web application, Synthesizer (https://github.com/lisagandy/synthesizer). The software accepts input as spreadsheets in comma separated value (CSV) format, visualizes the merged data, and outputs the results as a new spreadsheet. Synthesizer includes an easy to use graphical user interface, which enables the user to finish combining data and obtain perfect accuracy. Future work will allow detection of units to automatically merge continuous data and application of the algorithm to other data formats, including databases. PMID:28437440
Yao, Zhao; Yu, Deshui; Wang, Lili; Zhu, Xiangru; Guo, Jingjing; Wang, Zhenhong
2016-12-01
We investigated whether the effects of valence and arousal on emotional word processing are modulated by concreteness using event-related potentials (ERPs). The stimuli included concrete words (Experiment 1) and abstract words (Experiment 2) that were organized in an orthogonal design, with valence (positive and negative) and arousal (low and high) as factors in a lexical decision task. In Experiment 1, the impact of emotion on the effects of concrete words mainly resulted from the contribution of valence. Positive concrete words were processed more quickly than negative words and elicited a reduction of N400 (300-410ms) and enhancement of late positive complex (LPC; 450-750ms), whereas no differences in response times or ERPs were found between high and low levels of arousal. In Experiment 2, the interaction between valence and arousal influenced the impact of emotion on the effects of abstract words. Low-arousal positive words were associated with shorter response times and a reduction of LPC amplitudes compared with high-arousal positive words. Low-arousal negative words were processed more slowly and elicited a reduction of N170 (140-200ms) compared with high-arousal negative words. The present study indicates that word concreteness modulates the contributions of valence and arousal to the effects of emotion, and this modulation occurs during the early perceptual processing stage (N170) and late elaborate processing stage (LPC) for emotional words and at the end of all cognitive processes (i.e., reflected by response times). These findings support an embodied theory of semantic representation and help clarify prior inconsistent findings regarding the ways in which valance and arousal influence different stages of word processing, at least in a lexical decision task. Copyright © 2016 Elsevier B.V. All rights reserved.
Structural and functional correlates for language efficiency in auditory word processing.
Jung, JeYoung; Kim, Sunmi; Cho, Hyesuk; Nam, Kichun
2017-01-01
This study aims to provide convergent understanding of the neural basis of auditory word processing efficiency using a multimodal imaging. We investigated the structural and functional correlates of word processing efficiency in healthy individuals. We acquired two structural imaging (T1-weighted imaging and diffusion tensor imaging) and functional magnetic resonance imaging (fMRI) during auditory word processing (phonological and semantic tasks). Our results showed that better phonological performance was predicted by the greater thalamus activity. In contrary, better semantic performance was associated with the less activation in the left posterior middle temporal gyrus (pMTG), supporting the neural efficiency hypothesis that better task performance requires less brain activation. Furthermore, our network analysis revealed the semantic network including the left anterior temporal lobe (ATL), dorsolateral prefrontal cortex (DLPFC) and pMTG was correlated with the semantic efficiency. Especially, this network acted as a neural efficient manner during auditory word processing. Structurally, DLPFC and cingulum contributed to the word processing efficiency. Also, the parietal cortex showed a significate association with the word processing efficiency. Our results demonstrated that two features of word processing efficiency, phonology and semantics, can be supported in different brain regions and, importantly, the way serving it in each region was different according to the feature of word processing. Our findings suggest that word processing efficiency can be achieved by in collaboration of multiple brain regions involved in language and general cognitive function structurally and functionally.
Structural and functional correlates for language efficiency in auditory word processing
Kim, Sunmi; Cho, Hyesuk; Nam, Kichun
2017-01-01
This study aims to provide convergent understanding of the neural basis of auditory word processing efficiency using a multimodal imaging. We investigated the structural and functional correlates of word processing efficiency in healthy individuals. We acquired two structural imaging (T1-weighted imaging and diffusion tensor imaging) and functional magnetic resonance imaging (fMRI) during auditory word processing (phonological and semantic tasks). Our results showed that better phonological performance was predicted by the greater thalamus activity. In contrary, better semantic performance was associated with the less activation in the left posterior middle temporal gyrus (pMTG), supporting the neural efficiency hypothesis that better task performance requires less brain activation. Furthermore, our network analysis revealed the semantic network including the left anterior temporal lobe (ATL), dorsolateral prefrontal cortex (DLPFC) and pMTG was correlated with the semantic efficiency. Especially, this network acted as a neural efficient manner during auditory word processing. Structurally, DLPFC and cingulum contributed to the word processing efficiency. Also, the parietal cortex showed a significate association with the word processing efficiency. Our results demonstrated that two features of word processing efficiency, phonology and semantics, can be supported in different brain regions and, importantly, the way serving it in each region was different according to the feature of word processing. Our findings suggest that word processing efficiency can be achieved by in collaboration of multiple brain regions involved in language and general cognitive function structurally and functionally. PMID:28892503
Word Processing Curriculum: Attitudes/Skills Business Educators Should Update.
ERIC Educational Resources Information Center
Robertson, Jane R.; West, Judy F.
1984-01-01
Discusses a study to gain data enabling curricula planners and business educators to plan an effective word processing curriculum, to determine basic skills and attitudes needed by word processing operators, and to make recommendations to help word processor operators increase productivity. (JOW)
Wabnitz, Pascal; Martens, Ulla; Neuner, Frank
2016-01-01
Social anxiety disorder (SAD) is associated with heightened sensitivity to threat cues, typically represented by emotional facial expressions. To examine if this bias can be transferred to a general hypersensitivity or whether it is specific to disorder relevant cues, we investigated electrophysiological correlates of emotional word processing (alpha activity and event-related potentials) in 20 healthy participants and 20 participants with SAD. The experimental task was a silent reading of neutral, positive, physically threatening and socially threatening words (the latter were abusive swear words) while responding to a randomly presented dot. Subsequently, all participants were asked to recall as many words as possible during an unexpected recall test. Participants with SAD showed blunted sensory processing followed by a rapid processing of emotional words during early stages (early posterior negativity - EPN). At later stages, all participants showed enhanced processing of negative (physically and socially threatening) compared to neutral and positive words (N400). Moreover, at later processing stages alpha activity was increased specifically for negative words in participants with SAD but not in healthy controls. Recall of emotional words for all subjects was best for socially threatening words, followed by negative and positive words irrespective of social anxiety. The present findings indicate that SAD is associated with abnormalities in emotional word processing characterised by early hypervigilance to emotional cues followed by cognitive avoidance at later processing stages. Most importantly, the specificity of these attentional biases seems to change as a function of time with a general emotional bias at early and a more specific bias at later processing stages.
Semantic word category processing in semantic dementia and posterior cortical atrophy.
Shebani, Zubaida; Patterson, Karalyn; Nestor, Peter J; Diaz-de-Grenu, Lara Z; Dawson, Kate; Pulvermüller, Friedemann
2017-08-01
There is general agreement that perisylvian language cortex plays a major role in lexical and semantic processing; but the contribution of additional, more widespread, brain areas in the processing of different semantic word categories remains controversial. We investigated word processing in two groups of patients whose neurodegenerative diseases preferentially affect specific parts of the brain, to determine whether their performance would vary as a function of semantic categories proposed to recruit those brain regions. Cohorts with (i) Semantic Dementia (SD), who have anterior temporal-lobe atrophy, and (ii) Posterior Cortical Atrophy (PCA), who have predominantly parieto-occipital atrophy, performed a lexical decision test on words from five different lexico-semantic categories: colour (e.g., yellow), form (oval), number (seven), spatial prepositions (under) and function words (also). Sets of pseudo-word foils matched the target words in length and bi-/tri-gram frequency. Word-frequency was matched between the two visual word categories (colour and form) and across the three other categories (number, prepositions, and function words). Age-matched healthy individuals served as controls. Although broad word processing deficits were apparent in both patient groups, the deficit was strongest for colour words in SD and for spatial prepositions in PCA. The patterns of performance on the lexical decision task demonstrate (a) general lexicosemantic processing deficits in both groups, though more prominent in SD than in PCA, and (b) differential involvement of anterior-temporal and posterior-parietal cortex in the processing of specific semantic categories of words. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Effects of context and individual differences on the processing of taboo words.
Christianson, Kiel; Zhou, Peiyun; Palmer, Cassie; Raizen, Adina
2017-07-01
Previous studies suggest that taboo words are special in regards to language processing. Findings from the studies have led to the formation of two theories, global resource theory and binding theory, of taboo word processing. The current study investigates how readers process taboo words embedded in sentences during silent reading. In two experiments, measures collected include eye movement data, accuracy and reaction time measures for recalling probe words within the sentences, and individual differences in likelihood of being offended by taboo words. Although certain aspects of the results support both theories, as the likelihood of a person being offended by a taboo word influenced some measures, neither theory sufficiently predicts or describes the effects observed. The results are interpreted as evidence that processing effects ascribed to taboo words are largely, but not completely, attributable to the context in which they are used and the individual attitudes of the people who hear/read them. The results also demonstrate the importance of investigating taboo words in naturalistic language processing paradigms. A revised theory of taboo word processing is proposed that incorporates both global resource theory and binding theory along with the sociolinguistic factors and individual differences that largely drive the effects observed here. Copyright © 2017 Elsevier B.V. All rights reserved.
Neural dichotomy of word concreteness: a view from functional neuroimaging.
Kumar, Uttam
2016-02-01
Our perception about the representation and processing of concrete and abstract concepts is based on the fact that concrete words are highly imagined and remembered faster than abstract words. In order to explain the processing differences between abstract and concrete concepts, various theories have been proposed, yet there is no unanimous consensus about its neural implication. The present study investigated the processing of concrete and abstract words during an orthography judgment task (implicit semantic processing) using functional magnetic resonance imaging to validate the involvement of the neural regions. Relative to non-words, both abstract and concrete words show activation in the regions of bilateral hemisphere previously associated with semantic processing. The common areas (conjunction analyses) observed for abstract and concrete words are bilateral inferior frontal gyrus (BA 44/45), left superior parietal (BA 7), left fusiform gyrus and bilateral middle occipital. The additional areas for abstract words were noticed in bilateral superior temporal and bilateral middle temporal region, whereas no distinct region was noticed for concrete words. This suggests that words with abstract concepts recruit additional language regions in the brain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
St. Onge, Melinda
The Geothermal Resource Portfolio Optimization and Reporting Tool (GeoRePORT) was developed as a way to distill large amounts of geothermal project data into an objective, reportable data set that can be used to communicate with experts and non-experts. GeoRePORT summarizes (1) resource grade and certainty and (2) project readiness. This Excel file allows users to easily navigate through the resource grade attributes, using drop-down menus to pick grades and project readiness, and then easily print and share the summary with others. This spreadsheet is the first draft, for which we are soliciting expert feedback. The spreadsheet will be updated basedmore » on this feedback to increase usability of the tool. If you have any comments, please feel free to contact us.« less
Words and Melody Are Intertwined in Perception of Sung Words: EEG and Behavioral Evidence
Gordon, Reyna L.; Schön, Daniele; Magne, Cyrille; Astésano, Corine; Besson, Mireille
2010-01-01
Language and music, two of the most unique human cognitive abilities, are combined in song, rendering it an ecological model for comparing speech and music cognition. The present study was designed to determine whether words and melodies in song are processed interactively or independently, and to examine the influence of attention on the processing of words and melodies in song. Event-Related brain Potentials (ERPs) and behavioral data were recorded while non-musicians listened to pairs of sung words (prime and target) presented in four experimental conditions: same word, same melody; same word, different melody; different word, same melody; different word, different melody. Participants were asked to attend to either the words or the melody, and to perform a same/different task. In both attentional tasks, different word targets elicited an N400 component, as predicted based on previous results. Most interestingly, different melodies (sung with the same word) elicited an N400 component followed by a late positive component. Finally, ERP and behavioral data converged in showing interactions between the linguistic and melodic dimensions of sung words. The finding that the N400 effect, a well-established marker of semantic processing, was modulated by musical melody in song suggests that variations in musical features affect word processing in sung language. Implications of the interactions between words and melody are discussed in light of evidence for shared neural processing resources between the phonological/semantic aspects of language and the melodic/harmonic aspects of music. PMID:20360991
The Impact of Word Processing on Office Administration in the Medical and Allied Health Professions.
ERIC Educational Resources Information Center
Platt, Naomi Dornfeld
The effect of word processing equipment on the future medical secretarial science curriculum was studied. A literature search focused on word processing and the medical and allied health professions, word processing and business education, and futuring of and changes in the secretarial science curriculum. Questionnaires to identify various aspects…
Ray, N J; Hannigan, A
1999-05-01
As dental practice management becomes more computer-based, the efficient functioning of the dentist will become dependent on adequate computer literacy. A survey has been carried out into the computer literacy of a cohort of 140 undergraduate dental students at a University Dental School in Ireland (years 1-5), in the academic year 1997-98. Aspects investigated by anonymous questionnaire were: (1) keyboard skills; (2) computer skills; (3) access to computer facilities; (4) software competencies and (5) use of medical library computer facilities. The students are relatively unfamiliar with basic computer hardware and software: 51.1% considered their expertise with computers as "poor"; 34.3% had taken a formal typewriting or computer keyboarding course; 7.9% had taken a formal computer course at university level and 67.2% were without access to computer facilities at their term-time residences. A majority of students had never used either word-processing, spreadsheet, or graphics programs. Programs relating to "informatics" were more popular, such as literature searching, accessing the Internet and the use of e-mail which represent the major use of the computers in the medical library. The lack of experience with computers may be addressed by including suitable computing courses at the secondary level (age 13-18 years) and/or tertiary level (FE/HE) education programmes. Such training may promote greater use of generic softwares, particularly in the library, with a more electronic-based approach to data handling.
Brysbaert, Marc; Keuleers, Emmanuel; New, Boris
2011-01-01
In this Perspective Article we assess the usefulness of Google's new word frequencies for word recognition research (lexical decision and word naming). We find that, despite the massive corpus on which the Google estimates are based (131 billion words from books published in the United States alone), the Google American English frequencies explain 11% less of the variance in the lexical decision times from the English Lexicon Project (Balota et al., 2007) than the SUBTLEX-US word frequencies, based on a corpus of 51 million words from film and television subtitles. Further analyses indicate that word frequencies derived from recent books (published after 2000) are better predictors of word processing times than frequencies based on the full corpus, and that word frequencies based on fiction books predict word processing times better than word frequencies based on the full corpus. The most predictive word frequencies from Google still do not explain more of the variance in word recognition times of undergraduate students and old adults than the subtitle-based word frequencies. PMID:21713191
Costing nursing education programs. It's as easy as 1-2-3.
Fisher, M L; Hume, R; Emerick, R
1998-01-01
Staff development departments are pressured to reveal the costs of their educational programs and to compete with outside vendors for programming. The process of implementing a spreadsheet template for costing out staff development programs is described. The template is easy to use and supports "what if" analysis. This model allows educators to evaluate cost implications of curricular decisions and to better negotiate with internal and external customers.
An Excel Macro to Plot the HFE-Diagram to Identify Sea Water Intrusion Phases.
Giménez-Forcada, Elena; Sánchez San Román, F Javier
2015-01-01
A hydrochemical facies evolution diagram (HFE-D) is a multirectangular diagram, which is a useful tool in the interpretation of sea water intrusion processes. This method note describes a simple method for generating an HFE-D plot using the spreadsheet software package, Microsoft Excel. The code was applied to groundwater from the alluvial coastal plain of Grosseto (Tuscany, Italy), which is characterized by a complex salinization process in which sea water mixes with sulfate or bicarbonate recharge water. © 2014, National GroundWater Association.
Dreyer, Felix R.; Frey, Dietmar; Arana, Sophie; von Saldern, Sarah; Picht, Thomas; Vajkoczy, Peter; Pulvermüller, Friedemann
2015-01-01
Neuroimaging and neuropsychological experiments suggest that modality-preferential cortices, including motor- and somatosensory areas, contribute to the semantic processing of action related concrete words. Still, a possible role of sensorimotor areas in processing abstract meaning remains under debate. Recent fMRI studies indicate an involvement of the left sensorimotor cortex in the processing of abstract-emotional words (e.g., “love”) which resembles activation patterns seen for action words. But are the activated areas indeed necessary for processing action-related and abstract words? The current study now investigates word processing in two patients suffering from focal brain lesion in the left frontocentral motor system. A speeded Lexical Decision Task on meticulously matched word groups showed that the recognition of nouns from different semantic categories – related to food, animals, tools, and abstract-emotional concepts – was differentially affected. Whereas patient HS with a lesion in dorsolateral central sensorimotor systems next to the hand area showed a category-specific deficit in recognizing tool words, patient CA suffering from lesion centered in the left supplementary motor area was primarily impaired in abstract-emotional word processing. These results point to a causal role of the motor cortex in the semantic processing of both action-related object concepts and abstract-emotional concepts and therefore suggest that the motor areas previously found active in action-related and abstract word processing can serve a meaning-specific necessary role in word recognition. The category-specific nature of the observed dissociations is difficult to reconcile with the idea that sensorimotor systems are somehow peripheral or ‘epiphenomenal’ to meaning and concept processing. Rather, our results are consistent with the claim that cognition is grounded in action and perception and based on distributed action perception circuits reaching into modality-preferential cortex. PMID:26617535
McBride, Dawn M; Anne Dosher, Barbara
2002-09-01
Four experiments were conducted to evaluate explanations of picture superiority effects previously found for several tasks. In a process dissociation procedure (Jacoby, 1991) with word stem completion, picture fragment completion, and category production tasks, conscious and automatic memory processes were compared for studied pictures and words with an independent retrieval model and a generate-source model. The predictions of a transfer appropriate processing account of picture superiority were tested and validated in "process pure" latent measures of conscious and unconscious, or automatic and source, memory processes. Results from both model fits verified that pictures had a conceptual (conscious/source) processing advantage over words for all tasks. The effects of perceptual (automatic/word generation) compatibility depended on task type, with pictorial tasks favoring pictures and linguistic tasks favoring words. Results show support for an explanation of the picture superiority effect that involves an interaction of encoding and retrieval processes.
NASA Technical Reports Server (NTRS)
Fanourakis, Sofia
2015-01-01
My main project was to determine and implement updates to be made to MODEAR (Mission Operations Data Enterprise Architecture Repository) process definitions to be used for CST-100 (Crew Space Transportation-100) related missions. Emphasis was placed on the scheduling aspect of the processes. In addition, I was to complete other tasks as given. Some of the additional tasks were: to create pass-through command look-up tables for the flight controllers, finish one of the MDT (Mission Operations Directorate Display Tool) displays, gather data on what is included in the CST-100 public data, develop a VBA (Visual Basic for Applications) script to create a csv (Comma-Separated Values) file with specific information from spreadsheets containing command data, create a command script for the November MCC-ASIL (Mission Control Center-Avionics System Integration Laboratory) testing, and take notes for one of the TCVB (Terminal Configured Vehicle B-737) meetings. In order to make progress in my main project I scheduled meetings with the appropriate subject matter experts, prepared material for the meetings, and assisted in the discussions in order to understand the process or processes at hand. After such discussions I made updates to various MODEAR processes and process graphics. These meetings have resulted in significant updates to the processes that were discussed. In addition, the discussions have helped the departments responsible for these processes better understand the work ahead and provided material to help document how their products are created. I completed my other tasks utilizing resources available to me and, when necessary, consulting with the subject matter experts. Outputs resulting from my other tasks were: two completed and one partially completed pass through command look-up tables for the fight controllers, significant updates to one of the MDT displays, a spreadsheet containing data on what is included in the CST-100 public data, a tool to create a csv file with specific information from spreadsheets containing command data, a command script for the November MCC-ASIL testing which resulted in a successful test day identifying several potential issues, and notes from one of the TCVB meetings that was used to keep the teams up to date on what was discussed and decided. I have learned a great deal working at NASA these last four months. I was able to meet and work with amazing individuals, further develop my technical knowledge, expand my knowledge base regarding human spaceflight, and contribute to the CST-100 missions. My work at NASA has strengthened my desire to continue my education in order to make further contributions to the field, and has given me the opportunity to see the advantages of a career at NASA.
Holistic processing of words modulated by reading experience.
Wong, Alan C-N; Bukach, Cindy M; Yuen, Crystal; Yang, Lizhuang; Leung, Shirley; Greenspon, Emma
2011-01-01
Perceptual expertise has been studied intensively with faces and object categories involving detailed individuation. A common finding is that experience in fulfilling the task demand of fine, subordinate-level discrimination between highly similar instances is associated with the development of holistic processing. This study examines whether holistic processing is also engaged by expert word recognition, which is thought to involve coarser, basic-level processing that is more part-based. We adopted a paradigm widely used for faces--the composite task, and found clear evidence of holistic processing for English words. A second experiment further showed that holistic processing for words was sensitive to the amount of experience with the language concerned (native vs. second-language readers) and with the specific stimuli (words vs. pseudowords). The adoption of a paradigm from the face perception literature to the study of expert word perception is important for further comparison between perceptual expertise with words and face-like expertise.
What do foreign neighbors say about the mental lexicon?*
VITEVITCH, MICHAEL S.
2012-01-01
A corpus analysis of phonological word-forms shows that English words have few phonological neighbors that are Spanish words. Concomitantly, Spanish words have few phonological neighbors that are English words. These observations appear to undermine certain accounts of bilingual language processing, and have significant implications for the processing and representation of word-forms in bilinguals. PMID:23930081
The low-frequency encoding disadvantage: Word frequency affects processing demands.
Diana, Rachel A; Reder, Lynne M
2006-07-01
Low-frequency words produce more hits and fewer false alarms than high-frequency words in a recognition task. The low-frequency hit rate advantage has sometimes been attributed to processes that operate during the recognition test (e.g., L. M. Reder et al., 2000). When tasks other than recognition, such as recall, cued recall, or associative recognition, are used, the effects seem to contradict a low-frequency advantage in memory. Four experiments are presented to support the claim that in addition to the advantage of low-frequency words at retrieval, there is a low-frequency disadvantage during encoding. That is, low-frequency words require more processing resources to be encoded episodically than high-frequency words. Under encoding conditions in which processing resources are limited, low-frequency words show a larger decrement in recognition than high-frequency words. Also, studying items (pictures and words of varying frequencies) along with low-frequency words reduces performance for those stimuli. Copyright 2006 APA, all rights reserved.
Limitations of the dual-process-theory regarding the writing of words and non-words to dictation.
Tucha, Oliver; Trumpp, Christian; Lange, Klaus W
2004-12-01
It is generally assumed that the lexical and phonological systems are involved in writing to dictation. In an experiment concerned with the writing of words and non-words to dictation, the handwriting of female students was registered using a digitising tablet. The data contradict the assumption that the phonological system represents an alexical process. Both words and non-words which were acoustically presented to the subjects were lexically parsed. The analysis of kinematic data revealed significant differences between the subjects' writing of words and non-words. The findings reveal gross disturbances of handwriting fluency during the writing of non-words. The findings of the experiment cannot be explained by the dual-process-theory.
How does the interaction between spelling and motor processes build up during writing acquisition?
Kandel, Sonia; Perret, Cyril
2015-03-01
How do we recall a word's spelling? How do we produce the movements to form the letters of a word? Writing involves several processing levels. Surprisingly, researchers have focused either on spelling or motor production. However, these processes interact and cannot be studied separately. Spelling processes cascade into movement production. For example, in French, producing letters PAR in the orthographically irregular word PARFUM (perfume) delays motor production with respect to the same letters in the regular word PARDON (pardon). Orthographic regularity refers to the possibility of spelling a word correctly by applying the most frequent sound-letter conversion rules. The present study examined how the interaction between spelling and motor processing builds up during writing acquisition. French 8-10 year old children participated in the experiment. This is the age handwriting skills start to become automatic. The children wrote regular and irregular words that could be frequent or infrequent. They wrote on a digitizer so we could collect data on latency, movement duration and fluency. The results revealed that the interaction between spelling and motor processing was present already at age 8. It became more adult-like at ages 9 and 10. Before starting to write, processing irregular words took longer than regular words. This processing load spread into movement production. It increased writing duration and rendered the movements more dysfluent. Word frequency affected latencies and cascaded into production. It modulated writing duration but not movement fluency. Writing infrequent words took longer than frequent words. The data suggests that orthographic regularity has a stronger impact on writing than word frequency. They do not cascade in the same extent. Copyright © 2014 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Fuglestad, Anne Berit
2013-01-01
This paper presents a case of collaboration with three teachers and a didactician on task development within a developmental research project based on ideas of inquiry and learning community. The teachers' goal was to utilise a spreadsheet to orchestrate the pupils' investigations and build a library of tasks for the classroom. The focus is on one…
ERIC Educational Resources Information Center
Adhitama, Egy; Fauzi, Ahmad
2018-01-01
In this study, a pendulum experimental tool with a light-based timer has been developed to measure the period of a simple pendulum. The obtained data was automatically recorded in an Excel spreadsheet. The intensity of monochromatic light, sensed by a 3DU5C phototransistor, dynamically changes as the pendulum swings. The changed intensity varies…
ERIC Educational Resources Information Center
Fetter, Gary; Shockley, Jeff
2014-01-01
Instructors look for ways to explain to students how supply chains can be constructed so that competing suppliers can work together to improve inventory management performance (i.e., a phenomenon known as co-opetition). An Excel spreadsheet-driven simulation is presented that models a complete multilevel supply chain system--customer, retailer,…
ERIC Educational Resources Information Center
Halat, Erdogan; Peker, Murat
2011-01-01
The purpose of this study was to compare the influence of instruction using WebQuest activities with the influence of an instruction using spreadsheet activities on the motivation of pre-service elementary school teachers in mathematics teaching course. There were a total of 70 pre-service elementary school teachers involved in this study. Thirty…
Great Basin NV Play Fairway Analysis - Carson Sink
Jim Faulds
2015-10-28
All datasets and products specific to the Carson Sink Basin. Includes a packed ArcMap (.mpk), individually zipped shapefiles, and a file geodatabase for the Carson Sink area; a GeoSoft Oasis montaj project containing GM-SYS 2D gravity profiles along the trace of our seismic reflection lines; a 3D model in EarthVision; spreadsheet of links to published maps; and spreadsheets of well data.
ERIC Educational Resources Information Center
Krange, Ingeborg; Arnseth, Hans Christian
2012-01-01
The aim of this study is to scrutinize the characteristics of conceptual meaning making when students engage with virtual worlds in combination with a spreadsheet with the aim to develop graphs. We study how these tools and the representations they contain or enable students to construct serve to influence their understanding of energy resource…
Jim Faulds
2015-10-29
All datasets and products specific to the Steptoe Valley model area. Includes a packed ArcMap project (.mpk), individually zipped shapefiles, and a file geodatabase for the northern Steptoe Valley area; a GeoSoft Oasis montaj project containing GM-SYS 2D gravity profiles along the trace of our seismic reflection lines; a 3D model in EarthVision; spreadsheet of links to published maps; and spreadsheets of well data.
Sang-Kyun Han; Han-Sup Han; William J. Elliot; Edward M. Bilek
2017-01-01
We developed a spreadsheet-based model, named ThinTool, to evaluate the cost of mechanical fuel reduction thinning including biomass removal, to predict net energy output, and to assess nutrient impacts from thinning treatments in northern California and southern Oregon. A combination of literature reviews, field-based studies, and contractor surveys was used to...
A user-friendly tool for incremental haemodialysis prescription.
Casino, Francesco Gaetano; Basile, Carlo
2018-01-05
There is a recently heightened interest in incremental haemodialysis (IHD), the main advantage of which could likely be a better preservation of the residual kidney function of the patients. The implementation of IHD, however, is hindered by many factors, among them, the mathematical complexity of its prescription. The aim of our study was to design a user-friendly tool for IHD prescription, consisting of only a few rows of a common spreadsheet. The keystone of our spreadsheet was the following fundamental concept: the dialysis dose to be prescribed in IHD depends only on the normalized urea clearance provided by the native kidneys (KRUn) of the patient for each frequency of treatment, according to the variable target model recently proposed by Casino and Basile (The variable target model: a paradigm shift in the incremental haemodialysis prescription. Nephrol Dial Transplant 2017; 32: 182-190). The first step was to put in sequence a series of equations in order to calculate, firstly, KRUn and, then, the key parameters to be prescribed for an adequate IHD; the second step was to compare KRUn values obtained with our spreadsheet with KRUn values obtainable with the gold standard Solute-solver (Daugirdas JT et al., Solute-solver: a web-based tool for modeling urea kinetics for a broad range of hemodialysis schedules in multiple patients. Am J Kidney Dis 2009; 54: 798-809) in a sample of 40 incident haemodialysis patients. Our spreadsheet provided excellent results. The differences with Solute-solver were clinically negligible. This was confirmed by the Bland-Altman plot built to analyse the agreement between KRUn values obtained with the two methods: the difference was 0.07 ± 0.05 mL/min/35 L. Our spreadsheet is a user-friendly tool able to provide clinically acceptable results in IHD prescription. Two immediate consequences could derive: (i) a larger dissemination of IHD might occur; and (ii) our spreadsheet could represent a useful tool for an ineludibly needed full-fledged clinical trial, comparing IHD with standard thrice-weekly HD. © The Author(s) 2018. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etmektzoglou, A; Mishra, P; Svatos, M
Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomesmore » available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly translate research ideas into machine readable scripts without programming knowledge. As an open source initiative, it also enables researcher collaboration on future developments. I am a full time employee at Varian Medical Systems, Palo Alto, California.« less
ERIC Educational Resources Information Center
Scriven, Jolene D.; And Others
A study was conducted (1) to determine current practices in word processing installations in selected organizations throughout the United States, and (2) to ascertain anticipated future developments in word processing as well as to provide recommendations for educational institutions that prepare workers for business offices. Seven interview…
[Representation of letter position in visual word recognition process].
Makioka, S
1994-08-01
Two experiments investigated the representation of letter position in visual word recognition process. In Experiment 1, subjects (12 undergraduates and graduates) were asked to detect a target word in a briefly-presented probe. Probes consisted of two kanji words. The latters which formed targets (critical letters) were always contained in probes. (e.g. target: [symbol: see text] probe: [symbol: see text]) High false alarm rate was observed when critical letters occupied the same within-word relative position (left or right within the word) in the probe words as in the target word. In Experiment 2 (subject were ten undergraduates and graduates), spaces adjacent to probe words were replaced by randomly chosen hiragana letters (e.g. [symbol: see text]), because spaces are not used to separate words in regular Japanese sentences. In addition to the effect of within-word relative position as in Experiment 1, the effect of between-word relative position (left or right across the probe words) was observed. These results suggest that information about within-word relative position of a letter is used in word recognition process. The effect of within-word relative position was explained by a connectionist model of word recognition.
Is the masked priming same-different task a pure measure of prelexical processing?
Kelly, Andrew N; van Heuven, Walter J B; Pitchford, Nicola J; Ledgeway, Timothy
2013-01-01
To study prelexical processes involved in visual word recognition a task is needed that only operates at the level of abstract letter identities. The masked priming same-different task has been purported to do this, as the same pattern of priming is shown for words and nonwords. However, studies using this task have consistently found a processing advantage for words over nonwords, indicating a lexicality effect. We investigated the locus of this word advantage. Experiment 1 used conventional visually-presented reference stimuli to test previous accounts of the lexicality effect. Results rule out the use of different strategies, or strength of representations, for words and nonwords. No interaction was shown between prime type and word type, but a consistent word advantage was found. Experiment 2 used novel auditorally-presented reference stimuli to restrict nonword matching to the sublexical level. This abolished scrambled priming for nonwords, but not words. Overall this suggests the processing advantage for words over nonwords results from activation of whole-word, lexical representations. Furthermore, the number of shared open-bigrams between primes and targets could account for scrambled priming effects. These results have important implications for models of orthographic processing and studies that have used this task to investigate prelexical processes.
ERP Indicators of L2 Proficiency in Word-to-text Integration Processes.
Yang, Chin Lung; Perfetti, Charles A; Tan, Li-Hai; Jiang, Ying
2018-06-04
Studies of bilingual proficiency have largely focused on word and sentence processing, whereas the text level has received relatively little attention. We examined on-line second language (L2) text comprehension in relation to L2 proficiency with ERPs recorded on critical words separated across a sentence boundary from their co-referential antecedents. The integration processes on the critical words were designed to reflect different levels of text representation: word-form, word-meaning, and situational levels (Kintsch, 1998). Across proficiency level, bilinguals showed biphasic N400/late positive component (LPC) effects related to word meaning integration (N400) and mental model updating (LPC) processes. More proficient bilinguals, compared with less proficient bilinguals, showed reduced amplitudes in both N400 and LPC when the integration depended on semantic and conceptual meanings. When the integration was based on word repetitions and inferences, both groups showed reduced N400 negativity while elevated LPC positivity. These effects reflect how memory mechanisms (processes and resources) support the tight coupling among word meaning, readers' memory of the text meaning and the referentially-specified meaning of the text. They further demonstrate the importance of L2 semantic and conceptual processing in modulating the L2 proficiency effect on L2 text integration processes. These results align with the assumption that word meaning processes are causal components in variations of comprehension ability for both monolinguals and bilinguals. Copyright © 2018. Published by Elsevier Ltd.
Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya
2016-02-01
The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Spreadsheet WATERSHED modeling for nonpoint-source pollution management in a Wisconsin basin
Walker, J.F.; Pickard, S.A.; Sonzogni, W.C.
1989-01-01
Although several sophisticated nonpoint pollution models exist, few are available that are easy to use, cover a variety of conditions, and integrate a wide range of information to allow managers and planners to assess different control strategies. Here, a straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.A straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.
The Time Course of Incremental Word Processing during Chinese Reading
ERIC Educational Resources Information Center
Zhou, Junyi; Ma, Guojie; Li, Xingshan; Taft, Marcus
2018-01-01
In the current study, we report two eye movement experiments investigating how Chinese readers process incremental words during reading. These are words where some of the component characters constitute another word (an embedded word). In two experiments, eye movements were monitored while the participants read sentences with incremental words…
Processing of Color Words Activates Color Representations
ERIC Educational Resources Information Center
Richter, Tobias; Zwaan, Rolf A.
2009-01-01
Two experiments were conducted to investigate whether color representations are routinely activated when color words are processed. Congruency effects of colors and color words were observed in both directions. Lexical decisions on color words were faster when preceding colors matched the color named by the word. Color-discrimination responses…
Sambai, Ami; Coltheart, Max; Uno, Akira
2018-04-01
In English, the size of the regularity effect on word reading-aloud latency decreases across position of irregularity. This has been explained by a sublexical serially operating reading mechanism. It is unclear whether sublexical serial processing occurs in reading two-character kanji words aloud. To investigate this issue, we studied how the position of atypical character-to-sound correspondences influenced reading performance. When participants read inconsistent-atypical words aloud mixed randomly with nonwords, reading latencies of words with an inconsistent-atypical correspondence in the initial position were significantly longer than words with an inconsistent-atypical correspondence in the second position. The significant difference of reading latencies for inconsistent-atypical words disappeared when inconsistent-atypical words were presented without nonwords. Moreover, reading latencies for words with an inconsistent-atypical correspondence in the first position were shorter than for words with a typical correspondence in the first position. This typicality effect was absent when the atypicality was in the second position. These position-of-atypicality effects suggest that sublexical processing of kanji occurs serially and that the phonology of two-character kanji words is generated from both a lexical parallel process and a sublexical serial process.
Word Processors and Invention in Technical Writing.
ERIC Educational Resources Information Center
Barker, Thomas T.
1989-01-01
Explores how word processing affects thinking and writing. Examines two myths surrounding word processors and invention in technical writing. Describes how word processing can enhance invention through collaborative writing, templates, and on-screen outlining. (MM)
Neuromagnetic correlates of audiovisual word processing in the developing brain.
Dinga, Samantha; Wu, Di; Huang, Shuyang; Wu, Caiyun; Wang, Xiaoshan; Shi, Jingping; Hu, Yue; Liang, Chun; Zhang, Fawen; Lu, Meng; Leiken, Kimberly; Xiang, Jing
2018-06-01
The brain undergoes enormous changes during childhood. Little is known about how the brain develops to serve word processing. The objective of the present study was to investigate the maturational changes of word processing in children and adolescents using magnetoencephalography (MEG). Responses to a word processing task were investigated in sixty healthy participants. Each participant was presented with simultaneous visual and auditory word pairs in "match" and "mismatch" conditions. The patterns of neuromagnetic activation from MEG recordings were analyzed at both sensor and source levels. Topography and source imaging revealed that word processing transitioned from bilateral connections to unilateral connections as age increased from 6 to 17 years old. Correlation analyses of language networks revealed that the path length of word processing networks negatively correlated with age (r = -0.833, p < 0.0001), while the connection strength (r = 0.541, p < 0.01) and the clustering coefficient (r = 0.705, p < 0.001) of word processing networks were positively correlated with age. In addition, males had more visual connections, whereas females had more auditory connections. The correlations between gender and path length, gender and connection strength, and gender and clustering coefficient demonstrated a developmental trend without reaching statistical significance. The results indicate that the developmental trajectory of word processing is gender specific. Since the neuromagnetic signatures of these gender-specific paths to adult word processing were determined using non-invasive, objective, and quantitative methods, the results may play a key role in understanding language impairments in pediatric patients in the future. Copyright © 2018 Elsevier B.V. All rights reserved.
Cognate and Word Class Ambiguity Effects in Noun and Verb Processing
ERIC Educational Resources Information Center
Bultena, Sybrine; Dijkstra, Ton; van Hell, Janet G.
2013-01-01
This study examined how noun and verb processing in bilingual visual word recognition are affected by within and between-language overlap. We investigated how word class ambiguous noun and verb cognates are processed by bilinguals, to see if co-activation of overlapping word forms between languages benefits from additional overlap within a…
Midbrain-Driven Emotion and Reward Processing in Alcoholism
Müller-Oehring, E M; Jung, Y-C; Sullivan, E V; Hawkes, W C; Pfefferbaum, A; Schulte, T
2013-01-01
Alcohol dependence is associated with impaired control over emotionally motivated actions, possibly associated with abnormalities in the frontoparietal executive control network and midbrain nodes of the reward network associated with automatic attention. To identify differences in the neural response to alcohol-related word stimuli, 26 chronic alcoholics (ALC) and 26 healthy controls (CTL) performed an alcohol-emotion Stroop Match-to-Sample task during functional MR imaging. Stroop contrasts were modeled for color-word incongruency (eg, word RED printed in green) and for alcohol (eg, BEER), positive (eg, HAPPY) and negative (eg, MAD) emotional word content relative to congruent word conditions (eg, word RED printed in red). During color-Stroop processing, ALC and CTL showed similar left dorsolateral prefrontal activation, and CTL, but not ALC, deactivated posterior cingulate cortex/cuneus. An interaction revealed a dissociation between alcohol-word and color-word Stroop processing: ALC activated midbrain and parahippocampal regions more than CTL when processing alcohol-word relative to color-word conditions. In ALC, the midbrain region was also invoked by negative emotional Stroop words thereby showing significant overlap of this midbrain activation for alcohol-related and negative emotional processing. Enhanced midbrain activation to alcohol-related words suggests neuroadaptation of dopaminergic midbrain systems. We speculate that such tuning is normally associated with behavioral conditioning to optimize responses but here contributed to automatic bias to alcohol-related stimuli. PMID:23615665
Midbrain-driven emotion and reward processing in alcoholism.
Müller-Oehring, E M; Jung, Y-C; Sullivan, E V; Hawkes, W C; Pfefferbaum, A; Schulte, T
2013-09-01
Alcohol dependence is associated with impaired control over emotionally motivated actions, possibly associated with abnormalities in the frontoparietal executive control network and midbrain nodes of the reward network associated with automatic attention. To identify differences in the neural response to alcohol-related word stimuli, 26 chronic alcoholics (ALC) and 26 healthy controls (CTL) performed an alcohol-emotion Stroop Match-to-Sample task during functional MR imaging. Stroop contrasts were modeled for color-word incongruency (eg, word RED printed in green) and for alcohol (eg, BEER), positive (eg, HAPPY) and negative (eg, MAD) emotional word content relative to congruent word conditions (eg, word RED printed in red). During color-Stroop processing, ALC and CTL showed similar left dorsolateral prefrontal activation, and CTL, but not ALC, deactivated posterior cingulate cortex/cuneus. An interaction revealed a dissociation between alcohol-word and color-word Stroop processing: ALC activated midbrain and parahippocampal regions more than CTL when processing alcohol-word relative to color-word conditions. In ALC, the midbrain region was also invoked by negative emotional Stroop words thereby showing significant overlap of this midbrain activation for alcohol-related and negative emotional processing. Enhanced midbrain activation to alcohol-related words suggests neuroadaptation of dopaminergic midbrain systems. We speculate that such tuning is normally associated with behavioral conditioning to optimize responses but here contributed to automatic bias to alcohol-related stimuli.
ERIC Educational Resources Information Center
Mogey, Nora; Hartley, James
2013-01-01
There is much debate about whether or not these days students should be able to word-process essay-type examinations as opposed to handwriting them, particularly when they are asked to word-process everything else. This study used word-processing software to examine the stylistic features of 13 examination essays written by hand and 24 by…
ERIC Educational Resources Information Center
Scriven, Jolene D.; And Others
A study sought to determine current practices in word processing installations located in selected organizations throughout the United States. A related problem was to ascertain anticipated future developments in word processing to provide information for educational institutions preparing workers for the business office. Six interview instruments…
Nakagawa, A; Sukigara, M
2000-09-01
The purpose of this study was to examine the relationship between familiarity and laterality in reading Japanese Kana words. In two divided-visual-field experiments, three- or four-character Hiragana or Katakana words were presented in both familiar and unfamiliar scripts, to which subjects performed lexical decisions. Experiment 1, using three stimulus durations (40, 100, 160 ms), suggested that only in the unfamiliar script condition was increased stimulus presentation time differently affected in each visual field. To examine this lateral difference during the processing of unfamiliar scripts as related to attentional laterality, a concurrent auditory shadowing task was added in Experiment 2. The results suggested that processing words in an unfamiliar script requires attention, which could be left-hemisphere lateralized, while orthographically familiar kana words can be processed automatically on the basis of their word-level orthographic representations or visual word form. Copyright 2000 Academic Press.
Wegrzyn, Martin; Herbert, Cornelia; Ethofer, Thomas; Flaisch, Tobias; Kissler, Johanna
2017-11-01
Visually presented emotional words are processed preferentially and effects of emotional content are similar to those of explicit attention deployment in that both amplify visual processing. However, auditory processing of emotional words is less well characterized and interactions between emotional content and task-induced attention have not been fully understood. Here, we investigate auditory processing of emotional words, focussing on how auditory attention to positive and negative words impacts their cerebral processing. A Functional magnetic resonance imaging (fMRI) study manipulating word valence and attention allocation was performed. Participants heard negative, positive and neutral words to which they either listened passively or attended by counting negative or positive words, respectively. Regardless of valence, active processing compared to passive listening increased activity in primary auditory cortex, left intraparietal sulcus, and right superior frontal gyrus (SFG). The attended valence elicited stronger activity in left inferior frontal gyrus (IFG) and left SFG, in line with these regions' role in semantic retrieval and evaluative processing. No evidence for valence-specific attentional modulation in auditory regions or distinct valence-specific regional activations (i.e., negative > positive or positive > negative) was obtained. Thus, allocation of auditory attention to positive and negative words can substantially increase their processing in higher-order language and evaluative brain areas without modulating early stages of auditory processing. Inferior and superior frontal brain structures mediate interactions between emotional content, attention, and working memory when prosodically neutral speech is processed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Processing concrete words: fMRI evidence against a specific right-hemisphere involvement.
Fiebach, Christian J; Friederici, Angela D
2004-01-01
Behavioral, patient, and electrophysiological studies have been taken as support for the assumption that processing of abstract words is confined to the left hemisphere, whereas concrete words are processed also by right-hemispheric brain areas. These are thought to provide additional information from an imaginal representational system, as postulated in the dual-coding theory of memory and cognition. Here we report new event-related fMRI data on the processing of concrete and abstract words in a lexical decision task. While abstract words activated a subregion of the left inferior frontal gyrus (BA 45) more strongly than concrete words, specific activity for concrete words was observed in the left basal temporal cortex. These data as well as data from other neuroimaging studies reviewed here are not compatible with the assumption of a specific right-hemispheric involvement for concrete words. The combined findings rather suggest a revised view of the neuroanatomical bases of the imaginal representational system assumed in the dual-coding theory, at least with respect to word recognition.
Oscillatory brain dynamics associated with the automatic processing of emotion in words.
Wang, Lin; Bastiaansen, Marcel
2014-10-01
This study examines the automaticity of processing the emotional aspects of words, and characterizes the oscillatory brain dynamics that accompany this automatic processing. Participants read emotionally negative, neutral and positive nouns while performing a color detection task in which only perceptual-level analysis was required. Event-related potentials and time frequency representations were computed from the concurrently measured EEG. Negative words elicited a larger P2 and a larger late positivity than positive and neutral words, indicating deeper semantic/evaluative processing of negative words. In addition, sustained alpha power suppressions were found for the emotional compared to neutral words, in the time range from 500 to 1000ms post-stimulus. These results suggest that sustained attention was allocated to the emotional words, whereas the attention allocated to the neutral words was released after an initial analysis. This seems to hold even when the emotional content of the words is task-irrelevant. Copyright © 2014 Elsevier Inc. All rights reserved.
Planning and production of grammatical and lexical verbs in multi-word messages.
Michel Lange, Violaine; Messerschmidt, Maria; Harder, Peter; Siebner, Hartwig Roman; Boye, Kasper
2017-01-01
Grammatical words represent the part of grammar that can be most directly contrasted with the lexicon. Aphasiological studies, linguistic theories and psycholinguistic studies suggest that their processing is operated at different stages in speech production. Models of sentence production propose that at the formulation stage, lexical words are processed at the functional level while grammatical words are processed at a later positional level. In this study we consider proposals made by linguistic theories and psycholinguistic models to derive two predictions for the processing of grammatical words compared to lexical words. First, based on the assumption that grammatical words are less crucial for communication and therefore paid less attention to, it is predicted that they show shorter articulation times and/or higher error rates than lexical words. Second, based on the assumption that grammatical words differ from lexical words in being dependent on a lexical host, it is hypothesized that the retrieval of a grammatical word has to be put on hold until its lexical host is available, and it is predicted that this is reflected in longer reaction times (RTs) for grammatical compared to lexical words. We investigated these predictions by comparing fully homonymous sentences with only a difference in verb status (grammatical vs. lexical) elicited by a specific context. We measured RTs, duration and accuracy rate. No difference in duration was observed. Longer RTs and a lower accuracy rate for grammatical words were reported, successfully reflecting grammatical word properties as defined by linguistic theories and psycholinguistic models. Importantly, this study provides insight into the span of encoding and grammatical encoding processes in speech production.
Planning and production of grammatical and lexical verbs in multi-word messages
Messerschmidt, Maria; Harder, Peter; Siebner, Hartwig Roman; Boye, Kasper
2017-01-01
Grammatical words represent the part of grammar that can be most directly contrasted with the lexicon. Aphasiological studies, linguistic theories and psycholinguistic studies suggest that their processing is operated at different stages in speech production. Models of sentence production propose that at the formulation stage, lexical words are processed at the functional level while grammatical words are processed at a later positional level. In this study we consider proposals made by linguistic theories and psycholinguistic models to derive two predictions for the processing of grammatical words compared to lexical words. First, based on the assumption that grammatical words are less crucial for communication and therefore paid less attention to, it is predicted that they show shorter articulation times and/or higher error rates than lexical words. Second, based on the assumption that grammatical words differ from lexical words in being dependent on a lexical host, it is hypothesized that the retrieval of a grammatical word has to be put on hold until its lexical host is available, and it is predicted that this is reflected in longer reaction times (RTs) for grammatical compared to lexical words. We investigated these predictions by comparing fully homonymous sentences with only a difference in verb status (grammatical vs. lexical) elicited by a specific context. We measured RTs, duration and accuracy rate. No difference in duration was observed. Longer RTs and a lower accuracy rate for grammatical words were reported, successfully reflecting grammatical word properties as defined by linguistic theories and psycholinguistic models. Importantly, this study provides insight into the span of encoding and grammatical encoding processes in speech production. PMID:29091940
NASA Technical Reports Server (NTRS)
Holderman, James D.; Clisset, James R.; Moder, Jeffrey P.
2010-01-01
This is a printout of the supplemental spreadsheet that is a supplement to the document found in NASA/TM-2010-216100. The calculations for cases of opposed rows of jets with the orifices on one side shifted show that staggering can improve the mixing, particularly for cases where jets would overpenetrate slightly if the orifices were in an aligned configuration.
Teaching graphical simulations of Fourier series expansion of some periodic waves using spreadsheets
NASA Astrophysics Data System (ADS)
Singh, Iqbal; Kaur, Bikramjeet
2018-05-01
The present article demonstrates a way of programming using an Excel spreadsheet to teach Fourier series expansion in school/colleges without the knowledge of any typical programming language. By using this, a student learns to approximate partial sum of the n terms of Fourier series for some periodic signals such as square wave, saw tooth wave, half wave rectifier and full wave rectifier signals.
ERIC Educational Resources Information Center
Benacka, Jan
2015-01-01
This paper provides the formula for the elevation angle at which a projectile has to be fired in a vacuum from a general position to hit a target at a given distance. A spreadsheet application that models the trajectory is presented, and the problem of finding the points of shot and impact of a projectile moving in a vacuum if three points of the…
A Spreadsheet for the Mixing of a Row of Jets with a Confined Crossflow
NASA Technical Reports Server (NTRS)
Holderman, J. D.; Smith, T. D.; Clisset, J. R.; Lear, W. E.
2005-01-01
An interactive computer code, written with a readily available software program, Microsoft Excel (Microsoft Corporation, Redmond, WA) is presented which displays 3 D oblique plots of a conserved scalar distribution downstream of jets mixing with a confined crossflow, for a single row, double rows, or opposed rows of jets with or without flow area convergence and/or a non-uniform crossflow scalar distribution. This project used a previously developed empirical model of jets mixing in a confined crossflow to create an Microsoft Excel spreadsheet that can output the profiles of a conserved scalar for jets injected into a confined crossflow given several input variables. The program uses multiple spreadsheets in a single Microsoft Excel notebook to carry out the modeling. The first sheet contains the main program, controls for the type of problem to be solved, and convergence criteria. The first sheet also provides for input of the specific geometry and flow conditions. The second sheet presents the results calculated with this routine to show the effects on the mixing of varying flow and geometric parameters. Comparisons are also made between results from the version of the empirical correlations implemented in the spreadsheet and the versions originally written in Applesoft BASIC (Apple Computer, Cupertino, CA) in the 1980's.
A Spreadsheet for the Mixing of a Row of Jets with a Confined Crossflow. Supplement
NASA Technical Reports Server (NTRS)
Holderman, J. D.; Smith, T. D.; Clisset, J. R.; Lear, W. E.
2005-01-01
An interactive computer code, written with a readily available software program, Microsoft Excel (Microsoft Corporation, Redmond, WA) is presented which displays 3 D oblique plots of a conserved scalar distribution downstream of jets mixing with a confined crossflow, for a single row, double rows, or opposed rows of jets with or without flow area convergence and/or a non-uniform crossflow scalar distribution. This project used a previously developed empirical model of jets mixing in a confined crossflow to create an Microsoft Excel spreadsheet that can output the profiles of a conserved scalar for jets injected into a confined crossflow given several input variables. The program uses multiple spreadsheets in a single Microsoft Excel notebook to carry out the modeling. The first sheet contains the main program, controls for the type of problem to be solved, and convergence criteria. The first sheet also provides for input of the specific geometry and flow conditions. The second sheet presents the results calculated with this routine to show the effects on the mixing of varying flow and geometric parameters. Comparisons are also made between results from the version of the empirical correlations implemented in the spreadsheet and the versions originally written in Applesoft BASIC (Apple Computer, Cupertino, CA) in the 1980's.
Li, Su; Lee, Kang; Zhao, Jing; Yang, Zhi; He, Sheng; Weng, Xuchu
2013-01-01
Little is known about the impact of learning to read on early neural development for word processing and its collateral effects on neural development in non-word domains. Here, we examined the effect of early exposure to reading on neural responses to both word and face processing in preschool children with the use of the Event Related Potential (ERP) methodology. We specifically linked children’s reading experience (indexed by their sight vocabulary) to two major neural markers: the amplitude differences between the left and right N170 on the bilateral posterior scalp sites and the hemispheric spectrum power differences in the γ band on the same scalp sites. The results showed that the left-lateralization of both the word N170 and the spectrum power in the γ band were significantly positively related to vocabulary. In contrast, vocabulary and the word left-lateralization both had a strong negative direct effect on the face right-lateralization. Also, vocabulary negatively correlated with the right-lateralized face spectrum power in the γ band even after the effects of age and the word spectrum power were partialled out. The present study provides direct evidence regarding the role of reading experience in the neural specialization of word and face processing above and beyond the effect of maturation. The present findings taken together suggest that the neural development of visual word processing competes with that of face processing before the process of neural specialization has been consolidated. PMID:23462239
Taken out of Context: Differential Processing in Contextual and Isolated Word Reading
ERIC Educational Resources Information Center
Martin-Chang, Sandra; Levesque, Kyle
2013-01-01
Three experiments are reported that investigate the cognitive processes underlying contextual and isolated word reading. In Phase 1, undergraduate participants were exposed to 75 target words under three conditions. The participants generated 25 words from definitions, read 25 words in context and read 25 in isolation. In Phase 2, volunteers…
Intermediate-sized natural gas fueled carbonate fuel cell power plants
NASA Astrophysics Data System (ADS)
Sudhoff, Frederick A.; Fleming, Donald K.
1994-04-01
This executive summary of the report describes the accomplishments of the joint US Department of Energy's (DOE) Morgantown Energy Technology Center (METC) and M-C POWER Corporation's Cooperative Research and Development Agreement (CRADA) No. 93-013. This study addresses the intermediate power plant size between 2 megawatt (MW) and 200 MW. A 25 MW natural-gas, fueled-carbonate fuel cell power plant was chosen for this purpose. In keeping with recent designs, the fuel cell will operate under approximately three atmospheres of pressure. An expander/alternator is utilized to expand exhaust gas to atmospheric conditions and generate additional power. A steam-bottoming cycle is not included in this study because it is not believed to be cost effective for this system size. This study also addresses the simplicity and accuracy of a spreadsheet-based simulation with that of a full Advanced System for Process Engineering (ASPEN) simulation. The personal computer can fully utilize the simple spreadsheet model simulation. This model can be made available to all users and is particularly advantageous to the small business user.
Gallegos, Tanya J.; Varela, Brian A.
2015-01-01
Comprehensive, published, and publicly available data regarding the extent, location, and character of hydraulic fracturing in the United States are scarce. The objective of this data series is to publish data related to hydraulic fracturing in the public domain. The spreadsheets released with this data series contain derivative datasets aggregated temporally and spatially from the commercial and proprietary IHS database of U.S. oil and gas production and well data (IHS Energy, 2011). These datasets, served in 21 spreadsheets in Microsoft Excel (.xlsx) format, outline the geographical distributions of hydraulic fracturing treatments and associated wells (including well drill-hole directions) as well as water volumes, proppants, treatment fluids, and additives used in hydraulic fracturing treatments in the United States from 1947 through 2010. This report also describes the data—extraction/aggregation processing steps, field names and descriptions, field types and sources. An associated scientific investigation report (Gallegos and Varela, 2014) provides a detailed analysis of the data presented in this data series and comparisons of the data and trends to the literature.
Mano, Junichi; Shigemitsu, Natsuki; Futo, Satoshi; Akiyama, Hiroshi; Teshima, Reiko; Hino, Akihiro; Furui, Satoshi; Kitta, Kazumi
2009-01-14
We developed a novel type of real-time polymerase chain reaction (PCR) array with TaqMan chemistry as a platform for the comprehensive and semiquantitative detection of genetically modified (GM) crops. Thirty primer-probe sets for the specific detection of GM lines, recombinant DNA (r-DNA) segments, endogenous reference genes, and donor organisms were synthesized, and a 96-well PCR plate was prepared with a different primer-probe in each well as the real-time PCR array. The specificity and sensitivity of the array were evaluated. A comparative analysis with the data and publicly available information on GM crops approved in Japan allowed us to assume the possibility of unapproved GM crop contamination. Furthermore, we designed a Microsoft Excel spreadsheet application, Unapproved GMO Checker version 2.01, which helps process all the data of real-time PCR arrays for the easy assumption of unapproved GM crop contamination. The spreadsheet is available free of charge at http://cse.naro.affrc.go.jp/jmano/index.html .
Mass-balance measurements in Alaska and suggestions for simplified observation programs
Trabant, D.C.; March, R.S.
1999-01-01
US Geological Survey glacier fieldwork in Alaska includes repetitious measurements, corrections for leaning or bending stakes, an ability to reliably measure seasonal snow as deep as 10 m, absolute identification of summer surfaces in the accumulation area, and annual evaluation of internal accumulation, internal ablation, and glacier-thickness changes. Prescribed field measurement and note-taking techniques help eliminate field errors and expedite the interpretative process. In the office, field notes are transferred to computerized spread-sheets for analysis, release on the World Wide Web, and archival storage. The spreadsheets have error traps to help eliminate note-taking and transcription errors. Rigorous error analysis ends when mass-balance measurements are extrapolated and integrated with area to determine glacier and basin mass balances. Unassessable errors in the glacier and basin mass-balance data reduce the value of the data set for correlations with climate change indices. The minimum glacier mass-balance program has at least three measurement sites on a glacier and the measurements must include the seasonal components of mass balance as well as the annual balance.
The influence of contextual diversity on eye movements in reading.
Plummer, Patrick; Perea, Manuel; Rayner, Keith
2014-01-01
Recent research has shown contextual diversity (i.e., the number of passages in which a given word appears) to be a reliable predictor of word processing difficulty. It has also been demonstrated that word-frequency has little or no effect on word recognition speed when accounting for contextual diversity in isolated word processing tasks. An eye-movement experiment was conducted wherein the effects of word-frequency and contextual diversity were directly contrasted in a normal sentence reading scenario. Subjects read sentences with embedded target words that varied in word-frequency and contextual diversity. All 1st-pass and later reading times were significantly longer for words with lower contextual diversity compared to words with higher contextual diversity when controlling for word-frequency and other important lexical properties. Furthermore, there was no difference in reading times for higher frequency and lower frequency words when controlling for contextual diversity. The results confirm prior findings regarding contextual diversity and word-frequency effects and demonstrate that contextual diversity is a more accurate predictor of word processing speed than word-frequency within a normal reading task. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Matlab-Excel Interface for OpenDSS
DOE Office of Scientific and Technical Information (OSTI.GOV)
The software allows users of the OpenDSS grid modeling software to access their load flow models using a GUI interface developed in MATLAB. The circuit definitions are entered into a Microsoft Excel spreadsheet which makes circuit creation and editing a much simpler process than the basic text-based editors used in the native OpenDSS interface. Plot tools have been developed which can be accessed through a MATLAB GUI once the desired parameters have been simulated.
Strategy And The Spreadsheet: Optimizing The Total Army To Satisfy Both
2016-02-11
historically reduces military end strength at the conclusion of major conflicts. The Budget Control Act of 2011 imposed sequestration spending limits on...the military that began the process of drawing down the military through fiscal year 2021. While the 2016 defense budget delays sequestration cuts... budget by a wide margin, has started repeating a historical cycle of budget driven defense cuts. The Army’s large force represents an attractive
Moffat, Michael; Siakaluk, Paul D; Sidhu, David M; Pexman, Penny M
2015-04-01
It has been proposed that much of conceptual knowledge is acquired through situated conceptualization, such that both external (e.g., agents, objects, events) and internal (e.g., emotions, introspections) environments are considered important (Barsalou, 2003). To evaluate this proposal, we characterized two dimensions by which situated conceptualization may be measured and which should have different relevance for abstract and concrete concepts; namely, emotional experience (i.e., the ease with which words evoke emotional experience; Newcombe, Campbell, Siakaluk, & Pexman, 2012) and context availability (i.e., the ease with which words evoke contexts in which their referents may appear; Schwanenflugel & Shoben, 1983). We examined the effects of these two dimensions on abstract and concrete word processing in verbal semantic categorization (VSCT) and naming tasks. In the VSCT, emotional experience facilitated processing of abstract words but inhibited processing of concrete words, whereas context availability facilitated processing of both types of words. In the naming task in which abstract words and concrete words were not blocked by emotional experience, context availability facilitated responding to only the abstract words. In the naming task in which abstract words and concrete words were blocked by emotional experience, emotional experience facilitated responding to only the abstract words, whereas context availability facilitated responding to only the concrete words. These results were observed even with several lexical (e.g., frequency, age of acquisition) and semantic (e.g., concreteness, arousal, valence) variables included in the analyses. As such, the present research suggests that emotional experience and context availability tap into different aspects of situated conceptualization and make unique contributions to the representation and processing of abstract and concrete concepts.
Flaisch, Tobias; Imhof, Martin; Schmälzle, Ralf; Wentz, Klaus-Ulrich; Ibach, Bernd; Schupp, Harald T
2015-01-01
The present study utilized functional magnetic resonance imaging (fMRI) to examine the neural processing of concurrently presented emotional stimuli under varying explicit and implicit attention demands. Specifically, in separate trials, participants indicated the category of either pictures or words. The words were placed over the center of the pictures and the picture-word compound-stimuli were presented for 1500 ms in a rapid event-related design. The results reveal pronounced main effects of task and emotion: the picture categorization task prompted strong activations in visual, parietal, temporal, frontal, and subcortical regions; the word categorization task evoked increased activation only in left extrastriate cortex. Furthermore, beyond replicating key findings regarding emotional picture and word processing, the results point to a dissociation of semantic-affective and sensory-perceptual processes for words: while emotional words engaged semantic-affective networks of the left hemisphere regardless of task, the increased activity in left extrastriate cortex associated with explicitly attending to words was diminished when the word was overlaid over an erotic image. Finally, we observed a significant interaction between Picture Category and Task within dorsal visual-associative regions, inferior parietal, and dorsolateral, and medial prefrontal cortices: during the word categorization task, activation was increased in these regions when the words were overlaid over erotic as compared to romantic pictures. During the picture categorization task, activity in these areas was relatively decreased when categorizing erotic as compared to romantic pictures. Thus, the emotional intensity of the pictures strongly affected brain regions devoted to the control of task-related word or picture processing. These findings are discussed with respect to the interplay of obligatory stimulus processing with task-related attentional control mechanisms.
Flaisch, Tobias; Imhof, Martin; Schmälzle, Ralf; Wentz, Klaus-Ulrich; Ibach, Bernd; Schupp, Harald T.
2015-01-01
The present study utilized functional magnetic resonance imaging (fMRI) to examine the neural processing of concurrently presented emotional stimuli under varying explicit and implicit attention demands. Specifically, in separate trials, participants indicated the category of either pictures or words. The words were placed over the center of the pictures and the picture-word compound-stimuli were presented for 1500 ms in a rapid event-related design. The results reveal pronounced main effects of task and emotion: the picture categorization task prompted strong activations in visual, parietal, temporal, frontal, and subcortical regions; the word categorization task evoked increased activation only in left extrastriate cortex. Furthermore, beyond replicating key findings regarding emotional picture and word processing, the results point to a dissociation of semantic-affective and sensory-perceptual processes for words: while emotional words engaged semantic-affective networks of the left hemisphere regardless of task, the increased activity in left extrastriate cortex associated with explicitly attending to words was diminished when the word was overlaid over an erotic image. Finally, we observed a significant interaction between Picture Category and Task within dorsal visual-associative regions, inferior parietal, and dorsolateral, and medial prefrontal cortices: during the word categorization task, activation was increased in these regions when the words were overlaid over erotic as compared to romantic pictures. During the picture categorization task, activity in these areas was relatively decreased when categorizing erotic as compared to romantic pictures. Thus, the emotional intensity of the pictures strongly affected brain regions devoted to the control of task-related word or picture processing. These findings are discussed with respect to the interplay of obligatory stimulus processing with task-related attentional control mechanisms. PMID:26733895
Cross-language parafoveal semantic processing: Evidence from Korean-Chinese bilinguals.
Wang, Aiping; Yeon, Junmo; Zhou, Wei; Shu, Hua; Yan, Ming
2016-02-01
In the present study, we aimed at testing cross-language cognate and semantic preview effects. We tested how native Korean readers who learned Chinese as a second language make use of the parafoveal information during the reading of Chinese sentences. There were 3 types of Korean preview words: cognate translations of the Chinese target words, semantically related noncognate words, and unrelated words. Together with a highly significant cognate preview effect, more critically, we also observed reliable facilitation in processing of the target word from the semantically related previews in all fixation measures. Results from the present study provide first evidence for semantic processing from parafoveally presented Korean words and for cross-language parafoveal semantic processing.
ERIC Educational Resources Information Center
Aboud, Katherine S.; Bailey, Stephen K.; Petrill, Stephen A.; Cutting, Laurie E.
2016-01-01
Skilled reading depends on recognizing words efficiently in isolation ("word-level processing"; "WL") and extracting meaning from text ("discourse-level processing"; "DL"); deficiencies in either result in poor reading. FMRI has revealed consistent overlapping networks in word and passage reading, as well as…
Morphological Processing during Visual Word Recognition in Hebrew as a First and a Second Language
ERIC Educational Resources Information Center
Norman, Tal; Degani, Tamar; Peleg, Orna
2017-01-01
The present study examined whether sublexical morphological processing takes place during visual word-recognition in Hebrew, and whether morphological decomposition of written words depends on lexical activation of the complete word. Furthermore, it examined whether morphological processing is similar when reading Hebrew as a first language (L1)…
Information content versus word length in random typing
NASA Astrophysics Data System (ADS)
Ferrer-i-Cancho, Ramon; Moscoso del Prado Martín, Fermín
2011-12-01
Recently, it has been claimed that a linear relationship between a measure of information content and word length is expected from word length optimization and it has been shown that this linearity is supported by a strong correlation between information content and word length in many languages (Piantadosi et al 2011 Proc. Nat. Acad. Sci. 108 3825). Here, we study in detail some connections between this measure and standard information theory. The relationship between the measure and word length is studied for the popular random typing process where a text is constructed by pressing keys at random from a keyboard containing letters and a space behaving as a word delimiter. Although this random process does not optimize word lengths according to information content, it exhibits a linear relationship between information content and word length. The exact slope and intercept are presented for three major variants of the random typing process. A strong correlation between information content and word length can simply arise from the units making a word (e.g., letters) and not necessarily from the interplay between a word and its context as proposed by Piantadosi and co-workers. In itself, the linear relation does not entail the results of any optimization process.
Velan, Hadas; Frost, Ram
2010-01-01
Recent studies suggest that basic effects which are markers of visual word recognition in Indo-European languages cannot be obtained in Hebrew or in Arabic. Although Hebrew has an alphabetic writing system, just like English, French, or Spanish, a series of studies consistently suggested that simple form-orthographic priming, or letter-transposition priming are not found in Hebrew. In four experiments, we tested the hypothesis that this is due to the fact that Semitic words have an underlying structure that constrains the possible alignment of phonemes and their respective letters. The experiments contrasted typical Semitic words which are root-derived, with Hebrew words of non-Semitic origin, which are morphologically simple and resemble base words in European languages. Using RSVP, TL priming, and form-priming manipulations, we show that Hebrew readers process Hebrew words which are morphologically simple similar to the way they process English words. These words indeed reveal the typical form-priming and TL priming effects reported in European languages. In contrast, words with internal structure are processed differently, and require a different code for lexical access. We discuss the implications of these findings for current models of visual word recognition. PMID:21163472
Intrinsically organized network for word processing during the resting state.
Zhao, Jizheng; Liu, Jiangang; Li, Jun; Liang, Jimin; Feng, Lu; Ai, Lin; Lee, Kang; Tian, Jie
2011-01-03
Neural mechanisms underlying word processing have been extensively studied. It has been revealed that when individuals are engaged in active word processing, a complex network of cortical regions is activated. However, it is entirely unknown whether the word-processing regions are intrinsically organized without any explicit processing tasks during the resting state. The present study investigated the intrinsic functional connectivity between word-processing regions during the resting state with the use of fMRI methodology. The low-frequency fluctuations were observed between the left middle fusiform gyrus and a number of cortical regions. They included the left angular gyrus, left supramarginal gyrus, bilateral pars opercularis, and left pars triangularis of the inferior frontal gyrus, which have been implicated in phonological and semantic processing. Additionally, the activations were also observed in the bilateral superior parietal lobule and dorsal lateral prefrontal cortex, which have been suggested to provide top-down monitoring on the visual-spatial processing of words. The findings of our study indicate an intrinsically organized network during the resting state that likely prepares the visual system to anticipate the highly probable word input for ready and effective processing. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Reinke, Karen; Fernandes, Myra; Schwindt, Graeme; O'Craven, Kathleen; Grady, Cheryl L.
2008-01-01
The functional specificity of the brain region known as the Visual Word Form Area (VWFA) was examined using fMRI. We explored whether this area serves a general role in processing symbolic stimuli, rather than being selective for the processing of words. Brain activity was measured during a visual 1-back task to English words, meaningful symbols…
Resting state neural networks for visual Chinese word processing in Chinese adults and children.
Li, Ling; Liu, Jiangang; Chen, Feiyan; Feng, Lu; Li, Hong; Tian, Jie; Lee, Kang
2013-07-01
This study examined the resting state neural networks for visual Chinese word processing in Chinese children and adults. Both the functional connectivity (FC) and amplitude of low frequency fluctuation (ALFF) approaches were used to analyze the fMRI data collected when Chinese participants were not engaged in any specific explicit tasks. We correlated time series extracted from the visual word form area (VWFA) with those in other regions in the brain. We also performed ALFF analysis in the resting state FC networks. The FC results revealed that, regarding the functionally connected brain regions, there exist similar intrinsically organized resting state networks for visual Chinese word processing in adults and children, suggesting that such networks may already be functional after 3-4 years of informal exposure to reading plus 3-4 years formal schooling. The ALFF results revealed that children appear to recruit more neural resources than adults in generally reading-irrelevant brain regions. Differences between child and adult ALFF results suggest that children's intrinsic word processing network during the resting state, though similar in functional connectivity, is still undergoing development. Further exposure to visual words and experience with reading are needed for children to develop a mature intrinsic network for word processing. The developmental course of the intrinsically organized word processing network may parallel that of the explicit word processing network. Copyright © 2013 Elsevier Ltd. All rights reserved.
Simulation of axonal excitability using a Spreadsheet template created in Microsoft Excel.
Brown, A M
2000-08-01
The objective of this present study was to implement an established simulation protocol (A.M. Brown, A methodology for simulating biological systems using Microsoft Excel, Comp. Methods Prog. Biomed. 58 (1999) 181-90) to model axonal excitability. The simulation protocol involves the use of in-cell formulas directly typed into a spreadsheet and does not require any programming skills or use of the macro language. Once the initial spreadsheet template has been set up the simulations described in this paper can be executed with a few simple keystrokes. The model axon contained voltage-gated ion channels that were modeled using Hodgkin Huxley style kinetics. The basic properties of axonal excitability modeled were: (1) threshold of action potential firing, demonstrating that not only are the stimulus amplitude and duration critical in the generation of an action potential, but also the resting membrane potential; (2) refractoriness, the phenomenon of reduced excitability immediately following an action potential. The difference between the absolute refractory period, when no amount of stimulus will elicit an action potential, and relative refractory period, when an action potential may be generated by applying increased stimulus, was demonstrated with regard to the underlying state of the Na(+) and K(+) channels; (3) temporal summation, a process by which two sub-threshold stimuli can unite to elicit an action potential was shown to be due to conductance changes outlasting the first stimulus and summing with the second stimulus-induced conductance changes to drive the membrane potential past threshold; (4) anode break excitation, where membrane hyperpolarization was shown to produce an action potential by removing Na(+) channel inactivation that is present at resting membrane potential. The simulations described in this paper provide insights into mechanisms of axonal excitation that can be carried out by following an easily understood protocol.
Processing Academic Language through Four Corners Vocabulary Chart Applications
ERIC Educational Resources Information Center
Smith, Sarah; Sanchez, Claudia; Betty, Sharon; Davis, Shiloh
2016-01-01
4 Corners Vocabulary Charts (FCVCs) are explored as a multipurpose vehicle for processing academic language in a 5th-grade classroom. FCVCs typically display a vocabulary word, an illustration of the word, synonyms associated with the word, a sentence using a given vocabulary word, and a definition of the term in students' words. The use of…
Using the Word Processor in Writing Groups.
ERIC Educational Resources Information Center
Melia, Josie
Writing groups can use word processors or microcomputers in many different types of writing activities. Four hour-long sessions at a word processor with the help of a skilled word processing tutor have been found to be sufficient to provide a working knowledge of word processing. When two or three students enrolled in a writing class are assigned…
The Influence of Contextual Diversity on Eye Movements in Reading
ERIC Educational Resources Information Center
Plummer, Patrick; Perea, Manuel; Rayner, Keith
2014-01-01
Recent research has shown contextual diversity (i.e., the number of passages in which a given word appears) to be a reliable predictor of word processing difficulty. It has also been demonstrated that word-frequency has little or no effect on word recognition speed when accounting for contextual diversity in isolated word processing tasks. An…
Rotation Reveals the Importance of Configural Cues in Handwritten Word Perception
Barnhart, Anthony S.; Goldinger, Stephen D.
2013-01-01
A dramatic perceptual asymmetry occurs when handwritten words are rotated 90° in either direction. Those rotated in a direction consistent with their natural tilt (typically clockwise) become much more difficult to recognize, relative to those rotated in the opposite direction. In Experiment 1, we compared computer-printed and handwritten words, all equated for degrees of leftward and rightward tilt, and verified the phenomenon: The effect of rotation was far larger for cursive words, especially when rotated in a tilt-consistent direction. In Experiment 2, we replicated this pattern with all items presented in visual noise. In both experiments, word frequency effects were larger for computer-printed words and did not interact with rotation. The results suggest that handwritten word perception requires greater configural processing, relative to computer print, because handwritten letters are variable and ambiguous. When words are rotated, configural processing suffers, particularly when rotation exaggerates natural tilt. Our account is similar to theories of the “Thatcher Illusion,” wherein face inversion disrupts holistic processing. Together, the findings suggest that configural, word-level processing automatically increases when people read handwriting, as letter-level processing becomes less reliable. PMID:23589201
Interpreting Chicken-Scratch: Lexical Access for Handwritten Words
Barnhart, Anthony S.; Goldinger, Stephen D.
2014-01-01
Handwritten word recognition is a field of study that has largely been neglected in the psychological literature, despite its prevalence in society. Whereas studies of spoken word recognition almost exclusively employ natural, human voices as stimuli, studies of visual word recognition use synthetic typefaces, thus simplifying the process of word recognition. The current study examined the effects of handwriting on a series of lexical variables thought to influence bottom-up and top-down processing, including word frequency, regularity, bidirectional consistency, and imageability. The results suggest that the natural physical ambiguity of handwritten stimuli forces a greater reliance on top-down processes, because almost all effects were magnified, relative to conditions with computer print. These findings suggest that processes of word perception naturally adapt to handwriting, compensating for physical ambiguity by increasing top-down feedback. PMID:20695708
ERIC Educational Resources Information Center
Morocco, Catherine Cobb; And Others
The 2-year study investigated the use of word processing technology with 36 learning disabled (LD) intermediate grade children and 9 remedial teachers in five Massachusetts school districts. During the first year study staff documented how word processing was being used. In the second year, word processing activities hypothesized to be the most…
A New Perspective on Visual Word Processing Efficiency
Houpt, Joseph W.; Townsend, James T.; Donkin, Christopher
2013-01-01
As a fundamental part of our daily lives, visual word processing has received much attention in the psychological literature. Despite the well established advantage of perceiving letters in a word or in a pseudoword over letters alone or in random sequences using accuracy, a comparable effect using response times has been elusive. Some researchers continue to question whether the advantage due to word context is perceptual. We use the capacity coefficient, a well established, response time based measure of efficiency to provide evidence of word processing as a particularly efficient perceptual process to complement those results from the accuracy domain. PMID:24334151
Is Word Shape Still in Poor Shape for the Race to the Lexicon?
ERIC Educational Resources Information Center
Hill, Jessica C.
2010-01-01
Current models of normal reading behavior emphasize not only the recognition and processing of the word being fixated (n) but also processing of the upcoming parafoveal word (n + 1). Gaze contingent displays employing the boundary paradigm often mask words in order to understand how much and what type of processing is completed on the parafoveal…
Gwilliams, L; Marantz, A
2015-08-01
Although the significance of morphological structure is established in visual word processing, its role in auditory processing remains unclear. Using magnetoencephalography we probe the significance of the root morpheme for spoken Arabic words with two experimental manipulations. First we compare a model of auditory processing that calculates probable lexical outcomes based on whole-word competitors, versus a model that only considers the root as relevant to lexical identification. Second, we assess violations to the root-specific Obligatory Contour Principle (OCP), which disallows root-initial consonant gemination. Our results show root prediction to significantly correlate with neural activity in superior temporal regions, independent of predictions based on whole-word competitors. Furthermore, words that violated the OCP constraint were significantly easier to dismiss as valid words than probability-matched counterparts. The findings suggest that lexical auditory processing is dependent upon morphological structure, and that the root forms a principal unit through which spoken words are recognised. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Bridgers, Franca Ferrari; Kacinik, Natalie
2017-02-01
The majority of words in most languages consist of derived poly-morphemic words but a cross-linguistic review of the literature (Amenta and Crepaldi in Front Psychol 3:232-243, 2012) shows a contradictory picture with respect to how such words are represented and processed. The current study examined the effects of linearity and structural complexity on the processing of Italian derived words. Participants performed a lexical decision task on three types of prefixed and suffixed words and nonwords differing in the complexity of their internal structure. The processing of these words was indeed found to vary according to the nature of the affixes, the order in which they appear, and the type of information the affix encodes. The results thus indicate that derived words are not a uniform class and the best account of these findings appears to be a constraint-based or probabilistic multi-route processing model (e.g., Kuperman et al. in Lang Cogn Process 23:1089-1132, 2008; J Exp Psychol Hum Percept Perform 35:876-895, 2009; J Mem Lang 62:83-97, 2010).
How Sound Symbolism Is Processed in the Brain: A Study on Japanese Mimetic Words
Okuda, Jiro; Okada, Hiroyuki; Matsuda, Tetsuya
2014-01-01
Sound symbolism is the systematic and non-arbitrary link between word and meaning. Although a number of behavioral studies demonstrate that both children and adults are universally sensitive to sound symbolism in mimetic words, the neural mechanisms underlying this phenomenon have not yet been extensively investigated. The present study used functional magnetic resonance imaging to investigate how Japanese mimetic words are processed in the brain. In Experiment 1, we compared processing for motion mimetic words with that for non-sound symbolic motion verbs and adverbs. Mimetic words uniquely activated the right posterior superior temporal sulcus (STS). In Experiment 2, we further examined the generalizability of the findings from Experiment 1 by testing another domain: shape mimetics. Our results show that the right posterior STS was active when subjects processed both motion and shape mimetic words, thus suggesting that this area may be the primary structure for processing sound symbolism. Increased activity in the right posterior STS may also reflect how sound symbolic words function as both linguistic and non-linguistic iconic symbols. PMID:24840874
Fussell, Nicola J; Rowe, Angela C; Mohr, Christine
2012-01-01
The reliance in experimental psychology on testing undergraduate populations with relatively little life experience, and/or ambiguously valenced stimuli with varying degrees of self-relevance, may have contributed to inconsistent findings in the literature on the valence hypothesis. To control for these potential limitations, the current study assessed lateralised lexical decisions for positive and negative attachment words in 40 middle-aged male and female participants. Self-relevance was manipulated in two ways: by testing currently married compared with previously married individuals and by assessing self-relevance ratings individually for each word. Results replicated a left hemisphere advantage for lexical decisions and a processing advantage of emotional over neutral words but did not support the valence hypothesis. Positive attachment words yielded a processing advantage over neutral words in the right hemisphere, while emotional words (irrespective of valence) yielded a processing advantage over neutral words in the left hemisphere. Both self-relevance manipulations were unrelated to lateralised performance. The role of participant sex and age in emotion processing are discussed as potential modulators of the present findings.
The effects of sad prosody on hemispheric specialization for words processing.
Leshem, Rotem; Arzouan, Yossi; Armony-Sivan, Rinat
2015-06-01
This study examined the effect of sad prosody on hemispheric specialization for word processing using behavioral and electrophysiological measures. A dichotic listening task combining focused attention and signal-detection methods was conducted to evaluate the detection of a word spoken in neutral or sad prosody. An overall right ear advantage together with leftward lateralization in early (150-170 ms) and late (240-260 ms) processing stages was found for word detection, regardless of prosody. Furthermore, the early stage was most pronounced for words spoken in neutral prosody, showing greater negative activation over the left than the right hemisphere. In contrast, the later stage was most pronounced for words spoken with sad prosody, showing greater positive activation over the left than the right hemisphere. The findings suggest that sad prosody alone was not sufficient to modulate hemispheric asymmetry in word-level processing. We posit that lateralized effects of sad prosody on word processing are largely dependent on the psychoacoustic features of the stimuli as well as on task demands. Copyright © 2015 Elsevier Inc. All rights reserved.
Organizational Linkages: Understanding the Productivity Paradox,
1994-01-01
students were asked to make a decision regarding a production scheduling. Some used a Lotus spreadsheet’s what-if capacity, which enabled them to...the degree to which managers and MBA students believed that they make better decisions using what-if spreadsheet models, despite the fact that their...for this system is Naylor et al.’s (1980) view of behavior in organizations. When Pritchard and his students (Pritchard et al., 1988) applied this
Maxine: A spreadsheet for estimating dose from chronic atmospheric radioactive releases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jannik, Tim; Bell, Evaleigh; Dixon, Kenneth
MAXINE is an EXCEL© spreadsheet, which is used to estimate dose to individuals for routine and accidental atmospheric releases of radioactive materials. MAXINE does not contain an atmospheric dispersion model, but rather doses are estimated using air and ground concentrations as input. Minimal input is required to run the program and site specific parameters are used when possible. Complete code description, verification of models, and user’s manual have been included.
NASA Astrophysics Data System (ADS)
Fauzi, Ahmad
2017-11-01
Numerical computation has many pedagogical advantages: it develops analytical skills and problem-solving skills, helps to learn through visualization, and enhances physics education. Unfortunately, numerical computation is not taught to undergraduate education physics students in Indonesia. Incorporate numerical computation into the undergraduate education physics curriculum presents many challenges. The main challenges are the dense curriculum that makes difficult to put new numerical computation course and most students have no programming experience. In this research, we used case study to review how to integrate numerical computation into undergraduate education physics curriculum. The participants of this research were 54 students of the fourth semester of physics education department. As a result, we concluded that numerical computation could be integrated into undergraduate education physics curriculum using spreadsheet excel combined with another course. The results of this research become complements of the study on how to integrate numerical computation in learning physics using spreadsheet excel.
NASA Astrophysics Data System (ADS)
Sokolova, Tatiana S.; Dorogokupets, Peter I.; Dymshits, Anna M.; Danilov, Boris S.; Litasov, Konstantin D.
2016-09-01
We present Microsoft Excel spreadsheets for calculation of thermodynamic functions and P-V-T properties of MgO, diamond and 9 metals, Al, Cu, Ag, Au, Pt, Nb, Ta, Mo, and W, depending on temperature and volume or temperature and pressure. The spreadsheets include the most common pressure markers used in in situ experiments with diamond anvil cell and multianvil techniques. The calculations are based on the equation of state formalism via the Helmholtz free energy. The program was developed using Visual Basic for Applications in Microsoft Excel and is a time-efficient tool to evaluate volume, pressure and other thermodynamic functions using T-P and T-V data only as input parameters. This application is aimed to solve practical issues of high pressure experiments in geosciences and mineral physics.
Li, Su; Lee, Kang; Zhao, Jing; Yang, Zhi; He, Sheng; Weng, Xuchu
2013-04-01
Little is known about the impact of learning to read on early neural development for word processing and its collateral effects on neural development in non-word domains. Here, we examined the effect of early exposure to reading on neural responses to both word and face processing in preschool children with the use of the Event Related Potential (ERP) methodology. We specifically linked children's reading experience (indexed by their sight vocabulary) to two major neural markers: the amplitude differences between the left and right N170 on the bilateral posterior scalp sites and the hemispheric spectrum power differences in the γ band on the same scalp sites. The results showed that the left-lateralization of both the word N170 and the spectrum power in the γ band were significantly positively related to vocabulary. In contrast, vocabulary and the word left-lateralization both had a strong negative direct effect on the face right-lateralization. Also, vocabulary negatively correlated with the right-lateralized face spectrum power in the γ band even after the effects of age and the word spectrum power were partialled out. The present study provides direct evidence regarding the role of reading experience in the neural specialization of word and face processing above and beyond the effect of maturation. The present findings taken together suggest that the neural development of visual word processing competes with that of face processing before the process of neural specialization has been consolidated. Copyright © 2013 Elsevier Ltd. All rights reserved.
Hemispheric asymmetry in holistic processing of words.
Ventura, Paulo; Delgado, João; Ferreira, Miguel; Farinha-Fernandes, António; Guerreiro, José C; Faustino, Bruno; Leite, Isabel; Wong, Alan C-N
2018-05-13
Holistic processing has been regarded as a hallmark of face perception, indicating the automatic and obligatory tendency of the visual system to process all face parts as a perceptual unit rather than in isolation. Studies involving lateralized stimulus presentation suggest that the right hemisphere dominates holistic face processing. Holistic processing can also be shown with other categories such as words and thus it is not specific to faces or face-like expertize. Here, we used divided visual field presentation to investigate the possibly different contributions of the two hemispheres for holistic word processing. Observers performed same/different judgment on the cued parts of two sequentially presented words in the complete composite paradigm. Our data indicate a right hemisphere specialization for holistic word processing. Thus, these markers of expert object recognition are domain general.
The Effects of Test Trial and Processing Level on Immediate and Delayed Retention.
Chang, Sau Hou
2017-03-01
The purpose of the present study was to investigate the effects of test trial and processing level on immediate and delayed retention. A 2 × 2 × 2 mixed ANOVAs was used with two between-subject factors of test trial (single test, repeated test) and processing level (shallow, deep), and one within-subject factor of final recall (immediate, delayed). Seventy-six college students were randomly assigned first to the single test (studied the stimulus words three times and took one free-recall test) and the repeated test trials (studied the stimulus words once and took three consecutive free-recall tests), and then to the shallow processing level (asked whether each stimulus word was presented in capital letter or in small letter) and the deep processing level (whether each stimulus word belonged to a particular category) to study forty stimulus words. The immediate test was administered five minutes after the trials, whereas the delayed test was administered one week later. Results showed that single test trial recalled more words than repeated test trial in immediate final free-recall test, participants in deep processing performed better than those in shallow processing in both immediate and delayed retention. However, the dominance of single test trial and deep processing did not happen in delayed retention. Additional study trials did not further enhance the delayed retention of words encoded in deep processing, but did enhance the delayed retention of words encoded in shallow processing.
The Effects of Test Trial and Processing Level on Immediate and Delayed Retention
Chang, Sau Hou
2017-01-01
The purpose of the present study was to investigate the effects of test trial and processing level on immediate and delayed retention. A 2 × 2 × 2 mixed ANOVAs was used with two between-subject factors of test trial (single test, repeated test) and processing level (shallow, deep), and one within-subject factor of final recall (immediate, delayed). Seventy-six college students were randomly assigned first to the single test (studied the stimulus words three times and took one free-recall test) and the repeated test trials (studied the stimulus words once and took three consecutive free-recall tests), and then to the shallow processing level (asked whether each stimulus word was presented in capital letter or in small letter) and the deep processing level (whether each stimulus word belonged to a particular category) to study forty stimulus words. The immediate test was administered five minutes after the trials, whereas the delayed test was administered one week later. Results showed that single test trial recalled more words than repeated test trial in immediate final free-recall test, participants in deep processing performed better than those in shallow processing in both immediate and delayed retention. However, the dominance of single test trial and deep processing did not happen in delayed retention. Additional study trials did not further enhance the delayed retention of words encoded in deep processing, but did enhance the delayed retention of words encoded in shallow processing. PMID:28344679
A dual-task investigation of automaticity in visual word processing
NASA Technical Reports Server (NTRS)
McCann, R. S.; Remington, R. W.; Van Selst, M.
2000-01-01
An analysis of activation models of visual word processing suggests that frequency-sensitive forms of lexical processing should proceed normally while unattended. This hypothesis was tested by having participants perform a speeded pitch discrimination task followed by lexical decisions or word naming. As the stimulus onset asynchrony between the tasks was reduced, lexical-decision and naming latencies increased dramatically. Word-frequency effects were additive with the increase, indicating that frequency-sensitive processing was subject to postponement while attention was devoted to the other task. Either (a) the same neural hardware shares responsibility for lexical processing and central stages of choice reaction time task processing and cannot perform both computations simultaneously, or (b) lexical processing is blocked in order to optimize performance on the pitch discrimination task. Either way, word processing is not as automatic as activation models suggest.
Encoding the world around us: motor-related processing influences verbal memory.
Madan, Christopher R; Singhal, Anthony
2012-09-01
It is known that properties of words such as their imageability can influence our ability to remember those words. However, it is not known if other object-related properties can also influence our memory. In this study we asked whether a word representing a concrete object that can be functionally interacted with (i.e., high-manipulability word) would enhance the memory representations for that item compared to a word representing a less manipulable object (i.e., low-manipulability word). Here participants incidentally encoded high-manipulability (e.g., CAMERA) and low-manipulability words (e.g., TABLE) while making word judgments. Using a between-subjects design, we varied the depth-of-processing involved in the word judgment task: participants judged the words based on personal experience (deep/elaborative processing), word length (shallow), or functionality (intermediate). Participants were able to remember high-manipulability words better than low-manipulability words in both the personal experience and word length groups; thus presenting the first evidence that manipulability can influence memory. However, we observed better memory for low- than high-manipulability words in the functionality group. We explain this surprising interaction between manipulability and memory as being mediated by automatic vs. controlled motor-related cognition. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Hronusov, V. V.
2006-12-01
We suggest a method of using external public servers for rearranging, restructuring and rapid sharing of environmental data for the purpose of quick presentations in numerous GE clients. The method allows to add new philosophy for the presentation (publication) of the data (mostly static) stored in the public domain (e.g., Blue Marble, Visible Earth, etc). - The new approach is generated by publishing freely accessible spreadsheets which contain enough information and links to the data. Due to the fact that most of the large depositories of the data on the environmental monitoring have rather simple net address system as well as simple hierarchy mostly based on the date and type of the data, it is possible to develop the http-based link to the file which contains the data. Publication of new data on the server is recorded by a simple entering a new address into a cell in the spreadsheet. At the moment we use the EditGrid (www.editgrid.com) system as a spreadsheet platform. The generation of kml-codes is achieved on the basis of XML data and XSLT procedures. Since the EditGride environment supports "fetch" and similar commands, it is possible to create"smart-adaptive" KML generation on the fly based on the data streams from RSS and XML sources. The previous GIS-based methods could combine hi-definition data combined from various sources, but large- scale comparisons of dynamic processes have been usually out of reach of the technology. The suggested method allows unlimited number of GE clients to view, review and compare dynamic and static process of previously un-combinable sources, and on unprecedent scales. The ease of automated or computer-assisted georeferencing has already led to translation about 3000 raster public domain imagery, point and linear data sources into GE-language. In addition the suggested method allows a user to create rapid animations to demonstrate dynamic processes; roducts of high demand in education, meteorology, volcanology and potentially in a number of industries. In general it is possible to state that the new approach, which we have tested on numerous projects, saves times and energy in creating huge amounts of georeferenced data of various kinds, and thus provided an excellent tools for education and science.
Memory for pictures and words as a function of level of processing: Depth or dual coding?
D'Agostino, P R; O'Neill, B J; Paivio, A
1977-03-01
The experiment was designed to test differential predictions derived from dual-coding and depth-of-processing hypotheses. Subjects under incidental memory instructions free recalled a list of 36 test events, each presented twice. Within the list, an equal number of events were assigned to structural, phonemic, and semantic processing conditions. Separate groups of subjects were tested with a list of pictures, concrete words, or abstract words. Results indicated that retention of concrete words increased as a direct function of the processing-task variable (structural < phonemic
Eye-fixation behavior, lexical storage, and visual word recognition in a split processing model.
Shillcock, R; Ellison, T M; Monaghan, P
2000-10-01
Some of the implications of a model of visual word recognition in which processing is conditioned by the anatomical splitting of the visual field between the two hemispheres of the brain are explored. The authors investigate the optimal processing of visually presented words within such an architecture, and, for a realistically sized lexicon of English, characterize a computationally optimal fixation point in reading. They demonstrate that this approach motivates a range of behavior observed in reading isolated words and text, including the optimal viewing position and its relationship with the preferred viewing location, the failure to fixate smaller words, asymmetries in hemisphere-specific processing, and the priority given to the exterior letters of words. The authors also show that split architectures facilitate the uptake of all the letter-position information necessary for efficient word recognition and that this information may be less specific than is normally assumed. A split model of word recognition captures a range of behavior in reading that is greater than that covered by existing models of visual word recognition.
Dudschig, Carolin; de la Vega, Irmgard; Kaup, Barbara
2014-05-01
Converging evidence suggests that understanding our first-language (L1) results in reactivation of experiential sensorimotor traces in the brain. Surprisingly, little is known regarding the involvement of these processes during second-language (L2) processing. Participants saw L1 or L2 words referring to entities with a typical location (e.g., star, mole) (Experiment 1 & 2) or to an emotion (e.g., happy, sad) (Experiment 3). Participants responded to the words' ink color with an upward or downward arm movement. Despite word meaning being fully task-irrelevant, L2 automatically activated motor responses similar to L1 even when L2 was acquired rather late in life (age >11). Specifically, words such as star facilitated upward, and words such as root facilitated downward responses. Additionally, words referring to positive emotions facilitated upward, and words referring to negative emotions facilitated downward responses. In summary our study suggests that reactivation of experiential traces is not limited to L1 processing. Copyright © 2014 Elsevier Inc. All rights reserved.
Parafoveal Load of Word N+1 Modulates Preprocessing Effectiveness of Word N+2 in Chinese Reading
ERIC Educational Resources Information Center
Yan, Ming; Kliegl, Reinhold; Shu, Hua; Pan, Jinger; Zhou, Xiaolin
2010-01-01
Preview benefits (PBs) from two words to the right of the fixated one (i.e., word N + 2) and associated parafoveal-on-foveal effects are critical for proposals of distributed lexical processing during reading. This experiment examined parafoveal processing during reading of Chinese sentences, using a boundary manipulation of N + 2-word preview…
ERIC Educational Resources Information Center
Eckerth, Johannes; Tavakoli, Parveneh
2012-01-01
Research on incidental second language (L2) vocabulary acquisition through reading has claimed that repeated encounters with unfamiliar words and the relative elaboration of processing these words facilitate word learning. However, so far both variables have been investigated in isolation. To help close this research gap, the current study…
Rau, Anne K; Moll, Kristina; Snowling, Margaret J; Landerl, Karin
2015-02-01
The current study investigated the time course of cross-linguistic differences in word recognition. We recorded eye movements of German and English children and adults while reading closely matched sentences, each including a target word manipulated for length and frequency. Results showed differential word recognition processes for both developing and skilled readers. Children of the two orthographies did not differ in terms of total word processing time, but this equal outcome was achieved quite differently. Whereas German children relied on small-unit processing early in word recognition, English children applied small-unit decoding only upon rereading-possibly when experiencing difficulties in integrating an unfamiliar word into the sentence context. Rather unexpectedly, cross-linguistic differences were also found in adults in that English adults showed longer processing times than German adults for nonwords. Thus, although orthographic consistency does play a major role in reading development, cross-linguistic differences are detectable even in skilled adult readers. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Quagliato, Luca; Berti, Guido A.
2017-10-01
In this paper, a statically determined slip-line solution algorithm is proposed for the calculation of the axial forming force in the radial-axial ring rolling process of flat rings. The developed solution is implemented in an Excel spreadsheet for the construction of the slip-line field and the calculation of the pressure factor to be used in the force model. The comparison between analytical solution and authors' FE simulation allows stating that the developed model supersedes the previous literature ones and proves the reliability of the proposed approach.
Generic Modeling of a Life Support System for Process Technology Comparison
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support system and process technology options for a Lunar Base with a crew size of 4 and mission lengths of 90 and 600 days. System configurations to minimize the life support system weight and power are explored.
Word Processing: The Air Force Administrators’ Handbook
1979-05-01
finest magazine on the market , Word Processing World. If you can’t get the bucks for the Report, order Word Process- ing World by itself for $14/year...following publication. "The Seybold Report on Word Processing" is published monthly by Seybold Publications, Inc., Box 644, Media , Pennsylvania 19063...Avenue, New York, NY 10022. It’s a lot like the Cecil book--aimed at the community college and vocational-technical school market . Well, that wraps up
When does word frequency influence written production?
Baus, Cristina; Strijkers, Kristof; Costa, Albert
2013-01-01
The aim of the present study was to explore the central (e.g., lexical processing) and peripheral processes (motor preparation and execution) underlying word production during typewriting. To do so, we tested non-professional typers in a picture typing task while continuously recording EEG. Participants were instructed to write (by means of a standard keyboard) the corresponding name for a given picture. The lexical frequency of the words was manipulated: half of the picture names were of high-frequency while the remaining were of low-frequency. Different measures were obtained: (1) first keystroke latency and (2) keystroke latency of the subsequent letters and duration of the word. Moreover, ERPs locked to the onset of the picture presentation were analyzed to explore the temporal course of word frequency in typewriting. The results showed an effect of word frequency for the first keystroke latency but not for the duration of the word or the speed to which letter were typed (interstroke intervals). The electrophysiological results showed the expected ERP frequency effect at posterior sites: amplitudes for low-frequency words were more positive than those for high-frequency words. However, relative to previous evidence in the spoken modality, the frequency effect appeared in a later time-window. These results demonstrate two marked differences in the processing dynamics underpinning typing compared to speaking: First, central processing dynamics between speaking and typing differ already in the manner that words are accessed; second, central processing differences in typing, unlike speaking, do not cascade to peripheral processes involved in response execution.
When does word frequency influence written production?
Baus, Cristina; Strijkers, Kristof; Costa, Albert
2013-01-01
The aim of the present study was to explore the central (e.g., lexical processing) and peripheral processes (motor preparation and execution) underlying word production during typewriting. To do so, we tested non-professional typers in a picture typing task while continuously recording EEG. Participants were instructed to write (by means of a standard keyboard) the corresponding name for a given picture. The lexical frequency of the words was manipulated: half of the picture names were of high-frequency while the remaining were of low-frequency. Different measures were obtained: (1) first keystroke latency and (2) keystroke latency of the subsequent letters and duration of the word. Moreover, ERPs locked to the onset of the picture presentation were analyzed to explore the temporal course of word frequency in typewriting. The results showed an effect of word frequency for the first keystroke latency but not for the duration of the word or the speed to which letter were typed (interstroke intervals). The electrophysiological results showed the expected ERP frequency effect at posterior sites: amplitudes for low-frequency words were more positive than those for high-frequency words. However, relative to previous evidence in the spoken modality, the frequency effect appeared in a later time-window. These results demonstrate two marked differences in the processing dynamics underpinning typing compared to speaking: First, central processing dynamics between speaking and typing differ already in the manner that words are accessed; second, central processing differences in typing, unlike speaking, do not cascade to peripheral processes involved in response execution. PMID:24399980
Don't words come easy? A psychophysical exploration of word superiority
Starrfelt, Randi; Petersen, Anders; Vangkilde, Signe
2013-01-01
Words are made of letters, and yet sometimes it is easier to identify a word than a single letter. This word superiority effect (WSE) has been observed when written stimuli are presented very briefly or degraded by visual noise. We compare performance with letters and words in three experiments, to explore the extents and limits of the WSE. Using a carefully controlled list of three letter words, we show that a WSE can be revealed in vocal reaction times even to undegraded stimuli. With a novel combination of psychophysics and mathematical modeling, we further show that the typical WSE is specifically reflected in perceptual processing speed: single words are simply processed faster than single letters. Intriguingly, when multiple stimuli are presented simultaneously, letters are perceived more easily than words, and this is reflected both in perceptual processing speed and visual short term memory (VSTM) capacity. So, even if single words come easy, there is a limit to the WSE. PMID:24027510
Wang, Jie; Wong, Andus Wing-Kuen; Chen, Hsuan-Chih
2017-06-05
The time course of phonological encoding in Mandarin monosyllabic word production was investigated by using the picture-word interference paradigm. Participants were asked to name pictures in Mandarin while visual distractor words were presented before, at, or after picture onset (i.e., stimulus-onset asynchrony/SOA = -100, 0, or +100 ms, respectively). Compared with the unrelated control, the distractors sharing atonal syllables with the picture names significantly facilitated the naming responses at -100- and 0-ms SOAs. In addition, the facilitation effect of sharing word-initial segments only appeared at 0-ms SOA, and null effects were found for sharing word-final segments. These results indicate that both syllables and subsyllabic units play important roles in Mandarin spoken word production and more critically that syllabic processing precedes subsyllabic processing. The current results lend strong support to the proximate units principle (O'Seaghdha, Chen, & Chen, 2010), which holds that the phonological structure of spoken word production is language-specific and that atonal syllables are the proximate phonological units in Mandarin Chinese. On the other hand, the significance of word-initial segments over word-final segments suggests that serial processing of segmental information seems to be universal across Germanic languages and Chinese, which remains to be verified in future studies.
Juhasz, Barbara J
2016-11-14
Recording eye movements provides information on the time-course of word recognition during reading. Juhasz and Rayner [Juhasz, B. J., & Rayner, K. (2003). Investigating the effects of a set of intercorrelated variables on eye fixation durations in reading. Journal of Experimental Psychology: Learning, Memory and Cognition, 29, 1312-1318] examined the impact of five word recognition variables, including familiarity and age-of-acquisition (AoA), on fixation durations. All variables impacted fixation durations, but the time-course differed. However, the study focused on relatively short, morphologically simple words. Eye movements are also informative for examining the processing of morphologically complex words such as compound words. The present study further examined the time-course of lexical and semantic variables during morphological processing. A total of 120 English compound words that varied in familiarity, AoA, semantic transparency, lexeme meaning dominance, sensory experience rating (SER), and imageability were selected. The impact of these variables on fixation durations was examined when length, word frequency, and lexeme frequencies were controlled in a regression model. The most robust effects were found for familiarity and AoA, indicating that a reader's experience with compound words significantly impacts compound recognition. These results provide insight into semantic processing of morphologically complex words during reading.
Computational Models of the Representation of Bangla Compound Words in the Mental Lexicon.
Dasgupta, Tirthankar; Sinha, Manjira; Basu, Anupam
2016-08-01
In this paper we aim to model the organization and processing of Bangla compound words in the mental lexicon. Our objective is to determine whether the mental lexicon access a Bangla compound word as a whole or decomposes the whole word into its constituent morphemes and then recognize them accordingly. To address this issue, we adopted two different strategies. First, we conduct a cross-modal priming experiment over a number of native speakers. Analysis of reaction time (RT) and error rates indicates that in general, Bangla compound words are accessed via partial decomposition process. That is some word follows full-listing mode of representation and some words follow the decomposition route of representation. Next, based on the collected RT data we have developed a computational model that can explain the processing phenomena of the access and representation of Bangla compound words. In order to achieve this, we first explored the individual roles of head word position, morphological complexity, orthographic transparency and semantic compositionality between the constituents and the whole compound word. Accordingly, we have developed a complexity based model by combining these features together. To a large extent we have successfully explained the possible processing phenomena of most of the Bangla compound words. Our proposed model shows an accuracy of around 83 %.
ERIC Educational Resources Information Center
Juhasz, Barbara J.; Johnson, Rebecca L.; Brewer, Jennifer
2017-01-01
New words enter the language through several word formation processes [see Simonini ("Engl J" 55:752-757, 1966)]. One such process, blending, occurs when two source words are combined to represent a new concept (e.g., SMOG, BRUNCH, BLOG, and INFOMERCIAL). While there have been examinations of the structure of blends [see Gries…
[Electrophysiological bases of semantic processing of objects].
Kahlaoui, Karima; Baccino, Thierry; Joanette, Yves; Magnié, Marie-Noële
2007-02-01
How pictures and words are stored and processed in the human brain constitute a long-standing question in cognitive psychology. Behavioral studies have yielded a large amount of data addressing this issue. Generally speaking, these data show that there are some interactions between the semantic processing of pictures and words. However, behavioral methods can provide only limited insight into certain findings. Fortunately, Event-Related Potential (ERP) provides on-line cues about the temporal nature of cognitive processes and contributes to the exploration of their neural substrates. ERPs have been used in order to better understand semantic processing of words and pictures. The main objective of this article is to offer an overview of the electrophysiologic bases of semantic processing of words and pictures. Studies presented in this article showed that the processing of words is associated with an N 400 component, whereas pictures elicited both N 300 and N 400 components. Topographical analysis of the N 400 distribution over the scalp is compatible with the idea that both image-mediated concrete words and pictures access an amodal semantic system. However, given the distinctive N 300 patterns, observed only during picture processing, it appears that picture and word processing rely upon distinct neuronal networks, even if they end up activating more or less similar semantic representations.
Distance-dependent processing of pictures and words.
Amit, Elinor; Algom, Daniel; Trope, Yaacov
2009-08-01
A series of 8 experiments investigated the association between pictorial and verbal representations and the psychological distance of the referent objects from the observer. The results showed that people better process pictures that represent proximal objects and words that represent distal objects than pictures that represent distal objects and words that represent proximal objects. These results were obtained with various psychological distance dimensions (spatial, temporal, and social), different tasks (classification and categorization), and different measures (speed of processing and selective attention). The authors argue that differences in the processing of pictures and words emanate from the physical similarity of pictures, but not words, to the referents. Consequently, perceptual analysis is commonly applied to pictures but not to words. Pictures thus impart a sense of closeness to the referent objects and are preferably used to represent such objects, whereas words do not convey proximity and are preferably used to represent distal objects in space, time, and social perspective.
The Integration of Word Processing with Data Processing in an Educational Environment. Final Report.
ERIC Educational Resources Information Center
Patterson, Lorna; Schlender, Jim
A project examined the Office of the Future and determined trends regarding an integration of word processing and data processing. It then sought to translate those trends into an educational package to develop the potential information specialist. A survey instrument completed by 33 office managers and word processing and data processing…
Word Processors: A Look at Four Popular Programs.
ERIC Educational Resources Information Center
Press, Larry
1980-01-01
Described are types of programs used for processing text (editors, print formatters, and word processors), followed by the comparison of four word-processing packages: Auto Scribe, Electric Pencil, Magic Want and Word Star. With the exception of Auto Scribe, all programs reviewed are CP/M versions. (KC)
Bakos, Sarolta; Landerl, Karin; Bartling, Jürgen; Schulte-Körne, Gerd; Moll, Kristina
2018-03-01
In consistent orthographies, isolated reading disorders (iRD) and isolated spelling disorders (iSD) are nearly as common as combined reading-spelling disorders (cRSD). However, the exact nature of the underlying word processing deficits in isolated versus combined literacy deficits are not well understood yet. We applied a phonological lexical decision task (including words, pseudohomophones, legal and illegal pseudowords) during ERP recording to investigate the neurophysiological correlates of lexical and sublexical word-processing in children with iRD, iSD and cRSD compared to typically developing (TD) 9-year-olds. TD children showed enhanced early sensitivity (N170) for word material and for the violation of orthographic rules compared to the other groups. Lexical orthographic effects (higher LPC amplitude for words than for pseudohomophones) were the same in the TD and iRD groups, although processing took longer in children with iRD. In the iSD and cRSD groups, lexical orthographic effects were evident and stable over time only for correctly spelled words. Orthographic representations were intact in iRD children, but word processing took longer compared to TD. Children with spelling disorders had partly missing orthographic representations. Our study is the first to specify the underlying neurophysiology of word processing deficits associated with isolated literacy deficits. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
Quam, Carolyn; Creel, Sarah C
2017-01-01
Previous research has mainly considered the impact of tone-language experience on ability to discriminate linguistic pitch, but proficient bilingual listening requires differential processing of sound variation in each language context. Here, we ask whether Mandarin-English bilinguals, for whom pitch indicates word distinctions in one language but not the other, can process pitch differently in a Mandarin context vs. an English context. Across three eye-tracked word-learning experiments, results indicated that tone-intonation bilinguals process tone in accordance with the language context. In Experiment 1, 51 Mandarin-English bilinguals and 26 English speakers without tone experience were taught Mandarin-compatible novel words with tones. Mandarin-English bilinguals out-performed English speakers, and, for bilinguals, overall accuracy was correlated with Mandarin dominance. Experiment 2 taught 24 Mandarin-English bilinguals and 25 English speakers novel words with Mandarin-like tones, but English-like phonemes and phonotactics. The Mandarin-dominance advantages observed in Experiment 1 disappeared when words were English-like. Experiment 3 contrasted Mandarin-like vs. English-like words in a within-subjects design, providing even stronger evidence that bilinguals can process tone language-specifically. Bilinguals (N = 58), regardless of language dominance, attended more to tone than English speakers without Mandarin experience (N = 28), but only when words were Mandarin-like-not when they were English-like. Mandarin-English bilinguals thus tailor tone processing to the within-word language context.
Analysis of the Requirements Generation Process for the Logistics Analysis and Wargame Support Tool
2017-06-01
For instance, the requirements for a pen seem straight forward; however, they may vary depending on the context in which the pen will be used...the interactions between the operational elements, specify which tasks are dependent on others and the order of executing task, and estimate how...configuration file to call that spreadsheet. This requirement can be met depending on the situation. If the nodes and arcs are pre-defined and readily
Ojima, Shiro; Matsuba-Kurita, Hiroko; Nakamura, Naoko; Hagiwara, Hiroko
2011-04-01
Healthy adults can identify spoken words at a remarkable speed, by incrementally analyzing word-onset information. It is currently unknown how this adult-level speed of spoken-word processing emerges during children's native-language acquisition. In a picture-word mismatch paradigm, we manipulated the semantic congruency between picture contexts and spoken words, and recorded event-related potential (ERP) responses to the words. Previous similar studies focused on the N400 response, but we focused instead on the onsets of semantic congruency effects (N200 or Phonological Mismatch Negativity), which contain critical information for incremental spoken-word processing. We analyzed ERPs obtained longitudinally from two age cohorts of 40 primary-school children (total n=80) in a 3-year period. Children first tested at 7 years of age showed earlier onsets of congruency effects (by approximately 70ms) when tested 2 years later (i.e., at age 9). Children first tested at 9 years of age did not show such shortening of onset latencies 2 years later (i.e., at age 11). Overall, children's onset latencies at age 9 appeared similar to those of adults. These data challenge the previous hypothesis that word processing is well established at age 7. Instead they support the view that the acceleration of spoken-word processing continues beyond age 7. Copyright © 2011 Elsevier Ltd. All rights reserved.
Computational Modeling of Morphological Effects in Bangla Visual Word Recognition.
Dasgupta, Tirthankar; Sinha, Manjira; Basu, Anupam
2015-10-01
In this paper we aim to model the organization and processing of Bangla polymorphemic words in the mental lexicon. Our objective is to determine whether the mental lexicon accesses a polymorphemic word as a whole or decomposes the word into its constituent morphemes and then recognize them accordingly. To address this issue, we adopted two different strategies. First, we conduct a masked priming experiment over native speakers. Analysis of reaction time (RT) and error rates indicates that in general, morphologically derived words are accessed via decomposition process. Next, based on the collected RT data we have developed a computational model that can explain the processing phenomena of the access and representation of Bangla derivationally suffixed words. In order to do so, we first explored the individual roles of different linguistic features of a Bangla morphologically complex word and observed that processing of Bangla morphologically complex words depends upon several factors like, the base and surface word frequency, suffix type/token ratio, suffix family size and suffix productivity. Accordingly, we have proposed different feature models. Finally, we combine these feature models together and came up with a new model that takes the advantage of the individual feature models and successfully explain the processing phenomena of most of the Bangla morphologically derived words. Our proposed model shows an accuracy of around 80% which outperforms the other related frequency models.
Interplay Between the Object and Its Symbol: The Size-Congruency Effect
Shen, Manqiong; Xie, Jiushu; Liu, Wenjuan; Lin, Wenjie; Chen, Zhuoming; Marmolejo-Ramos, Fernando; Wang, Ruiming
2016-01-01
Grounded cognition suggests that conceptual processing shares cognitive resources with perceptual processing. Hence, conceptual processing should be affected by perceptual processing, and vice versa. The current study explored the relationship between conceptual and perceptual processing of size. Within a pair of words, we manipulated the font size of each word, which was either congruent or incongruent with the actual size of the referred object. In Experiment 1a, participants compared object sizes that were referred to by word pairs. Higher accuracy was observed in the congruent condition (e.g., word pairs referring to larger objects in larger font sizes) than in the incongruent condition. This is known as the size-congruency effect. In Experiments 1b and 2, participants compared the font sizes of these word pairs. The size-congruency effect was not observed. In Experiments 3a and 3b, participants compared object and font sizes of word pairs depending on a task cue. Results showed that perceptual processing affected conceptual processing, and vice versa. This suggested that the association between conceptual and perceptual processes may be bidirectional but further modulated by semantic processing. Specifically, conceptual processing might only affect perceptual processing when semantic information is activated. The current study PMID:27512529
The effect of word concreteness on recognition memory.
Fliessbach, K; Weis, S; Klaver, P; Elger, C E; Weber, B
2006-09-01
Concrete words that are readily imagined are better remembered than abstract words. Theoretical explanations for this effect either claim a dual coding of concrete words in the form of both a verbal and a sensory code (dual-coding theory), or a more accessible semantic network for concrete words than for abstract words (context-availability theory). However, the neural mechanisms of improved memory for concrete versus abstract words are poorly understood. Here, we investigated the processing of concrete and abstract words during encoding and retrieval in a recognition memory task using event-related functional magnetic resonance imaging (fMRI). As predicted, memory performance was significantly better for concrete words than for abstract words. Abstract words elicited stronger activations of the left inferior frontal cortex both during encoding and recognition than did concrete words. Stronger activation of this area was also associated with successful encoding for both abstract and concrete words. Concrete words elicited stronger activations bilaterally in the posterior inferior parietal lobe during recognition. The left parietal activation was associated with correct identification of old stimuli. The anterior precuneus, left cerebellar hemisphere and the posterior and anterior cingulate cortex showed activations both for successful recognition of concrete words and for online processing of concrete words during encoding. Additionally, we observed a correlation across subjects between brain activity in the left anterior fusiform gyrus and hippocampus during recognition of learned words and the strength of the concreteness effect. These findings support the idea of specific brain processes for concrete words, which are reactivated during successful recognition.
An automated graphics tool for comparative genomics: the Coulson plot generator
2013-01-01
Background Comparative analysis is an essential component to biology. When applied to genomics for example, analysis may require comparisons between the predicted presence and absence of genes in a group of genomes under consideration. Frequently, genes can be grouped into small categories based on functional criteria, for example membership of a multimeric complex, participation in a metabolic or signaling pathway or shared sequence features and/or paralogy. These patterns of retention and loss are highly informative for the prediction of function, and hence possible biological context, and can provide great insights into the evolutionary history of cellular functions. However, representation of such information in a standard spreadsheet is a poor visual means from which to extract patterns within a dataset. Results We devised the Coulson Plot, a new graphical representation that exploits a matrix of pie charts to display comparative genomics data. Each pie is used to describe a complex or process from a separate taxon, and is divided into sectors corresponding to the number of proteins (subunits) in a complex/process. The predicted presence or absence of proteins in each complex are delineated by occupancy of a given sector; this format is visually highly accessible and makes pattern recognition rapid and reliable. A key to the identity of each subunit, plus hierarchical naming of taxa and coloring are included. A java-based application, the Coulson plot generator (CPG) automates graphic production, with a tab or comma-delineated text file as input and generating an editable portable document format or svg file. Conclusions CPG software may be used to rapidly convert spreadsheet data to a graphical matrix pie chart format. The representation essentially retains all of the information from the spreadsheet but presents a graphically rich format making comparisons and identification of patterns significantly clearer. While the Coulson plot format is highly useful in comparative genomics, its original purpose, the software can be used to visualize any dataset where entity occupancy is compared between different classes. Availability CPG software is available at sourceforge http://sourceforge.net/projects/coulson and http://dl.dropbox.com/u/6701906/Web/Sites/Labsite/CPG.html PMID:23621955
The Temporal Structure of Spoken Language Understanding.
ERIC Educational Resources Information Center
Marslen-Wilson, William; Tyler, Lorraine Komisarjevsky
1980-01-01
An investigation of word-by-word time-course of spoken language understanding focused on word recognition and structural and interpretative processes. Results supported an online interactive language processing theory, in which lexical, structural, and interpretative knowledge sources communicate and interact during processing efficiently and…
Orthographic processing in pigeons (Columba livia)
Scarf, Damian; Boy, Karoline; Uber Reinert, Anelisie; Devine, Jack; Güntürkün, Onur; Colombo, Michael
2016-01-01
Learning to read involves the acquisition of letter–sound relationships (i.e., decoding skills) and the ability to visually recognize words (i.e., orthographic knowledge). Although decoding skills are clearly human-unique, given they are seated in language, recent research and theory suggest that orthographic processing may derive from the exaptation or recycling of visual circuits that evolved to recognize everyday objects and shapes in our natural environment. An open question is whether orthographic processing is limited to visual circuits that are similar to our own or a product of plasticity common to many vertebrate visual systems. Here we show that pigeons, organisms that separated from humans more than 300 million y ago, process words orthographically. Specifically, we demonstrate that pigeons trained to discriminate words from nonwords picked up on the orthographic properties that define words and used this knowledge to identify words they had never seen before. In addition, the pigeons were sensitive to the bigram frequencies of words (i.e., the common co-occurrence of certain letter pairs), the edit distance between nonwords and words, and the internal structure of words. Our findings demonstrate that visual systems organizationally distinct from the primate visual system can also be exapted or recycled to process the visual word form. PMID:27638211
Fast Mapping Across Time: Memory Processes Support Children's Retention of Learned Words.
Vlach, Haley A; Sandhofer, Catherine M
2012-01-01
Children's remarkable ability to map linguistic labels to referents in the world is commonly called fast mapping. The current study examined children's (N = 216) and adults' (N = 54) retention of fast-mapped words over time (immediately, after a 1-week delay, and after a 1-month delay). The fast mapping literature often characterizes children's retention of words as consistently high across timescales. However, the current study demonstrates that learners forget word mappings at a rapid rate. Moreover, these patterns of forgetting parallel forgetting functions of domain-general memory processes. Memory processes are critical to children's word learning and the role of one such process, forgetting, is discussed in detail - forgetting supports extended mapping by promoting the memory and generalization of words and categories.
Adaptive memory: the comparative value of survival processing.
Nairne, James S; Pandeirada, Josefa N S; Thompson, Sarah R
2008-02-01
We recently proposed that human memory systems are "tuned" to remember information that is processed for survival, perhaps as a result of fitness advantages accrued in the ancestral past. This proposal was supported by experiments in which participants showed superior memory when words were rated for survival relevance, at least relative to when words received other forms of deep processing. The current experiments tested the mettle of survival memory by pitting survival processing against conditions that are universally accepted as producing excellent retention, including conditions in which participants rated words for imagery, pleasantness, and self-reference; participants also generated words, studied words with the intention of learning them, or rated words for relevance to a contextually rich (but non-survival-related) scenario. Survival processing yielded the best retention, which suggests that it may be one of the best encoding procedures yet discovered in the memory field.
2017-09-01
Figure 58. Click on run ................................................................................................61 Figure 59. Top view of...engines, helicopter rotors, and turbine blades , and so forth Creating Marks Readable with a Scanner 4. Simple techniques to follow: Make the light...spreadsheet with data Figure 58. Click on Menu bar and find “View” then click on “Macros.” Click on run Figure 59. 62 Top view of xml spreadsheet
Small-Caliber Projectile Target Impact Angle Determined From Close Proximity Radiographs
2006-10-01
discrete motion data that can be numerically modeled using linear aerodynamic theory or 6-degrees-of- freedom equations of motion. The values of Fφ...Prediction Excel® Spreadsheet shown in figure 9. The Gamma at Impact Spreadsheet uses the linear aerodynamics model , equations 5 and 6, to calculate αT...trajectory angle error via consideration of the RMS fit errors of the actual firings. However, the linear aerodynamics model does not include this effect
Toward a Model for Picture and Word Processing.
ERIC Educational Resources Information Center
Snodgrass, Joan Gay
A model was developed to account for similarities and differences between picture and word processing in a variety of semantic and episodic memory tasks. The model contains three levels of processing: low-level processing of the physical characteristics of externally presented pictures and words; an intermediate level where the low-level processor…
A retention index calculator simplifies identification of plant volatile organic compounds.
Lucero, Mary; Estell, Rick; Tellez, María; Fredrickson, Ed
2009-01-01
Plant volatiles (PVOCs) are important targets for studies in natural products, chemotaxonomy and biochemical ecology. The complexity of PVOC profiles often limits research to studies targeting only easily identified compounds. With the availability of mass spectral libraries and recent growth of retention index (RI) libraries, PVOC identification can be achieved using only gas chromatography coupled to mass spectrometry (GCMS). However, RI library searching is not typically automated, and until recently, RI libraries were both limited in scope and costly to obtain. To automate RI calculation and lookup functions commonly utilised in PVOC analysis. Formulae required for calculating retention indices from retention time data were placed in a spreadsheet along with lookup functions and a retention index library. Retention times obtained from GCMS analysis of alkane standards and Koeberlinia spinosa essential oil were entered into the spreadsheet to determine retention indices. Indices were used in combination with mass spectral analysis to identify compounds contained in Koeberlinia spinosa essential oil. Eighteen compounds were positively identified. Total oil yield was low, with only 5 ppm in purple berries. The most abundant compounds were octen-3-ol and methyl salicylate. The spreadsheet accurately calculated RIs of the detected compounds. The downloadable spreadsheet tool developed for this study provides a calculator and RI library that works in conjuction with GCMS or other analytical techniques to identify PVOCs in plant extracts.
The impact of inverted text on visual word processing: An fMRI study.
Sussman, Bethany L; Reddigari, Samir; Newman, Sharlene D
2018-06-01
Visual word recognition has been studied for decades. One question that has received limited attention is how different text presentation orientations disrupt word recognition. By examining how word recognition processes may be disrupted by different text orientations it is hoped that new insights can be gained concerning the process. Here, we examined the impact of rotating and inverting text on the neural network responsible for visual word recognition focusing primarily on a region of the occipto-temporal cortex referred to as the visual word form area (VWFA). A lexical decision task was employed in which words and pseudowords were presented in one of three orientations (upright, rotated or inverted). The results demonstrate that inversion caused the greatest disruption of visual word recognition processes. Both rotated and inverted text elicited increased activation in spatial attention regions within the right parietal cortex. However, inverted text recruited phonological and articulatory processing regions within the left inferior frontal and left inferior parietal cortices. Finally, the VWFA was found to not behave similarly to the fusiform face area in that unusual text orientations resulted in increased activation and not decreased activation. It is hypothesized here that the VWFA activation is modulated by feedback from linguistic processes. Copyright © 2018 Elsevier Inc. All rights reserved.
Selective attention in perceptual adjustments to voice.
Mullennix, J W; Howe, J N
1999-10-01
The effects of perceptual adjustments to voice information on the perception of isolated spoken words were examined. In two experiments, spoken target words were preceded or followed within a trial by a neutral word spoken in the same voice or in a different voice as the target. Over-all, words were reproduced more accurately on trials on which the voice of the neutral word matched the voice of the spoken target word, suggesting that perceptual adjustments to voice interfere with word processing. This result, however, was mediated by selective attention to voice. The results provide further evidence of a close processing relationship between perceptual adjustments to voice and spoken word recognition.
Effects of visual familiarity for words on interhemispheric cooperation for lexical processing.
Yoshizaki, K
2001-12-01
The purpose of this study was to examine the effects of visual familiarity of words on interhemispheric lexical processing. Words and pseudowords were tachistoscopically presented in a left, a right, or bilateral visual fields. Two types of words, Katakana-familiar-type and Hiragana-familiar-type, were used as the word stimuli. The former refers to the words which are more frequently written with Katakana script, and the latter refers to the words which are written predominantly in Hiragana script. Two conditions for the words were set up in terms of visual familiarity for a word. In visually familiar condition, words were presented in familiar script form and in visually unfamiliar condition, words were presented in less familiar script form. The 32 right-handed Japanese students were asked to make a lexical decision. Results showed that a bilateral gain, which indicated that the performance in the bilateral visual fields was superior to that in the unilateral visual field, was obtained only in the visually familiar condition, not in the visually unfamiliar condition. These results suggested that the visual familiarity for a word had an influence on the interhemispheric lexical processing.
Evidence for simultaneous syntactic processing of multiple words during reading.
Snell, Joshua; Meeter, Martijn; Grainger, Jonathan
2017-01-01
A hotly debated issue in reading research concerns the extent to which readers process parafoveal words, and how parafoveal information might influence foveal word recognition. We investigated syntactic word processing both in sentence reading and in reading isolated foveal words when these were flanked by parafoveal words. In Experiment 1 we found a syntactic parafoveal preview benefit in sentence reading, meaning that fixation durations on target words were decreased when there was a syntactically congruent preview word at the target location (n) during the fixation on the pre-target (n-1). In Experiment 2 we used a flanker paradigm in which participants had to classify foveal target words as either noun or verb, when those targets were flanked by syntactically congruent or incongruent words (stimulus on-time 170 ms). Lower response times and error rates in the congruent condition suggested that higher-order (syntactic) information can be integrated across foveal and parafoveal words. Although higher-order parafoveal-on-foveal effects have been elusive in sentence reading, results from our flanker paradigm show that the reading system can extract higher-order information from multiple words in a single glance. We propose a model of reading to account for the present findings.
Representation of visual symbols in the visual word processing network.
Muayqil, Taim; Davies-Thompson, Jodie; Barton, Jason J S
2015-03-01
Previous studies have shown that word processing involves a predominantly left-sided occipitotemporal network. Words are a form of symbolic representation, in that they are arbitrary perceptual stimuli that represent other objects, actions or concepts. Lesions of parts of the visual word processing network can cause alexia, which can be associated with difficulty processing other types of symbols such as musical notation or road signs. We investigated whether components of the visual word processing network were also activated by other types of symbols. In 16 music-literate subjects, we defined the visual word network using fMRI and examined responses to four symbolic categories: visual words, musical notation, instructive symbols (e.g. traffic signs), and flags and logos. For each category we compared responses not only to scrambled stimuli, but also to similar stimuli that lacked symbolic meaning. The left visual word form area and a homologous right fusiform region responded similarly to all four categories, but equally to both symbolic and non-symbolic equivalents. Greater response to symbolic than non-symbolic stimuli occurred only in the left inferior frontal and middle temporal gyri, but only for words, and in the case of the left inferior frontal gyri, also for musical notation. A whole-brain analysis comparing symbolic versus non-symbolic stimuli revealed a distributed network of inferior temporooccipital and parietal regions that differed for different symbols. The fusiform gyri are involved in processing the form of many symbolic stimuli, but not specifically for stimuli with symbolic content. Selectivity for stimuli with symbolic content only emerges in the visual word network at the level of the middle temporal and inferior frontal gyri, but is specific for words and musical notation. Copyright © 2015 Elsevier Ltd. All rights reserved.
10 years of BAWLing into affective and aesthetic processes in reading: what are the echoes?
Jacobs, Arthur M.; Võ, Melissa L.-H.; Briesemeister, Benny B.; Conrad, Markus; Hofmann, Markus J.; Kuchinke, Lars; Lüdtke, Jana; Braun, Mario
2015-01-01
Reading is not only “cold” information processing, but involves affective and aesthetic processes that go far beyond what current models of word recognition, sentence processing, or text comprehension can explain. To investigate such “hot” reading processes, standardized instruments that quantify both psycholinguistic and emotional variables at the sublexical, lexical, inter-, and supralexical levels (e.g., phonological iconicity, word valence, arousal-span, or passage suspense) are necessary. One such instrument, the Berlin Affective Word List (BAWL) has been used in over 50 published studies demonstrating effects of lexical emotional variables on all relevant processing levels (experiential, behavioral, neuronal). In this paper, we first present new data from several BAWL studies. Together, these studies examine various views on affective effects in reading arising from dimensional (e.g., valence) and discrete emotion features (e.g., happiness), or embodied cognition features like smelling. Second, we extend our investigation of the complex issue of affective word processing to words characterized by a mixture of affects. These words entail positive and negative valence, and/or features making them beautiful or ugly. Finally, we discuss tentative neurocognitive models of affective word processing in the light of the present results, raising new issues for future studies. PMID:26089808
Koban, Leonie; Ninck, Markus; Li, Jun; Gisler, Thomas; Kissler, Johanna
2010-07-27
Emotional stimuli are preferentially processed compared to neutral ones. Measuring the magnetic resonance blood-oxygen level dependent (BOLD) response or EEG event-related potentials, this has also been demonstrated for emotional versus neutral words. However, it is currently unclear whether emotion effects in word processing can also be detected with other measures such as EEG steady-state visual evoked potentials (SSVEPs) or optical brain imaging techniques. In the present study, we simultaneously performed SSVEP measurements and near-infrared diffusing-wave spectroscopy (DWS), a new optical technique for the non-invasive measurement of brain function, to measure brain responses to neutral, pleasant, and unpleasant nouns flickering at a frequency of 7.5 Hz. The power of the SSVEP signal was significantly modulated by the words' emotional content at occipital electrodes, showing reduced SSVEP power during stimulation with pleasant compared to neutral nouns. By contrast, the DWS signal measured over the visual cortex showed significant differences between stimulation with flickering words and baseline periods, but no modulation in response to the words' emotional significance. This study is the first investigation of brain responses to emotional words using simultaneous measurements of SSVEPs and DWS. Emotional modulation of word processing was detected with EEG SSVEPs, but not by DWS. SSVEP power for emotional, specifically pleasant, compared to neutral words was reduced, which contrasts with previous results obtained when presenting emotional pictures. This appears to reflect processing differences between symbolic and pictorial emotional stimuli. While pictures prompt sustained perceptual processing, decoding the significance of emotional words requires more internal associative processing. Reasons for an absence of emotion effects in the DWS signal are discussed.
Grimm, Robert; Cassani, Giovanni; Gillis, Steven; Daelemans, Walter
2017-01-01
Previous studies have suggested that children and adults form cognitive representations of co-occurring word sequences. We propose (1) that the formation of such multi-word unit (MWU) representations precedes and facilitates the formation of single-word representations in children and thus benefits word learning, and (2) that MWU representations facilitate adult word recognition and thus benefit lexical processing. Using a modified version of an existing computational model (McCauley and Christiansen, 2014), we extract MWUs from a corpus of child-directed speech (CDS) and a corpus of conversations among adults. We then correlate the number of MWUs within which each word appears with (1) age of first production and (2) adult reaction times on a word recognition task. In doing so, we take care to control for the effect of word frequency, as frequent words will naturally tend to occur in many MWUs. We also compare results to a baseline model which randomly groups words into sequences-and find that MWUs have a unique facilitatory effect on both response variables, suggesting that they benefit word learning in children and word recognition in adults. The effect is strongest on age of first production, implying that MWUs are comparatively more important for word learning than for adult lexical processing. We discuss possible underlying mechanisms and formulate testable predictions.
Grimm, Robert; Cassani, Giovanni; Gillis, Steven; Daelemans, Walter
2017-01-01
Previous studies have suggested that children and adults form cognitive representations of co-occurring word sequences. We propose (1) that the formation of such multi-word unit (MWU) representations precedes and facilitates the formation of single-word representations in children and thus benefits word learning, and (2) that MWU representations facilitate adult word recognition and thus benefit lexical processing. Using a modified version of an existing computational model (McCauley and Christiansen, 2014), we extract MWUs from a corpus of child-directed speech (CDS) and a corpus of conversations among adults. We then correlate the number of MWUs within which each word appears with (1) age of first production and (2) adult reaction times on a word recognition task. In doing so, we take care to control for the effect of word frequency, as frequent words will naturally tend to occur in many MWUs. We also compare results to a baseline model which randomly groups words into sequences—and find that MWUs have a unique facilitatory effect on both response variables, suggesting that they benefit word learning in children and word recognition in adults. The effect is strongest on age of first production, implying that MWUs are comparatively more important for word learning than for adult lexical processing. We discuss possible underlying mechanisms and formulate testable predictions. PMID:28450842
Electrophysiological indices of brain activity to content and function words in discourse.
Neumann, Yael; Epstein, Baila; Shafer, Valerie L
2016-09-01
An increase in positivity of event-related potentials (ERPs) at the lateral anterior sites has been hypothesized to be an index of semantic and discourse processing, with the right lateral anterior positivity (LAP) showing particular sensitivity to discourse factors. However, the research investigating the LAP is limited; it is unclear whether the effect is driven by word class (function word versus content word) or by a more general process of structure building triggered by elements of a determiner phrase (DP). To examine the neurophysiological indices of semantic/discourse integration using two different word categories (function versus content word) in the discourse contexts and to contrast processing of these word categories in meaningful versus nonsense contexts. Planned comparisons of ERPs time locked to a function word stimulus 'the' and a content word stimulus 'cats' in sentence-initial position were conducted in both discourse and nonsense contexts to examine the time course of processing following these word forms. A repeated-measures analysis of variance (ANOVA) for the Discourse context revealed a significant interaction of condition and site due to greater positivity for 'the' relative to 'cats' at anterior and superior sites. In the Nonsense context, there was a significant interaction of condition, time and site due to greater positivity for 'the' relative to 'cats' at anterior sites from 150 to 350 ms post-stimulus offset and at superior sites from 150 to 200 ms post-stimulus offset. Overall, greater positivity for both 'the' and 'cats' was observed in the discourse relative to the nonsense context beginning approximately 150 ms post-stimulus offset. Additionally, topographical analyses were highly correlated for the two word categories when processing meaningful discourse. This topographical pattern could be characterized as a prominent right LAP. The LAP was attenuated when the target stimulus word initiated a nonsense context. The results of this study support the view that the right LAP is an index of general discourse processing rather than an index of word class. These findings demonstrate that the LAP can be used to study discourse processing in populations with compromised metalinguistic skills, such as adults with aphasia or traumatic brain injury. © 2016 Royal College of Speech and Language Therapists.
Word form Encoding in Chinese Word Naming and Word Typing
ERIC Educational Resources Information Center
Chen, Jenn-Yeu; Li, Cheng-Yi
2011-01-01
The process of word form encoding was investigated in primed word naming and word typing with Chinese monosyllabic words. The target words shared or did not share the onset consonants with the prime words. The stimulus onset asynchrony (SOA) was 100 ms or 300 ms. Typing required the participants to enter the phonetic letters of the target word,…
Processing negative valence of word pairs that include a positive word.
Itkes, Oksana; Mashal, Nira
2016-09-01
Previous research has suggested that cognitive performance is interrupted by negative relative to neutral or positive stimuli. We examined whether negative valence affects performance at the word or phrase level. Participants performed a semantic decision task on word pairs that included either a negative or a positive target word. In Experiment 1, the valence of the target word was congruent with the overall valence conveyed by the word pair (e.g., fat kid). As expected, response times were slower in the negative condition relative to the positive condition. Experiment 2 included target words that were incongruent with the overall valence of the word pair (e.g., fat salary). Response times were longer for word pairs whose overall valence was negative relative to positive, even though these word pairs included a positive word. Our findings support the Cognitive Primacy Hypothesis, according to which emotional valence is extracted after conceptual processing is complete.
Emotionally enhanced memory for negatively arousing words: storage or retrieval advantage?
Nadarevic, Lena
2017-12-01
People typically remember emotionally negative words better than neutral words. Two experiments are reported that investigate whether emotionally enhanced memory (EEM) for negatively arousing words is based on a storage or retrieval advantage. Participants studied non-word-word pairs that either involved negatively arousing or neutral target words. Memory for these target words was tested by means of a recognition test and a cued-recall test. Data were analysed with a multinomial model that allows the disentanglement of storage and retrieval processes in the present recognition-then-cued-recall paradigm. In both experiments the multinomial analyses revealed no storage differences between negatively arousing and neutral words but a clear retrieval advantage for negatively arousing words in the cued-recall test. These findings suggest that EEM for negatively arousing words is driven by associative processes.
The role of selective attention in perceptual and affective priming
NASA Technical Reports Server (NTRS)
Stone, M.; Ladd, S. L.; Gabrieli, J. D.
2000-01-01
Two kinds of perceptual priming (word identification and word fragment completion), as well as preference priming (that may rely on special affective mechanisms) were examined after participants either read or named the colors of words and nonwords at study. Participants named the colors of words more slowly than the colors of nonwords, indicating that lexical processing of the words occurred at study. Nonetheless, priming on all three tests was lower after color naming than after reading, despite evidence of lexical processing during color naming shown by slower responses to words than to nonwords. These results indicate that selective attention to (rather than the mere processing of) letter string identity at study is important for subsequent repetition priming.
Gullick, Margaret M; Mitra, Priya; Coch, Donna
2013-05-01
Previous event-related potential studies have indicated that both a widespread N400 and an anterior N700 index differential processing of concrete and abstract words, but the nature of these components in relation to concreteness and imagery has been unclear. Here, we separated the effects of word concreteness and task demands on the N400 and N700 in a single word processing paradigm with a within-subjects, between-tasks design and carefully controlled word stimuli. The N400 was larger to concrete words than to abstract words, and larger in the visualization task condition than in the surface task condition, with no interaction. A marked anterior N700 was elicited only by concrete words in the visualization task condition, suggesting that this component indexes imagery. These findings are consistent with a revised or extended dual coding theory according to which concrete words benefit from greater activation in both verbal and imagistic systems. Copyright © 2013 Society for Psychophysiological Research.
Tamura, Niina; Castles, Anne; Nation, Kate
2017-06-01
Children learn new words via their everyday reading experience but little is known about how this learning happens. We addressed this by focusing on the conditions needed for new words to become familiar to children, drawing a distinction between lexical configuration (the acquisition of word knowledge) and lexical engagement (the emergence of interactive processes between newly learned words and existing words). In Experiment 1, 9-11-year-olds saw unfamiliar words in one of two storybook conditions, differing in degree of focus on the new words but matched for frequency of exposure. Children showed good learning of the novel words in terms of both configuration (form and meaning) and engagement (lexical competition). A frequency manipulation under incidental learning conditions in Experiment 2 revealed different time-courses of learning: a fast lexical configuration process, indexed by explicit knowledge, and a slower lexicalization process, indexed by lexical competition. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Computer-based Astronomy Labs for Non-science Majors
NASA Astrophysics Data System (ADS)
Smith, A. B. E.; Murray, S. D.; Ward, R. A.
1998-12-01
We describe and demonstrate two laboratory exercises, Kepler's Third Law and Stellar Structure, which are being developed for use in an astronomy laboratory class aimed at non-science majors. The labs run with Microsoft's Excel 98 (Macintosh) or Excel 97 (Windows). They can be run in a classroom setting or in an independent learning environment. The intent of the labs is twofold; first and foremost, students learn the subject matter through a series of informational frames. Next, students enhance their understanding by applying their knowledge in lab procedures, while also gaining familiarity with the use and power of a widely-used software package and scientific tool. No mathematical knowledge beyond basic algebra is required to complete the labs or to understand the computations in the spreadsheets, although the students are exposed to the concepts of numerical integration. The labs are contained in Excel workbook files. In the files are multiple spreadsheets, which contain either a frame with information on how to run the lab, material on the subject, or one or more procedures. Excel's VBA macro language is used to automate the labs. The macros are accessed through button interfaces positioned on the spreadsheets. This is done intentionally so that students can focus on learning the subject matter and the basic spreadsheet features without having to learn advanced Excel features all at once. Students open the file and progress through the informational frames to the procedures. After each procedure, student comments and data are automatically recorded in a preformatted Lab Report spreadsheet. Once all procedures have been completed, the student is prompted for a filename in which to save their Lab Report. The lab reports can then be printed or emailed to the instructor. The files will have full worksheet and workbook protection, and will have a "redo" feature at the end of the lab for students who want to repeat a procedure.
The Relationships among Cognitive Correlates and Irregular Word, Non-Word, and Word Reading
ERIC Educational Resources Information Center
Abu-Hamour, Bashir; University, Mu'tah; Urso, Annmarie; Mather, Nancy
2012-01-01
This study explored four hypotheses: (a) the relationships among rapid automatized naming (RAN) and processing speed (PS) to irregular word, non-word, and word reading; (b) the predictive power of various RAN and PS measures, (c) the cognitive correlates that best predicted irregular word, non-word, and word reading, and (d) reading performance of…
Word Processing and the Writing Process: Enhancement or Distraction?
ERIC Educational Resources Information Center
Dalton, David W.; Watson, James F.
This study examined the effects of a year-long word processing program on learners' holistic writing skills. Based on results of a writing pretest, 80 seventh grade students were designated as relatively high or low in prior writing achievement and assigned to one of two groups: a word processing treatment and a conventional writing process…
Learning new meanings for known words: Biphasic effects of prior knowledge.
Fang, Xiaoping; Perfetti, Charles; Stafura, Joseph
2017-01-01
In acquiring word meanings, learners are often confronted by a single word form that is mapped to two or more meanings. For example, long after how to roller-"skate", one may learn that "skate" is also a kind of fish. Such learning of new meanings for familiar words involves two potentially contrasting processes, relative to new form-new meaning learning: 1) Form-based familiarity may facilitate learning a new meaning, and 2) meaning-based interference may inhibit learning a new meaning. We examined these two processes by having native English speakers learn new, unrelated meanings for familiar (high frequency) and less familiar (low frequency) English words, as well as for unfamiliar (novel or pseudo-) words. Tracking learning with cued-recall tasks at several points during learning revealed a biphasic pattern: higher learning rates and greater learning efficiency for familiar words relative to novel words early in learning and a reversal of this pattern later in learning. Following learning, interference from original meanings for familiar words was detected in a semantic relatedness judgment task. Additionally, lexical access to familiar words with new meanings became faster compared to their exposure controls, but no such effect occurred for less familiar words. Overall, the results suggest a biphasic pattern of facilitating and interfering processes: Familiar word forms facilitate learning earlier, while interference from original meanings becomes more influential later. This biphasic pattern reflects the co-activation of new and old meanings during learning, a process that may play a role in lexicalization of new meanings.
Word Frequency Effects in Dual-Task Studies Using Lexical Decision and Naming as Task 2
NASA Technical Reports Server (NTRS)
Remington, Roger W.; McCann, Robert S.; VanSelst, Mark; Shafto, Michael G. (Technical Monitor)
1997-01-01
Word frequency effects in dual-task lexical decision are variously reported to be additive or underadditive across SOA. We replicate and extend earlier lexical decision studies and find word frequency to be additive across SOA. To more directly capture lexical processing, we examine dual-task naming. Once again, we find word frequency to be additive across SOA. Lexical processing appears to be constrained by central processing limitations.
Word Effects in Dual-Task Studies Using Lexical Decision and Naming as Task 2
NASA Technical Reports Server (NTRS)
Remington, Roger; McCann, Robert S.; VanSelst, Mark; Shafto, Michael (Technical Monitor)
1997-01-01
Word frequency effects in dual-task, lexical decision are variously reported to be additive or under-additive across SOA. We replicate and extend earlier lexical decision studies and find word frequency to be additive across SOA. To more directly capture lexical processing, we examine dual-task naming. Once again we find word frequency to be additive across SOA. Lexical processing appears to be constrained by central processing limitations.
Effects of Speed of Word Processing on Semantic Access: The Case of Bilingualism
ERIC Educational Resources Information Center
Martin, Clara D.; Costa, Albert; Dering, Benjamin; Hoshino, Noriko; Wu, Yan Jing; Thierry, Guillaume
2012-01-01
Bilingual speakers generally manifest slower word recognition than monolinguals. We investigated the consequences of the word processing speed on semantic access in bilinguals. The paradigm involved a stream of English words and pseudowords presented in succession at a constant rate. English-Welsh bilinguals and English monolinguals were asked to…
Moseley, Rachel L.; Shtyrov, Yury; Mohr, Bettina; Lombardo, Michael V.; Baron-Cohen, Simon; Pulvermüller, Friedemann
2015-01-01
Autism spectrum conditions (ASC) are characterised by deficits in understanding and expressing emotions and are frequently accompanied by alexithymia, a difficulty in understanding and expressing emotion words. Words are differentially represented in the brain according to their semantic category and these difficulties in ASC predict reduced activation to emotion-related words in limbic structures crucial for affective processing. Semantic theories view ‘emotion actions’ as critical for learning the semantic relationship between a word and the emotion it describes, such that emotion words typically activate the cortical motor systems involved in expressing emotion actions such as facial expressions. As ASC are also characterised by motor deficits and atypical brain structure and function in these regions, motor structures would also be expected to show reduced activation during emotion-semantic processing. Here we used event-related fMRI to compare passive processing of emotion words in comparison to abstract verbs and animal names in typically-developing controls and individuals with ASC. Relatively reduced brain activation in ASC for emotion words, but not matched control words, was found in motor areas and cingulate cortex specifically. The degree of activation evoked by emotion words in the motor system was also associated with the extent of autistic traits as revealed by the Autism Spectrum Quotient. We suggest that hypoactivation of motor and limbic regions for emotion word processing may underlie difficulties in processing emotional language in ASC. The role that sensorimotor systems and their connections might play in the affective and social-communication difficulties in ASC is discussed. PMID:25278250
The new meaning of quality in the information age.
Prahalad, C K; Krishnan, M S
1999-01-01
Software applications are now a mission-critical source of competitive advantage for most companies. They are also a source of great risk, as the Y2K bug has made clear. Yet many line managers still haven't confronted software issues--partly because they aren't sure how best to define the quality of the applications in their IT infrastructures. Some companies such as Wal-Mart and the Gap have successfully integrated the software in their networks, but most have accumulated an unwidely number of incompatible applications--all designed to perform the same tasks. The authors provide a framework for measuring the performance of software in a company's IT portfolio. Quality traditionally has been measured according to a product's ability to meet certain specifications; other views of quality have emerged that measure a product's adaptability to customers' needs and a product's ability to encourage innovation. To judge software quality properly, argue the authors, managers must measure applications against all three approaches. Understanding the domain of a software application is an important part of that process. The domain is the body of knowledge about a user's needs and expectations for a product. Software domains change frequently based on how a consumer chooses to use, for example, Microsoft Word or a spreadsheet application. The domain can also be influenced by general changes in technology, such as the development of a new software platform. Thus, applications can't be judged only according to whether they conform to specifications. The authors discuss how to identify domain characteristics and software risks and suggest ways to reduce the variability of software domains.
Vowelling and semantic priming effects in Arabic.
Mountaj, Nadia; El Yagoubi, Radouane; Himmi, Majid; Lakhdar Ghazal, Faouzi; Besson, Mireille; Boudelaa, Sami
2015-01-01
In the present experiment we used a semantic judgment task with Arabic words to determine whether semantic priming effects are found in the Arabic language. Moreover, we took advantage of the specificity of the Arabic orthographic system, which is characterized by a shallow (i.e., vowelled words) and a deep orthography (i.e., unvowelled words), to examine the relationship between orthographic and semantic processing. Results showed faster Reaction Times (RTs) for semantically related than unrelated words with no difference between vowelled and unvowelled words. By contrast, Event Related Potentials (ERPs) revealed larger N1 and N2 components to vowelled words than unvowelled words suggesting that visual-orthographic complexity taxes the early word processing stages. Moreover, semantically unrelated Arabic words elicited larger N400 components than related words thereby demonstrating N400 effects in Arabic. Finally, the Arabic N400 effect was not influenced by orthographic depth. The implications of these results for understanding the processing of orthographic, semantic, and morphological structures in Modern Standard Arabic are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.
Learning during processing Word learning doesn’t wait for word recognition to finish
Apfelbaum, Keith S.; McMurray, Bob
2017-01-01
Previous research on associative learning has uncovered detailed aspects of the process, including what types of things are learned, how they are learned, and where in the brain such learning occurs. However, perceptual processes, such as stimulus recognition and identification, take time to unfold. Previous studies of learning have not addressed when, during the course of these dynamic recognition processes, learned representations are formed and updated. If learned representations are formed and updated while recognition is ongoing, the result of learning may incorporate spurious, partial information. For example, during word recognition, words take time to be identified, and competing words are often active in parallel. If learning proceeds before this competition resolves, representations may be influenced by the preliminary activations present at the time of learning. In three experiments using word learning as a model domain, we provide evidence that learning reflects the ongoing dynamics of auditory and visual processing during a learning event. These results show that learning can occur before stimulus recognition processes are complete; learning does not wait for ongoing perceptual processing to complete. PMID:27471082
Acoustic and semantic interference effects in words and pictures.
Dhawan, M; Pellegrino, J W
1977-05-01
Interference effects for pictures and words were investigated using a probe-recall task. Word stimuli showed acoustic interference effects for items at the end of the list and semantic interference effects for items at the beginning of the list, similar to results of Kintsch and Buschke (1969). Picture stimuli showed large semantic interference effects at all list positions with smaller acoustic interference effects. The results were related to latency data on picture-word processing and interpreted in terms of the differential order, probability, and/or speed of access to acoustic and semantic levels of processing. A levels of processing explanation of picture-word retention differences was related to dual coding theory. Both theoretical positions converge on an explanation of picture-word retention differences as a function of the relative capacity for semantic or associative processing.
Quam, Carolyn; Creel, Sarah C.
2017-01-01
Previous research has mainly considered the impact of tone-language experience on ability to discriminate linguistic pitch, but proficient bilingual listening requires differential processing of sound variation in each language context. Here, we ask whether Mandarin-English bilinguals, for whom pitch indicates word distinctions in one language but not the other, can process pitch differently in a Mandarin context vs. an English context. Across three eye-tracked word-learning experiments, results indicated that tone-intonation bilinguals process tone in accordance with the language context. In Experiment 1, 51 Mandarin-English bilinguals and 26 English speakers without tone experience were taught Mandarin-compatible novel words with tones. Mandarin-English bilinguals out-performed English speakers, and, for bilinguals, overall accuracy was correlated with Mandarin dominance. Experiment 2 taught 24 Mandarin-English bilinguals and 25 English speakers novel words with Mandarin-like tones, but English-like phonemes and phonotactics. The Mandarin-dominance advantages observed in Experiment 1 disappeared when words were English-like. Experiment 3 contrasted Mandarin-like vs. English-like words in a within-subjects design, providing even stronger evidence that bilinguals can process tone language-specifically. Bilinguals (N = 58), regardless of language dominance, attended more to tone than English speakers without Mandarin experience (N = 28), but only when words were Mandarin-like—not when they were English-like. Mandarin-English bilinguals thus tailor tone processing to the within-word language context. PMID:28076400
Tracing the time course of picture--word processing.
Smith, M C; Magee, L E
1980-12-01
A number of independent lines of research have suggested that semantic and articulatory information become available differentially from pictures and words. The first of the experiments reported here sought to clarify the time course by which information about pictures and words becomes available by considering the pattern of interference generated when incongruent pictures and words are presented simultaneously in a Stroop-like situation. Previous investigators report that picture naming is easily disrupted by the presence of a distracting word but that word naming is relatively immune to interference from an incongruent picture. Under the assumption that information available from a completed process may disrupt an ongoing process, these results suggest that words access articulatory information more rapidly than do pictures. Experiment 1 extended this paradigm by requiring subjects to verify the category of the target stimulus. In accordance with the hypothesis that picture access the semantic code more rapidly than words, there was a reversal in the interference pattern: Word categorization suffered considerable disruption, whereas picture categorization was minimally affected by the presence of an incongruent word. Experiment 2 sought to further test the hypothesis that access to semantic and articulatory codes is different for pictures and words by examining memory for those items following naming or categorization. Categorized words were better recognized than named words, whereas the reverse was true for pictures, a result which suggests that picture naming involves more extensive processing than picture categorization. Experiment 3 replicated this result under conditions in which viewing time was held constant. The last experiment extended the investigation of memory differences to a situation in which subjects were required to generate the superordinate category name. Here, memory for categorized pictures was as good as memory for named pictures. Category generation also influenced memory for words, memory performance being superior to that following a yes--no verification of category membership. These experiments suggest a model of information access whereby pictures access semantic information were readily than name information, with the reverse being true for words. Memory for both pictures and words was a function of the amount of processing required to access a particular type of information as well as the extent of response differentiation necessitated by the task.
Space-Plane Spreadsheet Program
NASA Technical Reports Server (NTRS)
Mackall, Dale
1993-01-01
Basic Hypersonic Data and Equations (HYPERDATA) spreadsheet computer program provides data gained from three analyses of performance of space plane. Equations used to perform analyses derived from Newton's second law of physics, derivation included. First analysis is parametric study of some basic factors affecting ability of space plane to reach orbit. Second includes calculation of thickness of spherical fuel tank. Third produces ratio between volume of fuel and total mass for each of various aircraft. HYPERDATA intended for use on Macintosh(R) series computers running Microsoft Excel 3.0.
Social Security: A Present Value Analysis of Old Age Survivors Insurance (OASI) Taxes and Benefits.
1995-12-01
private sector plans and provides a spreadsheet model for making this comparison of plans using different assumptions. The investigation was done by collecting data from various books, Government publications, and various Government agencies to conduct a spreadsheet analysis of three different wage-earning groups, assuming various real interest rates potentially earned in the private sector . A comparison of Social Security with alternative private sector plans is important to the DoD/DoN because less constrained budgets could
ERIC Educational Resources Information Center
Gatlin, Rebecca; And Others
Research indicates that people tend to use only five percent of the capabilities available in word processing software. The major objective of this study was to determine to what extent word processing was used by businesses, what competencies were required by those businesses, and how those competencies were being learned in Mid-South states. A…
Word Processing for Technical Writers and Teachers.
ERIC Educational Resources Information Center
Mullins, Carolyn J.; West, Thomas W.
This discussion of the computing network and word processing facilities available to professionals on the Indiana University campuses identifies the word and text processing needs of technical writers and faculty, describes the current computing network, and outlines both long- and short-range objectives, policies, and plans for meeting these…