Science.gov

Sample records for acm computing surveys

  1. ACM TOMS replicated computational results initiative

    SciTech Connect

    Heroux, Michael Allen

    2015-06-03

    In this study, the scientific community relies on the peer review process for assuring the quality of published material, the goal of which is to build a body of work we can trust. Computational journals such as The ACM Transactions on Mathematical Software (TOMS) use this process for rigorously promoting the clarity and completeness of content, and citation of prior work. At the same time, it is unusual to independently confirm computational results.

  2. ACM TOMS replicated computational results initiative

    DOE PAGESBeta

    Heroux, Michael Allen

    2015-06-03

    In this study, the scientific community relies on the peer review process for assuring the quality of published material, the goal of which is to build a body of work we can trust. Computational journals such as The ACM Transactions on Mathematical Software (TOMS) use this process for rigorously promoting the clarity and completeness of content, and citation of prior work. At the same time, it is unusual to independently confirm computational results.

  3. Categorization of Computing Education Resources into the ACM Computing Classification System

    SciTech Connect

    Chen, Yinlin; Bogen, Paul Logasa; Fox, Dr. Edward A.; Hsieh, Dr. Haowei; Cassel, Dr. Lillian N.

    2012-01-01

    The Ensemble Portal harvests resources from multiple heterogonous federated collections. Managing these dynamically increasing collections requires an automatic mechanism to categorize records in to corresponding topics. We propose an approach to use existing ACM DL metadata to build classifiers for harvested resources in the Ensemble project. We also present our experience on utilizing the Amazon Mechanical Turk platform to build ground truth training data sets from Ensemble collections.

  4. ACME-III and ACME-IV Final Campaign Reports

    SciTech Connect

    Biraud, S. C.

    2016-01-01

    The goals of the Atmospheric Radiation Measurement (ARM) Climate Research Facility’s third and fourth Airborne Carbon Measurements (ACME) field campaigns, ACME-III and ACME-IV, are: 1) to measure and model the exchange of CO2, water vapor, and other greenhouse gases by the natural, agricultural, and industrial ecosystems of the Southern Great Plains (SGP) region; 2) to develop quantitative approaches to relate these local fluxes to the concentration of greenhouse gases measured at the Central Facility tower and in the atmospheric column above the ARM SGP Central Facility, 3) to develop and test bottom-up measurement and modeling approaches to estimate regional scale carbon balances, and 4) to develop and test inverse modeling approaches to estimate regional scale carbon balance and anthropogenic sources over continental regions. Regular soundings of the atmosphere from near the surface into the mid-troposphere are essential for this research.

  5. Cloud Computing Security Issue: Survey

    NASA Astrophysics Data System (ADS)

    Kamal, Shailza; Kaur, Rajpreet

    2011-12-01

    Cloud computing is the growing field in IT industry since 2007 proposed by IBM. Another company like Google, Amazon, and Microsoft provides further products to cloud computing. The cloud computing is the internet based computing that shared recourses, information on demand. It provides the services like SaaS, IaaS and PaaS. The services and recourses are shared by virtualization that run multiple operation applications on cloud computing. This discussion gives the survey on the challenges on security issues during cloud computing and describes some standards and protocols that presents how security can be managed.

  6. Quark ACM with topologically generated gluon mass

    NASA Astrophysics Data System (ADS)

    Choudhury, Ishita Dutta; Lahiri, Amitabha

    2016-03-01

    We investigate the effect of a small, gauge-invariant mass of the gluon on the anomalous chromomagnetic moment (ACM) of quarks by perturbative calculations at one-loop level. The mass of the gluon is taken to have been generated via a topological mass generation mechanism, in which the gluon acquires a mass through its interaction with an antisymmetric tensor field Bμν. For a small gluon mass ( < 10 MeV), we calculate the ACM at momentum transfer q2 = -M Z2. We compare those with the ACM calculated for the gluon mass arising from a Proca mass term. We find that the ACM of up, down, strange and charm quarks vary significantly with the gluon mass, while the ACM of top and bottom quarks show negligible gluon mass dependence. The mechanism of gluon mass generation is most important for the strange quarks ACM, but not so much for the other quarks. We also show the results at q2 = -m t2. We find that the dependence on gluon mass at q2 = -m t2 is much less than at q2 = -M Z2 for all quarks.

  7. Additive Construction with Mobile Emplacement (ACME)

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The Additive Construction with Mobile Emplacement (ACME) project is developing technology to build structures on planetary surfaces using in-situ resources. The project focuses on the construction of both 2D (landing pads, roads, and structure foundations) and 3D (habitats, garages, radiation shelters, and other structures) infrastructure needs for planetary surface missions. The ACME project seeks to raise the Technology Readiness Level (TRL) of two components needed for planetary surface habitation and exploration: 3D additive construction (e.g., contour crafting), and excavation and handling technologies (to effectively and continuously produce in-situ feedstock). Additionally, the ACME project supports the research and development of new materials for planetary surface construction, with the goal of reducing the amount of material to be launched from Earth.

  8. Experiments in Computing: A Survey

    PubMed Central

    Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general. PMID:24688404

  9. Survey of Computer Usage in Louisiana Schools.

    ERIC Educational Resources Information Center

    Kirby, Peggy C.; And Others

    A survey of computer usage in 179 randomly selected public elementary and secondary schools in Louisiana was conducted in the spring of 1988. School principals responded to questions about school size, the socioeconomic status of the student population, the number of teachers certified in computer literacy and computer science, and the number of…

  10. Computer Augmented Learning; A Survey.

    ERIC Educational Resources Information Center

    Kindred, J.

    The report contains a description and summary of computer augmented learning devices and systems. The devices are of two general types programed instruction systems based on the teaching machines pioneered by Pressey and developed by Skinner, and the so-called "docile" systems that permit greater user-direction with the computer under student…

  11. Computer Graphics Evolution: A Survey.

    ERIC Educational Resources Information Center

    Gartel, Laurence M.

    1985-01-01

    The history of the field of computer graphics is discussed. In 1976 there were no institutions that offered any kind of study of computer graphics. Today electronic image-making is seen as a viable, legitimate art form, and courses are offered by many universities and colleges. (RM)

  12. Towards an Autonomic Cluster Management System (ACMS) with Reflex Autonomicity

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Hinchey, Mike; Sterritt, Roy

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of providing a fault-tolerant environment and achieving significant computational capabilities for high-performance computing applications. However, the task of manually managing and configuring a cluster quickly becomes daunting as the cluster grows in size. Autonomic computing, with its vision to provide self-management, can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Autonomic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management and its evolution to include reflex reactions via pulse monitoring.

  13. The Survey; An Interdisciplinary Computer Application.

    ERIC Educational Resources Information Center

    Carolan, Kevin

    APL (A Programing Language), a computer language used thus far largely for mathematical and scientific applications, can be used to tabulate a survey. Since this computer application can be appreciated by social scientists as well as mathematicians, it serves as an invaluable pedagogical tool for presenting APL to nonscientific users. An…

  14. Survey: Computer Usage in Design Courses.

    ERIC Educational Resources Information Center

    Henley, Ernest J.

    1983-01-01

    Presents results of a survey of chemical engineering departments regarding computer usage in senior design courses. Results are categorized according to: computer usage (use of process simulators, student-written programs, faculty-written or "canned" programs; costs (hard and soft money); and available software. Programs offered are listed in a…

  15. How to recycle asbestos containing materials (ACM)

    SciTech Connect

    Jantzen, C.M.

    2000-04-11

    The current disposal of asbestos containing materials (ACM) in the private sector consists of sealing asbestos wetted with water in plastic for safe transportation and burial in regulated land fills. This disposal methodology requires large disposal volumes especially for asbestos covered pipe and asbestos/fiberglass adhering to metal framework, e.g. filters. This wrap and bury technology precludes recycle of the asbestos, the pipe and/or the metal frameworks. Safe disposal of ACM at U.S. Department of Energy (DOE) sites, likewise, requires large disposal volumes in landfills for non-radioactive ACM and large disposal volumes in radioactive burial grounds for radioactive and suspect contaminated ACM. The availability of regulated disposal sites is rapidly diminishing causing recycle to be a more attractive option. Asbestos adhering to metal (e.g., pipes) can be recycled by safely removing the asbestos from the metal in a patented hot caustic bath which prevents airborne contamination /inhalation of asbestos fibers. The dissolution residue (caustic and asbestos) can be wet slurry fed to a melter and vitrified into a glass or glass-ceramic. Palex glasses, which are commercially manufactured, are shown to be preferred over conventional borosilicate glasses. The Palex glasses are alkali magnesium silicate glasses derived by substituting MgO for B{sub 2}O{sub 3} in borosilicate type glasses. Palex glasses are very tolerant of the high MgO and high CaO content of the fillers used in forming asbestos coverings for pipes and found in boiler lashing, e.g., hydromagnesite (3MgCO{sub 3} Mg(OH){sub 2} 3H{sub 2}O) and plaster of paris, gypsum (CaSO{sub 4}). The high temperate of the vitrification process destroys the asbestos fibers and renders the asbestos non-hazardous, e.g., a glass or glass-ceramic. In this manner the glass or glass-ceramic produced can be recycled, e.g., glassphalt or glasscrete, as can the clean metal pipe or metal framework.

  16. A Survey of Techniques for Approximate Computing

    DOE PAGESBeta

    Mittal, Sparsh

    2016-03-18

    Approximate computing trades off computation quality with the effort expended and as rising performance demands confront with plateauing resource budgets, approximate computing has become, not merely attractive, but even imperative. Here, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU and FPGA), processor components, memory technologies etc., and programming frameworks for AC. Moreover, we classify these techniques based on several key characteristics to emphasize their similarities and differences. Finally, the aim of this paper is tomore » provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.« less

  17. In-situ Data Analysis Framework for ACME Land Simulations

    NASA Astrophysics Data System (ADS)

    Wang, D.; Yao, C.; Jia, Y.; Steed, C.; Atchley, S.

    2015-12-01

    The realistic representation of key biogeophysical and biogeochemical functions is the fundamental of process-based ecosystem models. Investigating the behavior of those ecosystem functions within real-time model simulation can be a very challenging due to the complex of both model and software structure of an environmental model, such as the Accelerated Climate Model for Energy (ACME) Land Model (ALM). In this research, author will describe the urgent needs and challenges for in-situ data analysis for ALM simulations, and layouts our methods/strategies to meet these challenges. Specifically, an in-situ data analysis framework is designed to allow users interactively observe the biogeophyical and biogeochemical process during ALM simulation. There are two key components in this framework, automatically instrumented ecosystem simulation, in-situ data communication and large-scale data exploratory toolkit. This effort is developed by leveraging several active projects, including scientific unit testing platform, common communication interface and extreme-scale data exploratory toolkit. Authors believe that, based on advanced computing technologies, such as compiler-based software system analysis, automatic code instrumentation, and in-memory data transport, this software system provides not only much needed capability for real-time observation and in-situ data analytics for environmental model simulation, but also the potentials for in-situ model behavior adjustment via simulation steering.

  18. Faculty Computer Expertise and Use of Instructional Technology. Technology Survey.

    ERIC Educational Resources Information Center

    Gabriner, Robert; Mery, Pamela

    This report shows the findings of a 1997 technology survey used to assess degrees of faculty computer expertise and the use of instructional technology. Part 1 reviews general findings of the fall 1997 technology survey: (1) the level of computer expertise among faculty, staff and administrators appears to be increasing; (2) in comparison with the…

  19. Equivalency of Paper versus Tablet Computer Survey Data

    ERIC Educational Resources Information Center

    Ravert, Russell D.; Gomez-Scott, Jessica; Donnellan, M. Brent

    2015-01-01

    Survey responses collected via paper surveys and computer tablets were compared to test for differences between those methods of obtaining self-report data. College students (N = 258) were recruited in public campus locations and invited to complete identical surveys on either paper or iPad tablet. Only minor homogeneity differences were found…

  20. Computer Availability and Principals' Perceptions of Online Surveys

    ERIC Educational Resources Information Center

    Eaton, Danice K.; Brener, Nancy D.; Kann, Laura; Roberts, Alice M.; Kyle, Tonja M.; Flint, Katherine H.; Ross, Alexander L. R.

    2011-01-01

    Background: School-based risk behavior surveys traditionally have been administered via paper-and-pencil. This study assessed the feasibility of conducting in-class online surveys in US high schools. Methods: A paper-and-pencil questionnaire assessing computer availability and perceptions of online surveys was mailed to a nationally representative…

  1. Computer-Administered Surveys in Extension.

    ERIC Educational Resources Information Center

    Kawasaki, Jodee L.; Raven, Matt R.

    1995-01-01

    A survey was sent via electronic mail (email) to 116 Montana extension staff. Forty agents and 13 specialists returned it via email; 26 agents and 17 specialists via regular mail. One-third were not comfortable with completing an electronic survey, although it is an easier and less costly method. (SK)

  2. Model Diagnostics for the Department of Energy's Accelerated Climate Modeling for Energy (ACME) Project

    NASA Astrophysics Data System (ADS)

    Smith, B.

    2015-12-01

    In 2014, eight Department of Energy (DOE) national laboratories, four academic institutions, one company, and the National Centre for Atmospheric Research combined forces in a project called Accelerated Climate Modeling for Energy (ACME) with the goal to speed Earth system model development for climate and energy. Over the planned 10-year span, the project will conduct simulations and modeling on DOE's most powerful high-performance computing systems at Oak Ridge, Argonne, and Lawrence Berkeley Leadership Compute Facilities. A key component of the ACME project is the development of an interactive test bed for the advanced Earth system model. Its execution infrastructure will accelerate model development and testing cycles. The ACME Workflow Group is leading the efforts to automate labor-intensive tasks, provide intelligent support for complex tasks and reduce duplication of effort through collaboration support. As part of this new workflow environment, we have created a diagnostic, metric, and intercomparison Python framework, called UVCMetrics, to aid in the testing-to-production execution of the ACME model. The framework exploits similarities among different diagnostics to compactly support diagnosis of new models. It presently focuses on atmosphere and land but is designed to support ocean and sea ice model components as well. This framework is built on top of the existing open-source software framework known as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT). Because of its flexible framework design, scientists and modelers now can generate thousands of possible diagnostic outputs. These diagnostics can compare model runs, compare model vs. observation, or simply verify a model is physically realistic. Additional diagnostics are easily integrated into the framework, and our users have already added several. Diagnostics can be generated, viewed, and manipulated from the UV-CDAT graphical user interface, Python command line scripts and programs

  3. A survey of computer science capstone course literature

    NASA Astrophysics Data System (ADS)

    Dugan, Robert F., Jr.

    2011-09-01

    In this article, we surveyed literature related to undergraduate computer science capstone courses. The survey was organized around course and project issues. Course issues included: course models, learning theories, course goals, course topics, student evaluation, and course evaluation. Project issues included: software process models, software process phases, project type, documentation, tools, groups, and instructor administration. We reflected on these issues and thecomputer science capstone course we have taught for seven years. The survey summarized, organized, and synthesized the literature to provide a referenced resource for computer science instructors and researchers interested in computer science capstone courses.

  4. State of Washington Computer Use Survey.

    ERIC Educational Resources Information Center

    Beal, Jack L.; And Others

    This report presents the results of a spring 1982 survey of a random sample of Washington public schools which separated findings according to school level (elementary, middle, junior high, or high school) and district size (either less than or greater than 2,000 enrollment). A brief review of previous studies and a description of the survey…

  5. A Survey of Computer Science Capstone Course Literature

    ERIC Educational Resources Information Center

    Dugan, Robert F., Jr.

    2011-01-01

    In this article, we surveyed literature related to undergraduate computer science capstone courses. The survey was organized around course and project issues. Course issues included: course models, learning theories, course goals, course topics, student evaluation, and course evaluation. Project issues included: software process models, software…

  6. Canadian Community College Computer Usage Survey, May 1983.

    ERIC Educational Resources Information Center

    Gee, Michael Dennis

    This survey was conducted to provide information on the level of computer usage in Canadian community colleges. A 19-question form was mailed to the deans of instruction in 175 Canadian public community colleges identified as such by Statistics Canada. Of these, 111 colleges returned their surveys (a 63% response rate), and the results were…

  7. Survey of Computer Facilities in Minnesota and North Dakota.

    ERIC Educational Resources Information Center

    MacGregor, Donald

    In order to attain a better understanding of the data processing manpower needs of business and industry, a survey instrument was designed and mailed to 570 known and possible computer installations in the Minnesota/North Dakota area. The survey was conducted during the spring of 1975, and concentrated on the kinds of equipment and computer…

  8. Autonomic Cluster Management System (ACMS): A Demonstration of Autonomic Principles at Work

    NASA Technical Reports Server (NTRS)

    Baldassari, James D.; Kopec, Christopher L.; Leshay, Eric S.; Truszkowski, Walt; Finkel, David

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of achieving significant computational capabilities for high-performance computing applications, while simultaneously affording the ability to. increase that capability simply by adding more (inexpensive) processors. However, the task of manually managing and con.guring a cluster quickly becomes impossible as the cluster grows in size. Autonomic computing is a relatively new approach to managing complex systems that can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Automatic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management.

  9. Sealing Force Increasing of ACM Gasket through Electron Beam Radiation

    NASA Astrophysics Data System (ADS)

    dos Santos, D. J.; Batalha, G. F.

    2011-01-01

    Rubber is an engineering material largely used as sealing parts, in form of O-rings, solid gaskets and liquid gaskets, materials applied in liquid state with posterior vulcanization and sealing. Stress relaxation is a rubber characteristic which impacts negatively in such industrial applications (rings and solid gaskets). This work has the purpose to investigate the use of electron beam radiation (EB) as a technology able to decrease the stress relaxation in acrylic rubber (ACM), consequently increasing the sealing capability of this material. ACM samples were irradiated with dose of 100 kGy and 250 kGy, its behavior was comparatively investigated using, dynamic mechanical analysis (DMA) and compression stress relaxation (CSR) experiments. The results obtained by DMA shown an increase of Tg and changes in dynamic mechanical behavior.

  10. Survey of Educational Computing in the Pacific Rim.

    ERIC Educational Resources Information Center

    Ah Mai, Karen L.

    A survey of selected educational institutions within the Pacific Rim was taken to determine the degree of interest and level of expertise available for an international experiment in computer networking. From the data collected from the Pacific Rim institutions, it appears that the conditions which promulgate computer network development are…

  11. Survey of Intelligent Computer-Aided Training

    NASA Technical Reports Server (NTRS)

    Loftin, R. B.; Savely, Robert T.

    1992-01-01

    Intelligent Computer-Aided Training (ICAT) systems integrate artificial intelligence and simulation technologies to deliver training for complex, procedural tasks in a distributed, workstation-based environment. Such systems embody both the knowledge of how to perform a task and how to train someone to perform that task. This paper briefly reviews the antecedents of ICAT systems and describes the approach to their creation developed at the NASA Lyndon B. Johnson Space Center. In addition to the general ICAT architecture, specific ICAT applications that have been or are currently under development are discussed. ICAT systems can offer effective solutions to a number of training problems of interest to the aerospace community.

  12. Computers for real time flight simulation: A market survey

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.; Karplus, W. J.

    1977-01-01

    An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

  13. Computer generated K indices adopted by the British Geological Survey

    NASA Astrophysics Data System (ADS)

    Clark, T. D. G.

    1992-04-01

    On 1 January 1991 the British Geological Survey adopted a computer method for generating K indices from its three geomagnetic observatories. This replaced the traditional handscaling method, resulting in saving of staff time. Other advantages are the ability to distribute K indices to users in real time and the fact that there will not be any change in bias of the K index caused by a change of handscaler in future. The computer algorithm is described. The results of a comparison between the computed and handscaled K indices are presented, which show the computer method to be compatible with handscaling.

  14. 2005 DOE Computer Graphics Forum Site Survey

    SciTech Connect

    Rebecca, S; Eric, B

    2005-04-15

    The Information Management and Graphics Group supports and develops tools that enhance our ability to access, display, and understand large, complex data sets. Activities include developing visualization software for terascale data exploration; running two video production labs; supporting graphics libraries and tools for end users; maintaining four PowerWalls and assorted other advanced displays; and providing integrated tools for searching, organizing, and browsing scientific data. The Data group supports Defense and Nuclear technologies (D&NT) Directorate. The group's visualization team has developed and maintains two visualization tools: MeshTV and VisIt. These are interactive graphical analysis tools for visualizing and analyzing data on two- and three-dimensional meshes. They also provide movie production support. Researchers in the Center for Applied Scientific Computing (CASC) work on various projects including the development of visualization and data mining techniques for terascale data exploration that are funded by ASC. The researchers also have LDRD projects and collaborations with other lab researchers, academia, and industry.

  15. Campus Computing 1990: The EDUCOM/USC Survey of Desktop Computing in Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.; Eastman, Skip

    The National Survey of Desktop Computer Use in Higher Education was conducted in the spring and summer of 1990 by the Center for Scholarly Technology at the University of Southern California, in cooperation with EDUCOM and with support from 15 corporate sponsors. The survey was designed to collect information about campus planning, policies, and…

  16. Empirical Validation and Application of the Computing Attitudes Survey

    ERIC Educational Resources Information Center

    Dorn, Brian; Elliott Tew, Allison

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…

  17. The ACLS Survey of Scholars: Views on Publications, Computers, Libraries.

    ERIC Educational Resources Information Center

    Morton, Herbert C.; Price, Anne Jamieson

    1986-01-01

    Reviews results of a survey by the American Council of Learned Societies (ACLS) of 3,835 scholars in the humanities and social sciences who are working both in colleges and universities and outside the academic community. Areas highlighted include professional reading, authorship patterns, computer use, and library use. (LRW)

  18. A Survey of Current Computer Information Science (CIS) Students.

    ERIC Educational Resources Information Center

    Los Rios Community Coll. District, Sacramento, CA. Office of Institutional Research.

    This document is a survey designed to be completed by current students of Computer Information Science (CIS) in the Los Rios Community College District (LRCCD), which consists of three community colleges: American River College, Cosumnes River College, and Sacramento City College. The students are asked about their educational goals and how…

  19. Computer Infrastructure for the Variable Young Stellar Objects Survey

    NASA Astrophysics Data System (ADS)

    Walawender, Josh; Reipurth, Bo; Paegert, Martin

    2011-03-01

    An increasing number of remote or robotically controlled telescopes are using commercial "off the shelf" hardware and software. We describe a system which has been implemented in the Variable Young Stellar Objects Survey (VYSOS) project which uses simple, commercially available software and hardware to enable the quick restoration of observatory operations in the event of a computer failure.

  20. Business School Computer Usage, Fourth Annual UCLA Survey.

    ERIC Educational Resources Information Center

    Frand, Jason L.; And Others

    The changing nature of the business school computing environment is monitored in a report whose purpose is to provide deans and other policy-makers with information to use in making allocation decisions and program plans. This survey focuses on resource allocations of 249 accredited U.S. business schools and 15 Canadian schools. A total of 128…

  1. Empirical validation and application of the computing attitudes survey

    NASA Astrophysics Data System (ADS)

    Dorn, Brian; Tew, Allison Elliott

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning Attitudes about Science Survey and measures novice to expert attitude shifts about the nature of knowledge and problem solving in computer science. Factor analysis with a large, multi-institutional data-set identified and confirmed five subscales on the CAS related to different facets of attitudes measured on the survey. We then used the CAS in a pre-post format to demonstrate its usefulness in studying attitude shifts during CS1 courses and its responsiveness to varying instructional conditions. The most recent version of the CAS is provided in its entirety along with a discussion of the conditions under which its validity has been demonstrated.

  2. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  3. Acme jumper pipe system for coke-oven charging

    SciTech Connect

    Medved, P.D.; Thomas, H.

    1996-08-01

    Acme Steel has operated larry cars with an attached jumper pipe since 1977 and had been able to meet the State Implementation Plan (SIP). With the advent of the Clean Air Act (CAA), Acme considered that it could not meet these new standards without modifications to the jumper pipe system. Several drop sleeve modifications, boot seal materials and configurations were tested that resulted in limited success in improving the boot seal life. These modifications showed that the Clean Air Act standards could be met, but it would be cost prohibitive to continue to operate in this manner. The company decided to install an off-car jumper pipe system which uses a traveling U-tube for connection to the assist oven through an additional hole in the roof of each oven. Temperature related failures of drop sleeve seals were eliminated. The off-car jumper pipe is a more efficient gas connection to the assist oven and enables the company to meet the Clean Air Act charging requirements in a cost effective manner.

  4. Fuel gas main replacement at Acme Steel's coke plant

    SciTech Connect

    Trevino, O. . Chicago Coke Plant)

    1994-09-01

    ACME Steel's Chicago coke plant consists of two 4-meter, 50-oven Wilputte underjet coke-oven batteries. These batteries were constructed in 1956--1957. The use of blast furnace gas was discontinued in the late 1960's. In 1977--1978, the oven walls in both batteries were reconstructed. Reconstruction of the underfire system was limited to rebuilding the coke-oven gas reversing cocks and meter in orifices. By the early 1980's, the 24-in. diameter underfire fuel gas mains of both batteries developed leaks at the Dresser expansion joints. These leaks were a result of pipe loss due to corrosion. Leaks also developed along the bottoms and sides of both mains. A method is described that permitted pushing temperatures to be maintained during replacement of underfire fuel gas mains. Each of Acme's two, 50-oven, 4-metric Wilputte coke-oven, gas-fired batteries were heated by converting 10-in. diameter decarbonizing air mains into temporary fuel gas mains. Replacement was made one battery at a time, with the temporary 10-in. mains in service for five to eight weeks.

  5. A survey of software adaptation in mobile and ubiquitous computing

    NASA Astrophysics Data System (ADS)

    Kakousis, Konstantinos; Paspallis, Nearchos; Angelos Papadopoulos, George

    2010-11-01

    Driven by the vast proliferation of mobile devices and ubiquitous computing, dynamic software adaptation is becoming one of the most common terms in Software Engineering and Computer Science in general. After the evolution in autonomic and ubiquitous computing, we will soon expect devices to understand our changing needs and react to them as transparently as possible. Software adaptation is not a new term though; it has been extensively researched in several domains and in numerous forms. This has resulted in several interpretations of adaptation. This survey aims to provide a disambiguation of the term, as it is understood in ubiquitous computing, and a critical evaluation of existing software adaptation approaches. In particular, we focus on existing solutions that enable dynamic software modifications that happen on resource constrained devices, deployed in mobile and ubiquitous computing environments.

  6. A survey of GPU-based medical image computing techniques.

    PubMed

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming; Wang, Defeng

    2012-09-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080

  7. A survey of GPU-based medical image computing techniques

    PubMed Central

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming

    2012-01-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080

  8. Air Traffic Complexity Measurement Environment (ACME): Software User's Guide

    NASA Technical Reports Server (NTRS)

    1996-01-01

    A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.

  9. Icing simulation: A survey of computer models and experimental facilities

    NASA Technical Reports Server (NTRS)

    Potapczuk, M. G.; Reinmann, J. J.

    1991-01-01

    A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focussed on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for increased understanding of the physical processes governing ice accretion, ice shedding, and iced airfoil aerodynamics is examined.

  10. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1982-01-01

    An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.

  11. Survey of computer programs for heat transfer analysis

    NASA Astrophysics Data System (ADS)

    Noor, A. K.

    An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.

  12. Sparse Polynomial Chaos Surrogate for ACME Land Model via Iterative Bayesian Compressive Sensing

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Ricciuto, D. M.; Safta, C.; Debusschere, B.; Najm, H. N.; Thornton, P. E.

    2015-12-01

    For computationally expensive climate models, Monte-Carlo approaches of exploring the input parameter space are often prohibitive due to slow convergence with respect to ensemble size. To alleviate this, we build inexpensive surrogates using uncertainty quantification (UQ) methods employing Polynomial Chaos (PC) expansions that approximate the input-output relationships using as few model evaluations as possible. However, when many uncertain input parameters are present, such UQ studies suffer from the curse of dimensionality. In particular, for 50-100 input parameters non-adaptive PC representations have infeasible numbers of basis terms. To this end, we develop and employ Weighted Iterative Bayesian Compressive Sensing to learn the most important input parameter relationships for efficient, sparse PC surrogate construction with posterior uncertainty quantified due to insufficient data. Besides drastic dimensionality reduction, the uncertain surrogate can efficiently replace the model in computationally intensive studies such as forward uncertainty propagation and variance-based sensitivity analysis, as well as design optimization and parameter estimation using observational data. We applied the surrogate construction and variance-based uncertainty decomposition to Accelerated Climate Model for Energy (ACME) Land Model for several output QoIs at nearly 100 FLUXNET sites covering multiple plant functional types and climates, varying 65 input parameters over broad ranges of possible values. This work is supported by the U.S. Department of Energy, Office of Science, Biological and Environmental Research, Accelerated Climate Modeling for Energy (ACME) project. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  13. Surveying co-located space geodesy techniques for ITRF computation

    NASA Astrophysics Data System (ADS)

    Sarti, P.; Sillard, P.; Vittuari, L.

    2003-04-01

    We present a comprehensive operational methodology, based on classical geodesy triangulation and trilateration, that allows the determination of reference points of the five space geodesy techniques used in ITRF computation (i.e.: DORIS, GPS, LLR, SLR, VLBI). Most of the times, for a single technique, the reference point is not accessible and measurable directly. Likewise, no mechanically determined ex-center with respect to an external and measurable point is usually given. In these cases, it is not possible to directly measure the sought reference points and it is even less straightforward to obtain the statistical information relating these points for different techniques. We outline the most general practical surveying methodology that permits to recover the reference points of the different techniques regardless of their physical materialization. We also give a detailed analytical approach for less straightforward cases (e.g.: non geodetic VLBI antennae and SLR/LLR systems). We stress the importance of surveying instrumentation and procedure in achieving the best possible results and outline the impact of the information retrieved with our method in ITRF computation. In particular, we will give numerical examples of computation of the reference point of VLBI antennae (Ny Aalesund and Medicina) and the ex-centre vector computation linking co-located VLBI and GPS techniques in Medicina (Italy). A special attention was paid to the rigorous derivation of statistical elements. They will be presented in an other presentation.

  14. A Survey of Architectural Techniques for Near-Threshold Computing

    DOE PAGESBeta

    Mittal, Sparsh

    2015-12-28

    Energy efficiency has now become the primary obstacle in scaling the performance of all classes of computing systems. In low-voltage computing and specifically, near-threshold voltage computing (NTC), which involves operating the transistor very close to and yet above its threshold voltage, holds the promise of providing many-fold improvement in energy efficiency. However, use of NTC also presents several challenges such as increased parametric variation, failure rate and performance loss etc. Our paper surveys several re- cent techniques which aim to offset these challenges for fully leveraging the potential of NTC. By classifying these techniques along several dimensions, we also highlightmore » their similarities and differences. Ultimately, we hope that this paper will provide insights into state-of-art NTC techniques to researchers and system-designers and inspire further research in this field.« less

  15. A Survey of Architectural Techniques for Near-Threshold Computing

    SciTech Connect

    Mittal, Sparsh

    2015-12-28

    Energy efficiency has now become the primary obstacle in scaling the performance of all classes of computing systems. In low-voltage computing and specifically, near-threshold voltage computing (NTC), which involves operating the transistor very close to and yet above its threshold voltage, holds the promise of providing many-fold improvement in energy efficiency. However, use of NTC also presents several challenges such as increased parametric variation, failure rate and performance loss etc. Our paper surveys several re- cent techniques which aim to offset these challenges for fully leveraging the potential of NTC. By classifying these techniques along several dimensions, we also highlight their similarities and differences. Ultimately, we hope that this paper will provide insights into state-of-art NTC techniques to researchers and system-designers and inspire further research in this field.

  16. A survey of CPU-GPU heterogeneous computing techniques

    DOE PAGESBeta

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-07-04

    As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and applicationmore » level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.« less

  17. A survey of CPU-GPU heterogeneous computing techniques

    SciTech Connect

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-07-04

    As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and application level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.

  18. Campus Computing, 2001: The 12th National Survey of Computing and Information Technology in American Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    The 2001 Campus Computing Survey, the 12th such survey, is the largest continuing study of the role of computing and information technology in U.S. higher education today. The survey results in this report summarize data from 590 two- and four-year, public and private colleges across the United States, representing a 38.4% response rate. The focus…

  19. Campus Computing, 1995: The Sixth National Survey of Desktop Computing in Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    This monograph reports findings of a Fall, 1995 survey of computing officials at approximately 650 two- and four-year colleges and universities across the United States concerning increasing use of technology on college campuses. Major findings include: the percentage of college courses using e-mail and multimedia resources more than doubled; the…

  20. Campus Computing 1991. The EDUCOM-USC Survey of Desktop Computing in Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.; Eastman, Skip

    A national survey of desktop computing in higher education was conducted in 1991 of 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges, and community colleges. Respondents (N=1099) were individuals specifically responsible for the operation and future direction of academic…

  1. Campus Computing 1993. The USC National Survey of Desktop Computing in Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.; Eastman, Skip

    A national survey of desktop computing in higher education was conducted in spring and summer 1993 at over 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges and community colleges. Respondents (N=1011) were individuals specifically responsible for the operation and future…

  2. Campus Computing 1992. The EDUCOM-USC Survey of Desktop Computing in Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.; Eastman, Skip

    A national survey of desktop computing in higher education was conducted in 1992 of 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges, and community colleges. Respondents (N=970) were individuals specifically responsible for the operation and future direction of academic…

  3. Campus Computing, 1998. The Ninth National Survey of Desktop Computing and Information Technology in American Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    This report presents findings of a June 1998 survey of computing officials at 1,623 two- and four-year U.S. colleges and universities concerning the use of computer technology. The survey found that computing and information technology (IT) are now core components of the campus environment and classroom experience. However, key aspects of IT…

  4. Autolysis of Lactococcus lactis caused by induced overproduction of its major autolysin, AcmA.

    PubMed Central

    Buist, G; Karsens, H; Nauta, A; van Sinderen, D; Venema, G; Kok, J

    1997-01-01

    The optical density of a culture of lactococcus lactis MG1363 was reduced more than 60% during prolonged stationary phase. Reduction in optical density (autolysis) was almost absent in a culture of an isogenic mutant containing a deletion in the major autolysin gene, acmA. An acmA mutant carrying multiple coples of a plasmid encoding AcmA lysed to a greater extent than the wild-type strain did. Intercellular action of AcmA was shown by mixing end-exponential-phase cultures of an acmA deletion mutant and a tripeptidase (pepT) deletion mutant. PepT, produced by the acmA mutant, was detected in the supernatant of the mixed culture, but no PepT was present in the culture supernatant of the acmA mutant. A plasmid was constructed in which acmA, lacking its own promoter, was placed downstream of the inducible promoter/operator region of the temperate lactococcal bacteriophage r1t. After mitomycin induction of an exponential-phase culture of L. lactis LL302 carrying this plasmid, the cells became subject to autolysis, resulting in the release of intracellular proteins. PMID:9212419

  5. The Presence of Computers in American Schools. Teaching, Learning, and Computing: 1998 National Survey. Report No.2.

    ERIC Educational Resources Information Center

    Anderson, Ronald E.; Ronnkvist, Amy

    In order to assess the current presence of computing technology in American schools, a national survey was conducted of elementary and secondary principals and technology coordinators in 655 public and private schools. Results are discussed in terms of: computer density; computer capacity; computer renewal; peripherals; computer location;…

  6. ARM Airborne Carbon Measurements VI (ACME VI) Science Plan

    SciTech Connect

    Biraud, S

    2015-12-01

    From October 1 through September 30, 2016, the Atmospheric Radiation Measurement (ARM) Aerial Facility will deploy the Cessna 206 aircraft over the Southern Great Plains (SGP) site, collecting observations of trace-gas mixing ratios over the ARM’s SGP facility. The aircraft payload includes two Atmospheric Observing Systems, Inc., analyzers for continuous measurements of CO2 and a 12-flask sampler for analysis of carbon cycle gases (CO2, CO, CH4, N2O, 13CO2, 14CO2, carbonyl sulfide, and trace hydrocarbon species, including ethane). The aircraft payload also includes instrumentation for solar/infrared radiation measurements. This research is supported by the U.S. Department of Energy’s ARM Climate Research Facility and Terrestrial Ecosystem Science Program and builds upon previous ARM Airborne Carbon Measurements (ARM-ACME) missions. The goal of these measurements is to improve understanding of 1) the carbon exchange at the SGP site, 2) how CO2 and associated water and energy fluxes influence radiative forcing, convective processes and CO2 concentrations over the SGP site, and 3) how greenhouse gases are transported on continental scales.

  7. Pomegranate MR images analysis using ACM and FCM algorithms

    NASA Astrophysics Data System (ADS)

    Morad, Ghobad; Shamsi, Mousa; Sedaaghi, M. H.; Alsharif, M. R.

    2011-10-01

    Segmentation of an image plays an important role in image processing applications. In this paper segmentation of pomegranate magnetic resonance (MR) images has been explored. Pomegranate has healthy nutritional and medicinal properties for which the maturity indices and quality of internal tissues play an important role in the sorting process in which the admissible determination of features mentioned above cannot be easily achieved by human operator. Seeds and soft tissues are the main internal components of pomegranate. For research purposes, such as non-destructive investigation, in order to determine the ripening index and the percentage of seeds in growth period, segmentation of the internal structures should be performed as exactly as possible. In this paper, we present an automatic algorithm to segment the internal structure of pomegranate. Since its intensity of stem and calyx is close to the internal tissues, the stem and calyx pixels are usually labeled to the internal tissues by segmentation algorithm. To solve this problem, first, the fruit shape is extracted from its background using active contour model (ACM). Then stem and calyx are removed using morphological filters. Finally the image is segmented by fuzzy c-means (FCM). The experimental results represent an accuracy of 95.91% in the presence of stem and calyx, while the accuracy of segmentation increases to 97.53% when stem and calyx are first removed by morphological filters.

  8. Design and implementation of GaAs HBT circuits with ACME

    NASA Technical Reports Server (NTRS)

    Hutchings, Brad L.; Carter, Tony M.

    1993-01-01

    GaAs HBT circuits offer high performance (5-20 GHz) and radiation hardness (500 Mrad) that is attractive for space applications. ACME is a CAD tool specifically developed for HBT circuits. ACME implements a novel physical schematic-capture design technique where designers simultaneously view the structure and physical organization of a circuit. ACME's design interface is similar to schematic capture; however, unlike conventional schematic capture, designers can directly control the physical placement of both function and interconnect at the schematic level. In addition, ACME provides design-time parasitic extraction, complex wire models, and extensions to Multi-Chip Modules (MCM's). A GaAs HBT gate-array and semi-custom circuits have been developed with ACME; several circuits have been fabricated and found to be fully functional .

  9. A survey of computational aerodynamics in the United States

    NASA Technical Reports Server (NTRS)

    Gessow, A.; Morris, D. J.

    1977-01-01

    Programs in theoretical and computational aerodynamics in the United States are described. Those aspects of programs that relate to aeronautics are detailed. The role of analysis at various levels of sophistication is discussed as well as the inverse solution techniques that are of primary importance in design methodology. The research is divided into the broad categories of application for boundary layer flow, Navier-Stokes turbulence modeling, internal flows, two-dimensional configurations, subsonic and supersonic aircraft, transonic aircraft, and the space shuttle. A survey of representative work in each area is presented.

  10. Importance of Computer Competencies for Entering JCCC Students: A Survey of Faculty and Staff.

    ERIC Educational Resources Information Center

    Weglarz, Shirley

    Johnson County Community College (JCCC) conducted a survey in response to faculty comments regarding entering students' lack of rudimentary computer skills. Faculty were spending time in non-computer related classes teaching students basic computer skills. The aim of the survey was to determine what the basic computer competencies for entering…

  11. Campus Computing, 1996. The Seventh National Survey of Desktop Computing and Information Technology in American Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    This report presents the findings of a June, 1996, survey of computing officials at 660 two- and four-year colleges and universities across the United States concerning the use of computer technology on college campuses. The survey found that instructional integration and user support emerged as the two most important information technology (IT)…

  12. Modern cosmology: Interactive computer simulations that use recent observational surveys

    NASA Astrophysics Data System (ADS)

    Moldenhauer, Jacob; Engelhardt, Larry; Stone, Keenan M.; Shuler, Ezekiel

    2013-06-01

    We present a collection of new, open-source computational tools for numerically modeling recent large-scale observational data sets using modern cosmology theory. These tools allow both students and researchers to constrain the parameter values in competitive cosmological models, thereby discovering both the accelerated expansion of the universe and its composition (e.g., dark matter and dark energy). These programs have several features to help the non-cosmologist build an understanding of cosmological models and their relation to observational data, including a built-in collection of several real observational data sets. The current list of built-in observations includes several recent supernovae Type-Ia surveys, baryon acoustic oscillations, the cosmic microwave background radiation, gamma-ray bursts, and measurements of the Hubble parameter. In this article, we discuss specific results for testing cosmological models using these observational data.

  13. Computer-Aided Diagnostic System For Mass Survey Chest Images

    NASA Astrophysics Data System (ADS)

    Yasuda, Yoshizumi; Kinoshita, Yasuhiro; Emori, Yasufumi; Yoshimura, Hitoshi

    1988-06-01

    In order to support screening of chest radiographs on mass survey, a computer-aided diagnostic system that automatically detects abnormality of candidate images using a digital image analysis technique has been developed. Extracting boundary lines of lung fields and examining their shapes allowed various kind of abnormalities to be detected. Correction and expansion were facilitated by describing the system control, image analysis control and judgement of abnormality in the rule type programing language. In the experiments using typical samples of student's radiograms, good results were obtained for the detection of abnormal shape of lung field, cardiac hypertrophy and scoliosis. As for the detection of diaphragmatic abnormality, relatively good results were obtained but further improvements will be necessary.

  14. Computational needs survey of NASA automation and robotics missions. Volume 1: Survey and results

    NASA Technical Reports Server (NTRS)

    Davis, Gloria J.

    1991-01-01

    NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is that mission computing requirements are frequently unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. A preliminary set of advanced mission computational processing requirements of automation and robotics (A&R) systems are provided for use by NASA, industry, and academic communities. These results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implementation capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Volume one includes the survey and results. Volume two contains the appendixes.

  15. Comparative study of numerical schemes of TVD3, UNO3-ACM and optimized compact scheme

    NASA Technical Reports Server (NTRS)

    Lee, Duck-Joo; Hwang, Chang-Jeon; Ko, Duck-Kon; Kim, Jae-Wook

    1995-01-01

    Three different schemes are employed to solve the benchmark problem. The first one is a conventional TVD-MUSCL (Monotone Upwind Schemes for Conservation Laws) scheme. The second scheme is a UNO3-ACM (Uniformly Non-Oscillatory Artificial Compression Method) scheme. The third scheme is an optimized compact finite difference scheme modified by us: the 4th order Runge Kutta time stepping, the 4th order pentadiagonal compact spatial discretization with the maximum resolution characteristics. The problems of category 1 are solved by using the second (UNO3-ACM) and third (Optimized Compact) schemes. The problems of category 2 are solved by using the first (TVD3) and second (UNO3-ACM) schemes. The problem of category 5 is solved by using the first (TVD3) scheme. It can be concluded from the present calculations that the Optimized Compact scheme and the UN03-ACM show good resolutions for category 1 and category 2 respectively.

  16. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1986-01-01

    An overview is given of the current capabilities of thirty-three computer programs that are used to solve heat transfer problems. The programs considered range from large general-purpose codes with broad spectrum of capabilities, large user community, and comprehensive user support (e.g., ABAQUS, ANSYS, EAL, MARC, MITAS II, MSC/NASTRAN, and SAMCEF) to the small, special-purpose codes with limited user community such as ANDES, NTEMP, TAC2D, TAC3D, TEPSA and TRUMP. The majority of the programs use either finite elements or finite differences for the spatial discretization. The capabilities of the programs are listed in tabular form followed by a summary of the major features of each program. The information presented herein is based on a questionnaire sent to the developers of each program. This information is preceded by a brief background material needed for effective evaluation and use of computer programs for heat transfer analysis. The present survey is useful in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program.

  17. Improving radiation survey data using CADD/CAE (computer-aided design and drafting computer-aided engineering)

    SciTech Connect

    Palau, G.L.; Tarpinian, J.E.

    1987-01-01

    A new application of computer-aided design and drafting (CADD) and computer-aided engineering (CAE) at the Three Mile Island Unit 2 (TMI-2) cleanup is improving the quality of radiation survey data taken in the plant. The use of CADD/CAE-generated survey maps has increased both the accuracy of survey data and the capability to perform analyses with these data. In addition, health physics technician manhours and radiation exposure can be reduced in situations where the CADD/CAE-generated drawings are used for survey mapping.

  18. X-ray diffraction computed tomography: a survey and description

    NASA Astrophysics Data System (ADS)

    Kleuker, Ulf

    1997-10-01

    Coherently scattered x-rays are mainly confined to a forward peaked cone, which exhibits, due to their coherence, structural information of the atomic arrangement in the sample. Coherent scattering in amorphous materials, which are of random short range order, therefore results in board diffraction ring patter, whereas crystalline substance show more confined diffraction rings or even Brag spots. X-ray diffraction computed tomography (XRDCT) reconstructs the intensities diffracted from extended objects on a square image grid and thus retrieves the local structure. A short survey is presented about what information can be extracted from diffraction experiments. Hereby a new method is proposed to use the Rietveld refinement for quantitative XRDCT. Also the possible use of XRDCT to reconstruct the spatial distribution of preferred orientation axis is suggested. An imaging system for XRDCT, consisting of a medical image intensifier tube and CCD readout system, is presented, which includes a modified beam stop for recording the intensity of the transmitted beam. Depending on the application this imaging system cam work in first generation or second generation tomography mode. Furthermore a new approach for the reconstruction of the differential coherent cross-section is proposed. It includes an absorption correction based on weighted sinograms. The introduced reconstruction strategy is elucidated by experimental result from a simple phantom. The measured data also validate the simulation program, written to study more complex phantoms under different experimental conditions. Finally possible applications in medical and material science are discussed. A design for a mammography setup using x-ray diffraction is presented.

  19. Survey of Commercially Available Computer-Readable Bibliographic Data Bases.

    ERIC Educational Resources Information Center

    Schneider, John H., Ed.; And Others

    This document contains the results of a survey of 94 U. S. organizations, and 36 organizations in other countries that were thought to prepare machine-readable data bases. Of those surveyed, 55 organizations (40 in U. S., 15 in other countries) provided completed camera-ready forms describing 81 commercially available, machine-readable data bases…

  20. Computation of optimized arrays for 3-D electrical imaging surveys

    NASA Astrophysics Data System (ADS)

    Loke, M. H.; Wilkinson, P. B.; Uhlemann, S. S.; Chambers, J. E.; Oxby, L. S.

    2014-12-01

    3-D electrical resistivity surveys and inversion models are required to accurately resolve structures in areas with very complex geology where 2-D models might suffer from artefacts. Many 3-D surveys use a grid where the number of electrodes along one direction (x) is much greater than in the perpendicular direction (y). Frequently, due to limitations in the number of independent electrodes in the multi-electrode system, the surveys use a roll-along system with a small number of parallel survey lines aligned along the x-direction. The `Compare R' array optimization method previously used for 2-D surveys is adapted for such 3-D surveys. Offset versions of the inline arrays used in 2-D surveys are included in the number of possible arrays (the comprehensive data set) to improve the sensitivity to structures in between the lines. The array geometric factor and its relative error are used to filter out potentially unstable arrays in the construction of the comprehensive data set. Comparisons of the conventional (consisting of dipole-dipole and Wenner-Schlumberger arrays) and optimized arrays are made using a synthetic model and experimental measurements in a tank. The tests show that structures located between the lines are better resolved with the optimized arrays. The optimized arrays also have significantly better depth resolution compared to the conventional arrays.

  1. An Overview of Two Recent Surveys of Administrative Computer Operations in Higher Education.

    ERIC Educational Resources Information Center

    Mann, Richard L.; And Others

    This document summarizes the results of two surveys about the current administrative uses of computers in higher education. Included in the document is: (1) a brief history of the development of computer operational and management information systems in higher education; (2) information on how computers are currently being used to support…

  2. Is a Web Survey as Effective as a Mail Survey? A Field Experiment Among Computer Users

    ERIC Educational Resources Information Center

    Kiernan, Nancy; Kiernan, Michaela; Oyler, Mary; Gilles, Carolyn

    2005-01-01

    With the exponential increase in Web access, program evaluators need to understand the methodological benefits and barriers of using the Web to collect survey data from program participants. In this experimental study, the authors examined whether a Web survey can be as effective as the more established mail survey on three measures of survey…

  3. A comparison of computational methods and algorithms for the complex gamma function

    NASA Technical Reports Server (NTRS)

    Ng, E. W.

    1974-01-01

    A survey and comparison of some computational methods and algorithms for gamma and log-gamma functions of complex arguments are presented. Methods and algorithms reported include Chebyshev approximations, Pade expansion and Stirling's asymptotic series. The comparison leads to the conclusion that Algorithm 421 published in the Communications of ACM by H. Kuki is the best program either for individual application or for the inclusion in subroutine libraries.

  4. CNTF-ACM promotes mitochondrial respiration and oxidative stress in cortical neurons through upregulating L-type calcium channel activity.

    PubMed

    Sun, Meiqun; Liu, Hongli; Xu, Huanbai; Wang, Hongtao; Wang, Xiaojing

    2016-09-01

    A specialized culture medium termed ciliary neurotrophic factor-treated astrocyte-conditioned medium (CNTF-ACM) allows investigators to assess the peripheral effects of CNTF-induced activated astrocytes upon cultured neurons. CNTF-ACM has been shown to upregulate neuronal L-type calcium channel current activity, which has been previously linked to changes in mitochondrial respiration and oxidative stress. Therefore, the aim of this study was to evaluate CNTF-ACM's effects upon mitochondrial respiration and oxidative stress in rat cortical neurons. Cortical neurons, CNTF-ACM, and untreated control astrocyte-conditioned medium (UC-ACM) were prepared from neonatal Sprague-Dawley rat cortical tissue. Neurons were cultured in either CNTF-ACM or UC-ACM for a 48-h period. Changes in the following parameters before and after treatment with the L-type calcium channel blocker isradipine were assessed: (i) intracellular calcium levels, (ii) mitochondrial membrane potential (ΔΨm), (iii) oxygen consumption rate (OCR) and adenosine triphosphate (ATP) formation, (iv) intracellular nitric oxide (NO) levels, (v) mitochondrial reactive oxygen species (ROS) production, and (vi) susceptibility to the mitochondrial complex I toxin rotenone. CNTF-ACM neurons displayed the following significant changes relative to UC-ACM neurons: (i) increased intracellular calcium levels (p < 0.05), (ii) elevation in ΔΨm (p < 0.05), (iii) increased OCR and ATP formation (p < 0.05), (iv) increased intracellular NO levels (p < 0.05), (v) increased mitochondrial ROS production (p < 0.05), and (vi) increased susceptibility to rotenone (p < 0.05). Treatment with isradipine was able to partially rescue these negative effects of CNTF-ACM (p < 0.05). CNTF-ACM promotes mitochondrial respiration and oxidative stress in cortical neurons through elevating L-type calcium channel activity. PMID:27514537

  5. ACME algorithms for contact in a multiphysics environment API version 2.2.

    SciTech Connect

    Heinstein, Martin Wilhelm; Glass, Micheal W.; Gullerud, Arne S.; Brown, Kevin H.; Voth, Thomas Eugene; Jones, Reese E.

    2004-07-01

    An effort is underway at Sandia National Laboratories to develop a library of algorithms to search for potential interactions between surfaces represented by analytic and discretized topological entities. This effort is also developing algorithms to determine forces due to these interactions for transient dynamics applications. This document describes the Application Programming Interface (API) for the ACME (Algorithms for Contact in a Multiphysics Environment) library.

  6. SUPERFUND TREATABILITY CLEARINGHOUSE: FINAL REPORT, PHASE I - IMMEDIATE ASSESSMENT, ACME SOLVENTS SITE

    EPA Science Inventory

    This is a site assessment and feasibility study of incineration alternatives at the ACME Solvents Site at Rockford, Illinois. The document contains laboratory results that are reported to simulate incineration conditions but no details on test methods were provided. The d...

  7. An audience-channel-message-evaluation (ACME) framework for health communication campaigns.

    PubMed

    Noar, Seth M

    2012-07-01

    Recent reviews of the literature have indicated that a number of health communication campaigns continue to fail to adhere to principles of effective campaign design. The lack of an integrated, organizing framework for the design, implementation, and evaluation of health communication campaigns may contribute to this state of affairs. The current article introduces an audience-channel-message-evaluation (ACME) framework that organizes the major principles of health campaign design, implementation, and evaluation. ACME also explicates the relationships and linkages between the varying principles. Insights from ACME include the following: The choice of audience segment(s) to focus on in a campaign affects all other campaign design choices, including message strategy and channel/component options. Although channel selection influences options for message design, choice of message design also influences channel options. Evaluation should not be thought of as a separate activity, but rather should be infused and integrated throughout the campaign design and implementation process, including formative, process, and outcome evaluation activities. Overall, health communication campaigns that adhere to this integrated set of principles of effective campaign design will have a greater chance of success than those using principles idiosyncratically. These design, implementation, and evaluation principles are embodied in the ACME framework. PMID:21441207

  8. Survey of Unsteady Computational Aerodynamics for Horizontal Axis Wind Turbines

    NASA Astrophysics Data System (ADS)

    Frunzulicǎ, F.; Dumitrescu, H.; Cardoş, V.

    2010-09-01

    We present a short review of aerodynamic computational models for horizontal axis wind turbines (HAWT). Models presented have a various level of complexity to calculate aerodynamic loads on rotor of HAWT, starting with the simplest blade element momentum (BEM) and ending with the complex model of Navier-Stokes equations. Also, we present some computational aspects of these models.

  9. Campus Computing, 2000: The 11th National Survey of Computing and Information Technology in American Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    The 2000 Campus Computing Survey, the 11th such survey, was sent to the chief academic officer at 1,176 two-year and four-year colleges and universities across the United States. By October 2000, 506 responses had been received, a response rate of 43%. New data reveal that the growing demand for technology talent across all sectors of the U.S.…

  10. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software. Revision 3

    SciTech Connect

    Fenske, K.R.

    1991-11-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization Plans for Word Processors, Personal Computers, Workstations, and Associated Software to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference that documents the plans of each organization for office automation, identifies appropriate planners and other contact people in those organizations, and encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan.

  11. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software

    SciTech Connect

    Fenske, K.R.

    1991-11-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization Plans for Word Processors, Personal Computers, Workstations, and Associated Software to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference that documents the plans of each organization for office automation, identifies appropriate planners and other contact people in those organizations, and encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan.

  12. A Synthesis and Survey of Critical Success Factors for Computer Technology Projects

    ERIC Educational Resources Information Center

    Baker, Ross A.

    2012-01-01

    The author investigated the existence of critical success factors for computer technology projects. Current research literature and a survey of experienced project managers indicate that there are 23 critical success factors (CSFs) that correlate with project success. The survey gathered an assessment of project success and the degree to which…

  13. A Survey of Computer Usage in Adult Education Programs in Florida Report.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Div. of Vocational, Adult, and Community Education.

    A study was conducted to identify the types and uses of computer hardware and software in adult and community education programs in Florida. Information was gathered through a survey instrument developed for the study and mailed to 100 adult and community education directors and adult literacy center coordinators (92 surveys were returned). The…

  14. Survey of computer codes applicable to waste facility performance evaluations

    SciTech Connect

    Alsharif, M.; Pung, D.L.; Rivera, A.L.; Dole, L.R.

    1988-01-01

    This study is an effort to review existing information that is useful to develop an integrated model for predicting the performance of a radioactive waste facility. A summary description of 162 computer codes is given. The identified computer programs address the performance of waste packages, waste transport and equilibrium geochemistry, hydrological processes in unsaturated and saturated zones, and general waste facility performance assessment. Some programs also deal with thermal analysis, structural analysis, and special purposes. A number of these computer programs are being used by the US Department of Energy, the US Nuclear Regulatory Commission, and their contractors to analyze various aspects of waste package performance. Fifty-five of these codes were identified as being potentially useful on the analysis of low-level radioactive waste facilities located above the water table. The code summaries include authors, identification data, model types, and pertinent references. 14 refs., 5 tabs.

  15. A Survey of Computer Use by Undergraduate Psychology Departments in Virginia.

    ERIC Educational Resources Information Center

    Stoloff, Michael L.; Couch, James V.

    1987-01-01

    Reports a survey of computer use in psychology departments in Virginia's four year colleges. Results showed that faculty, students, and clerical staff used word processing, statistical analysis, and database management most frequently. The three most numerous computers brands were the Apple II family, IBM PCs, and the Apple Macintosh. (Author/JDH)

  16. Computer Usage Survey for NUCEA Region IV. Summary and Observations.

    ERIC Educational Resources Information Center

    Jeska, Elizabeth E.; White, Cynthia

    The 57 institutional members of Region IV of the National University Continuing Education Association (NUCEA) were asked to provide information on computerization for teaching and conference use. Forty institutions (70 percent) responded. Sixty percent of the respondents indicated having a computer teaching facility. Of the 16 schools without a…

  17. National Survey of Computer Aided Manufacturing in Industrial Technology Programs.

    ERIC Educational Resources Information Center

    Heidari, Farzin

    The current status of computer-aided manufacturing in the 4-year industrial technology programs in the United States was studied. All industrial technology department chairs were mailed a questionnaire divided into program information, equipment information, and general comments sections. The questionnaire was designed to determine the subjects…

  18. A Survey of Navigational Computer Program in the Museum Setting.

    ERIC Educational Resources Information Center

    Sliman, Paula

    Patron service is a high priority in the library setting and alleviating a large percentage of the directional questions will provide librarians with more time to help patrons more thoroughly than they are able to currently. Furthermore, in view of the current economic trend of downsizing, a navigational computer system program has the potential…

  19. A survey of synchronization methods for parallel computers

    SciTech Connect

    Dinning, A. )

    1989-07-01

    This article examines how traditional synchronization methods influence the design of MIMD multiprocessors. This particular class of architectures is one in which high-level synchronization plays an important role. Although vector processors, dataflow machines, and single instruction, multiple-data (SIMD) computers are highly synchronized, their synchronization is generally an explicit part of the control flow and is executed as part of every instruction. In MIMD multiprocessors, synchronization must occur on demand, so more sophisticated schemes are needed.

  20. A Survey of Computational Intelligence Techniques in Protein Function Prediction

    PubMed Central

    Tiwari, Arvind Kumar; Srivastava, Rajeev

    2014-01-01

    During the past, there was a massive growth of knowledge of unknown proteins with the advancement of high throughput microarray technologies. Protein function prediction is the most challenging problem in bioinformatics. In the past, the homology based approaches were used to predict the protein function, but they failed when a new protein was different from the previous one. Therefore, to alleviate the problems associated with homology based traditional approaches, numerous computational intelligence techniques have been proposed in the recent past. This paper presents a state-of-the-art comprehensive review of various computational intelligence techniques for protein function predictions using sequence, structure, protein-protein interaction network, and gene expression data used in wide areas of applications such as prediction of DNA and RNA binding sites, subcellular localization, enzyme functions, signal peptides, catalytic residues, nuclear/G-protein coupled receptors, membrane proteins, and pathway analysis from gene expression datasets. This paper also summarizes the result obtained by many researchers to solve these problems by using computational intelligence techniques with appropriate datasets to improve the prediction performance. The summary shows that ensemble classifiers and integration of multiple heterogeneous data are useful for protein function prediction. PMID:25574395

  1. Children's Experiences of Completing a Computer-Based Violence Survey: Finnish Child Victim Survey Revisited.

    PubMed

    Fagerlund, Monica; Ellonen, Noora

    2016-07-01

    The involvement of children as research subjects requires special considerations with regard to research practices and ethics. This is especially true concerning sensitive research topics such as sexual victimization. Prior research suggests that reflecting these experiences in a survey can cause negative feelings in child participants, although posing only a minimal to moderate risk. Analyzing only predefined, often negative feelings related to answering a sexual victimization survey has dominated the existing literature. In this article children's free-text comments about answering a victimization survey and experiences of sexual victimization are analyzed together to evaluate the effects of research participation in relation to this sensitive issue. Altogether 11,364 children, aged 11-12 and 15-16, participated in the Finnish Child Victim Survey in 2013. Of these, 69% (7,852) reflected on their feelings about answering the survey. Results indicate that both clearly negative and positive feelings are more prevalent among victimized children compared to their nonvictimized peers. Characteristics unique to sexual victimization as well as differences related to gender and age are also discussed. The study contributes to the important yet contradictory field of studying the effects of research participation on children. PMID:27472509

  2. Computational materials science and engineering education: A survey of trends and needs

    NASA Astrophysics Data System (ADS)

    Thornton, K.; Nola, Samanthule; Edwin Garcia, R.; Asta, Mark; Olson, G. B.

    2009-10-01

    Results from a recent reassessment of the state of computational materials science and engineering (CMSE) education are reported. Surveys were distributed to the chairs and heads of materials programs, faculty members engaged in computational research, and employers of materials scientists and engineers, mainly in the United States. The data was compiled to assess current course offerings related to CMSE, the general climate for introducing computational methods in MSE curricula, and the requirements from the employers’ viewpoint. Furthermore, the available educational resources and their utilization by the community are examined. The surveys show a general support for integrating computational content into MSE education. However, they also reflect remaining issues with implementation, as well as a gap between the tools being taught in courses and those that are used by employers. Overall, the results suggest the necessity for a comprehensively developed vision and plans to further the integration of computational methods into MSE curricula.

  3. INFORM: European survey of computers in intensive care units.

    PubMed

    Ambroso, C; Bowes, C; Chambrin, M C; Gilhooly, K; Green, C; Kari, A; Logie, R; Marraro, G; Mereu, M; Rembold, P

    1992-01-01

    The aims of this study were (a) to survey and evaluate the impact of information technology applications in High Dependency Environments (HDEs) on organizational, psychological and cost-effectiveness factors, (b) to contribute information and design requirements to the other workpackages in the INFORM Project, and (c) to develop useful evaluation methodologies. The evaluation methodologies used were: questionnaires, case studies, objective findings (keystroke) and literature search and review. Six questionnaires were devised covering organizational impact, cost-benefit impact and perceived advantages and disadvantages of computerized systems in HDE (psychological impact). The general conclusion was that while existing systems have been generally well received, they are not yet designed in such a developed and integrated way as to yield their full potential. Greater user involvement in design and implementation and more emphasis on training emerged as strong requirements. Lack of reliability leading to parallel charting was a major problem with the existing systems. It proved difficult to assess cost effectiveness due to a lack of detailed accounting costs; however, it appeared that in the short term, computerisation in HDEs tended to increase costs. It is felt that through a better stock control and better decision making, costs may be reduced in the longer run and effectiveness increased; more detailed longitudinal studies appear to be needed on this subject. PMID:1402304

  4. Basic plasma and fusion theory and computer simulations survey

    SciTech Connect

    Kawakami, I.; Nishikawa, K.

    1983-12-01

    The College of Science and Technology at Nihon University and the Institute for Fusion Theory at Hiroshima University discuss the history of the role of theory and simulation in fusion-oriented research. Recent activities include a one-dimensional tokamak transport code at Nagoya University and three-dimensional resistive MHD simulation studies of spheromaks. Other recent activities discussed include the tokamak computer code system TRITON, transport flux in currentless ECH-produced plasma in Heliotron-E, and thermal electron transport in the presence of a steep temperature gradient. The Japan-U.S. Joint Institute for Fusion Theory's present activities are discussed, including subject areas in three-dimensional simulation studies, nonequilibrium statistical physics, anaomalous transport and drift wave turbulence and hot-electron physics.

  5. Survey of computer vision-based natural disaster warning systems

    NASA Astrophysics Data System (ADS)

    Ko, ByoungChul; Kwak, Sooyeong

    2012-07-01

    With the rapid development of information technology, natural disaster prevention is growing as a new research field dealing with surveillance systems. To forecast and prevent the damage caused by natural disasters, the development of systems to analyze natural disasters using remote sensing geographic information systems (GIS), and vision sensors has been receiving widespread interest over the last decade. This paper provides an up-to-date review of five different types of natural disasters and their corresponding warning systems using computer vision and pattern recognition techniques such as wildfire smoke and flame detection, water level detection for flood prevention, coastal zone monitoring, and landslide detection. Finally, we conclude with some thoughts about future research directions.

  6. On the modeling of a single-stage, entrained-flow gasifier using Aspen Custom Modeler (ACM)

    SciTech Connect

    Kasule, J.; Turton, R.; Bhattacharyya, D.; Zitney, S.

    2010-01-01

    Coal-fired gasifiers are the centerpiece of integrated gasification combined cycle (IGCC) power plants. The gasifier produces synthesis gas that is subsequently converted into electricity through combustion in a gas turbine. Several mathematical models have been developed to study the physical and chemical processes taking place inside the gasifier. Such models range from simple one-dimensional (1D) steady-state models to sophisticated dynamic 3D computational fluid dynamics (CFD) models that incorporate turbulence effects in the reactor. The practical operation of the gasifier is dynamic in nature but most 1D and some higher-dimensional models are often steady state. On the other hand, many higher order CFD-based models are dynamic in nature, but are too computationally expensive to be used directly in operability and controllability dynamic studies. They are also difficult to incorporate in the framework of process simulation software such as Aspen Plus Dynamics. Thus lower-dimensional dynamic models are still useful in these types of studies. In the current study, a 1D dynamic model for a single-stage, downward-firing, entrained-flow GE-type gasifier is developed using Aspen Custom Modeler{reg_sign} (ACM), which is a commercial equation-based simulator for creating, editing, and re-using models of process units. The gasifier model is based on mass, momentum, and energy balances for the solid and gas phases. The physical and chemical reactions considered in the model are drying, devolatilization/pyrolysis, gasification, combustion, and the homogeneous gas phase reactions. The dynamic gasifier model is being developed for use in a plant-wide dynamic model of an IGCC power plant. For dynamic simulation, the resulting highly nonlinear system of partial differential algebraic equations (PDAE) is solved in ACM using the well-known Method of Lines (MoL) approach. The MoL discretizes the space domain and leaves the time domain continuous, thereby converting the PDAE to

  7. USL NASA/RECON project presentations at the 1985 ACM Computer Science Conference: Abstracts and visuals

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Chum, Frank Y.; Gallagher, Suzy; Granier, Martin; Hall, Philip P.; Moreau, Dennis R.; Triantafyllopoulos, Spiros

    1985-01-01

    This Working Paper Series entry represents the abstracts and visuals associated with presentations delivered by six USL NASA/RECON research team members at the above named conference. The presentations highlight various aspects of NASA contract activities pursued by the participants as they relate to individual research projects. The titles of the six presentations are as follows: (1) The Specification and Design of a Distributed Workstation; (2) An Innovative, Multidisciplinary Educational Program in Interactive Information Storage and Retrieval; (3) Critical Comparative Analysis of the Major Commercial IS and R Systems; (4) Design Criteria for a PC-Based Common User Interface to Remote Information Systems; (5) The Design of an Object-Oriented Graphics Interface; and (6) Knowledge-Based Information Retrieval: Techniques and Applications.

  8. A survey on computer aided diagnosis for ocular diseases

    PubMed Central

    2014-01-01

    Background Computer Aided Diagnosis (CAD), which can automate the detection process for ocular diseases, has attracted extensive attention from clinicians and researchers alike. It not only alleviates the burden on the clinicians by providing objective opinion with valuable insights, but also offers early detection and easy access for patients. Method We review ocular CAD methodologies for various data types. For each data type, we investigate the databases and the algorithms to detect different ocular diseases. Their advantages and shortcomings are analyzed and discussed. Result We have studied three types of data (i.e., clinical, genetic and imaging) that have been commonly used in existing methods for CAD. The recent developments in methods used in CAD of ocular diseases (such as Diabetic Retinopathy, Glaucoma, Age-related Macular Degeneration and Pathological Myopia) are investigated and summarized comprehensively. Conclusion While CAD for ocular diseases has shown considerable progress over the past years, the clinical importance of fully automatic CAD systems which are able to embed clinical knowledge and integrate heterogeneous data sources still show great potential for future breakthrough. PMID:25175552

  9. Cloud computing for energy management in smart grid - an application survey

    NASA Astrophysics Data System (ADS)

    Naveen, P.; Kiing Ing, Wong; Kobina Danquah, Michael; Sidhu, Amandeep S.; Abu-Siada, Ahmed

    2016-03-01

    The smart grid is the emerging energy system wherein the application of information technology, tools and techniques that make the grid run more efficiently. It possesses demand response capacity to help balance electrical consumption with supply. The challenges and opportunities of emerging and future smart grids can be addressed by cloud computing. To focus on these requirements, we provide an in-depth survey on different cloud computing applications for energy management in the smart grid architecture. In this survey, we present an outline of the current state of research on smart grid development. We also propose a model of cloud based economic power dispatch for smart grid.

  10. Biological Control of Pathogens Causing Root Rot Complex in Field Pea Using Clonostachys rosea Strain ACM941.

    PubMed

    Xue, Allen G

    2003-03-01

    ABSTRACT Pea root rot complex (PRRC), caused by Alternaria alternata, Aphanomyces euteiches, Fusarium oxysporum f. sp. pisi, F. solani f. sp. pisi, Mycosphaerella pinodes, Pythium spp., Rhizoctonia solani, and Sclerotinia sclerotiorum, is a major yield-limiting factor for field pea production in Canada. A strain of Clonostachys rosea (syn. Gliocladium roseum), ACM941 (ATCC 74447), was identified as a mycoparasite against these pathogens. When grown near the pathogen, ACM941 often was stimulated to produce lateral branches that grew directly toward the pathogen mycelium, typically entwining around the pathogen mycelium. When applied to the seed, ACM941 propagated in the rhizosphere and colonized the seed coat, hypocotyl, and roots as the plant developed and grew. ACM941 significantly reduced the recovery of all fungal pathogens from infected seed, increased in vitro seed germination by 44% and seedling emergence by 22%, and reduced root rot severity by 76%. The effects were similar to those of thiram fungicide, which increased germination and emergence by 33 and 29%, respectively, and reduced root rot severity by 65%. When soil was inoculated with selected PRRC pathogens in a controlled environment, seed treatment with ACM941 significantly increased emergence by 26, 38, 28, 13, and 21% for F. oxysporum f. sp. pisi, F. solani f. sp. pisi, M. pinodes, R. solani, and S. sclerotiorum, respectively. Under field conditions from 1995 to 1997, ACM941 increased emergence by 17, 23, 22, 13, and 18% and yield by 15, 6, 28, 6, and 19% for the five respective pathogens. The seed treatment effects of ACM941 on these PRRC pathogens were greater or statistically equivalent to those achieved with thiram. Results of this study suggest that ACM941 is an effective bioagent in controlling PRRC and is an alternative to existing chemical products. PMID:18944343

  11. Computers in Education: A Survey of Computer Technology in the Westchester/Putnam Schools.

    ERIC Educational Resources Information Center

    Flank, Sandra G.; Livesey, Lynne

    The Westchester Education Coalition, Inc., a coalition of business, education, and the community, surveyed the state of education in the schools of Westchester and Putnam counties (New York) to establish baseline data in the region and to suggest some future directions. In January 1992, a questionnaire was sent to all of the school districts of…

  12. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software

    SciTech Connect

    Fenske, K.R.; Rockwell, V.S.

    1992-08-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization plans for Word Processors, Personal Computers, Workstations, and Associated Software (ANL/TM, Revision 4) to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference document that (1) documents the plans of each organization for office automation, (2) identifies appropriate planners and other contact people in those organizations and (3) encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations (ANL/TM 458) and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan (ANL/TM 466).

  13. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software. Revision 4

    SciTech Connect

    Fenske, K.R.; Rockwell, V.S.

    1992-08-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization plans for Word Processors, Personal Computers, Workstations, and Associated Software (ANL/TM, Revision 4) to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference document that (1) documents the plans of each organization for office automation, (2) identifies appropriate planners and other contact people in those organizations and (3) encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations (ANL/TM 458) and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan (ANL/TM 466).

  14. On the Integration of Computer Algebra Systems (CAS) by Canadian Mathematicians: Results of a National Survey

    ERIC Educational Resources Information Center

    Buteau, Chantal; Jarvis, Daniel H.; Lavicza, Zsolt

    2014-01-01

    In this article, we outline the findings of a Canadian survey study (N = 302) that focused on the extent of computer algebra systems (CAS)-based technology use in postsecondary mathematics instruction. Results suggest that a considerable number of Canadian mathematicians use CAS in research and teaching. CAS use in research was found to be the…

  15. Developing a Computer Information Systems Curriculum Based on an Industry Needs Survey.

    ERIC Educational Resources Information Center

    Ghafarian, Ahmad; Sisk, Kathy A.

    This paper details experiences in developing an undergraduate Computer Information Systems (CIS) curriculum at a small liberal arts school. The development of the program was based on the study of needs assessment. Findings were based on the analysis of four sources of data: the results of an industry needs survey, data from a needs assessment…

  16. National Survey of Internet Usage: Teachers, Computer Coordinators, and School Librarians, Grades 3-12.

    ERIC Educational Resources Information Center

    Market Data Retrieval, Inc., Shelton, CT.

    A study was conducted to assess the number and type of schools and educators who use the Internet/World Wide Web. The national survey was conducted in November and December of 1996, and included 6,000 teachers, computer coordinators, and school librarians currently working in grades 3-5, 6-8, and 9-12. At the elementary level, classroom teachers…

  17. Technology survey of computer software as applicable to the MIUS project

    NASA Technical Reports Server (NTRS)

    Fulbright, B. E.

    1975-01-01

    Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.

  18. Children's Experiences of Completing a Computer-Based Violence Survey: Ethical Implications

    ERIC Educational Resources Information Center

    Ellonen, Noora; Poso, Tarja

    2011-01-01

    This article aims to contribute to the discussion about the ethics of research on children when studying sensitive issues such as violence. The empirical analysis is based on the accounts given by children (11 377) who completed a computer-based questionnaire about their experiences of violence ("The Finnish Child Victim Survey 2008") and their…

  19. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    DOE PAGESBeta

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-04-24

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less

  20. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    SciTech Connect

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-04-24

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based on their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.

  1. Superfund Record of Decision (EPA Region 5): Acme Solvents, Morristown, Illinois, September 1985. Final report

    SciTech Connect

    Not Available

    1985-09-27

    The Acme Solvents Reclaiming, Inc. facility is located approximately five miles south of Rockford, Illinois. From 1960 until 1973, the facility served as a disposal site for paints, oils and still bottoms from the solvent reclamation plant located in Rockford. In addition, empty drums were stored onsite. Wastes were dumped into depressions created from either previous quarrying activities or by scraping overburden from the near surface bedrock to form berms. In September 1972, the Illinois Pollution Control Board (IPCB) ordered Acme to remove all drums and wastes from the facility and to backfill the lagoons. Follow-up inspections revealed that wastes and crushed drums were being left onsite and merely covered with soil. Sampling of the site revealed high concentrations of chlorinated organics in the drinking water. The major source of hazardous substances at the facility are the waste disposal mounds. These mounds contain volatile and semi-volatile organic compounds and concentrations of PCBs up to several hundred mg/kg. The selected remedial action is included.

  2. AIHA position statement on the removal of asbestos-containing materials (ACM) from buildings

    SciTech Connect

    Not Available

    1991-06-01

    The health risks associated with asbestos exposure for building occupants has been demonstrated to be very low. The decision to remove asbestos-containing materials (ACM) in undamaged, intact condition that are not readily accessible to occupants should be made only after assessing all other options. Both technical and financial issues should be fully explored by a team of trained specialists, including industrial hygienists, architects, and engineers. The optimal solution will vary from building to building, based on factors unique to each situation. One important consideration is the use of a well-designed air-monitoring program to identify changes in airborne levels of asbestos. Special training and maintenance programs are needed to ensure the safety and health of building and contract workers who may encounter asbestos or who may disturb it during routine or nonroutine activities. Each building owner who has ACM in a building should identify an in-house asbestos manager, and it is also necessary to provide appropriate resources, including professional consultants, to develop and manage a responsible and effective in-place management program throughout the life of a building containing asbestos.

  3. Surveying co-located space-geodetic instruments for ITRF computation

    NASA Astrophysics Data System (ADS)

    Sarti, P.; Sillard, P.; Vittuari, L.

    2004-10-01

    A new and comprehensive method is presented that can be used for estimating eccentricity vectors between global positioning system (GPS) antennas, doppler orbitography and radiopositioning integrated by satellites (DORIS) antennas, azimuth-elevation (AZ-EL) very long baseline interferometry (VLBI) telescopes, and satellite laser ranging (SLR) and lunar laser ranging (LLR) telescopes. The problem of reference point (RP) definition for these space-geodetic instruments is addressed and computed using terrestrial triangulation and electronic distance measurement (EDM) trilateration. The practical ground operations, the surveying approach and the terrestrial data processing are briefly illustrated, and the post-processing procedure is discussed. It is a geometrically based analytical approach that allows computation of RPs along with a rigorous statistical treatment of measurements. The tight connection between the geometrical model and the surveying procedure is emphasized. The computation of the eccentricity vector and the associated variance covariance matrix between an AZ-EL VLBI telescope (with or without intersecting axes) and a GPS choke ring antenna is concentrated upon, since these are fundamental for computing the International Terrestrial Reference Frame (ITRF). An extension to RP computation and eccentricity vectors involving DORIS, SLR and LLR techniques is also presented. Numerical examples of the quality that can be reached using the authors’ approach are given. Working data sets were acquired in the years 2001 and 2002 at the radioastronomical observatory of Medicina (Italy), and have been used to estimate two VLBI-GPS eccentricity vectors and the corresponding SINEX files.

  4. Parallel computation of optimized arrays for 2-D electrical imaging surveys

    NASA Astrophysics Data System (ADS)

    Loke, M. H.; Wilkinson, P. B.; Chambers, J. E.

    2010-12-01

    Modern automatic multi-electrode survey instruments have made it possible to use non-traditional arrays to maximize the subsurface resolution from electrical imaging surveys. Previous studies have shown that one of the best methods for generating optimized arrays is to select the set of array configurations that maximizes the model resolution for a homogeneous earth model. The Sherman-Morrison Rank-1 update is used to calculate the change in the model resolution when a new array is added to a selected set of array configurations. This method had the disadvantage that it required several hours of computer time even for short 2-D survey lines. The algorithm was modified to calculate the change in the model resolution rather than the entire resolution matrix. This reduces the computer time and memory required as well as the computational round-off errors. The matrix-vector multiplications for a single add-on array were replaced with matrix-matrix multiplications for 28 add-on arrays to further reduce the computer time. The temporary variables were stored in the double-precision Single Instruction Multiple Data (SIMD) registers within the CPU to minimize computer memory access. A further reduction in the computer time is achieved by using the computer graphics card Graphics Processor Unit (GPU) as a highly parallel mathematical coprocessor. This makes it possible to carry out the calculations for 512 add-on arrays in parallel using the GPU. The changes reduce the computer time by more than two orders of magnitude. The algorithm used to generate an optimized data set adds a specified number of new array configurations after each iteration to the existing set. The resolution of the optimized data set can be increased by adding a smaller number of new array configurations after each iteration. Although this increases the computer time required to generate an optimized data set with the same number of data points, the new fast numerical routines has made this practical on

  5. Patient grouping for dose surveys and establishment of diagnostic reference levels in paediatric computed tomography.

    PubMed

    Vassileva, J; Rehani, M

    2015-07-01

    There has been confusion in literature on whether paediatric patients should be grouped according to age, weight or other parameters when dealing with dose surveys. The present work aims to suggest a pragmatic approach to achieve reasonable accuracy for performing patient dose surveys in countries with limited resources. The analysis is based on a subset of data collected within the IAEA survey of paediatric computed tomography (CT) doses, involving 82 CT facilities from 32 countries in Asia, Europe, Africa and Latin America. Data for 6115 patients were collected, in 34.5 % of which data for weight were available. The present study suggests that using four age groups, <1, >1-5, >5-10 and >10-15 y, is realistic and pragmatic for dose surveys in less resourced countries and for the establishment of DRLs. To ensure relevant accuracy of results, data for >30 patients in a particular age group should be collected if patient weight is not known. If a smaller sample is used, patient weight should be recorded and the median weight in the sample should be within 5-10 % from the median weight of the sample for which the DRLs were established. Comparison of results from different surveys should always be performed with caution, taking into consideration the way of grouping of paediatric patients. Dose results can be corrected for differences in patient weight/age group. PMID:25836695

  6. AcmB Is an S-Layer-Associated β-N-Acetylglucosaminidase and Functional Autolysin in Lactobacillus acidophilus NCFM

    PubMed Central

    Johnson, Brant R.

    2016-01-01

    ABSTRACT Autolysins, also known as peptidoglycan hydrolases, are enzymes that hydrolyze specific bonds within bacterial cell wall peptidoglycan during cell division and daughter cell separation. Within the genome of Lactobacillus acidophilus NCFM, there are 11 genes encoding proteins with peptidoglycan hydrolase catalytic domains, 9 of which are predicted to be functional. Notably, 5 of the 9 putative autolysins in L. acidophilus NCFM are S-layer-associated proteins (SLAPs) noncovalently colocalized along with the surface (S)-layer at the cell surface. One of these SLAPs, AcmB, a β-N-acetylglucosaminidase encoded by the gene lba0176 (acmB), was selected for functional analysis. In silico analysis revealed that acmB orthologs are found exclusively in S-layer- forming species of Lactobacillus. Chromosomal deletion of acmB resulted in aberrant cell division, autolysis, and autoaggregation. Complementation of acmB in the ΔacmB mutant restored the wild-type phenotype, confirming the role of this SLAP in cell division. The absence of AcmB within the exoproteome had a pleiotropic effect on the extracellular proteins covalently and noncovalently bound to the peptidoglycan, which likely led to the observed decrease in the binding capacity of the ΔacmB strain for mucin and extracellular matrices fibronectin, laminin, and collagen in vitro. These data suggest a functional association between the S-layer and the multiple autolysins noncovalently colocalized at the cell surface of L. acidophilus NCFM and other S-layer-producing Lactobacillus species. IMPORTANCE Lactobacillus acidophilus is one of the most widely used probiotic microbes incorporated in many dairy foods and dietary supplements. This organism produces a surface (S)-layer, which is a self-assembling crystalline array found as the outermost layer of the cell wall. The S-layer, along with colocalized associated proteins, is an important mediator of probiotic activity through intestinal adhesion and modulation of

  7. Surveying co-located space geodesy techniques for ITRF computation: statistical aspects

    NASA Astrophysics Data System (ADS)

    Sillard, P.; Sarti, P.; Vittuari, L.

    2003-04-01

    For two years, CNR (ITALY) has been involved in a complete renovation of the way Space Geodesy coloocated instruments are surveyed. Local ties are one of the most problematic part of International Terrestrial Reference Frame (ITRF) computation since the accuracy of Space Geodesy techniques has decreased to a few millimeters level. Therefore everybody now agrees on the fact that local ties are one of the most problematic aspects of the ITRF computation. The CNR has then decided to start a comprehensive reflection on the way local ties should be surveyed between Space Geodesy instruments. This reflection concerns the practical ground operations, the physical definition of a Space Geodesy instrument reference point (especially for VLBI), and the consequent adjustment of the results, as well as their publication. The two first aspects will be presented in an other presentation as the present one will focus on the two last points (statistics and publication). As Space Geodesy has now reached the mm level, local ties must be used in ITRF computation with a full variance covariance matrix available for one site. The talk will present the way this variance can be derived, even when the reference point is implicitly defined, like for VLBI. Some numerical examples will be given of the quality which can be reached through a rigorous statistical treatment of the new approach developed by CNR. The evidence of the significant improvement that can be seen of the ITRF-type computation will also be given.

  8. Opportunities and Needs for Mobile-Computing Technology to Support U.S. Geological Survey Fieldwork

    USGS Publications Warehouse

    Wood, Nathan J.; Halsing, David L.

    2006-01-01

    To assess the opportunities and needs for mobile-computing technology at the U.S. Geological Survey (USGS), we conducted an internal, Internet-based survey of bureau scientists whose research includes fieldwork. In summer 2005, 144 survey participants answered 65 questions about fieldwork activities and conditions, technology to support field research, and postfieldwork data processing and analysis. Results suggest that some types of mobile-computing technology are already commonplace, such as digital cameras and Global Positioning System (GPS) receivers, whereas others are not, such as personal digital assistants (PDAs) and tablet-based personal computers (tablet PCs). The potential for PDA use in the USGS is high: 97 percent of respondents record field observations (primarily environmental conditions and water-quality data), and 87 percent take field samples (primarily water-quality data, water samples, and sediment/soil samples). The potential for tablet PC use in the USGS is also high: 59 percent of respondents map environmental features in the field, primarily by sketching in field notebooks, on aerial photographs, or on topographic-map sheets. Results also suggest that efficient mobile-computing-technology solutions could benefit many USGS scientists because most respondents spend at least 1 week per year in the field, conduct field sessions that are least 1 week in duration, have field crews of one to three people, and typically travel on foot about 1 mi from their field vehicles. By allowing researchers to enter data directly into digital databases while in the field, mobile-computing technology could also minimize postfieldwork data processing: 93 percent of respondents enter collected field data into their office computers, and more than 50 percent spend at least 1 week per year on postfieldwork data processing. Reducing postfieldwork data processing could free up additional time for researchers and result in cost savings for the bureau. Generally

  9. A survey of parametrized variational principles and applications to computational mechanics

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1993-01-01

    This survey paper describes recent developments in the area of parametrized variational principles (PVP's) and selected applications to finite-element computational mechanics. A PVP is a variational principle containing free parameters that have no effect on the Euler-Lagrange equations. The theory of single-field PVP's based on gauge functions (also known as null Lagrangians) is a subset of the inverse problem of variational calculus that has limited value. On the other hand, multifield PVP's are more interesting from theoretical and practical standpoints. Following a tutorial introduction, the paper describes the recent construction of multifield PVP's in several areas of elasticity and electromagnetics. It then discusses three applications to finite-element computational mechanics: the derivation of high-performance finite elements, the development of element-level error indicators, and the constructions of finite element templates. The paper concludes with an overview of open research areas.

  10. Survey results of Internet and computer usage in veterans with epilepsy.

    PubMed

    Pramuka, Michael; Hendrickson, Rick; Van Cott, Anne C

    2010-03-01

    After our study of a self-management intervention for epilepsy, we gathered data on Internet use and computer availability to assess the feasibility of computer-based interventions in a veteran population. Veterans were asked to complete an anonymous questionnaire that gathered information regarding seizures/epilepsy in addition to demographic data, Internet use, computer availability, and interest in distance education regarding epilepsy. Three hundred twenty-four VA neurology clinic patients completed the survey. One hundred twenty-six self-reported a medical diagnosis of epilepsy and constituted the epilepsy/seizure group. For this group of veterans, the need for remote/distance-based interventions was validated given the majority of veterans traveled long distances (>2 hours). Only 51% of the epilepsy/seizure group had access to the Internet, and less than half (42%) expressed an interest in getting information on epilepsy self-management on their computer, suggesting that Web-based interventions may not be an optimal method for a self-management intervention in this population. PMID:20116339

  11. U.S. Geological Survey national computer technology meeting; program and abstracts, New Orleans, Louisiana, April 10-15, 1994

    USGS Publications Warehouse

    Balthrop, B. H., (compiler); Baker, E.G.

    1994-01-01

    This report contains some of the abstracts of papers that were presented at the National Computer Technology Meeting that was held in April 1994. This meeting was sponsored by the Water Resources Division of the U.S. Geological Survey, and was attended by more than 200 technical and managerial personnel representing all the Divisions of the U.S. Geological Survey. Computer-related information from all Divisions of the U.S. Geological Survey are discussed in this compilation of abstracts. Some of the topics addressed are data transfer, data-base management, hydrologic applications, national water information systems, and geographic information systems applications and techniques.

  12. A survey on resource allocation in high performance distributed computing systems

    SciTech Connect

    Hussain, Hameed; Malik, Saif Ur Rehman; Hameed, Abdul; Khan, Samee Ullah; Bickler, Gage; Min-Allah, Nasro; Qureshi, Muhammad Bilal; Zhang, Limin; Yongji, Wang; Ghani, Nasir; Kolodziej, Joanna; Zomaya, Albert Y.; Xu, Cheng-Zhong; Balaji, Pavan; Vishnu, Abhinav; Pinel, Fredric; Pecero, Johnatan E.; Kliazovich, Dzmitry; Bouvry, Pascal; Li, Hongxiang; Wang, Lizhe; Chen, Dan; Rayes, Ammar

    2013-11-01

    An efficient resource allocation is a fundamental requirement in high performance computing (HPC) systems. Many projects are dedicated to large-scale distributed computing systems that have designed and developed resource allocation mechanisms with a variety of architectures and services. In our study, through analysis, a comprehensive survey for describing resource allocation in various HPCs is reported. The aim of the work is to aggregate under a joint framework, the existing solutions for HPC to provide a thorough analysis and characteristics of the resource management and allocation strategies. Resource allocation mechanisms and strategies play a vital role towards the performance improvement of all the HPCs classifications. Therefore, a comprehensive discussion of widely used resource allocation strategies deployed in HPC environment is required, which is one of the motivations of this survey. Moreover, we have classified the HPC systems into three broad categories, namely: (a) cluster, (b) grid, and (c) cloud systems and define the characteristics of each class by extracting sets of common attributes. All of the aforementioned systems are cataloged into pure software and hybrid/hardware solutions. The system classification is used to identify approaches followed by the implementation of existing resource allocation strategies that are widely presented in the literature.

  13. A State-Wide Survey of South Australian Secondary Schools to Determine the Current Emphasis on Ergonomics and Computer Use

    ERIC Educational Resources Information Center

    Sawyer, Janet; Penman, Joy

    2012-01-01

    This study investigated the pattern of teaching of healthy computing skills to high school students in South Australia. A survey approach was used to collect data, specifically to determine the emphasis placed by schools on ergonomics that relate to computer use. Participating schools were recruited through the Department for Education and Child…

  14. Survey on computer aided decision support for diagnosis of celiac disease

    PubMed Central

    Hegenbart, Sebastian; Uhl, Andreas; Vécsei, Andreas

    2015-01-01

    Celiac disease (CD) is a complex autoimmune disorder in genetically predisposed individuals of all age groups triggered by the ingestion of food containing gluten. A reliable diagnosis is of high interest in view of embarking on a strict gluten-free diet, which is the CD treatment modality of first choice. The gold standard for diagnosis of CD is currently based on a histological confirmation of serology, using biopsies performed during upper endoscopy. Computer aided decision support is an emerging option in medicine and endoscopy in particular. Such systems could potentially save costs and manpower while simultaneously increasing the safety of the procedure. Research focused on computer-assisted systems in the context of automated diagnosis of CD has started in 2008. Since then, over 40 publications on the topic have appeared. In this context, data from classical flexible endoscopy as well as wireless capsule endoscopy (WCE) and confocal laser endomicrosopy (CLE) has been used. In this survey paper, we try to give a comprehensive overview of the research focused on computer-assisted diagnosis of CD. PMID:25770906

  15. Wireless data collection of self-administered surveys using tablet computers.

    PubMed

    Singleton, Kyle W; Lan, Mars; Arnold, Corey; Vahidi, Mani; Arangua, Lisa; Gelberg, Lillian; Bui, Alex A T

    2011-01-01

    The accurate and expeditious collection of survey data by coordinators in the field is critical in the support of research studies. Early methods that used paper documentation have slowly evolved into electronic capture systems. Indeed, tools such as REDCap and others illustrate this transition. However, many current systems are tailored web-browsers running on desktop/laptop computers, requiring keyboard and mouse input. We present a system that utilizes a touch screen interface running on a tablet PC with consideration for portability, limited screen space, wireless connectivity, and potentially inexperienced and low literacy users. The system was developed using C#, ASP.net, and SQL Server by multiple programmers over the course of a year. The system was developed in coordination with UCLA Family Medicine and is currently deployed for the collection of data in a group of Los Angeles area clinics of community health centers for a study on drug addiction and intervention. PMID:22195187

  16. Survey of engineering computational methods and experimental programs for estimating supersonic missile aerodynamic characteristics

    NASA Technical Reports Server (NTRS)

    Sawyer, W. C.; Allen, J. M.; Hernandez, G.; Dillenius, M. F. E.; Hemsch, M. J.

    1982-01-01

    This paper presents a survey of engineering computational methods and experimental programs used for estimating the aerodynamic characteristics of missile configurations. Emphasis is placed on those methods which are suitable for preliminary design of conventional and advanced concepts. An analysis of the technical approaches of the various methods is made in order to assess their suitability to estimate longitudinal and/or lateral-directional characteristics for different classes of missile configurations. Some comparisons between the predicted characteristics and experimental data are presented. These comparisons are made for a large variation in flow conditions and model attitude parameters. The paper also presents known experimental research programs developed for the specific purpose of validating analytical methods and extending the capability of data-base programs.

  17. Segmentation of solid subregion of high grade gliomas in MRI images based on active contour model (ACM)

    NASA Astrophysics Data System (ADS)

    Seow, P.; Win, M. T.; Wong, J. H. D.; Abdullah, N. A.; Ramli, N.

    2016-03-01

    Gliomas are tumours arising from the interstitial tissue of the brain which are heterogeneous, infiltrative and possess ill-defined borders. Tumour subregions (e.g. solid enhancing part, edema and necrosis) are often used for tumour characterisation. Tumour demarcation into substructures facilitates glioma staging and provides essential information. Manual segmentation had several drawbacks that include laborious, time consuming, subjected to intra and inter-rater variability and hindered by diversity in the appearance of tumour tissues. In this work, active contour model (ACM) was used to segment the solid enhancing subregion of the tumour. 2D brain image acquisition data using 3T MRI fast spoiled gradient echo sequence in post gadolinium of four histologically proven high-grade glioma patients were obtained. Preprocessing of the images which includes subtraction and skull stripping were performed and then followed by ACM segmentation. The results of the automatic segmentation method were compared against the manual delineation of the tumour by a trainee radiologist. Both results were further validated by an experienced neuroradiologist and a brief quantitative evaluations (pixel area and difference ratio) were performed. Preliminary results of the clinical data showed the potential of ACM model in the application of fast and large scale tumour segmentation in medical imaging.

  18. Computer analysis of digital sky surveys using citizen science and manual classification

    NASA Astrophysics Data System (ADS)

    Kuminski, Evan; Shamir, Lior

    2015-01-01

    As current and future digital sky surveys such as SDSS, LSST, DES, Pan-STARRS and Gaia create increasingly massive databases containing millions of galaxies, there is a growing need to be able to efficiently analyze these data. An effective way to do this is through manual analysis, however, this may be insufficient considering the extremely vast pipelines of astronomical images generated by the present and future surveys. Some efforts have been made to use citizen science to classify galaxies by their morphology on a larger scale than individual or small groups of scientists can. While these citizen science efforts such as Zooniverse have helped obtain reasonably accurate morphological information about large numbers of galaxies, they cannot scale to provide complete analysis of billions of galaxy images that will be collected by future ventures such as LSST. Since current forms of manual classification cannot scale to the masses of data collected by digital sky surveys, it is clear that in order to keep up with the growing databases some form of automation of the data analysis will be required, and will work either independently or in combination with human analysis such as citizen science. Here we describe a computer vision method that can automatically analyze galaxy images and deduce galaxy morphology. Experiments using Galaxy Zoo 2 data show that the performance of the method increases as the degree of agreement between the citizen scientists gets higher, providing a cleaner dataset. For several morphological features, such as the spirality of the galaxy, the algorithm agreed with the citizen scientists on around 95% of the samples. However, the method failed to analyze some of the morphological features such as the number of spiral arms, and provided accuracy of just ~36%.

  19. Sci—Thur PM: Imaging — 06: Canada's National Computed Tomography (CT) Survey

    SciTech Connect

    Wardlaw, GM; Martel, N; Blackler, W; Asselin, J-F

    2014-08-15

    The value of computed tomography (CT) in medical imaging is reflected in its' increased use and availability since the early 1990's; however, given CT's relatively larger exposures (vs. planar x-ray) greater care must be taken to ensure that CT procedures are optimised in terms of providing the smallest dose possible while maintaining sufficient diagnostic image quality. The development of CT Diagnostic Reference Levels (DRLs) supports this process. DRLs have been suggested/supported by international/national bodies since the early 1990's and widely adopted elsewhere, but not on a national basis in Canada. Essentially, CT DRLs provide guidance on what is considered good practice for common CT exams, but require a representative sample of CT examination data to make any recommendations. Canada's National CT Survey project, in collaboration with provincial/territorial authorities, has collected a large national sample of CT practice data for 7 common examinations (with associated clinical indications) of both adult and pediatric patients. Following completion of data entry into a common database, a survey summary report and recommendations will be made on CT DRLs from this data. It is hoped that these can then be used by local regions to promote CT practice optimisation and support any dose reduction initiatives.

  20. Development and first application of an Aerosol Collection Module (ACM) for quasi online compound specific aerosol measurements

    NASA Astrophysics Data System (ADS)

    Hohaus, Thorsten; Kiendler-Scharr, Astrid; Trimborn, Dagmar; Jayne, John; Wahner, Andreas; Worsnop, Doug

    2010-05-01

    Atmospheric aerosols influence climate and human health on regional and global scales (IPCC, 2007). In many environments organics are a major fraction of the aerosol influencing its properties. Due to the huge variety of organic compounds present in atmospheric aerosol current measurement techniques are far from providing a full speciation of organic aerosol (Hallquist et al., 2009). The development of new techniques for compound specific measurements with high time resolution is a timely issue in organic aerosol research. Here we present first laboratory characterisations of an aerosol collection module (ACM) which was developed to allow for the sampling and transfer of atmospheric PM1 aerosol. The system consists of an aerodynamic lens system focussing particles on a beam. This beam is directed to a 3.4 mm in diameter surface which is cooled to -30 °C with liquid nitrogen. After collection the aerosol sample can be evaporated from the surface by heating it to up to 270 °C. The sample is transferred through a 60cm long line with a carrier gas. In order to test the ACM for linearity and sensitivity we combined it with a GC-MS system. The tests were performed with octadecane aerosol. The octadecane mass as measured with the ACM-GC-MS was compared versus the mass as calculated from SMPS derived total volume. The data correlate well (R2 0.99, slope of linear fit 1.1) indicating 100 % collection efficiency. From 150 °C to 270 °C no effect of desorption temperature on transfer efficiency could be observed. The ACM-GC-MS system was proven to be linear over the mass range 2-100 ng and has a detection limit of ~ 2 ng. First experiments applying the ACM-GC-MS system were conducted at the Jülich Aerosol Chamber. Secondary organic aerosol (SOA) was formed from ozonolysis of 600 ppbv of b-pinene. The major oxidation product nopinone was detected in the aerosol and could be shown to decrease from 2 % of the total aerosol to 0.5 % of the aerosol over the 48 hours of

  1. U.S. Geological Survey National Computer Technology Meeting; Program and abstracts, May 7-11, 1990

    USGS Publications Warehouse

    Balthrop, B. H., (compiler); Baker, E.G.

    1990-01-01

    Computer-related information from all Divisions of the U.S. Geological Survey are discussed in this compilation of abstracts. Some of the topics addressed are system administration; distributed information systems and data bases, both current (1990) and proposed; hydrologic applications; national water information systems; geographic information systems applications and techniques. The report contains some of the abstracts that were presented at the National Computer Technology Meeting that was held in May 1990. The meeting was sponsored by the Water Resources Division and was attended by more than 200 technical and managerial personnel representing all the Divisions of the U.S. Geological Survey. (USGS)

  2. Do Mathematicians Integrate Computer Algebra Systems in University Teaching? Comparing a Literature Review to an International Survey Study

    ERIC Educational Resources Information Center

    Marshall, Neil; Buteau, Chantal; Jarvis, Daniel H.; Lavicza, Zsolt

    2012-01-01

    We present a comparative study of a literature review of 326 selected contributions (Buteau, Marshall, Jarvis & Lavicza, 2010) to an international (US, UK, Hungary) survey of mathematicians (Lavicza, 2008) regarding the use of Computer Algebra Systems (CAS) in post-secondary mathematics education. The comparison results are organized with respect…

  3. Automated School Food Service System. [A Directory Based on a Survey of Computer Applications in School Food Service.

    ERIC Educational Resources Information Center

    Food and Nutrition Service (USDA), Washington, DC.

    This directory consists of a compilation of information from a survey of 101 school food service administrators to ascertain specific information on computer hardware, software, and applications currently used in their school food service operations. It is designed to assist school food service administrators in developing or enhancing systems…

  4. Technology Support: Its Depth, Breadth and Impact in America's Schools. Teaching, Learning, and Computing: 1998 National Survey Report #5.

    ERIC Educational Resources Information Center

    Ronnkvist, Amy M.; Dexter, Sara L.; Anderson, Ronald E.

    This report, the fifth in a series from the spring 1998 national survey, "Teaching, Learning, and Computing," provides a framework for defining the various dimensions of technology support. Research has shown that teachers lack adequate support for the use of information and communication technologies (ICT). In this report, the term "support" is…

  5. Promoting CLT within a Computer Assisted Learning Environment: A Survey of the Communicative English Course of FLTC

    ERIC Educational Resources Information Center

    Haider, Md. Zulfeqar; Chowdhury, Takad Ahmed

    2012-01-01

    This study is based on a survey of the Communicative English Language Certificate (CELC) course run by the Foreign Language Training Center (FLTC), a Project under the Ministry of Education, Bangladesh. FLTC is working to promote the teaching and learning of English through its eleven computer-based and state of the art language laboratories. As…

  6. A Survey of Exemplar Teachers' Perceptions, Use, and Access of Computer-Based Games and Technology for Classroom Instruction

    ERIC Educational Resources Information Center

    Proctor, Michael D.; Marks, Yaela

    2013-01-01

    This research reports and analyzes for archival purposes surveyed perceptions, use, and access by 259 United States based exemplar Primary and Secondary educators of computer-based games and technology for classroom instruction. Participating respondents were considered exemplary as they each won the Milken Educator Award during the 1996-2009…

  7. Range, Doppler and astrometric observables computed from Time Transfer Functions: a survey

    NASA Astrophysics Data System (ADS)

    Hees, A.; Bertone, S.; Le Poncin-Lafitte, C.; Teyssandier, P.

    2015-08-01

    Determining range, Doppler and astrometric observables is of crucial interest for modelling and analyzing space observations. We recall how these observables can be computed when the travel time of a light ray is known as a function of the positions of the emitter and the receiver for a given instant of reception (or emission). For a long time, such a function-called a reception (or emission) time transfer function has been almost exclusively calculated by integrating the null geodesic equations describing the light rays. However, other methods avoiding such an integration have been considerably developed in the last twelve years. We give a survey of the analytical results obtained with these new methods up to the third order in the gravitational constant G for a mass monopole. We briefly discuss the case of quasi-conjunctions, where higher-order enhanced terms must be taken into account for correctly calculating the effects. We summarize the results obtained at the first order in G when the multipole structure and the motion of an axisymmetric body is taken into account. We present some applications to on-going or future missions like Gaia and Juno. We give a short review of the recent works devoted to the numerical estimates of the time transfer functions and their derivatives.

  8. ARECIBO PALFA SURVEY AND EINSTEIN-HOME: BINARY PULSAR DISCOVERY BY VOLUNTEER COMPUTING

    SciTech Connect

    Knispel, B.; Allen, B.; Aulbert, C.; Bock, O.; Fehrmann, H.; Lazarus, P.; Bogdanov, S.; Anderson, D.; Bhat, N. D. R.; Brazier, A.; Chatterjee, S.; Cordes, J. M.; Camilo, F.; Crawford, F.; Deneva, J. S.; Desvignes, G.; Freire, P. C. C.; Hammer, D.; Hessels, J. W. T.; Jenet, F. A.

    2011-05-01

    We report the discovery of the 20.7 ms binary pulsar J1952+2630, made using the distributed computing project Einstein-Home in Pulsar ALFA survey observations with the Arecibo telescope. Follow-up observations with the Arecibo telescope confirm the binary nature of the system. We obtain a circular orbital solution with an orbital period of 9.4 hr, a projected orbital radius of 2.8 lt-s, and a mass function of f = 0.15 M{sub sun} by analysis of spin period measurements. No evidence of orbital eccentricity is apparent; we set a 2{sigma} upper limit e {approx}< 1.7 x 10{sup -3}. The orbital parameters suggest a massive white dwarf companion with a minimum mass of 0.95 M{sub sun}, assuming a pulsar mass of 1.4 M{sub sun}. Most likely, this pulsar belongs to the rare class of intermediate-mass binary pulsars. Future timing observations will aim to determine the parameters of this system further, measure relativistic effects, and elucidate the nature of the companion star.

  9. Superfund Record of Decision (EPA Region 5): Acme Solvent Reclaiming, Winnebago County, IL. (Second remedial action), December 1990

    SciTech Connect

    Not Available

    1990-12-31

    The 20-acre Acme Solvent Reclaiming site is a former industrial disposal site in Winnebago County, Illinois. Land use in the area is mixed agricultural and residential. From 1960 to 1973, Acme Solvent Reclaiming disposed of paints, oils, and still bottoms onsite from its solvent reclamation plant. Wastes were dumped into depressions created from previous quarrying and landscaping operations, and empty drums also were stored onsite. State investigations in 1981 identified elevated levels of chlorinated organic compounds in ground water. A 1985 Record of Decision (ROD) provided for excavation and onsite incineration of 26,000 cubic yards of contaminated soil and sludge, supplying home carbon treatment units to affected residences, and further study of ground water and bedrock. During illegal removal actions taken by PRPs in 1986, 40,000 tons of soil and sludge were removed from the site. The selected remedial action for the site includes excavating and treating 6,000 tons of soil and sludge from two waste areas, using low-temperature thermal stripping; treating residuals using solidification, if necessary, followed by onsite or offsite disposal; treating the remaining contaminated soil and possibly bedrock using soil/bedrock vapor extraction; consolidating the remaining contaminated soil onsite with any treatment residuals, followed by capping; incinerating offsite 8,000 gallons of liquids and sludge from two remaining tanks, and disposing of the tanks offsite; providing an alternate water supply to residents with contaminated wells; pumping and onsite treatment of VOC-contaminated ground water.

  10. Survey of clinical doses from computed tomography examinations in the Canadian province of Manitoba.

    PubMed

    A Elbakri, Idris; D C Kirkpatrick, Iain

    2013-12-01

    The purpose of this study was to document CT doses for common CT examinations performed throughout the province of Manitoba. Survey forms were sent out to all provincial CT sites. Thirteen out of sixteen (81 %) sites participated. The authors assessed scans of the brain, routine abdomen-pelvis, routine chest, sinuses, lumbar spine, low-dose lung nodule studies, CT pulmonary angiograms, CT KUBs, CT colonographies and combination chest-abdomen-pelvis exams. Sites recorded scanner model, protocol techniques and patient and dose data for 100 consecutive patients who were scanned with any of the aforementioned examinations. Mean effective doses and standard deviations for the province and for individual scanners were computed. The Kruskal-Wallis test was used to compare the variability of effective doses amongst scanners. The t test was used to compare doses and their provincial ranges between newer and older scanners and scanners that used dose saving tools and those that did not. Abdomen-pelvis, chest and brain scans accounted for over 70 % of scans. Their mean effective doses were 18.0 ± 6.7, 13.2 ± 6.4 and 3.0 ± 1.0 mSv, respectively. Variations in doses amongst scanners were statistically significant. Most examinations were performed at 120 kVp, and no lower kVp was used. Dose variations due to scanner age and use of dose saving tools were not statistically significant. Clinical CT doses in Manitoba are broadly similar to but higher than those reported in other Canadian provinces. Results suggest that further dose reduction can be achieved by modifying scanning techniques, such as using lower kVp. Wide variation in doses amongst different scanners suggests that standardisation of scanning protocols can reduce patient dose. New technological advances, such as dose-reduction software algorithms, can be adopted to reduce patient dose. PMID:23803227

  11. Radiation Dose Survey for Common Computed Tomography Exams: 2013 British Columbia Results.

    PubMed

    Thakur, Yogesh; Bjarnason, Thorarin A; Baxter, Patricia; Griffith, Mitch; Eaton, Kirk

    2016-02-01

    In 2013 Health Canada conducted a national survey of computed tomography (CT) radiation usage. We analysed contributions from all 7 public health authorities in the province of British Columbia, which covered scanner age, number of slices, and common adult protocols (≥ 19 years: 70 ± 20 kg, head, chest, abdomen/pelvis, and trunk). Patient doses were recorded for common protocols. Diagnostic reference levels (DRLs) was calculated using scanner data with >10 patient doses recorded for each protocol. Data was analysed based on image reconstruction (filtered backprojection vs iterative reconstruction [IR] vs IR available but not in use). Provincial response was 92%, with 59 of 64 CT data used for analysis. The average scanner age was 5.5 years old, with 39% of scanners installed between 2008-2013; 78.5% of scanners were multislice (>64 slices), and 44% of scanners had IR available. Overall British Columbia DRLs were: head = 1305, chest = 529, abdomen/pelvis = 819, and trunk = 1225. DRLs were consistent with Health Canada recommendations and other Canadian published values, but above international standards. For sites with IR available, less than 50% used this technology routinely for head, chest and trunk exams. Overall, use of IR reduced radiation usage between 11%-32% compared to filtered backprojection, while sites using IR vs IR available used 30%/43% less radiation for head/chest exams (P < .05). No significant difference was observed for abdomen/pelvis exams (P = .385). With the fast pace of CT technical advancement, DRLs should reflect the technology used, instead of just globally applied to anatomical regions. Federal guidelines should be updated at a higher frequency to reflect new technology. In addition, new technologies must be utilised to optimize image quality vs radiation usage. PMID:26608253

  12. Survey of Computer-Based Message Systems; COM/PortaCOM Conference System: Design Goals and Principles; Computer Conferencing Is More Than Electronic Mail; Effects of the COM Computer Conference System.

    ERIC Educational Resources Information Center

    Palme, Jacob

    The four papers contained in this document provide: (1) a survey of computer based mail and conference systems; (2) an evaluation of systems for both individually addressed mail and group addressing through conferences and distribution lists; (3) a discussion of various methods of structuring the text data in existing systems; and (4) a…

  13. Successful use of tablet personal computers and wireless technologies for the 2011 Nepal Demographic and Health Survey

    PubMed Central

    Paudel, Deepak; Ahmed, Marie; Pradhan, Anjushree; Lal Dangol, Rajendra

    2013-01-01

    ABSTRACT Computer-Assisted Personal Interviewing (CAPI), coupled with the use of mobile and wireless technology, is growing as a data collection methodology. Nepal, a geographically diverse and resource-scarce country, implemented the 2011 Nepal Demographic and Health Survey, a nationwide survey of major health indicators, using tablet personal computers (tablet PCs) and wireless technology for the first time in the country. This paper synthesizes responses on the benefits and challenges of using new technology in such a challenging environment from the 89 interviewers who administered the survey. Overall, feedback from the interviewers indicate that the use of tablet PCs and wireless technology to administer the survey demonstrated potential to improve data quality and reduce data collection time—benefits that outweigh manageable challenges, such as storage and transport of the tablet PCs during fieldwork, limited options for confidential interview space due to screen readability issues under direct sunlight, and inconsistent electricity supply at times. The introduction of this technology holds great promise for improving data availability and quality, even in a context with limited infrastructure and extremely difficult terrain. PMID:25276539

  14. Pre-Service Teachers' Attitudes towards Computer Use: A Singapore Survey

    ERIC Educational Resources Information Center

    Teo, Timothy

    2008-01-01

    The aim of this study is to examine the attitudes towards use of computers among pre-service teachers. A sample of 139 pre-service teachers was assessed for their computer attitudes using a Likert type questionnaire with four factors: affect (liking), perceived usefulness, perceived control, and behavioural intention to use the computer. The…

  15. Computer Programs for Library Operations; Results of a Survey Conducted Between Fall 1971 and Spring 1972.

    ERIC Educational Resources Information Center

    Liberman, Eva; And Others

    Many library operations involving large data banks lend themselves readily to computer operation. In setting up library computer programs, in changing or expanding programs, cost in programming and time delays could be substantially reduced if the programmers had access to library computer programs being used by other libraries, providing similar…

  16. Survey of the Computer Users of the Upper Arlington Public Library.

    ERIC Educational Resources Information Center

    Tsardoulias, L. Sevim

    The Computer Services Department of the Upper Arlington Public Library in Franklin County, Ohio, provides microcomputers for public use, including IBM compatible and Macintosh computers, a laser printer, and dot-matrix printers. Circulation statistics provide data regarding the frequency and amount of computer use, but these statistics indicate…

  17. Pre-Service ELT Teachers' Attitudes towards Computer Use: A Turkish Survey

    ERIC Educational Resources Information Center

    Sariçoban, Arif

    2013-01-01

    Problem Statement: Computer technology plays a crucial role in foreign/second language (L2) instruction, and as such, L2 teachers display different attitudes towards the use of computers in their teaching activities. It is important to know what attitudes these teachers hold towards the use of computers and whether they have these varying…

  18. Man-Computer Symbiosis Through Interactive Graphics: A Survey and Identification of Critical Research Areas.

    ERIC Educational Resources Information Center

    Knoop, Patricia A.

    The purpose of this report was to determine the research areas that appear most critical to achieving man-computer symbiosis. An operational definition of man-computer symbiosis was developed by: (1) reviewing and summarizing what others have said about it, and (2) attempting to distinguish it from other types of man-computer relationships. From…

  19. Perceived problems with computer gaming and internet use among adolescents: measurement tool for non-clinical survey studies

    PubMed Central

    2014-01-01

    Background Existing instruments for measuring problematic computer and console gaming and internet use are often lengthy and often based on a pathological perspective. The objective was to develop and present a new and short non-clinical measurement tool for perceived problems related to computer use and gaming among adolescents and to study the association between screen time and perceived problems. Methods Cross-sectional school-survey of 11-, 13-, and 15-year old students in thirteen schools in the City of Aarhus, Denmark, participation rate 89%, n = 2100. The main exposure was time spend on weekdays on computer- and console-gaming and internet use for communication and surfing. The outcome measures were three indexes on perceived problems related to computer and console gaming and internet use. Results The three new indexes showed high face validity and acceptable internal consistency. Most schoolchildren with high screen time did not experience problems related to computer use. Still, there was a strong and graded association between time use and perceived problems related to computer gaming, console gaming (only boys) and internet use, odds ratios ranging from 6.90 to 10.23. Conclusion The three new measures of perceived problems related to computer and console gaming and internet use among adolescents are appropriate, reliable and valid for use in non-clinical surveys about young people’s everyday life and behaviour. These new measures do not assess Internet Gaming Disorder as it is listed in the DSM and therefore has no parity with DSM criteria. We found an increasing risk of perceived problems with increasing time spent with gaming and internet use. Nevertheless, most schoolchildren who spent much time with gaming and internet use did not experience problems. PMID:24731270

  20. Computer ethics: A capstone course

    SciTech Connect

    Fisher, T.G.; Abunawass, A.M.

    1994-12-31

    This paper presents a capstone course on computer ethics required for all computer science majors in our program. The course was designed to encourage students to evaluate their own personal value systems in terms of the established values in computer science as represented by the ACM Code of Ethics. The structure, activities, and topics of the course as well as assessment of the students are presented. Observations on various course components and student evaluations of the course are also presented.

  1. A 100-kV, 100-A/cm2 Electron Optical System for the EB-X3 X-Ray Mask Writer

    NASA Astrophysics Data System (ADS)

    Saito, Kenichi; Kato, Junichi; Matsuda, Tadahito; Nakayama, Yoshinori

    2000-12-01

    In order to increase the throughput of the EB-X3 variably shaped electron beam writing system, a method of increasing the current density with a zoom lens was introduced into the electron optical system. The electron optical characteristics were measured at current densities of 50 and 100 A/cm2 under various zoom-lens conditions, and the results show that this method can increase the current density to 100 A/cm2 without any change in the major electron optical characteristics. At this current density, the patterning resolution was estimated to be 55 nm, and no melting of the first shaping aperture and no microdischarges in the 100-kV electron gun were observed. This confirms that the current density of the EB-X3 can in fact be extended to 100 A/cm2 for the fabrication of X-ray masks with a minimum feature size of 100 nm and below.

  2. Twelve-fold increase in the number of usable ThO molecules for the ACME electron electric dipole measurement through STIRAP

    NASA Astrophysics Data System (ADS)

    Panda, C. D.; O'Leary, B. R.; Lasner, Z.; Petrik, E. S.; West, A. D.; Demille, D.; Doyle, J. M.; Gabrielse, G.

    2016-05-01

    The ACME Collaboration recently reported an order of magnitude improved limit on the electric dipole moment of the electron (eEDM), setting more stringent constraints on many time reversal (T) violating extensions to the Standard Model. The experiment was performed using spin precession measurements in a molecular beam of thorium oxide. We report here on a new method of preparing the coherent spin superposition state that serves as the initial state of the spin precession measurement using STImulated Raman Adiabatic Passage (STIRAP). We demonstrate a transfer efficiency of 75 % , giving a twelve-fold increase in signal. We discuss the particularities of implementing STIRAP in the ACME measurement and the methods we have used to overcome various challenges. This work was performed as part of the ACME Collaboration, to whom we are grateful for its contributions, and was supported by the NSF.

  3. CLIC-ACM: generic modular rad-hard data acquisition system based on CERN GBT versatile link

    NASA Astrophysics Data System (ADS)

    Bielawski, B.; Locci, F.; Magnoni, S.

    2015-01-01

    CLIC is a world-wide collaboration to study the next ``terascale'' lepton collider, relying upon a very innovative concept of two-beam-acceleration. This accelerator, currently under study, will be composed of the subsequence of 21000 two-beam-modules. Each module requires more than 300 analogue and digital signals which need to be acquired and controlled in a synchronous way. CLIC-ACM (Acquisition and Control Module) is the 'generic' control and acquisition module developed to accommodate the controls of all these signals for various sub-systems and related specification in term of data bandwidth, triggering and timing synchronization. This paper describes the system architecture with respect to its radiation-tolerance, power consumption and scalability.

  4. Computing the Deflection of the Vertical for Improving Aerial Surveys: A Comparison between EGM2008 and ITALGEO05 Estimates.

    PubMed

    Barzaghi, Riccardo; Carrion, Daniela; Pepe, Massimiliano; Prezioso, Giuseppina

    2016-01-01

    Recent studies on the influence of the anomalous gravity field in GNSS/INS applications have shown that neglecting the impact of the deflection of vertical in aerial surveys induces horizontal and vertical errors in the measurement of an object that is part of the observed scene; these errors can vary from a few tens of centimetres to over one meter. The works reported in the literature refer to vertical deflection values based on global geopotential model estimates. In this paper we compared this approach with the one based on local gravity data and collocation methods. In particular, denoted by ξ and η, the two mutually-perpendicular components of the deflection of the vertical vector (in the north and east directions, respectively), their values were computed by collocation in the framework of the Remove-Compute-Restore technique, applied to the gravity database used for estimating the ITALGEO05 geoid. Following this approach, these values have been computed at different altitudes that are relevant in aerial surveys. The (ξ, η) values were then also estimated using the high degree EGM2008 global geopotential model and compared with those obtained in the previous computation. The analysis of the differences between the two estimates has shown that the (ξ, η) global geopotential model estimate can be reliably used in aerial navigation applications that require the use of sensors connected to a GNSS/INS system only above a given height (e.g., 3000 m in this paper) that must be defined by simulations. PMID:27472333

  5. Annual evaporite deposition at the acme of the Messinian salinity crisis: evidence for solar-lunar climate forcing

    NASA Astrophysics Data System (ADS)

    Manzi, Vinicio; Gennari, Rocco; Lugli, Stefano; Roveri, Marco; Scafetta, Nicola; Schreiber, B. Charlotte

    2013-04-01

    We studied two evaporite successions (one halite and the other gypsum) consisting of annual varves in order to reconstruct the paleoclimatic and paleoenvironmental conditions existing during the acme of the Messinian salinity crisis (MSC; ≈5.5 Ma), when huge volumes of evaporites accumulated on the floor of the Mediterranean basin. The spectral analyses of these varved evaporitic successions reveal significant peaks in periodicity at around 3-5, 9, 11-13, 20-27 and 50-100 yr. The deposition of varved sedimentary deposits is usually controlled by climate conditions. A comparison with modern precipitation data in the western Mediterranean shows that during the acme of the MSC the climate was not in a permanent evaporitic stage, but in a dynamic state where evaporite deposition was controlled by quasi-periodic climate oscillations similar to modern analogs including Quasi-Biennial Oscillation, El Niño Southern Oscillation, and decadal to secular lunar- and solar-induced cycles. Particularly, we found a significant quasi-decadal oscillation with a prominent 9-year peak that is also common in modern temperature records and is present in both the contemporary Atlantic Multidecadal Oscillation (AMO) index and Pacific Decadal Oscillation (PDO) index. These cyclical patterns are common to both ancient and modern climate records because they can be associated with solar and solar-lunar tidal cycles. During the Messinian, the Mediterranean basin as well as the global ocean, were characterized by somewhat different continent distribution, ocean size, geography, hydrological connections, and ice-sheet volume with respect to the modern configuration. The recognition of modern-style climate oscillations during the Messinian, however, suggests that, although local geographic factors acted as pre-conditioning factors turning the Mediterranean Sea into a giant brine pool, external climate forcing, regulated by solar-lunar cycles and largely independent of those local geographic

  6. Extremely high current density over 1000 A/cm2 operation in M-GaN LEDs on bulk GaN substrates with low-efficiency droop

    NASA Astrophysics Data System (ADS)

    Yokogawa, Toshiya; Inoue, Akira

    2014-02-01

    A high current density over 1000 A/cm2 operation in small chip size m-plane GaN-LED has been successfully demonstrated. The LED with chip size 450 × 450 μm2 has emitted 1353 mW in light output power and 39.2% in external quantum efficiency (EQE) at 1000 A/cm2 (1134 mA). The m-plane GaN-LED has showed asymmetric radiation characteristics. The radiation patterns are controlled by the surface of LED package, the height of LED chip, and striped texture on top m-plane surface.

  7. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    USGS Publications Warehouse

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  8. A survey of students` ethical attitudes using computer-related scenarios

    SciTech Connect

    Hanchey, C.M.; Kingsbury, J.

    1994-12-31

    Many studies exist that examine ethical beliefs and attitudes of university students ascending medium or large institutions. There are also many studies which examine ethical attitudes and beliefs of computer science and computer information systems majors. None, however, examines ethical attitudes of university students (regardless of undergraduate major) at a small, Christian, liberal arts institution regarding computer-related situations. This paper will present data accumulated by an on-going study in which students are presented seven scenarios--all of which involve some aspect of computing technology. These students were randomly selected from a small, Christian, liberal-arts university.

  9. A Survey of High-Quality Computational Libraries and their Impactin Science and Engineering Applications

    SciTech Connect

    Drummond, L.A.; Hernandez, V.; Marques, O.; Roman, J.E.; Vidal, V.

    2004-09-20

    Recently, a number of important scientific and engineering problems have been successfully studied and solved by means of computational modeling and simulation. Many of these computational models and simulations benefited from the use of available software tools and libraries to achieve high performance and portability. In this article, we present a reference matrix of the performance of robust, reliable and widely used tools mapped to scientific and engineering applications that use them. We aim at regularly maintaining and disseminating this matrix to the computational science community. This matrix will contain information on state-of-the-art computational tools, their applications and their use.

  10. Does Computer Survey Technology Improve Reports on Alcohol and Illicit Drug Use in the General Population? A Comparison Between Two Surveys with Different Data Collection Modes In France

    PubMed Central

    Beck, François; Guignard, Romain; Legleye, Stéphane

    2014-01-01

    Background Previous studies have shown that survey methodology can greatly influence prevalence estimates for alcohol and illicit drug use. The aim of this article is to assess the effect of data collection modes on alcohol misuse and drug use reports by comparing national estimates from computer-assisted telephone interviews (CATI) and audio-computer-assisted self interviews (A-CASI). Methods Design: Two national representative surveys conducted in 2005 in France by CATI (n = 24,674) and A-CASI (n = 8,111). Participants: French-speaking individuals aged [18]–[64] years old. Measurements: Alcohol misuse according to the CAGE test, cannabis use (lifetime, last year, 10+ in last month) and experimentation with cocaine, LSD, heroin, amphetamines, ecstasy, were measured with the same questions and wordings in the two surveys. Multivariate logistic regressions controlling for sociodemographic characteristics (age, educational level, marital status and professional status) were performed. Analyses were conducted on the whole sample and stratified by age (18–29 and 30–44 years old) and gender. 45–64 years old data were not analysed due to limited numbers. Results Overall national estimates were similar for 9 out of the 10 examined measures. However, after adjustment, A-CASI provided higher use for most types of illicit drugs among the youngest men (adjusted odds ratio, or OR, of 1.64 [1.08–2.49] for cocaine, 1.62 [1.10–2.38] for ecstasy, 1.99 [1.17–3.37] for LSD, 2.17 [1.07–4.43] for heroin, and 2.48 [1.41–4.35] for amphetamines), whereas use amongst women was similar in CATI and A-CASI, except for LSD in the 30–44 age group (OR = 3.60 [1.64–7.89]). Reported alcohol misuse was higher with A-CASI, for all ages and genders. Conclusions Although differences in the results over the whole population were relatively small between the surveys, the effect of data collection mode seemed to vary according to age and gender. PMID:24465720

  11. A Survey and Evaluation of Simulators Suitable for Teaching Courses in Computer Architecture and Organization

    ERIC Educational Resources Information Center

    Nikolic, B.; Radivojevic, Z.; Djordjevic, J.; Milutinovic, V.

    2009-01-01

    Courses in Computer Architecture and Organization are regularly included in Computer Engineering curricula. These courses are usually organized in such a way that students obtain not only a purely theoretical experience, but also a practical understanding of the topics lectured. This practical work is usually done in a laboratory using simulators…

  12. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    ERIC Educational Resources Information Center

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  13. Survey of Turbulence Models for the Computation of Turbulent Jet Flow and Noise

    NASA Technical Reports Server (NTRS)

    Nallasamy, N.

    1999-01-01

    The report presents an overview of jet noise computation utilizing the computational fluid dynamic solution of the turbulent jet flow field. The jet flow solution obtained with an appropriate turbulence model provides the turbulence characteristics needed for the computation of jet mixing noise. A brief account of turbulence models that are relevant for the jet noise computation is presented. The jet flow solutions that have been directly used to calculate jet noise are first reviewed. Then, the turbulent jet flow studies that compute the turbulence characteristics that may be used for noise calculations are summarized. In particular, flow solutions obtained with the k-e model, algebraic Reynolds stress model, and Reynolds stress transport equation model are reviewed. Since, the small scale jet mixing noise predictions can be improved by utilizing anisotropic turbulence characteristics, turbulence models that can provide the Reynolds stress components must now be considered for jet flow computations. In this regard, algebraic stress models and Reynolds stress transport models are good candidates. Reynolds stress transport models involve more modeling and computational effort and time compared to algebraic stress models. Hence, it is recommended that an algebraic Reynolds stress model (ASM) be implemented in flow solvers to compute the Reynolds stress components.

  14. TOPICAL REVIEW: A survey of signal processing algorithms in brain computer interfaces based on electrical brain signals

    NASA Astrophysics Data System (ADS)

    Bashashati, Ali; Fatourechi, Mehrdad; Ward, Rabab K.; Birch, Gary E.

    2007-06-01

    Brain computer interfaces (BCIs) aim at providing a non-muscular channel for sending commands to the external world using the electroencephalographic activity or other electrophysiological measures of the brain function. An essential factor in the successful operation of BCI systems is the methods used to process the brain signals. In the BCI literature, however, there is no comprehensive review of the signal processing techniques used. This work presents the first such comprehensive survey of all BCI designs using electrical signal recordings published prior to January 2006. Detailed results from this survey are presented and discussed. The following key research questions are addressed: (1) what are the key signal processing components of a BCI, (2) what signal processing algorithms have been used in BCIs and (3) which signal processing techniques have received more attention?

  15. How We Surveyed Doctors to Learn What They Want from Computers and Technology

    ERIC Educational Resources Information Center

    Bardyn, Tania; Young, Caroline; Lombardi, Lin C.

    2008-01-01

    Librarians at New York City's Bellevue Hospital Center needed to write a 3-year strategic plan that included technology data. In this article, they describe how they surveyed doctors and residents about their technology and internet use to determine what the Bellevue Medical Library needed to do in order to support those who deliver medical care.…

  16. Computer Based Instruction in Saudi Education: A Survey of Commercially Produced Software.

    ERIC Educational Resources Information Center

    Al-Saleh, Bader A.; Al-Debassi, Saleh M.

    This study addressed the status quo of instructional software produced by national Saudi Arabian software companies as well as the utilization of commercially produced software at selected 1-12 private schools in Riyadh, Saudi Arabia. Descriptive data from a survey of general managers of four major software producers are reported, as well as from…

  17. Proceeding of the ACM/IEEE-CS Joint Conference on Digital Libraries (1st, Roanoke, Virginia, June 24-28, 2001).

    ERIC Educational Resources Information Center

    Association for Computing Machinery, New York, NY.

    Papers in this Proceedings of the ACM/IEEE-CS Joint Conference on Digital Libraries (Roanoke, Virginia, June 24-28, 2001) discuss: automatic genre analysis; text categorization; automated name authority control; automatic event generation; linked active content; designing e-books for legal research; metadata harvesting; mapping the…

  18. Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes

    PubMed Central

    2016-01-01

    Background The National Health Service (NHS) 70-item inpatient questionnaire surveys inpatients on their perceptions of their hospitalization experience. However, it imposes more burden on the patient than other similar surveys. The literature shows that computerized adaptive testing (CAT) based on item response theory can help shorten the item length of a questionnaire without compromising its precision. Objective Our aim was to investigate whether CAT can be (1) efficient with item reduction and (2) used with quick response (QR) codes scanned by mobile phones. Methods After downloading the 2008 inpatient survey data from the Picker Institute Europe website and analyzing the difficulties of this 70-item questionnaire, we used an author-made Excel program using the Rasch partial credit model to simulate 1000 patients’ true scores followed by a standard normal distribution. The CAT was compared to two other scenarios of answering all items (AAI) and the randomized selection method (RSM), as we investigated item length (efficiency) and measurement accuracy. The author-made Web-based CAT program for gathering patient feedback was effectively accessed from mobile phones by scanning the QR code. Results We found that the CAT can be more efficient for patients answering questions (ie, fewer items to respond to) than either AAI or RSM without compromising its measurement accuracy. A Web-based CAT inpatient survey accessed by scanning a QR code on a mobile phone was viable for gathering inpatient satisfaction responses. Conclusions With advances in technology, patients can now be offered alternatives for providing feedback about hospitalization satisfaction. This Web-based CAT is a possible option in health care settings for reducing the number of survey items, as well as offering an innovative QR code access. PMID:26935793

  19. Using Computers to Survey the Epidemiological, Environmental and Genetic Factors Involved in the Process of Bacteria Resistance Acquisition

    PubMed Central

    Baccala, Luiz Antonio; Nicolelis, Miguel A.L.

    1989-01-01

    The sensitivity behavior in time of several species (S.aureus, E.coli, K.pneumoniae and P.mirabilis in a total of 16334 positive cultures collected at our hospital from July 1981 to December 1986) to amikacin and gentamicin are shown to be periodic. The implications of this finding, and parameters, both epidemiological and genetic, that might be of relevance in its understanding, are discussed as being necessary characteristics of a nosocomial survey-and-control computer system in which time-series analysis techniques are of central importance.

  20. Dynamic MRI-based computer aided diagnostic systems for early detection of kidney transplant rejection: A survey

    NASA Astrophysics Data System (ADS)

    Mostapha, Mahmoud; Khalifa, Fahmi; Alansary, Amir; Soliman, Ahmed; Gimel'farb, Georgy; El-Baz, Ayman

    2013-10-01

    Early detection of renal transplant rejection is important to implement appropriate medical and immune therapy in patients with transplanted kidneys. In literature, a large number of computer-aided diagnostic (CAD) systems using different image modalities, such as ultrasound (US), magnetic resonance imaging (MRI), computed tomography (CT), and radionuclide imaging, have been proposed for early detection of kidney diseases. A typical CAD system for kidney diagnosis consists of a set of processing steps including: motion correction, segmentation of the kidney and/or its internal structures (e.g., cortex, medulla), construction of agent kinetic curves, functional parameter estimation, diagnosis, and assessment of the kidney status. In this paper, we survey the current state-of-the-art CAD systems that have been developed for kidney disease diagnosis using dynamic MRI. In addition, the paper addresses several challenges that researchers face in developing efficient, fast and reliable CAD systems for the early detection of kidney diseases.

  1. Photo-Modeling and Cloud Computing. Applications in the Survey of Late Gothic Architectural Elements

    NASA Astrophysics Data System (ADS)

    Casu, P.; Pisu, C.

    2013-02-01

    This work proposes the application of the latest methods of photo-modeling to the study of Gothic architecture in Sardinia. The aim is to consider the versatility and ease of use of such documentation tools in order to study architecture and its ornamental details. The paper illustrates a procedure of integrated survey and restitution, with the purpose to obtain an accurate 3D model of some gothic portals. We combined the contact survey and the photographic survey oriented to the photo-modelling. The software used is 123D Catch by Autodesk an Image Based Modelling (IBM) system available free. It is a web-based application that requires a few simple steps to produce a mesh from a set of not oriented photos. We tested the application on four portals, working at different scale of detail: at first the whole portal and then the different architectural elements that composed it. We were able to model all the elements and to quickly extrapolate simple sections, in order to make a comparison between the moldings, highlighting similarities and differences. Working in different sites at different scale of detail, have allowed us to test the procedure under different conditions of exposure, sunshine, accessibility, degradation of surface, type of material, and with different equipment and operators, showing if the final result could be affected by these factors. We tested a procedure, articulated in a few repeatable steps, that can be applied, with the right corrections and adaptations, to similar cases and/or larger or smaller elements.

  2. A Survey on Applied Research of Soft Computing for Hydrology in China

    NASA Astrophysics Data System (ADS)

    Ai, Ping

    2010-05-01

    The applied research of soft-computing for hydrology is one of the new topics. In view of the characteristics of indetermination and non-linear in the hydrological systems, the modeling technologies represented by the artificial neural network algorithms and the artificial intelligence optimization technologies represented by the genetic algorithms have become hot issues; what's more, the application of chaos theory to the research of the complexity in hydrological system is a breakthrough in hydrologic research. However, the applied research of the soft computing in the hydrology field is also at the exploration stage in China. In this paper, we review that the applied research of the Artificial Neural Networks, Chaos and Genetic Algorithm in the hydrology in China, and made a brief discussions about basal principle and probability for application of these algorithms. Keywords: soft computing; hydrology; applied research; review

  3. Survey of Storage and Fault Tolerance Strategies Used in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Ericson, Kathleen; Pallickara, Shrideep

    Cloud computing has gained significant traction in recent years. Companies such as Google, Amazon and Microsoft have been building massive data centers over the past few years. Spanning geographic and administrative domains, these data centers tend to be built out of commodity desktops with the total number of computers managed by these companies being in the order of millions. Additionally, the use of virtualization allows a physical node to be presented as a set of virtual nodes resulting in a seemingly inexhaustible set of computational resources. By leveraging economies of scale, these data centers can provision cpu, networking, and storage at substantially reduced prices which in turn underpins the move by many institutions to host their services in the cloud.

  4. Unmanned aircraft systems image collection and computer vision image processing for surveying and mapping that meets professional needs

    NASA Astrophysics Data System (ADS)

    Peterson, James Preston, II

    Unmanned Aerial Systems (UAS) are rapidly blurring the lines between traditional and close range photogrammetry, and between surveying and photogrammetry. UAS are providing an economic platform for performing aerial surveying on small projects. The focus of this research was to describe traditional photogrammetric imagery and Light Detection and Ranging (LiDAR) geospatial products, describe close range photogrammetry (CRP), introduce UAS and computer vision (CV), and investigate whether industry mapping standards for accuracy can be met using UAS collection and CV processing. A 120-acre site was selected and 97 aerial targets were surveyed for evaluation purposes. Four UAS flights of varying heights above ground level (AGL) were executed, and three different target patterns of varying distances between targets were analyzed for compliance with American Society for Photogrammetry and Remote Sensing (ASPRS) and National Standard for Spatial Data Accuracy (NSSDA) mapping standards. This analysis resulted in twelve datasets. Error patterns were evaluated and reasons for these errors were determined. The relationship between the AGL, ground sample distance, target spacing and the root mean square error of the targets is exploited by this research to develop guidelines that use the ASPRS and NSSDA map standard as the template. These guidelines allow the user to select the desired mapping accuracy and determine what target spacing and AGL is required to produce the desired accuracy. These guidelines also address how UAS/CV phenomena affect map accuracy. General guidelines and recommendations are presented that give the user helpful information for planning a UAS flight using CV technology.

  5. The Results of an Independent Study Program Survey of Current and Former Students on the Role of Computer-Assisted Instruction in Correspondence Courses.

    ERIC Educational Resources Information Center

    Hartig, Gordon

    Although computers are used for administrative purposes and for grading in correspondence course programs throughout the United States, there has been little application to date of computer-assisted instruction (CAI) in these programs. A survey was sent to 899 former students in Indiana University's high school independent study program to…

  6. Using Computers in Distance Study: Results of a Survey amongst Disabled Distance Students.

    ERIC Educational Resources Information Center

    Ommerborn, Rainer; Schuemer, Rudolf

    A study at Germany's FernUniversitat sent a questionnaire to 300 enrolled distance education students (mostly adult, mostly part-time) who labeled themselves as severely disabled or chronically ill (about 2 percent of students), asking them about the types of their disabilities and their attitudes toward computer-assisted learning and online…

  7. Effects of Gender on Computer-Mediated Communication: A Survey of University Faculty

    ERIC Educational Resources Information Center

    Valenziano, Laura

    2007-01-01

    The influence of gender on computer-mediated communication is a research area with tremendous growth. This study sought to determine what gender effects exist in email communication between professors and students. The study also explored the amount of lying and misinterpretation that occurs through online communication. The study results indicate…

  8. A Survey of Computer Use in Associate Degree Programs in Engineering Technology.

    ERIC Educational Resources Information Center

    Cunningham, Pearley

    As part of its annual program review process, the Department of Engineering Technology at the Community College of Allegheny County, in Pennsylvania, conducted a study of computer usage in community college engineering technology programs across the nation. Specifically, the study sought to determine the types of software, Internet access, average…

  9. A Survey of Students Participating in a Computer-Assisted Education Programme

    ERIC Educational Resources Information Center

    Yel, Elif Binboga; Korhan, Orhan

    2015-01-01

    This paper mainly examines anthropometric data, data regarding the habits, experiences, and attitudes of the students about their tablet/laptop/desktop computer use, in addition to self-reported musculoskeletal discomfort levels and frequencies of students participating in a tablet-assisted interactive education programme. A two-part questionnaire…

  10. Computational needs survey of NASA automation and robotics missions. Volume 2: Appendixes

    NASA Technical Reports Server (NTRS)

    Davis, Gloria J.

    1991-01-01

    NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is the fact that mission computing requirements are frequency unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. Here, NASA, industry and academic communities are provided with a preliminary set of advanced mission computational processing requirements of automation and robotics (A and R) systems. The results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implemented capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Here, appendixes are provided.

  11. A Survey of Knowledge Management Skills Acquisition in an Online Team-Based Distributed Computing Course

    ERIC Educational Resources Information Center

    Thomas, Jennifer D. E.

    2007-01-01

    This paper investigates students' perceptions of their acquisition of knowledge management skills, namely thinking and team-building skills, resulting from the integration of various resources and technologies into an entirely team-based, online upper level distributed computing (DC) information systems (IS) course. Results seem to indicate that…

  12. Integrating Technology into Preservice Literacy Instruction: A Survey of Elementary Education Students' Attitudes toward Computers.

    ERIC Educational Resources Information Center

    Abbott, Judy A.; Faris, Saundra E.

    2000-01-01

    Examined attitudes toward the use of computers by preservice teachers before and after a literacy course that required the use of technology. Results suggest that increases in positive attitudes may have resulted from instructional approaches, meaningful assignments, and supportive faculty. Includes recommendations for instrument use in evaluating…

  13. Using Computers in Distance Study: Results of a Survey amongst Disabled Distance Students.

    ERIC Educational Resources Information Center

    Ommerborn, Rainer; Schuemer, Rudolf

    2002-01-01

    In the euphoria about new technologies in distance education there exists the danger of not sufficiently considering how ever increasing "virtualization" may exclude some student groups. An explorative study was conducted that asked disabled students about their experiences with using computers and the Internet. Overall, those questioned mentioned…

  14. Hydrologic effects of phreatophyte control, Acme-Artesia reach of the Pecos River, New Mexico, 1967-82

    USGS Publications Warehouse

    Welder, G.E.

    1988-01-01

    The U.S. Bureau of Reclamation began a phreatophyte clearing and control program in the bottom land of the Acme-Artesia reach of the Pecos River in March 1967. The initial cutting of 19,000 acres of saltcedar trees, the dominant phreatophyte in the area, was completed in May 1969. Saltcedar regrowth continued each year until July 1975, when root plowing eradicated most of the regrowth. The major objective of the clearing and control program was to salvage water that could be put to beneficial use. Measurements of changes in the water table in the bottom land and changes in the base flow of the Pecos River were made in order to determine the hydrologic effects of the program. Some salvage of water was indicated, but it is not readily recognized as an increase in base flow. The quantity of salvage probably is less than the average annual base-flow gain of 19 ,110 acre-ft in the reach during 1967-82. (Author 's abstract)

  15. Detection of structural and numerical chomosomal abnormalities by ACM-FISH analysis in sperm of oligozoospermic infertility patients

    SciTech Connect

    Schmid, T E; Brinkworth, M H; Hill, F; Sloter, E; Kamischke, A; Marchetti, F; Nieschlag, E; Wyrobek, A J

    2003-11-10

    Modern reproductive technologies are enabling the treatment of infertile men with severe disturbances of spermatogenesis. The possibility of elevated frequencies of genetically and chromosomally defective sperm has become an issue of concern with the increased usage of intracytoplasmic sperm injection (ICSI), which can enable men with severely impaired sperm production to father children. Several papers have been published about aneuploidy in oligozoospermic patients, but relatively little is known about chromosome structural aberrations in the sperm of these patients. We examined sperm from infertile, oligozoospermic individuals for structural and numerical chromosomal abnormalities using a multicolor ACM FISH assay that utilizes DNA probes specific for three regions of chromosome 1 to detect human sperm that carry numerical chromosomal abnormalities plus two categories of structural aberrations: duplications and deletions of 1pter and 1cen, and chromosomal breaks within the 1cen-1q12 region. There was a significant increase in the average frequencies of sperm with duplications and deletions in the infertility patients compared with the healthy concurrent controls. There was also a significantly elevated level of breaks within the 1cen-1q12 region. There was no evidence for an increase in chromosome-1 disomy, or in diploidy. Our data reveal that oligozoospermia is associated with chromosomal structural abnormalities suggesting that, oligozoospermic men carry a higher burden of transmissible, chromosome damage. The findings raise the possibility of elevated levels of transmissible chromosomal defects following ICSI treatment.

  16. Assessment of Two Planetary Boundary Layer Schemes (ACM2 and YSU) within the Weather Research and Forecasting (WRF) Model

    NASA Astrophysics Data System (ADS)

    Wolff, J.; Harrold, M.; Xu, M.

    2014-12-01

    The Weather Research and Forecasting (WRF) model is a highly configurable numerical weather prediction system used in both research and operational forecasting applications. Rigorously testing select configurations and evaluating the performance for specific applications is necessary due to the flexibility offered by the system. The Developmental Testbed Center (DTC) performed extensive testing and evaluation with the Advanced Research WRF (ARW) dynamic core for two physics suite configurations with a goal of assessing the impact that the planetary boundary layer (PBL) scheme had on the final forecast performance. The baseline configuration was run with the Air Force Weather Agency's physics suite, which includes the Yonsei University PBL scheme, while the second configuration was substituted with the Asymmetric Convective Model (ACM2) PBL scheme. This presentation will focus on assessing the forecast performance of the two configurations; both configurations were run over the same set of cases, allowing for a direct comparison of performance. The evaluation was performed over a 15 km CONUS domain for a testing period from September 2013 through August 2014. Simulations were initialized every 36 hours and run out to 48 hours; a 6-hour "warm start" spin-up, including data assimilation using the Gridpoint Statistical Interpolation system preceded each simulation. The extensive testing period allows for robust results as well as the ability to investigate seasonal and regional differences between the two configurations. Results will focus on the evaluation of traditional verification metrics for surface and upper air variables, along with an assessment of statistical and practical significance.

  17. A survey of computational methods and error rate estimation procedures for peptide and protein identification in shotgun proteomics

    PubMed Central

    Nesvizhskii, Alexey I.

    2010-01-01

    This manuscript provides a comprehensive review of the peptide and protein identification process using tandem mass spectrometry (MS/MS) data generated in shotgun proteomic experiments. The commonly used methods for assigning peptide sequences to MS/MS spectra are critically discussed and compared, from basic strategies to advanced multi-stage approaches. A particular attention is paid to the problem of false-positive identifications. Existing statistical approaches for assessing the significance of peptide to spectrum matches are surveyed, ranging from single-spectrum approaches such as expectation values to global error rate estimation procedures such as false discovery rates and posterior probabilities. The importance of using auxiliary discriminant information (mass accuracy, peptide separation coordinates, digestion properties, and etc.) is discussed, and advanced computational approaches for joint modeling of multiple sources of information are presented. This review also includes a detailed analysis of the issues affecting the interpretation of data at the protein level, including the amplification of error rates when going from peptide to protein level, and the ambiguities in inferring the identifies of sample proteins in the presence of shared peptides. Commonly used methods for computing protein-level confidence scores are discussed in detail. The review concludes with a discussion of several outstanding computational issues. PMID:20816881

  18. Creating a New Model Curriculum: A Rationale for "Computing Curricula 1990".

    ERIC Educational Resources Information Center

    Bruce, Kim B.

    1991-01-01

    Describes a model for the design of undergraduate curricula in the discipline of computing that was developed by the ACM/IEEE (Association for Computing Machinery/Institute of Electrical and Electronics Engineers) Computer Society Joint Curriculum Task Force. Institutional settings and structures in which computing degrees are awarded are…

  19. GTE: a new FFT based software to compute terrain correction on airborne gravity surveys in spherical approximation.

    NASA Astrophysics Data System (ADS)

    Capponi, Martina; Sampietro, Daniele; Sansò, Fernando

    2016-04-01

    The computation of the vertical attraction due to the topographic masses (Terrain Correction) is still a matter of study both in geodetic as well as in geophysical applications. In fact it is required in high precision geoid estimation by the remove-restore technique and it is used to isolate the gravitational effect of anomalous masses in geophysical exploration. This topographical effect can be evaluated from the knowledge of a Digital Terrain Model in different ways: e.g. by means of numerical integration, by prisms, tesseroids, polyedra or Fast Fourier Transform (FFT) techniques. The increasing resolution of recently developed digital terrain models, the increasing number of observation points due to extensive use of airborne gravimetry and the increasing accuracy of gravity data represents nowadays major issues for the terrain correction computation. Classical methods such as prism or point masses approximations are indeed too slow while Fourier based techniques are usually too approximate for the required accuracy. In this work a new software, called Gravity Terrain Effects (GTE), developed in order to guarantee high accuracy and fast computation of terrain corrections is presented. GTE has been thought expressly for geophysical applications allowing the computation not only of the effect of topographic and bathymetric masses but also those due to sedimentary layers or to the Earth crust-mantle discontinuity (the so called Moho). In the present contribution we summarize the basic theory of the software and its practical implementation. Basically the GTE software is based on a new algorithm which, by exploiting the properties of the Fast Fourier Transform, allows to quickly compute the terrain correction, in spherical approximation, at ground or airborne level. Some tests to prove its performances are also described showing GTE capability to compute high accurate terrain corrections in a very short time. Results obtained for a real airborne survey with GTE

  20. Computational analysis in epilepsy neuroimaging: A survey of features and methods

    PubMed Central

    Kini, Lohith G.; Gee, James C.; Litt, Brian

    2016-01-01

    Epilepsy affects 65 million people worldwide, a third of whom have seizures that are resistant to anti-epileptic medications. Some of these patients may be amenable to surgical therapy or treatment with implantable devices, but this usually requires delineation of discrete structural or functional lesion(s), which is challenging in a large percentage of these patients. Advances in neuroimaging and machine learning allow semi-automated detection of malformations of cortical development (MCDs), a common cause of drug resistant epilepsy. A frequently asked question in the field is what techniques currently exist to assist radiologists in identifying these lesions, especially subtle forms of MCDs such as focal cortical dysplasia (FCD) Type I and low grade glial tumors. Below we introduce some of the common lesions encountered in patients with epilepsy and the common imaging findings that radiologists look for in these patients. We then review and discuss the computational techniques introduced over the past 10 years for quantifying and automatically detecting these imaging findings. Due to large variations in the accuracy and implementation of these studies, specific techniques are traditionally used at individual centers, often guided by local expertise, as well as selection bias introduced by the varying prevalence of specific patient populations in different epilepsy centers. We discuss the need for a multi-institutional study that combines features from different imaging modalities as well as computational techniques to definitively assess the utility of specific automated approaches to epilepsy imaging. We conclude that sharing and comparing these different computational techniques through a common data platform provides an opportunity to rigorously test and compare the accuracy of these tools across different patient populations and geographical locations. We propose that these kinds of tools, quantitative imaging analysis methods and open data platforms for

  1. Computational analysis in epilepsy neuroimaging: A survey of features and methods.

    PubMed

    Kini, Lohith G; Gee, James C; Litt, Brian

    2016-01-01

    Epilepsy affects 65 million people worldwide, a third of whom have seizures that are resistant to anti-epileptic medications. Some of these patients may be amenable to surgical therapy or treatment with implantable devices, but this usually requires delineation of discrete structural or functional lesion(s), which is challenging in a large percentage of these patients. Advances in neuroimaging and machine learning allow semi-automated detection of malformations of cortical development (MCDs), a common cause of drug resistant epilepsy. A frequently asked question in the field is what techniques currently exist to assist radiologists in identifying these lesions, especially subtle forms of MCDs such as focal cortical dysplasia (FCD) Type I and low grade glial tumors. Below we introduce some of the common lesions encountered in patients with epilepsy and the common imaging findings that radiologists look for in these patients. We then review and discuss the computational techniques introduced over the past 10 years for quantifying and automatically detecting these imaging findings. Due to large variations in the accuracy and implementation of these studies, specific techniques are traditionally used at individual centers, often guided by local expertise, as well as selection bias introduced by the varying prevalence of specific patient populations in different epilepsy centers. We discuss the need for a multi-institutional study that combines features from different imaging modalities as well as computational techniques to definitively assess the utility of specific automated approaches to epilepsy imaging. We conclude that sharing and comparing these different computational techniques through a common data platform provides an opportunity to rigorously test and compare the accuracy of these tools across different patient populations and geographical locations. We propose that these kinds of tools, quantitative imaging analysis methods and open data platforms for

  2. Biomedical Informatics for Computer-Aided Decision Support Systems: A Survey

    PubMed Central

    Belle, Ashwin; Kon, Mark A.; Najarian, Kayvan

    2013-01-01

    The volumes of current patient data as well as their complexity make clinical decision making more challenging than ever for physicians and other care givers. This situation calls for the use of biomedical informatics methods to process data and form recommendations and/or predictions to assist such decision makers. The design, implementation, and use of biomedical informatics systems in the form of computer-aided decision support have become essential and widely used over the last two decades. This paper provides a brief review of such systems, their application protocols and methodologies, and the future challenges and directions they suggest. PMID:23431259

  3. Prescriptions for ACME's Future.

    ERIC Educational Resources Information Center

    Felch, William Campbell

    1991-01-01

    Five prescriptions for the future agenda of the Alliance for Continuing Medical Education are (1) a core curriculum; (2) informatics; (3) remedial continuing medical education (CME); (4) focus on the individual learner; and (5) practice-oriented CME. (SK)

  4. SAM 2.1—A computer program for plotting and formatting surveying data for estimating peak discharges by the slope-area method

    USGS Publications Warehouse

    Hortness, J.E.

    2004-01-01

    The U.S. Geological Survey (USGS) measures discharge in streams using several methods. However, measurement of peak discharges is often impossible or impractical due to difficult access, inherent danger of making measurements during flood events, and timing often associated with flood events. Thus, many peak discharge values often are calculated after the fact by use of indirect methods. The most common indirect method for estimating peak dis- charges in streams is the slope-area method. This, like other indirect methods, requires measuring the flood profile through detailed surveys. Processing the survey data for efficient entry into computer streamflow models can be time demanding; SAM 2.1 is a program designed to expedite that process. The SAM 2.1 computer program is designed to be run in the field on a portable computer. The program processes digital surveying data obtained from an electronic surveying instrument during slope- area measurements. After all measurements have been completed, the program generates files to be input into the SAC (Slope-Area Computation program; Fulford, 1994) or HEC-RAS (Hydrologic Engineering Center-River Analysis System; Brunner, 2001) computer streamflow models so that an estimate of the peak discharge can be calculated.

  5. A Self-Report Computer-Based Survey of Technology Use by People with Intellectual and Developmental Disabilities

    PubMed Central

    Tanis, Emily Shea; Palmer, Susan B.; Wehmeyer, Michael L.; Davies, Danial; Stock, Steven; Lobb, Kathy; Bishop, Barbara

    2014-01-01

    Advancements of technologies in the areas of mobiliy, hearing and vision, communication, and daily living for people with intellectual and developmental disabilities (IDD) has the potential to greatly enhance indepencence and self-determination. Previous research, however, suggests that there is a “technological divide” with regard to the use of such technologies by people with IDD when compared with the general public. The present study sought to provide current information with regard to technology use by people with IDD by examining the technology needs, use, and barriers to such use experienced by 180 adults with IDD through QuestNet, a self-directed computer survey program. The study findings suggest that although there has been progress in technology acquisition and use by people IDD, yet there remains an underutilization of technologies across the population. PMID:22316226

  6. A Review of Brain-Computer Interface Games and an Opinion Survey from Researchers, Developers and Users

    PubMed Central

    Ahn, Minkyu; Lee, Mijin; Choi, Jinyoung; Jun, Sung Chan

    2014-01-01

    In recent years, research on Brain-Computer Interface (BCI) technology for healthy users has attracted considerable interest, and BCI games are especially popular. This study reviews the current status of, and describes future directions, in the field of BCI games. To this end, we conducted a literature search and found that BCI control paradigms using electroencephalographic signals (motor imagery, P300, steady state visual evoked potential and passive approach reading mental state) have been the primary focus of research. We also conducted a survey of nearly three hundred participants that included researchers, game developers and users around the world. From this survey, we found that all three groups (researchers, developers and users) agreed on the significant influence and applicability of BCI and BCI games, and they all selected prostheses, rehabilitation and games as the most promising BCI applications. User and developer groups tended to give low priority to passive BCI and the whole head sensor array. Developers gave higher priorities to “the easiness of playing” and the “development platform” as important elements for BCI games and the market. Based on our assessment, we discuss the critical point at which BCI games will be able to progress from their current stage to widespread marketing to consumers. In conclusion, we propose three critical elements important for expansion of the BCI game market: standards, gameplay and appropriate integration. PMID:25116904

  7. A review of brain-computer interface games and an opinion survey from researchers, developers and users.

    PubMed

    Ahn, Minkyu; Lee, Mijin; Choi, Jinyoung; Jun, Sung Chan

    2014-01-01

    In recent years, research on Brain-Computer Interface (BCI) technology for healthy users has attracted considerable interest, and BCI games are especially popular. This study reviews the current status of, and describes future directions, in the field of BCI games. To this end, we conducted a literature search and found that BCI control paradigms using electroencephalographic signals (motor imagery, P300, steady state visual evoked potential and passive approach reading mental state) have been the primary focus of research. We also conducted a survey of nearly three hundred participants that included researchers, game developers and users around the world. From this survey, we found that all three groups (researchers, developers and users) agreed on the significant influence and applicability of BCI and BCI games, and they all selected prostheses, rehabilitation and games as the most promising BCI applications. User and developer groups tended to give low priority to passive BCI and the whole head sensor array. Developers gave higher priorities to "the easiness of playing" and the "development platform" as important elements for BCI games and the market. Based on our assessment, we discuss the critical point at which BCI games will be able to progress from their current stage to widespread marketing to consumers. In conclusion, we propose three critical elements important for expansion of the BCI game market: standards, gameplay and appropriate integration. PMID:25116904

  8. WTP Calculation Sheet: Determining the LAW Glass Former Constituents and Amounts for G2 and Acm Models. 24590-LAW-M4C-LFP-00002, Rev. B

    SciTech Connect

    Gimpel, Rodney F.; Kruger, Albert A.

    2013-12-16

    The purpose of this calculation is to determine the LAW glass former recipe and additives with their respective amounts. The methodology and equations contained herein are to be used in the G2 and ACM models until better information is supplied by R&T efforts. This revision includes calculations that determines the mass and volume of the bulk chemicals/minerals needed per batch. Plus, it contains calculations (for the G2 model) to help prevent overflow in LAW Feed Preparation Vessel.

  9. Computing parameters characterizing possibility of planets falling in star camera field of view during space survey

    NASA Astrophysics Data System (ADS)

    Zhurkin, I. G.; Kuzminykh, V. A.

    1985-03-01

    Selection of the optimum conditions and formulation of a program for a survey of planets from a space vehicle requires the knowledge of the probability of entry of planets into the star camera field of view (SCFV) and the time of presence of a planet in the SCFV. It is assumed that the optical axis of the star camera has a random orientation at a fixed moment in time and that the point of intersection of the optical axis of the star camera and the celestial sphere at a given moment in time t is uniformly distributed in the region L of possible values of the angular coordinates alfa, delta (right ascension, declination). By integration of a system of equations of motion for the large planets it is possible to determine the geocentric radius-vector corresponding to the moment in time t and it is possible to ascertain the probability of at least one planet falling in the SCFV. The P (probability) values, are given in five tables. The data make it easy to select the optimum regimes for star camera operation for the registry of at least one planet. A solution of the second problem is presented. It is assumed that: (1) during the considered time interval the planetary motion is Keplerian; (2) the SCFV is a right circular cone whose apex coincides with the center of the Earth's mass; (3) the rotation of the optical axis of the star camera occurs with a period equal to the period of revolution of the satellite carrying the star camera. An expression is derived for the presence of a planet within or at the boundary of the cone at a stipulated time.

  10. Accurate treatments of electrostatics for computer simulations of biological systems: A brief survey of developments and existing problems

    NASA Astrophysics Data System (ADS)

    Yi, Sha-Sha; Pan, Cong; Hu, Zhong-Han

    2015-12-01

    Modern computer simulations of biological systems often involve an explicit treatment of the complex interactions among a large number of molecules. While it is straightforward to compute the short-ranged Van der Waals interaction in classical molecular dynamics simulations, it has been a long-lasting issue to develop accurate methods for the longranged Coulomb interaction. In this short review, we discuss three types of methodologies for the accurate treatment of electrostatics in simulations of explicit molecules: truncation-type methods, Ewald-type methods, and mean-field-type methods. Throughout the discussion, we brief the formulations and developments of these methods, emphasize the intrinsic connections among the three types of methods, and focus on the existing problems which are often associated with the boundary conditions of electrostatics. This brief survey is summarized with a short perspective on future trends along the method developments and applications in the field of biological simulations. Project supported by the National Natural Science Foundation of China (Grant Nos. 91127015 and 21522304) and the Open Project from the State Key Laboratory of Theoretical Physics, and the Innovation Project from the State Key Laboratory of Supramolecular Structure and Materials.

  11. A survey of radiation dose to patients and operators during radiofrequency ablation using computed tomography

    PubMed Central

    Saidatul, A; Azlan, CA; Megat Amin, MSA; Abdullah, BJJ; Ng, KH

    2010-01-01

    Computed tomography (CT) fluoroscopy is able to give real time images to a physician undertaking minimally invasive procedures such as biopsies, percutaneous drainage, and radio frequency ablation (RFA). Both operators executing the procedure and patients too, are thus at risk of radiation exposure during a CT fluoroscopy. This study focuses on the radiation exposure present during a series of radio frequency ablation (RFA) procedures, and used Gafchromic film (Type XR-QA; International Specialty Products, USA) and thermoluminescent dosimeters (TLD-100H; Bicron, USA) to measure the radiation received by patients undergoing treatment, and also operators subject to scatter radiation. The voltage was held constant at 120 kVp and the current 70mA, with 5mm thickness. The duration of irradiation was between 150-638 seconds. Ultimately, from a sample of 30 liver that have undergone RFA, the study revealed that the operator received the highest dose at the hands, which was followed by the eyes and thyroid, while secondary staff dosage was moderately uniform across all parts of the body that were measured. PMID:21611060

  12. A Review of Models for Teacher Preparation Programs for Precollege Computer Science Education.

    ERIC Educational Resources Information Center

    Deek, Fadi P.; Kimmel, Howard

    2002-01-01

    Discusses the need for adequate precollege computer science education and focuses on the issues of teacher preparation programs and requirements needed to teach high school computer science. Presents models of teacher preparation programs and compares state requirements with Association for Computing Machinery (ACM) recommendations. (Author/LRW)

  13. Assessment of Universal Healthcare Coverage in a District of North India: A Rapid Cross-Sectional Survey Using Tablet Computers

    PubMed Central

    Singh, Tarundeep; Roy, Pritam; Jamir, Limalemla; Gupta, Saurav; Kaur, Navpreet; Jain, D. K.; Kumar, Rajesh

    2016-01-01

    Objective A rapid survey was carried out in Shaheed Bhagat Singh Nagar District of Punjab state in India to ascertain health seeking behavior and out-of-pocket health expenditures. Methods Using multistage cluster sampling design, 1,008 households (28 clusters x 36 households in each cluster) were selected proportionately from urban and rural areas. Households were selected through a house-to-house survey during April and May 2014 whose members had (a) experienced illness in the past 30 days, (b) had illness lasting longer than 30 days, (c) were hospitalized in the past 365 days, or (d) had women who were currently pregnant or experienced childbirth in the past two years. In these selected households, trained investigators, using a tablet computer-based structured questionnaire, enquired about the socio-demographics, nature of illness, source of healthcare, and healthcare and household expenditure. The data was transmitted daily to a central server using wireless communication network. Mean healthcare expenditures were computed for various health conditions. Catastrophic healthcare expenditure was defined as more than 10% of the total annual household expenditure on healthcare. Chi square test for trend was used to compare catastrophic expenditures on hospitalization between households classified into expenditure quartiles. Results The mean monthly household expenditure was 15,029 Indian Rupees (USD 188.2). Nearly 14.2% of the household expenditure was on healthcare. Fever, respiratory tract diseases, gastrointestinal diseases were the common acute illnesses, while heart disease, diabetes mellitus, and respiratory diseases were the more common chronic diseases. Hospitalizations were mainly due to cardiovascular diseases, gastrointestinal problems, and accidents. Only 17%, 18%, 20% and 31% of the healthcare for acute illnesses, chronic illnesses, hospitalizations and childbirth was sought in the government health facilities. Average expenditure in government health

  14. Experimental determination of the partitioning coefficient and volatility of important BVOC oxidation products using the Aerosol Collection Module (ACM) coupled to a PTR-ToF-MS

    NASA Astrophysics Data System (ADS)

    Gkatzelis, G.; Hohaus, T.; Tillmann, R.; Schmitt, S. H.; Yu, Z.; Schlag, P.; Wegener, R.; Kaminski, M.; Kiendler-Scharr, A.

    2015-12-01

    Atmospheric aerosol can alter the Earth's radiative budget and global climate but can also affect human health. A dominant contributor to the submicrometer particulate matter (PM) is organic aerosol (OA). OA can be either directly emitted through e.g. combustion processes (primary OA) or formed through the oxidation of organic gases (secondary organic aerosol, SOA). A detailed understanding of SOA formation is of importance as it constitutes a major contribution to the total OA. The partitioning between the gas and particle phase as well as the volatility of individual components of SOA is yet poorly understood adding uncertainties and thus complicating climate modelling. In this work, a new experimental methodology was used for compound-specific analysis of organic aerosol. The Aerosol Collection Module (ACM) is a newly developed instrument that deploys an aerodynamic lens to separate the gas and particle phase of an aerosol. The particle phase is directed to a cooled sampling surface. After collection particles are thermally desorbed and transferred to a detector for further analysis. In the present work, the ACM was coupled to a Proton Transfer Reaction-Time of Flight-Mass Spectrometer (PTR-ToF-MS) to detect and quantify organic compounds partitioning between the gas and particle phase. This experimental approach was used in a set of experiments at the atmosphere simulation chamber SAPHIR to investigate SOA formation. Ozone oxidation with subsequent photochemical aging of β-pinene, limonene and real plant emissions from Pinus sylvestris (Scots pine) were studied. Simultaneous measurement of the gas and particle phase using the ACM-PTR-ToF-MS allows to report partitioning coefficients of important BVOC oxidation products. Additionally, volatility trends and changes of the SOA with photochemical aging are investigated and compared for all systems studied.

  15. Teaching Perspectives among Introductory Computer Programming Faculty in Higher Education

    ERIC Educational Resources Information Center

    Mainier, Michael J.

    2011-01-01

    This study identified the teaching beliefs, intentions, and actions of 80 introductory computer programming (CS1) faculty members from institutions of higher education in the United States using the Teacher Perspectives Inventory. Instruction method used inside the classroom, categorized by ACM CS1 curriculum guidelines, was also captured along…

  16. A Placement Test for Computer Science: Design, Implementation, and Analysis

    ERIC Educational Resources Information Center

    Nugent, Gwen; Soh, Leen-Kiat; Samal, Ashok; Lang, Jeff

    2006-01-01

    An introductory CS1 course presents problems for educators and students due to students' diverse background in programming knowledge and exposure. Students who enroll in CS1 also have different expectations and motivations. Prompted by the curricular guidelines for undergraduate programmes in computer science released in 2001 by the ACM/IEEE, and…

  17. Macro- and microstructural diversity of sea urchin teeth revealed by large-scale mircro-computed tomography survey

    NASA Astrophysics Data System (ADS)

    Ziegler, Alexander; Stock, Stuart R.; Menze, Björn H.; Smith, Andrew B.

    2012-10-01

    Sea urchins (Echinodermata: Echinoidea) generally possess an intricate jaw apparatus that incorporates five teeth. Although echinoid teeth consist of calcite, their complex internal design results in biomechanical properties far superior to those of inorganic forms of the constituent material. While the individual elements (or microstructure) of echinoid teeth provide general insight into processes of biomineralization, the cross-sectional shape (or macrostructure) of echinoid teeth is useful for phylogenetic and biomechanical inferences. However, studies of sea urchin tooth macro- and microstructure have traditionally been limited to a few readily available species, effectively disregarding a potentially high degree of structural diversity that could be informative in a number of ways. Having scanned numerous sea urchin species using micro-computed tomography µCT) and synchrotron µCT, we report a large variation in macro- and microstructure of sea urchin teeth. In addition, we describe aberrant tooth shapes and apply 3D visualization protocols that permit accelerated visual access to the complex microstructure of sea urchin teeth. Our broad survey identifies key taxa for further in-depth study and integrates previously assembled data on fossil species into a more comprehensive systematic analysis of sea urchin teeth. In order to circumvent the imprecise, word-based description of tooth shape, we introduce shape analysis algorithms that will permit the numerical and therefore more objective description of tooth macrostructure. Finally, we discuss how synchrotron µCT datasets permit virtual models of tooth microstructure to be generated as well as the simulation of tooth mechanics based on finite element modeling.

  18. Radiation Dose from Whole-Body F-18 Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography: Nationwide Survey in Korea.

    PubMed

    Kwon, Hyun Woo; Kim, Jong Phil; Lee, Hong Jae; Paeng, Jin Chul; Lee, Jae Sung; Cheon, Gi Jeong; Lee, Dong Soo; Chung, June-Key; Kang, Keon Wook

    2016-02-01

    The purpose of this study was to estimate average radiation exposure from (18)F-fluorodeoxyglucose (FDG) positron emission tomography/computed tomography (PET/CT) examinations and to analyze possible factors affecting the radiation dose. A nation-wide questionnaire survey was conducted involving all institutions that operate PET/CT scanners in Korea. From the response, radiation doses from injected FDG and CT examination were calculated. A total of 105 PET/CT scanners in 73 institutions were included in the analysis (response rate of 62.4%). The average FDG injected activity was 310 ± 77 MBq and 5.11 ± 1.19 MBq/kg. The average effective dose from FDG was estimated to be 5.89 ± 1.46 mSv. The average CT dose index and dose-length product were 4.60 ± 2.47 mGy and 429.2 ± 227.6 mGy∙cm, which corresponded to 6.26 ± 3.06 mSv. The radiation doses from FDG and CT were significantly lower in case of newer scanners than older ones (P < 0.001). Advanced PET technologies such as time-of-flight acquisition and point-spread function recovery were also related to low radiation dose (P < 0.001). In conclusion, the average radiation dose from FDG PET/CT is estimated to be 12.2 mSv. The radiation dose from FDG PET/CT is reduced with more recent scanners equipped with image-enhancing algorithms. PMID:26908992

  19. Radiation Dose from Whole-Body F-18 Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography: Nationwide Survey in Korea

    PubMed Central

    2016-01-01

    The purpose of this study was to estimate average radiation exposure from 18F-fluorodeoxyglucose (FDG) positron emission tomography/computed tomography (PET/CT) examinations and to analyze possible factors affecting the radiation dose. A nation-wide questionnaire survey was conducted involving all institutions that operate PET/CT scanners in Korea. From the response, radiation doses from injected FDG and CT examination were calculated. A total of 105 PET/CT scanners in 73 institutions were included in the analysis (response rate of 62.4%). The average FDG injected activity was 310 ± 77 MBq and 5.11 ± 1.19 MBq/kg. The average effective dose from FDG was estimated to be 5.89 ± 1.46 mSv. The average CT dose index and dose-length product were 4.60 ± 2.47 mGy and 429.2 ± 227.6 mGy∙cm, which corresponded to 6.26 ± 3.06 mSv. The radiation doses from FDG and CT were significantly lower in case of newer scanners than older ones (P < 0.001). Advanced PET technologies such as time-of-flight acquisition and point-spread function recovery were also related to low radiation dose (P < 0.001). In conclusion, the average radiation dose from FDG PET/CT is estimated to be 12.2 mSv. The radiation dose from FDG PET/CT is reduced with more recent scanners equipped with image-enhancing algorithms. PMID:26908992

  20. A survey of the satisfaction of patients who have undergone implant surgery with and without employing a computer-guided implant surgical template

    PubMed Central

    Youk, Shin-Young; Lee, Jee-Ho; Heo, Seong-Joo; Roh, Hyun-Ki; Park, Eun-Jin; Shin, Im Hee

    2014-01-01

    PURPOSE This study aims to investigate the degree of subjective pain and the satisfaction of patients who have undergone an implant treatment using a computer-guided template. MATERIALS AND METHODS A survey was conducted for 135 patients who have undergone implant surgery with and without the use of the computer-guided template during the period of 2012 and 2013 in university hospitals, dental hospitals and dental clinics that practiced implant surgery using the computer-guided template. Likert scale and VAS score were used in the survey questions, and the independent t-test and One-Way ANOVA were performed (α=.05). RESULTS The route that the subjects were introduced to the computer-guided implant surgery using a surgical template was mostly advices by dentists, and the most common reason for which they chose to undergo such surgery was that it was accurate and safe. Most of them gave an answer that they were willing to recommend it to others. The patients who have undergone the computer-guided implant surgery felt less pain during the operation and showed higher satisfaction than those who have undergone conventional implant surgery. Among the patients who have undergone computer-guided implant surgery, those who also had prior experience of surgery without a computer-guided template expressed higher satisfaction with the former (P<.05). CONCLUSION In this study, it could be seen that the patients who have undergone computer-guided implant surgery employing a surgical template felt less pain and had higher satisfaction than those with the conventional one, and the dentist's description could provide the confidence about the safety of surgery. PMID:25352962

  1. Computer use and needs of internists: a survey of members of the American College of Physicians-American Society of Internal Medicine.

    PubMed Central

    Lacher, D.; Nelson, E.; Bylsma, W.; Spena, R.

    2000-01-01

    The American College of Physicians-American Society of Internal Medicine conducted a membership survey in late 1998 to assess their activities, needs, and attitudes. A total of 9,466 members (20.9% response rate) reported on 198 items related to computer use and needs of internists. Eighty-two percent of the respondents reported that they use computers for personal or professional reasons. Physicians younger than 50 years old who had full- or part-time academic affiliation reported using computers more frequently for medical applications. About two thirds of respondents who had access to computers connected to the Internet at least weekly, with most using the Internet from home for e-mail and nonmedical uses. Physicians expressed concerns about Internet security, confidentiality, and accuracy, and the lack of time to browse the Internet. In practice settings, internists used computers for administrative and financial functions. Less than 19% of respondents had partial or complete electronic clinical functions in their offices. Less than 7% of respondents exchanged e-mail with their patients on a weekly or daily basis. Also, less than 15% of respondents used computers for continuing medical education (CME). Respondents reported they wanted to increase their general computer skills and enhance their knowledge of computer-based information sources for patient care, electronic medical record systems, computer-based CME, and telemedicine While most respondents used computers and connected to the Internet, few physicians utilized computers for clinical management. Medical organizations face the challenge of increasing physician use of clinical systems and electronic CME. PMID:11079924

  2. Enterococcus faecium biofilm formation: identification of major autolysin AtlAEfm, associated Acm surface localization, and AtlAEfm-independent extracellular DNA Release.

    PubMed

    Paganelli, Fernanda L; Willems, Rob J L; Jansen, Pamela; Hendrickx, Antoni; Zhang, Xinglin; Bonten, Marc J M; Leavis, Helen L

    2013-01-01

    Enterococcus faecium is an important multidrug-resistant nosocomial pathogen causing biofilm-mediated infections in patients with medical devices. Insight into E. faecium biofilm pathogenesis is pivotal for the development of new strategies to prevent and treat these infections. In several bacteria, a major autolysin is essential for extracellular DNA (eDNA) release in the biofilm matrix, contributing to biofilm attachment and stability. In this study, we identified and functionally characterized the major autolysin of E. faecium E1162 by a bioinformatic genome screen followed by insertional gene disruption of six putative autolysin genes. Insertional inactivation of locus tag EfmE1162_2692 resulted in resistance to lysis, reduced eDNA release, deficient cell attachment, decreased biofilm, decreased cell wall hydrolysis, and significant chaining compared to that of the wild type. Therefore, locus tag EfmE1162_2692 was considered the major autolysin in E. faecium and renamed atlAEfm. In addition, AtlAEfm was implicated in cell surface exposure of Acm, a virulence factor in E. faecium, and thereby facilitates binding to collagen types I and IV. This is a novel feature of enterococcal autolysins not described previously. Furthermore, we identified (and localized) autolysin-independent DNA release in E. faecium that contributes to cell-cell interactions in the atlAEfm mutant and is important for cell separation. In conclusion, AtlAEfm is the major autolysin in E. faecium and contributes to biofilm stability and Acm localization, making AtlAEfm a promising target for treatment of E. faecium biofilm-mediated infections. IMPORTANCE Nosocomial infections caused by Enterococcus faecium have rapidly increased, and treatment options have become more limited. This is due not only to increasing resistance to antibiotics but also to biofilm-associated infections. DNA is released in biofilm matrix via cell lysis, caused by autolysin, and acts as a matrix stabilizer. In this study

  3. Peak data for U.S. Geological Survey gaging stations, Texas network and computer program to estimate peak-streamflow frequency

    USGS Publications Warehouse

    Slade, R.M.; Asquith, W.H.

    1996-01-01

    About 23,000 annual peak streamflows and about 400 historical peak streamflows exist for about 950 stations in the surface-water data-collection network of Texas. These data are presented on a computer diskette along with the corresponding dates, gage heights, and information concerning the basin, and nature or cause for the flood. Also on the computer diskette is a U.S. Geological Survey computer program that estimates peak-streamflow frequency based on annual and historical peak streamflow. The program estimates peak streamflow for 2-, 5-, 10-, 25-, 50-, and 100-year recurrence intervals and is based on guidelines established by the Interagency Advisory Committee on Water Data. Explanations are presented for installing the program, and an example is presented with discussion of its options.

  4. Finding Hidden Geothermal Resources in the Basin and Range Using Electrical Survey Techniques: A Computational Feasibility Study

    SciTech Connect

    J. W. Pritchett; not used on publication

    2004-12-01

    For many years, there has been speculation about "hidden" or "blind" geothermal systems—reservoirs that lack an obvious overlying surface fluid outlet. At present, it is simply not known whether "hidden" geothermal reservoirs are rare or common. An approach to identifying promising drilling targets using methods that are cheaper than drilling is needed. These methods should be regarded as reconnaissance tools, whose primary purpose is to locate high-probability targets for subsequent deep confirmation drilling. The purpose of this study was to appraise the feasibility of finding "hidden" geothermal reservoirs in the Basin and Range using electrical survey techniques, and of adequately locating promising targets for deep exploratory drilling based on the survey results. The approach was purely theoretical. A geothermal reservoir simulator was used to carry out a lengthy calculation of the evolution of a synthetic but generic Great Basin-type geothermal reservoir to a quasi-steady "natural state". Postprocessors were used to try to estimate what a suite of geophysical surveys of the prospect would see. Based on these results, the different survey techniques were compared and evaluated in terms of their ability to identify suitable drilling targets. This process was completed for eight different "reservoir models". Of the eight cases considered, four were "hidden" systems, so that the survey techniques could be appraised in terms of their ability to detect and characterize such resources and to distinguish them from more conventionally situated geothermal reservoirs. It is concluded that the best way to find "hidden" basin and range geothermal resources of this general type is to carry out simultaneous SP and low-frequency MT surveys, and then to combine the results of both surveys with other pertinent information using mathematical "inversion" techniques to characterize the subsurface quantitatively. Many such surveys and accompanying analyses can be carried out

  5. Python: a language for computational physics

    NASA Astrophysics Data System (ADS)

    Borcherds, P. H.

    2007-07-01

    Python is a relatively new computing language, created by Guido van Rossum [A.S. Tanenbaum, R. van Renesse, H. van Staveren, G.J. Sharp, S.J. Mullender, A.J. Jansen, G. van Rossum, Experiences with the Amoeba distributed operating system, Communications of the ACM 33 (1990) 46-63; also on-line at http://www.cs.vu.nl/pub/amoeba/. [6

  6. Enterococcus faecium Biofilm Formation: Identification of Major Autolysin AtlAEfm, Associated Acm Surface Localization, and AtlAEfm-Independent Extracellular DNA Release

    PubMed Central

    Paganelli, Fernanda L.; Willems, Rob J. L.; Jansen, Pamela; Hendrickx, Antoni; Zhang, Xinglin; Bonten, Marc J. M.; Leavis, Helen L.

    2013-01-01

    ABSTRACT Enterococcus faecium is an important multidrug-resistant nosocomial pathogen causing biofilm-mediated infections in patients with medical devices. Insight into E. faecium biofilm pathogenesis is pivotal for the development of new strategies to prevent and treat these infections. In several bacteria, a major autolysin is essential for extracellular DNA (eDNA) release in the biofilm matrix, contributing to biofilm attachment and stability. In this study, we identified and functionally characterized the major autolysin of E. faecium E1162 by a bioinformatic genome screen followed by insertional gene disruption of six putative autolysin genes. Insertional inactivation of locus tag EfmE1162_2692 resulted in resistance to lysis, reduced eDNA release, deficient cell attachment, decreased biofilm, decreased cell wall hydrolysis, and significant chaining compared to that of the wild type. Therefore, locus tag EfmE1162_2692 was considered the major autolysin in E. faecium and renamed atlAEfm. In addition, AtlAEfm was implicated in cell surface exposure of Acm, a virulence factor in E. faecium, and thereby facilitates binding to collagen types I and IV. This is a novel feature of enterococcal autolysins not described previously. Furthermore, we identified (and localized) autolysin-independent DNA release in E. faecium that contributes to cell-cell interactions in the atlAEfm mutant and is important for cell separation. In conclusion, AtlAEfm is the major autolysin in E. faecium and contributes to biofilm stability and Acm localization, making AtlAEfm a promising target for treatment of E. faecium biofilm-mediated infections. PMID:23592262

  7. Papers Presented at the ACM SIGCSE Technical Symposium on Academic Education in Computer Science [held in Houston, Texas, November 16, 1970].

    ERIC Educational Resources Information Center

    Aiken, Robert M., Ed.

    1970-01-01

    The papers given at this symposium were selected for their description of how specific problems were tackled, and with what success, as opposed to proposals unsupported by experience. The goal was to permit the audience to profit from the trials (and errors) of others. The eighteen papers presented are: "Business and the University Computer…

  8. A system of computer programs (WAT{_}MOVE) for transferring data among data bases in the US Geological Survey National Water Information System

    SciTech Connect

    Rogers, G.D.; Kerans, B.K.

    1991-11-01

    This report describes WAT{_}MOVE, a system of computer programs that was developed for moving National Water Information System data between US Geological Survey distributed computer databases. WAT{_}MOVE has three major sub-systems: one for retrieval, one for loading, and one for purging. The retrieval sub-system creates transaction files of retrieved data for transfer and invokes a file transfer to send the transaction files to the receiving site. The loading sub-system reads the control and transaction files retrieved from the source database and loads the data in the appropriate files. The purging sub-system deletes data from a database. Although WAT{_}MOVE was developed for use by the Geological Survey`s Hydrologic Investigations Program of the Yucca Mountain Project Branch, the software can be beneficial to any office maintaining data in the Site File, ADAPS (Automated Data Processing System), GWSI (Ground-Water Site Inventory), and QW (Quality of Water) sub-systems of the National Water Information System. The software also can be used to move data between databases on a single network node or to modify data within a database.

  9. Where we stand, where we are moving: Surveying computational techniques for identifying miRNA genes and uncovering their regulatory role.

    PubMed

    Kleftogiannis, Dimitrios; Korfiati, Aigli; Theofilatos, Konstantinos; Likothanassis, Spiros; Tsakalidis, Athanasios; Mavroudi, Seferina

    2013-06-01

    Traditional biology was forced to restate some of its principles when the microRNA (miRNA) genes and their regulatory role were firstly discovered. Typically, miRNAs are small non-coding RNA molecules which have the ability to bind to the 3'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. Existing experimental techniques for their identification and the prediction of the target genes share some important limitations such as low coverage, time consuming experiments and high cost reagents. Hence, many computational methods have been proposed for these tasks to overcome these limitations. Recently, many researchers emphasized on the development of computational approaches to predict the participation of miRNA genes in regulatory networks and to analyze their transcription mechanisms. All these approaches have certain advantages and disadvantages which are going to be described in the present survey. Our work is differentiated from existing review papers by updating the methodologies list and emphasizing on the computational issues that arise from the miRNA data analysis. Furthermore, in the present survey, the various miRNA data analysis steps are treated as an integrated procedure whose aims and scope is to uncover the regulatory role and mechanisms of the miRNA genes. This integrated view of the miRNA data analysis steps may be extremely useful for all researchers even if they work on just a single step. PMID:23501016

  10. Survey of new vector computers: The CRAY 1S from CRAY research; the CYBER 205 from CDC and the parallel computer from ICL - architecture and programming

    NASA Technical Reports Server (NTRS)

    Gentzsch, W.

    1982-01-01

    Problems which can arise with vector and parallel computers are discussed in a user oriented context. Emphasis is placed on the algorithms used and the programming techniques adopted. Three recently developed supercomputers are examined and typical application examples are given in CRAY FORTRAN, CYBER 205 FORTRAN and DAP (distributed array processor) FORTRAN. The systems performance is compared. The addition of parts of two N x N arrays is considered. The influence of the architecture on the algorithms and programming language is demonstrated. Numerical analysis of magnetohydrodynamic differential equations by an explicit difference method is illustrated, showing very good results for all three systems. The prognosis for supercomputer development is assessed.

  11. Ethics in the computer age. Conference proceedings

    SciTech Connect

    Kizza, J.M.

    1994-12-31

    These proceedings contain the papers presented at the Ethics in the Computer Age conference held in Gatlinburg, Tennessee, November 11-13, 1994. The conference was sponsored by ACM SIGCAS (Computers and Society) to which I am very grateful. The Ethics in the Computer Age conference sequence started in 1991 with the first conference at the campus of the University of Tennessee at Chattanooga. The second was help at the same location a year later. These two conferences were limited to only invited speakers, but their success was overwhelming. This is the third in the sequence and the first truly international one. Plans are already under way for the fourth in 1996.

  12. Computer-science guest-lecture series at Langston University sponsored by the U.S. Geological Survey; abstracts, 1992-93

    USGS Publications Warehouse

    Steele, K. S., (compiler)

    1994-01-01

    Langston University, a Historically Black University located at Langston, Oklahoma, has a computing and information science program within the Langston University Division of Business. Since 1984, Langston University has participated in the Historically Black College and University program of the U.S. Department of Interior, which provided education, training, and funding through a combined earth-science and computer-technology cooperative program with the U.S. Geological Survey (USGS). USGS personnel have presented guest lectures at Langston University since 1984. Students have been enthusiastic about the lectures, and as a result of this program, 13 Langston University students have been hired by the USGS on a part-time basis while they continued their education at the University. The USGS expanded the offering of guest lectures in 1992 by increasing the number of visits to Langston University, and by inviting participation of speakers from throughout the country. The objectives of the guest-lecture series are to assist Langston University in offering state-of-the-art education in the computer sciences, to provide students with an opportunity to learn from and interact with skilled computer-science professionals, and to develop a pool of potential future employees for part-time and full-time employment. This report includes abstracts for guest-lecture presentations during 1992-93 school year.

  13. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    PubMed

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research. PMID:27017830

  14. Documentation of computer programs to compute and display pathlines using results from the U.S. Geological Survey modular three-dimensional finite-difference ground-water flow model

    USGS Publications Warehouse

    Pollock, David W.

    1989-01-01

    A particle tracking post-processing package was developed to compute three-dimensional path lines based on output from steady-state simulations obtained with the U.S. Geological Survey modular 3-dimensional finite difference groundwater flow model. The package consists of two FORTRAN 77 computer programs: (1) MODPATH, which calculates pathlines, and (2) MODPATH-PLOT, which presents results graphically. MODPATH uses a semi-analytical particle tracking scheme. The method is based on the assumption that each directional velocity component varies linearly within a grid cell in its own coordinate direction. This assumption allows an analytical expression to be obtained describing the flow path within a grid cell. Given the initial position of a particle anywhere in a cell, the coordinates of any other point along its path line within the cell, and the time of travel between them, can be computed directly. Data is input to MODPATH and MODPATH-PLOT through a combination of files and interactive dialogue. Examples of how to use MODPATH and MODPATH-PLOT are provided for a sample problem. Listings of the computer codes and detailed descriptions of input data format and program options are also presented. (Author 's abstract)

  15. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature. [Once-through Cycle and Plutonium Recycle

    SciTech Connect

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included.

  16. The John von Neumann Institute for Computing (NIC): A survey of its supercomputer facilities and its Europe-wide computational science activities

    NASA Astrophysics Data System (ADS)

    Attig, N.

    2006-03-01

    The John von Neumann Institute for Computing (NIC) at the Research Centre Jülich, Germany, is one of the leading supercomputing centres in Europe. Founded as a national centre in the mid-eighties it now provides more and more resources to European scientists. This happens within EU-funded projects (I3HP, DEISA) or Europe-wide scientific collaborations. Beyond these activities NIC started an initiative towards the new EU member states in summer 2004. Outstanding research groups are offered to exploit the supercomputers at NIC to accelerate their investigations on leading-edge technology. The article gives an overview of the organisational structure of NIC, its current supercomputer systems, and its user support. Transnational Access (TA) within I3HP is described as well as access by the initiative for new EU member states. The volume of these offers and the procedure of how to apply for supercomputer resources is introduced in detail.

  17. How to Implement Rigorous Computer Science Education in K-12 Schools? Some Answers and Many Questions

    ERIC Educational Resources Information Center

    Hubwieser, Peter; Armoni, Michal; Giannakos, Michail N.

    2015-01-01

    Aiming to collect various concepts, approaches, and strategies for improving computer science education in K-12 schools, we edited this second special issue of the "ACM TOCE" journal. Our intention was to collect a set of case studies from different countries that would describe all relevant aspects of specific implementations of…

  18. Internet Use for Health-Related Information via Personal Computers and Cell Phones in Japan: A Cross-Sectional Population-Based Survey

    PubMed Central

    Takahashi, Yoshimitsu; Ohura, Tomoko; Ishizaki, Tatsuro; Okamoto, Shigeru; Miki, Kenji; Naito, Mariko; Akamatsu, Rie; Sugimori, Hiroki; Yoshiike, Nobuo; Miyaki, Koichi; Shimbo, Takuro

    2011-01-01

    Background The Internet is known to be used for health purposes by the general public all over the world. However, little is known about the use of, attitudes toward, and activities regarding eHealth among the Japanese population. Objectives This study aimed to measure the prevalence of Internet use for health-related information compared with other sources, and to examine the effects on user knowledge, attitudes, and activities with regard to Internet use for health-related information in Japan. We examined the extent of use via personal computers and cell phones. Methods We conducted a cross-sectional survey of a quasi-representative sample (N = 1200) of the Japanese general population aged 15–79 years in September 2007. The main outcome measures were (1) self-reported rates of Internet use in the past year to acquire health-related information and to contact health professionals, family, friends, and peers specifically for health-related purposes, and (2) perceived effects of Internet use on health care. Results The prevalence of Internet use via personal computer for acquiring health-related information was 23.8% (286/1200) among those surveyed, whereas the prevalence via cell phone was 6% (77). Internet use via both personal computer and cell phone for communicating with health professionals, family, friends, or peers was not common. The Internet was used via personal computer for acquiring health-related information primarily by younger people, people with higher education levels, and people with higher household incomes. The majority of those who used the Internet for health care purposes responded that the Internet improved their knowledge or affected their lifestyle attitude, and that they felt confident in the health-related information they obtained from the Internet. However, less than one-quarter thought it improved their ability to manage their health or affected their health-related activities. Conclusions Japanese moderately used the Internet via

  19. The use of computer-assisted orthopedic surgery for total knee replacement in daily practice: a survey among ESSKA/SGO-SSO members.

    PubMed

    Friederich, N; Verdonk, R

    2008-06-01

    Computer-assisted orthopedic surgery (CAOS) for total knee arthroplasty is an emerging surgical tool, yet little is known about how it is being used in everyday orthopedic centers. We sought to better understand physicians' current practices and beliefs on this topic through performing a Web-based survey. Between December 2006 and January 2007, a 24-question survey was emailed to 3,330 members of the European Society of Sports Traumatology Knee Surgery and Arthroscopy (ESSKA) and the Swiss Orthopedic Society (SGO-SSO), with 389 (11.7%) agreeing to participate. Of this group, 202 (51.9%) reported that their center was equipped with a navigation system, which was an image-free based system for most (83.2%) and was primarily used for total knee arthroplasty (61.4%). In terms of the proportion of use, 50.5% of respondents used their navigation system in less than 25% of cases, 16.3% in 25-50% of cases, 7.4% in 51-75% of cases, and 25.7% in more than 75% of cases. The potential for improving the alignment of prosthesis was the most strongly cited reason for using a navigation system, while the potential for increasing operation times and the risk of infections were the most strongly cited reasons for not using a navigation system. Approximately half of respondents surveyed believed navigation systems were a real innovation contributing to the improvement of total knee implantation. However, heavy usage of computer-assisted navigation (> or =51% of cases) was observed in only 33.1% of respondents, with only a quarter using it at rates that could be considered frequent (>75% of cases). Forty-eight percent of respondents said they will use a navigation system in more cases and 39.1% that their usage will stay the same. These findings indicate that CAOS is being used only moderately in current practices, though respondents generally had a positive opinion of its potential benefits. Physicians may be awaiting more data before adopting the use of these systems, though survey

  20. Computational Genomics Using Graph Theory

    NASA Astrophysics Data System (ADS)

    Schlick, Tamar

    2005-03-01

    . Laserson, H. H. Gan, and T. Schlick, ``Searching for 2D RNA Geometries in Bacterial Genomes,'' Proceedings of the ACM Symposium on Computational Geometry, June 9--11, New York, pp. 373--377 (2004). (http://socg.poly.edu/home.htm). N. Kim, N. Shiffeldrim, H. H. Gan, and T. Schlick, ``Novel Candidates of RNA Topologies,'' J. Mol. Biol. 341: 1129--1144 (2004). Schlick, ``RAG: RNA-As-Graphs Web Resource,'' BMC Bioinformatics 5: 88--97 (2004) (http://www.biomedcentral.com/1471-2105/5/88). S. Pasquali, H. H. Gan, and T. Schlick, ``Modular RNA Architecture Revealed by Computational Analysis of Existing Pseudoknots and Ribosomal RNAs,'' Nucl. Acids Res., Submitted (2004). T. Schlick, Molecular Modeling: An Interdisciplinary Guide, Springer-Verlag, New York, 2002.

  1. A Survey of Advancements in Nucleic Acid-based Logic Gates and Computing for Applications in Biotechnology and biomedicine

    PubMed Central

    Wu, Cuichen; Wan, Shuo; Hou, Weijia; Zhang, Liqin; Xu, Jiehua; Cui, Cheng; Wang, Yanyue; Hu, Jun

    2015-01-01

    Nucleic acid-based logic devices were first introduced in 1994. Since then, science has seen the emergence of new logic systems for mimicking mathematical functions, diagnosing disease and even imitating biological systems. The unique features of nucleic acids, such as facile and high-throughput synthesis, Watson-Crick complementary base pairing, and predictable structures, together with the aid of programming design, have led to the widespread applications of nucleic acids (NA) for logic gating and computing in biotechnology and biomedicine. In this feature article, the development of in vitro NA logic systems will be discussed, as well as the expansion of such systems using various input molecules for potential cellular, or even in vivo, applications. PMID:25597946

  2. Numerical analysis of boosting scheme for scalable NMR quantum computation

    SciTech Connect

    SaiToh, Akira; Kitagawa, Masahiro

    2005-02-01

    Among initialization schemes for ensemble quantum computation beginning at thermal equilibrium, the scheme proposed by Schulman and Vazirani [in Proceedings of the 31st ACM Symposium on Theory of Computing (STOC'99) (ACM Press, New York, 1999), pp. 322-329] is known for the simple quantum circuit to redistribute the biases (polarizations) of qubits and small time complexity. However, our numerical simulation shows that the number of qubits initialized by the scheme is rather smaller than expected from the von Neumann entropy because of an increase in the sum of the binary entropies of individual qubits, which indicates a growth in the total classical correlation. This result--namely, that there is such a significant growth in the total binary entropy--disagrees with that of their analysis.

  3. HIV-related risk behaviors among the general population: a survey using Audio Computer-Assisted Self-Interview in 3 cities in Vietnam.

    PubMed

    Vu, Lan T H; Nadol, Patrick; Le, Linh Cu

    2015-03-01

    This study used a confidential survey method-namely, Audio Computer-Assisted Self-Interview (ACASI)-to gather data about HIV-related risk knowledge/behaviors among the general population in Vietnam. The study sample included 1371 people aged 15 to 49 years in 3 cities-Hanoi, Da nang, and Can Tho. Results indicated that 7% of participants had ever had nonconsensual sex, and 3.6% of them had ever had a one-night stand. The percentage of male participants reported to ever have sex with sex workers was 9.6% and to ever inject drugs was 4.3%. The proportion of respondents who had ever tested for HIV was 17.6%. The risk factors and attitudes reported in the survey indicate the importance of analyzing risk behaviors related to HIV infection among the general population. Young people, especially men in more urbanized settings, are engaging in risky behaviors and may act as a "bridge" for the transmission of HIV from high-risk groups to the general population in Vietnam. PMID:22743864

  4. Surveying System

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Sunrise Geodetic Surveys are setting up their equipment for a town survey. Their equipment differs from conventional surveying systems that employ transit rod and chain to measure angles and distances. They are using ISTAC Inc.'s Model 2002 positioning system, which offers fast accurate surveying with exceptional signals from orbiting satellites. The special utility of the ISTAC Model 2002 is that it can provide positioning of the highest accuracy from Navstar PPS signals because it requires no knowledge of secret codes. It operates by comparing the frequency and time phase of a Navstar signal arriving at one ISTAC receiver with the reception of the same set of signals by another receiver. Data is computer processed and translated into three dimensional position data - latitude, longitude and elevation.

  5. Computer Game Use and Television Viewing Increased Risk for Overweight among Low Activity Girls: Fourth Thai National Health Examination Survey 2008-2009

    PubMed Central

    Nontarak, Jiraluck; Satheannoppakao, Warapone

    2014-01-01

    Studies of the relationship between sedentary behaviors and overweight among children and adolescents show mixed results. The fourth Thai National Health Examination Survey data collected between 2008 and 2009 were used to explore this association in 5,999 children aged 6 to 14 years. The prevalence of overweight defined by the age- and gender-specific body mass index cut-points of the International Obesity Task Force was 16%. Using multiple logistic regression, computer game use for more than 1 hour a day was found to be associated with an increased risk of overweight (adjusted odds ratio (AOR) = 1.4; 95% confidence interval: 1.02–1.93). The effect of computer game use and TV viewing on the risk for overweight was significantly pronounced among girls who spent ≤3 days/week in 60 minutes of moderate-intensity physical activity (AOR = 1.99 and 1.72, resp.). On the contrary, these sedentary behaviors did not exert significant risk for overweight among boys. The moderating effect on risk of overweight by physical inactivity and media use should be taken into consideration in designing the interventions for overweight control in children and adolescents. Tracking societal changes is essential for identification of potential areas for targeted interventions. PMID:24995018

  6. Potential and limitations of X-Ray micro-computed tomography in arthropod neuroanatomy: a methodological and comparative survey.

    PubMed

    Sombke, Andy; Lipke, Elisabeth; Michalik, Peter; Uhl, Gabriele; Harzsch, Steffen

    2015-06-01

    Classical histology or immunohistochemistry combined with fluorescence or confocal laser scanning microscopy are common techniques in arthropod neuroanatomy, and these methods often require time-consuming and difficult dissections and sample preparations. Moreover, these methods are prone to artifacts due to compression and distortion of tissues, which often result in information loss and especially affect the spatial relationships of the examined parts of the nervous system in their natural anatomical context. Noninvasive approaches such as X-ray micro-computed tomography (micro-CT) can overcome such limitations and have been shown to be a valuable tool for understanding and visualizing internal anatomy and structural complexity. Nevertheless, knowledge about the potential of this method for analyzing the anatomy and organization of nervous systems, especially of taxa with smaller body size (e.g., many arthropods), is limited. This study set out to analyze the brains of selected arthropods with micro-CT, and to compare these results with available histological and immunohistochemical data. Specifically, we explored the influence of different sample preparation procedures. Our study shows that micro-CT is highly suitable for analyzing arthropod neuroarchitecture in situ and allows specific neuropils to be distinguished within the brain to extract quantitative data such as neuropil volumes. Moreover, data acquisition is considerably faster compared with many classical histological techniques. Thus, we conclude that micro-CT is highly suitable for targeting neuroanatomy, as it reduces the risk of artifacts and is faster than classical techniques. PMID:25728683

  7. Computer Programs for Obtaining and Analyzing Daily Mean Steamflow Data from the U.S. Geological Survey National Water Information System Web Site

    USGS Publications Warehouse

    Granato, Gregory E.

    2009-01-01

    Research Council, 2004). The USGS maintains the National Water Information System (NWIS), a distributed network of computers and file servers used to store and retrieve hydrologic data (Mathey, 1998; U.S. Geological Survey, 2008). NWISWeb is an online version of this database that includes water data from more than 24,000 streamflow-gaging stations throughout the United States (U.S. Geological Survey, 2002, 2008). Information from NWISWeb is commonly used to characterize streamflows at gaged sites and to help predict streamflows at ungaged sites. Five computer programs were developed for obtaining and analyzing streamflow from the National Water Information System (NWISWeb). The programs were developed as part of a study by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, to develop a stochastic empirical loading and dilution model. The programs were developed because reliable, efficient, and repeatable methods are needed to access and process streamflow information and data. The first program is designed to facilitate the downloading and reformatting of NWISWeb streamflow data. The second program is designed to facilitate graphical analysis of streamflow data. The third program is designed to facilitate streamflow-record extension and augmentation to help develop long-term statistical estimates for sites with limited data. The fourth program is designed to facilitate statistical analysis of streamflow data. The fifth program is a preprocessor to create batch input files for the U.S. Environmental Protection Agency DFLOW3 program for calculating low-flow statistics. These computer programs were developed to facilitate the analysis of daily mean streamflow data for planning-level water-quality analyses but also are useful for many other applications pertaining to streamflow data and statistics. These programs and the associated documentation are included on the CD-ROM accompanying this report. This report and the appendixes on the

  8. Computer survey for likely genes in the one megabase contiguous genomic sequence data of Synechocystis sp. strain PCC6803.

    PubMed

    Hirosawa, M; Kaneko, T; Tabata, S; McIninch, J D; Hayes, W S; Borodovsky, M; Isono, K

    1995-12-31

    Using the computer program GeneMark, the open reading frames (ORFs) previously assigned within the one megabase sequence data of the genome of the cyanobacterium, Synechocystis sp. strain PCC6803 (Kaneko et al., DNA Res. 2: 153-166, 1995), were re-examined. Matrices required by GeneMark for its statistical calculation were generated and modified by running a script termed GeneMark-Genesis that performed recursive application of GeneMark against the Synechocystis data and evaluated the probability scores for optimization. Based on the matrices thus generated, 752 of the 818 previously assigned ORFs (92%) were supported by GeneMark as likely coding sequences, of which 26 were predicted to start at more internal positions than previously assigned. In addition, 50 ORFs were newly identified as likely coding sequences, most of them being shorter than 300 bp. Thus, the procedure was proven to be very powerful to locate likely coding regions within the genomic sequence data of Synechocystis without having prior information concerning their similarity to the genes of other organisms. However, GeneMark did not predict 66 previously assigned ORFs as likely genes: 14 of them showed significant degrees of similarity to known genes and 10 others were found within IS-like elements. It seems that these genes, many of which appear to be exogenous origin, escaped detection by GeneMark as in the case of "class 3 (horizontally transferred) genes" of E. coli, which in turn suggests that genes of different phylogenetic origins might also be detected as such by modifying the matrices. PMID:8867797

  9. Potential and limitations of X-Ray micro-computed tomography in arthropod neuroanatomy: A methodological and comparative survey

    PubMed Central

    Sombke, Andy; Lipke, Elisabeth; Michalik, Peter; Uhl, Gabriele; Harzsch, Steffen

    2015-01-01

    Classical histology or immunohistochemistry combined with fluorescence or confocal laser scanning microscopy are common techniques in arthropod neuroanatomy, and these methods often require time-consuming and difficult dissections and sample preparations. Moreover, these methods are prone to artifacts due to compression and distortion of tissues, which often result in information loss and especially affect the spatial relationships of the examined parts of the nervous system in their natural anatomical context. Noninvasive approaches such as X-ray micro-computed tomography (micro-CT) can overcome such limitations and have been shown to be a valuable tool for understanding and visualizing internal anatomy and structural complexity. Nevertheless, knowledge about the potential of this method for analyzing the anatomy and organization of nervous systems, especially of taxa with smaller body size (e.g., many arthropods), is limited. This study set out to analyze the brains of selected arthropods with micro-CT, and to compare these results with available histological and immunohistochemical data. Specifically, we explored the influence of different sample preparation procedures. Our study shows that micro-CT is highly suitable for analyzing arthropod neuroarchitecture in situ and allows specific neuropils to be distinguished within the brain to extract quantitative data such as neuropil volumes. Moreover, data acquisition is considerably faster compared with many classical histological techniques. Thus, we conclude that micro-CT is highly suitable for targeting neuroanatomy, as it reduces the risk of artifacts and is faster than classical techniques. J. Comp. Neurol. 523:1281–1295, 2015. © 2015 Wiley Periodicals, Inc. PMID:25728683

  10. Enhanced delegated computing using coherence

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie; Dunjko, Vedran; Schlederer, Florian; Moore, Merritt; Kashefi, Elham; Walmsley, Ian A.

    2016-03-01

    A longstanding question is whether it is possible to delegate computational tasks securely—such that neither the computation nor the data is revealed to the server. Recently, both a classical and a quantum solution to this problem were found [C. Gentry, in Proceedings of the 41st Annual ACM Symposium on the Theory of Computing (Association for Computing Machinery, New York, 2009), pp. 167-178; A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science (IEEE Computer Society, Los Alamitos, CA, 2009), pp. 517-526]. Here, we study the first step towards the interplay between classical and quantum approaches and show how coherence can be used as a tool for secure delegated classical computation. We show that a client with limited computational capacity—restricted to an XOR gate—can perform universal classical computation by manipulating information carriers that may occupy superpositions of two states. Using single photonic qubits or coherent light, we experimentally implement secure delegated classical computations between an independent client and a server, which are installed in two different laboratories and separated by 50 m . The server has access to the light sources and measurement devices, whereas the client may use only a restricted set of passive optical devices to manipulate the information-carrying light beams. Thus, our work highlights how minimal quantum and classical resources can be combined and exploited for classical computing.

  11. Academic Research Equipment in the Physical and Computer Sciences and Engineering. An Analysis of Findings from Phase I of the National Science Foundation's National Survey of Academic Research Instruments and Instrumentation Needs.

    ERIC Educational Resources Information Center

    Burgdorf, Kenneth; White, Kristine

    This report presents information from phase I of a survey designed to develop quantitative indicators of the current national stock, cost/investment, condition, obsolescence, utilization, and need for major research instruments in academic settings. Data for phase I (which focused on the physical and computer sciences and engineering) were…

  12. A survey of surveys

    SciTech Connect

    Kent, S.M.

    1994-11-01

    A new era for the field of Galactic structure is about to be opened with the advent of wide-area digital sky surveys. In this article, the author reviews the status and prospects for research for 3 new ground-based surveys: the Sloan Digital Sky Survey (SDSS), the Deep Near-Infrared Survey of the Southern Sky (DENIS) and the Two Micron AU Sky Survey (2MASS). These surveys will permit detailed studies of Galactic structure and stellar populations in the Galaxy with unprecedented detail. Extracting the information, however, will be challenging.

  13. Prior to the oral therapy, what do we know about HCV-4 in Egypt: a randomized survey of prevalence and risks using data mining computed analysis.

    PubMed

    Abd Elrazek, Abd Elrazek; Bilasy, Shymaa E; Elbanna, Abduh E M; Elsherif, Abd Elhalim A

    2014-12-01

    Hepatitis C virus (HCV) affects over 180 million people worldwide and it's the leading cause of chronic liver diseases and hepatocellular carcinoma. HCV is classified into seven major genotypes and a series of subtypes. In general, HCV genotype 4 (HCV-4) is common in the Middle East and Africa, where it is responsible for more than 80% of HCV infections. Although HCV-4 is the cause of approximately 20% of the 180 million cases of chronic hepatitis C worldwide, it has not been a major subject of research yet. The aim of the current study is to survey the morbidities and disease complications among Egyptian population infected with HCV-4 using data mining advanced computing methods mainly and other complementary statistical analysis. Six thousand six hundred sixty subjects, aged between 17 and 58 years old, from different Egyptian Governorates were screened for HCV infection by ELISA and qualitative PCR. HCV-positive patients were further investigated for the incidence of liver cirrhosis and esophageal varices. Obtained data were analyzed by data mining approach. Among 6660 subjects enrolled in this survey, 1018 patients (15.28%) were HCV-positive. Proportion of infected-males was significantly higher than females; 61.6% versus 38.4% (P=0.0052). Around two-third of infected-patients (635/1018; 62.4%) were presented with liver cirrhosis. Additionally, approximately half of the cirrhotic patients (301/635; 47.4%) showed degrees of large esophageal varices (LEVs), with higher variceal grade observed in males. Age for esophageal variceal development was 47±1. Data mining analysis yielded esophageal wall thickness (>6.5 mm), determined by conventional U/S, as the only independent predictor for esophageal varices. This study emphasizes the high prevalence of HCV infection among Egyptian population, in particular among males. Egyptians with HCV-4 infection are at a higher risk to develop cirrhotic liver and esophageal varices. Data mining, a new analytic technique in

  14. Surveying Future Surveys

    NASA Astrophysics Data System (ADS)

    Carlstrom, John E.

    2016-06-01

    The now standard model of cosmology has been tested and refined by the analysis of increasingly sensitive, large astronomical surveys, especially with statistically significant millimeter-wave surveys of the cosmic microwave background and optical surveys of the distribution of galaxies. This talk will offer a glimpse of the future, which promises an acceleration of this trend with cosmological information coming from new surveys across the electromagnetic spectrum as well as particles and even gravitational waves.

  15. Do Home Computers Improve Educational Outcomes? Evidence from Matched Current Population Surveys and the National Longitudinal Survey of Youth 1997. National Poverty Center Working Paper Series #06-01

    ERIC Educational Resources Information Center

    Beltran, Daniel O.; Das, Kuntal K.; Fairlie, Robert W.

    2006-01-01

    Nearly twenty million children in the United States do not have computers in their homes. The role of "home" computers in the educational process, however, has drawn very little attention in the previous literature. We use panel data from the two main U.S. datasets that include recent information on computer ownership among children--the 2000-2003…

  16. Computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An overview of computational fluid dynamics (CFD) activities at the Langley Research Center is given. The role of supercomputers in CFD research, algorithm development, multigrid approaches to computational fluid flows, aerodynamics computer programs, computational grid generation, turbulence research, and studies of rarefied gas flows are among the topics that are briefly surveyed.

  17. Evaluating audio computer assisted self-interviews in urban south African communities: evidence for good suitability and reduced social desirability bias of a cross-sectional survey on sexual behaviour

    PubMed Central

    2013-01-01

    Background Efficient HIV prevention requires accurate identification of individuals with risky sexual behaviour. However, self-reported data from sexual behaviour surveys are prone to social desirability bias (SDB). Audio Computer-Assisted Self-Interviewing (ACASI) has been suggested as an alternative to face-to-face interviewing (FTFI), because it may promote interview privacy and reduce SDB. However, little is known about the suitability and accuracy of ACASI in urban communities with high HIV prevalence in South Africa. To test this, we conducted a sexual behaviour survey in Cape Town, South Africa, using ACASI methods. Methods Participants (n = 878) answered questions about their sexual relationships on a touch screen computer in a private mobile office. We included questions at the end of the ACASI survey that were used to assess participants’ perceived ease of use, privacy, and truthfulness. Univariate logistic regression models, supported by multivariate models, were applied to identify groups of people who had adverse interviewing experiences. Further, we constructed male–female ratios of self-reported sexual behaviours as indicators of SDB. We used these indicators to compare SDB in our survey and in recent FTFI-based Demographic and Health Surveys (DHSs) from Lesotho, Swaziland, and Zimbabwe. Results Most participants found our methods easy to use (85.9%), perceived privacy (96.3%) and preferred ACASI to other modes of inquiry (82.5%) when reporting on sexual behaviours. Unemployed participants and those in the 40–70 year old age group were the least likely to find our methods easy to use (OR 0.69; 95% CI: 0.47–1.01 and OR 0.37; 95% CI: 0.23–0.58, respectively). In our survey, the male–female ratio for reporting >2 sexual partners in the past year, a concurrent relationship in the past year, and > 2 sexual partners in a lifetime was 3.4, 2.6, and 1.2, respectively— far lower than the ratios observed in the Demographic and Health Surveys

  18. Digital video delivery for a digital library in computer science

    NASA Astrophysics Data System (ADS)

    Fox, Edward A.; Abdulla, Ghaleb

    1994-04-01

    With support from four NSF awards we aim to develop a prototype digital library in computer science and apply it to improve undergraduate educations. First, Project Envision, `A User- Centered Database from the Computer Science Literature,' 1991-94, deals with translation, coding standards including SGML, retrieval/previewing/presentation/browsing/linking, human-computer interaction, and construction of a partial archive using text and multimedia materials provided by ACM. Second, `Interactive Learning with a Digital Library in Computer Science,' 1993-96, supported by NSF and ACM with additional assistance from other publishers, focuses on improving learning through delivery of materials from the archive. Third, `Networked Multimedia File System with HyTime,' funded by NSF through the SUCCEED coalition, considers networking support for distributed multimedia applications and the use of HyTime for description of such applications. Fourth, equipment support comes from the Information Access Laboratory allotment of the `Interactive Accessibility: Breaking Barriers to the Power of Computing' grant funded by NSF for 1993-98. In this paper we report on plans and work with digital video relating to these projects. In particular we focus on our analysis of the requirements for a multimedia digital library in computer science and our experience with MPEG as it applies to that library.

  19. Early science from the Pan-STARRS1 Optical Galaxy Survey (POGS): Maps of stellar mass and star formation rate surface density obtained from distributed-computing pixel-SED fitting

    NASA Astrophysics Data System (ADS)

    Thilker, David A.; Vinsen, K.; Galaxy Properties Key Project, PS1

    2014-01-01

    To measure resolved galactic physical properties unbiased by the mask of recent star formation and dust features, we are conducting a citizen-scientist enabled nearby galaxy survey based on the unprecedented optical (g,r,i,z,y) imaging from Pan-STARRS1 (PS1). The PS1 Optical Galaxy Survey (POGS) covers 3π steradians (75% of the sky), about twice the footprint of SDSS. Whenever possible we also incorporate ancillary multi-wavelength image data from the ultraviolet (GALEX) and infrared (WISE, Spitzer) spectral regimes. For each cataloged nearby galaxy with a reliable redshift estimate of z < 0.05 - 0.1 (dependent on donated CPU power), publicly-distributed computing is being harnessed to enable pixel-by-pixel spectral energy distribution (SED) fitting, which in turn provides maps of key physical parameters such as the local stellar mass surface density, crude star formation history, and dust attenuation. With pixel SED fitting output we will then constrain parametric models of galaxy structure in a more meaningful way than ordinarily achieved. In particular, we will fit multi-component (e.g. bulge, bar, disk) galaxy models directly to the distribution of stellar mass rather than surface brightness in a single band, which is often locally biased. We will also compute non-parametric measures of morphology such as concentration, asymmetry using the POGS stellar mass and SFR surface density images. We anticipate studying how galactic substructures evolve by comparing our results with simulations and against more distant imaging surveys, some of which which will also be processed in the POGS pipeline. The reliance of our survey on citizen-scientist volunteers provides a world-wide opportunity for education. We developed an interactive interface which highlights the science being produced by each volunteer’s own CPU cycles. The POGS project has already proven popular amongst the public, attracting about 5000 volunteers with nearly 12,000 participating computers, and is

  20. Self-administered food frequency questionnaire used in the 5-year follow-up survey of the JPHC Study: questionnaire structure, computation algorithms, and area-based mean intake.

    PubMed

    Sasaki, Satoshi; Kobayashi, Minatsu; Ishihara, Junko; Tsugane, Shoichiro

    2003-01-01

    In this section we described the structure of the self-administered semiquantitative food frequency questionnaire used in the 5-year follow-up survey of the JPHC study, the computation algorithms, and the area-based mean intakes of nutrients and food groups in the subjects of the validation study. The FFQ consists of five sections: 1) semiquantitative frequency questions for rice and miso (fermented soybean paste)-soup, 2) those for alcoholic beverages, 3) those for vitamin supplements, 4) those for foods and beverages, and 5) questions on dietary and cooking behaviors. From the questions, intakes of nutrients and foods by food groups were computed. Although most of them were computed from the frequency and relative portion size indicated in the replies, together with the fixed portion size, a seasonal coefficient was added in the computation of vegetables and fruits. Only frequency of intake and fixed portion size were used for computation of beverages. Sugar and cream added in coffee and tea were computed from the frequency of coffee and tea intake. The intakes of cooking oil, cooking salt (sodium), and salt in noodle-soup were estimated from the questions of relative preference of oil, salt, and noodle-soup. PMID:12701629

  1. A regional land use survey based on remote sensing and other data: A report on a LANDSAT and computer mapping project, volume 2

    NASA Technical Reports Server (NTRS)

    Nez, G. (Principal Investigator); Mutter, D.

    1977-01-01

    The author has identified the following significant results. The project mapped land use/cover classifications from LANDSAT computer compatible tape data and combined those results with other multisource data via computer mapping/compositing techniques to analyze various land use planning/natural resource management problems. Data were analyzed on 1:24,000 scale maps at 1.1 acre resolution. LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.

  2. d-Alanyl Ester Depletion of Teichoic Acids in Lactobacillus plantarum Results in a Major Modification of Lipoteichoic Acid Composition and Cell Wall Perforations at the Septum Mediated by the Acm2 Autolysin

    PubMed Central

    Palumbo, Emmanuelle; Deghorain, Marie; Cocconcelli, Pier Sandro; Kleerebezem, Michiel; Geyer, Armin; Hartung, Thomas; Morath, Siegfried; Hols, Pascal

    2006-01-01

    The insertional inactivation of the dlt operon from Lactobacillus plantarum NCIMB8826 had a strong impact on lipoteichoic acid (LTA) composition, resulting in a major reduction in d-alanyl ester content. Unexpectedly, mutant LTA showed high levels of glucosylation and were threefold longer than wild-type LTA. The dlt mutation resulted in a reduced growth rate and increased cell lysis during the exponential and stationary growth phases. Microscopy analysis revealed increased cell length, damaged dividing cells, and perforations of the envelope in the septal region. The observed defects in the separation process, cell envelope perforation, and autolysis of the dlt mutant could be partially attributed to the L. plantarum Acm2 peptidoglycan hydrolase. PMID:16672624

  3. Computational vision

    NASA Technical Reports Server (NTRS)

    Barrow, H. G.; Tenenbaum, J. M.

    1981-01-01

    The range of fundamental computational principles underlying human vision that equally apply to artificial and natural systems is surveyed. There emerges from research a view of the structuring of vision systems as a sequence of levels of representation, with the initial levels being primarily iconic (edges, regions, gradients) and the highest symbolic (surfaces, objects, scenes). Intermediate levels are constrained by information made available by preceding levels and information required by subsequent levels. In particular, it appears that physical and three-dimensional surface characteristics provide a critical transition from iconic to symbolic representations. A plausible vision system design incorporating these principles is outlined, and its key computational processes are elaborated.

  4. QADATA user's manual; an interactive computer program for the retrieval and analysis of the results from the external blind sample quality- assurance project of the U.S. Geological Survey

    USGS Publications Warehouse

    Lucey, K.J.

    1990-01-01

    The U.S. Geological Survey conducts an external blind sample quality assurance project for its National Water Quality Laboratory in Denver, Colorado, based on the analysis of reference water samples. Reference samples containing selected inorganic and nutrient constituents are disguised as environmental samples at the Survey 's office in Ocala, Florida, and are sent periodically through other Survey offices to the laboratory. The results of this blind sample project indicate the quality of analytical data produced by the laboratory. This report provides instructions on the use of QADATA, an interactive, menu-driven program that allows users to retrieve the results of the blind sample quality- assurance project. The QADATA program, which is available on the U.S. Geological Survey 's national computer network, accesses a blind sample data base that contains more than 50,000 determinations from the last five water years for approximately 40 constituents at various concentrations. The data can be retrieved from the database for any user- defined time period and for any or all available constituents. After the user defines the retrieval, the program prepares statistical tables, control charts, and precision plots and generates a report which can be transferred to the user 's office through the computer network. A discussion of the interpretation of the program output is also included. This quality assurance information will permit users to document the quality of the analytical results received from the laboratory. The blind sample data is entered into the database within weeks after being produced by the laboratory and can be retrieved to meet the needs of specific projects or programs. (USGS)

  5. 'Towers in the Tempest' Computer Animation Submission

    NASA Technical Reports Server (NTRS)

    Shirah, Greg

    2008-01-01

    The following describes a computer animation that has been submitted to the ACM/SIGGRAPH 2008 computer graphics conference: 'Towers in the Tempest' clearly communicates recent scientific research into how hurricanes intensify. This intensification can be caused by a phenomenon called a 'hot tower.' For the first time, research meteorologists have run complex atmospheric simulations at a very fine temporal resolution of 3 minutes. Combining this simulation data with satellite observations enables detailed study of 'hot towers.' The science of 'hot towers' is described using: satellite observation data, conceptual illustrations, and a volumetric atmospheric simulation data. The movie starts by showing a 'hot tower' observed by NASA's Tropical Rainfall Measuring Mission (TRMM) spacecraft's three dimensional precipitation radar data of Hurricane Bonnie. Next, the dynamics of a hurricane and the formation of 'hot towers' are briefly explained using conceptual illustrations. Finally, volumetric cloud, wind, and vorticity data from a supercomputer simulation of Hurricane Bonnie are shown using volume techniques such as ray marching.

  6. Quantum computing with trapped ions

    SciTech Connect

    Hughes, R.J.

    1998-01-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  7. Publishing Trends in Educational Computing.

    ERIC Educational Resources Information Center

    O'Hair, Marilyn; Johnson, D. LaMont

    1989-01-01

    Describes results of a survey of secondary school and college teachers that was conducted to determine subject matter that should be included in educational computing journals. Areas of interest included computer applications; artificial intelligence; computer-aided instruction; computer literacy; computer-managed instruction; databases; distance…

  8. Techniques for computer-aided analysis of ERTS-1 data, useful in geologic, forest and water resource surveys. [Colorado Rocky Mountains

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1974-01-01

    Forestry, geology, and water resource applications were the focus of this study, which involved the use of computer-implemented pattern-recognition techniques to analyze ERTS-1 data. The results have proven the value of computer-aided analysis techniques, even in areas of mountainous terrain. Several analysis capabilities have been developed during these ERTS-1 investigations. A procedure to rotate, deskew, and geometrically scale the MSS data results in 1:24,000 scale printouts that can be directly overlayed on 7 1/2 minutes U.S.G.S. topographic maps. Several scales of computer-enhanced "false color-infrared" composites of MSS data can be obtained from a digital display unit, and emphasize the tremendous detail present in the ERTS-1 data. A grid can also be superimposed on the displayed data to aid in specifying areas of interest.

  9. Freedom from the Tyranny of the Campus Main-Frame: Handling the Statistical Analysis of a 10-year Survey Research Study with a Personal Computer.

    ERIC Educational Resources Information Center

    Hickman, Linda J.

    Technological advances in microcomputer hardware and software, including size of memory and increasingly more sophisticated statistical application packages, create a new era in educational research. The alternative to costly main-frame computer data processing and statistical analysis is explored in this paper. In the first section, typical…

  10. PEP surveying procedures and equipment

    SciTech Connect

    Linker, F.

    1982-06-01

    The PEP Survey and Alignment System, which employs both laser-based and optical survey methods, is described. The laser is operated in conjunction with the Tektronix 4051 computer and surveying instruments such as ARM and SAM, system which is designed to automate data input, reduction, and production of alignment instructions. The laser system is used when surveying ring quadrupoles, main bend magnets, sextupoles, and is optional when surveying RF cavities and insertion quadrupoles. Optical methods usually require that data be manually entered into the computer for alignment, but in some cases, an element can be aligned using nominal values of fiducial locations without use of the computer. Optical surveying is used in the alignment of NIT and SIT, low field bend magnets, wigglers, RF cavities, and insertion quadrupoles.

  11. Survey of digital filtering

    NASA Technical Reports Server (NTRS)

    Nagle, H. T., Jr.

    1972-01-01

    A three part survey is made of the state-of-the-art in digital filtering. Part one presents background material including sampled data transformations and the discrete Fourier transform. Part two, digital filter theory, gives an in-depth coverage of filter categories, transfer function synthesis, quantization and other nonlinear errors, filter structures and computer aided design. Part three presents hardware mechanization techniques. Implementations by general purpose, mini-, and special-purpose computers are presented.

  12. Computer vision

    NASA Technical Reports Server (NTRS)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  13. Can Compute, Won't Compute: Women's Participation in the Culture of Computing.

    ERIC Educational Resources Information Center

    Wilson, Fiona

    2003-01-01

    Surveys of 130 psychology students and 52 computer science students (20 of the latter were interviewed) indicated that more males read computer magazines and were confident in computer use. Many did not perceive an equity problem. Men seemed to feel the equity situation is improving. Some felt that women do not enjoy computing as much as men and…

  14. Willingness of Patients with Breast Cancer in the Adjuvant and Metastatic Setting to Use Electronic Surveys (ePRO) Depends on Sociodemographic Factors, Health-related Quality of Life, Disease Status and Computer Skills

    PubMed Central

    Graf, J.; Simoes, E.; Wißlicen, K.; Rava, L.; Walter, C. B.; Hartkopf, A.; Keilmann, L.; Taran, A.; Wallwiener, S.; Fasching, P.; Brucker, S. Y.; Wallwiener, M.

    2016-01-01

    Introduction: Because of the often unfavorable prognosis, particularly for patients with metastases, health-related quality of life is extremely important for breast cancer patients. In recent years, data on patient-relevant endpoints is being increasingly collected electronically; however, knowledge on the acceptance and practicability of, and barriers to, this form of data collection remains limited. Material and Methods: A questionnaire was completed by 96 patients to determine to what extent existing computer skills, disease status, health-related quality of life and sociodemographic factors affect patientsʼ potential willingness to use electronics methods of data collection (ePRO). Results: 52 of 96 (55 %) patients reported a priori that they could envisage using ePRO. Patients who a priori preferred a paper-based survey (pPRO) tended to be older (ePRO 53 years vs. pPRO 62 years; p = 0.0014) and typically had lower levels of education (p = 0.0002), were in poorer health (p = 0.0327) and had fewer computer skills (p = 0.0003). Conclusion: Barriers to the prospective use of ePRO were identified in older patients and patients with a lower quality of life. Given the appropriate conditions with regard to age, education and current health status, opportunities to participate should be provided to encourage patientsʼ willingness to take part and ensure the validity of survey results. Focusing on ease of use of ePRO applications and making applications more patient-oriented and straightforward appears to be the way forward. PMID:27239062

  15. Computed Tomography Imaging Spectrometer (CTIS) with 2D Reflective Grating for Ultraviolet to Long-Wave Infrared Detection Especially Useful for Surveying Transient Events

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Maker, Paul D. (Inventor); Muller, Richard E. (Inventor); Mouroulis, Pantazis Z. (Inventor)

    2003-01-01

    The optical system of this invention is an unique type of imaging spectrometer, i.e. an instrument that can determine the spectra of all points in a two-dimensional scene. The general type of imaging spectrometer under which this invention falls has been termed a computed-tomography imaging spectrometer (CTIS). CTIS's have the ability to perform spectral imaging of scenes containing rapidly moving objects or evolving features, hereafter referred to as transient scenes. This invention, a reflective CTIS with an unique two-dimensional reflective grating, can operate in any wavelength band from the ultraviolet through long-wave infrared. Although this spectrometer is especially useful for events it is also for investigation of some slow moving phenomena as in the life sciences.

  16. Computed tomography imaging spectrometer (CTIS) with 2D reflective grating for ultraviolet to long-wave infrared detection especially useful for surveying transient events

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Maker, Paul D. (Inventor); Muller, Richard E. (Inventor); Mouroulis, Pantazis Z. (Inventor)

    2003-01-01

    The optical system of this invention is an unique type of imaging spectrometer, i.e. an instrument that can determine the spectra of all points in a two-dimensional scene. The general type of imaging spectrometer under which this invention falls has been termed a computed-tomography imaging spectrometer (CTIS). CTIS's have the ability to perform spectral imaging of scenes containing rapidly moving objects or evolving features, hereafter referred to as transient scenes. This invention, a reflective CTIS with an unique two-dimensional reflective grating, can operate in any wavelength band from the ultraviolet through long-wave infrared. Although this spectrometer is especially useful for rapidly occurring events it is also useful for investigation of some slow moving phenomena as in the life sciences.

  17. Audio computer-assisted survey instrument versus face-to-face interviews: optimal method for detecting high-risk behaviour in pregnant women and their sexual partners in the south of Brazil

    PubMed Central

    Yeganeh, N; Dillavou, C; Simon, M; Gorbach, P; Santos, B; Fonseca, R; Saraiva, J; Melo, M; Nielsen-Saines, K

    2016-01-01

    Summary Audio computer-assisted survey instrument (ACASI) has been shown to decrease under-reporting of socially undesirable behaviours, but has not been evaluated in pregnant women at risk of HIV acquisition in Brazil. We assigned HIV-negative pregnant women receiving routine antenatal care at in Porto Alegre, Brazil and their partners to receive a survey regarding high-risk sexual behaviours and drug use via ACASI (n = 372) or face-to-face (FTF) (n = 283) interviews. Logistic regression showed that compared with FTF, pregnant women interviewed via ACASI were significantly more likely to self-report themselves as single (14% versus 6%), having >5 sexual partners (35% versus 29%), having oral sex (42% versus 35%), using intravenous drugs (5% versus 0), smoking cigarettes (23% versus 16%), drinking alcohol (13% versus 8%) and using condoms during pregnancy (32% versus 17%). Therefore, ACASI may be a useful method in assessing risk behaviours in pregnant women, especially in relation to drug and alcohol use. PMID:23970659

  18. Survey of Gait Recognition

    NASA Astrophysics Data System (ADS)

    Liu, Ling-Feng; Jia, Wei; Zhu, Yi-Hai

    Gait recognition, the process of identifying an individual by his /her walking style, is a relatively new research area. It has been receiving wide attention in the computer vision community. In this paper, a comprehensive survey of video based gait recognition approaches is presented. And the research challenges and future directions of the gait recognition are also discussed.

  19. "Suntelligence" Survey

    MedlinePlus

    ... to the American Academy of Dermatology's "Suntelligence" sun-smart survey. Please answer the following questions to measure ... be able to view a ranking of major cities suntelligence based on residents' responses to this survey. ...

  20. Survey Says

    ERIC Educational Resources Information Center

    McCarthy, Susan K.

    2005-01-01

    Survey Says is a lesson plan designed to teach college students how to access Internet resources for valid data related to the sexual health of young people. Discussion questions based on the most recent available data from two national surveys, the Youth Risk Behavior Surveillance-United States, 2003 (CDC, 2004) and the National Survey of…

  1. Potential of known and short prokaryotic protein motifs as a basis for novel peptide-based antibacterial therapeutics: a computational survey

    PubMed Central

    Ruhanen, Heini; Hurley, Daniel; Ghosh, Ambarnil; O'Brien, Kevin T.; Johnston, Catrióna R.; Shields, Denis C.

    2014-01-01

    Short linear motifs (SLiMs) are functional stretches of protein sequence that are of crucial importance for numerous biological processes by mediating protein–protein interactions. These motifs often comprise peptides of less than 10 amino acids that modulate protein–protein interactions. While well-characterized in eukaryotic intracellular signaling, their role in prokaryotic signaling is less well-understood. We surveyed the distribution of known motifs in prokaryotic extracellular and virulence proteins across a range of bacterial species and conducted searches for novel motifs in virulence proteins. Many known motifs in virulence effector proteins mimic eukaryotic motifs and enable the pathogen to control the intracellular processes of their hosts. Novel motifs were detected by finding those that had evolved independently in three or more unrelated virulence proteins. The search returned several significantly over-represented linear motifs of which some were known motifs and others are novel candidates with potential roles in bacterial pathogenesis. A putative C-terminal G[AG].$ motif found in type IV secretion system proteins was among the most significant detected. A KK$ motif that has been previously identified in a plasminogen-binding protein, was demonstrated to be enriched across a number of adhesion and lipoproteins. While there is some potential to develop peptide drugs against bacterial infection based on bacterial peptides that mimic host components, this could have unwanted effects on host signaling. Thus, novel SLiMs in virulence factors that do not mimic host components but are crucial for bacterial pathogenesis, such as the type IV secretion system, may be more useful to develop as leads for anti-microbial peptides or drugs. PMID:24478765

  2. Efficient Computational Research Protocol to Survey Free Energy Surface for Solution Chemical Reaction in the QM/MM Framework: The FEG-ER Methodology and Its Application to Isomerization Reaction of Glycine in Aqueous Solution.

    PubMed

    Takenaka, Norio; Kitamura, Yukichi; Nagaoka, Masataka

    2016-03-01

    In solution chemical reaction, we often need to consider a multidimensional free energy (FE) surface (FES) which is analogous to a Born-Oppenheimer potential energy surface. To survey the FES, an efficient computational research protocol is proposed within the QM/MM framework; (i) we first obtain some stable states (or transition states) involved by optimizing their structures on the FES, in a stepwise fashion, finally using the free energy gradient (FEG) method, and then (ii) we directly obtain the FE differences among any arbitrary states on the FES, efficiently by employing the QM/MM method with energy representation (ER), i.e., the QM/MM-ER method. To validate the calculation accuracy and efficiency, we applied the above FEG-ER methodology to a typical isomerization reaction of glycine in aqueous solution, and reproduced quite satisfactorily the experimental value of the reaction FE. Further, it was found that the structural relaxation of the solute in the QM/MM force field is not negligible to estimate correctly the FES. We believe that the present research protocol should become prevailing as one computational strategy and will play promising and important roles in solution chemistry toward solution reaction ergodography. PMID:26794718

  3. Computer program for simulation of variable recharge with the U. S. Geological Survey modular finite-difference ground-water flow model (MODFLOW)

    USGS Publications Warehouse

    Kontis, A.L.

    2001-01-01

    The Variable-Recharge Package is a computerized method designed for use with the U.S. Geological Survey three-dimensional finitedifference ground-water flow model (MODFLOW-88) to simulate areal recharge to an aquifer. It is suitable for simulations of aquifers in which the relation between ground-water levels and land surface can affect the amount and distribution of recharge. The method is based on the premise that recharge to an aquifer cannot occur where the water level is at or above land surface. Consequently, recharge will vary spatially in simulations in which the Variable- Recharge Package is applied, if the water levels are sufficiently high. The input data required by the program for each model cell that can potentially receive recharge includes the average land-surface elevation and a quantity termed ?water available for recharge,? which is equal to precipitation minus evapotranspiration. The Variable-Recharge Package also can be used to simulate recharge to a valley-fill aquifer in which the valley fill and the adjoining uplands are explicitly simulated. Valley-fill aquifers, which are the most common type of aquifer in the glaciated northeastern United States, receive much of their recharge from upland sources as channeled and(or) unchanneled surface runoff and as lateral ground-water flow. Surface runoff in the uplands is generated in the model when the applied water available for recharge is rejected because simulated water levels are at or above land surface. The surface runoff can be distributed to other parts of the model by (1) applying the amount of the surface runoff that flows to upland streams (channeled runoff) to explicitly simulated streams that flow onto the valley floor, and(or) (2) applying the amount that flows downslope toward the valley- fill aquifer (unchanneled runoff) to specified model cells, typically those near the valley wall. An example model of an idealized valley- fill aquifer is presented to demonstrate application of the

  4. Seismic, side-scan survey, diving, and coring data analyzed by a Macintosh II sup TM computer and inexpensive software provide answers to a possible offshore extension of landslides at Palos Verdes Peninsula, California

    SciTech Connect

    Dill, R.F. ); Slosson, J.E. ); McEachen, D.B. )

    1990-05-01

    A Macintosh II{sup TM} computer and commercially available software were used to analyze and depict the topography, construct an isopach sediment thickness map, plot core positions, and locate the geology of an offshore area facing an active landslide on the southern side of Palos Verdes Peninsula California. Profile data from side scan sonar, 3.5 kHz, and Boomer subbottom, high-resolution seismic, diving, echo sounder traverses, and cores - all controlled with a mini Ranger II navigation system - were placed in MacGridzo{sup TM} and WingZ{sup TM} software programs. The computer-plotted data from seven sources were used to construct maps with overlays for evaluating the possibility of a shoreside landslide extending offshore. The poster session describes the offshore survey system and demonstrates the development of the computer data base, its placement into the MacGridzo{sup TM} gridding program, and transfer of gridded navigational locations to the WingZ{sup TM} data base and graphics program. Data will be manipulated to show how sea-floor features are enhanced and how isopach data were used to interpret the possibility of landslide displacement and Holocene sea level rise. The software permits rapid assessment of data using computerized overlays and a simple, inexpensive means of constructing and evaluating information in map form and the preparation of final written reports. This system could be useful in many other areas where seismic profiles, precision navigational locations, soundings, diver observations, and core provide a great volume of information that must be compared on regional plots to develop of field maps for geological evaluation and reports.

  5. Theory Survey or Survey Theory?

    ERIC Educational Resources Information Center

    Dean, Jodi

    2010-01-01

    Matthew Moore's survey of political theorists in U.S. American colleges and universities is an impressive contribution to political science (Moore 2010). It is the first such survey of political theory as a subfield, the response rate is very high, and the answers to the survey questions provide new information about how political theorists look…

  6. Robotic Surveying

    SciTech Connect

    Suzy Cantor-McKinney; Michael Kruzic

    2007-03-01

    -actuated functions to be controlled by an onboard computer. The computer-controlled Speedrower was developed at Carnegie Mellon University to automate agricultural harvesting. Harvesting tasks require the vehicle to cover a field using minimally overlapping rows at slow speeds in a similar manner to geophysical data acquisition. The Speedrower had demonstrated its ability to perform as it had already logged hundreds of acres of autonomous harvesting. This project is the first use of autonomous robotic technology on a large-scale for geophysical surveying.

  7. University Students' Perceptions of Computer Technology Experiences

    ERIC Educational Resources Information Center

    Inoue, Yukiko

    2007-01-01

    On the basis of a survey as a research method (involving from designing surveys to reporting on surveys), the author examined students' perceptions of computers and information technology. In fall 2005, a survey questionnaire was administered to students enrolled in education courses at a university in the western Pacific. Attention was given to…

  8. Columbia Gorge Community College Business Survey.

    ERIC Educational Resources Information Center

    McKee, Jonathon V.

    This is a report on a business survey conducted by Columbia Gorge Community College (CGCC) (Oregon) to review the success and quality of the college's degree and certificate programs in business administration, computer application systems, and computer information systems. The community college surveyed 104 local businesses to verify the…

  9. Computer Education for Engineers, Part III.

    ERIC Educational Resources Information Center

    McCullough, Earl S.; Lofy, Frank J.

    1989-01-01

    Reports the results of the third survey of computer use in engineering education conducted in the fall of 1987 in comparing with 1981 and 1984 results. Summarizes survey data on computer course credits, languages, equipment use, CAD/CAM instruction, faculty access, and computer graphics. (YP)

  10. Survey of Records Management Practices in Wisconsin.

    ERIC Educational Resources Information Center

    Blue, Richard I.; Schramm, Robert M.

    A 1990 survey of 68 records management operations in Wisconsin indicates current practices regarding records retention schedules, retention audits, use of computers, record destruction practices, and the effects of federal legislation on records management operations. The 1990 survey results are compared to similar surveys in the 1970s and 1980s.…

  11. Computer Literacy of Entering Freshmen.

    ERIC Educational Resources Information Center

    Tellep, Andrew

    In an effort to improve college program planning using data on the computer skills of entering freshmen, a survey was conducted to obtain information about computer science programs in Pennsylvania's public schools. The study investigated the material being taught, the background of computer science teachers, program plans, tendencies in the…

  12. Computer Use by Rural Principals.

    ERIC Educational Resources Information Center

    Witten, D. W.; And Others

    Very little research is available nationwide that measures the administrative use of computers in rural schools. A state survey of 154 rural Kentucky secondary school principals (representing a 51% response rate) focused on their knowledge about computers and use of computers for school administrative purposes. Only 14% of respondents had a…

  13. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-95, with projections to 2020; (supplement three to U.S. Geological Survey Water-resources investigations report 94-4251)

    USGS Publications Warehouse

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.) and revised by Kernodle (Kernodle, J.M., 1998, Simulation of ground-water flow in the Albuquerque Basin, 1901-95, with projections to 2020 (supplement two to U.S. Geological Survey Water-Resources Investigations Report 94-4251): U.S. Geological Survey Open-File Report 96-209, 54 p.). Output files resulting from the computer simulations are included for reference.

  14. Cryptography, quantum computation and trapped ions

    SciTech Connect

    Hughes, Richard J.

    1998-03-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  15. Computers in the World of College English.

    ERIC Educational Resources Information Center

    Tannheimer, Charlotte

    This sabbatical report surveys some computer software presently being developed, already in use, and/or available, and describes computer use in several Massachusetts colleges. A general introduction to computers, word processors, artificial intelligence, and computer assisted instruction is provided, as well as a discussion of what computers can…

  16. Redshift surveys

    NASA Technical Reports Server (NTRS)

    Geller, Margaret J.; Huchra, J. P.

    1991-01-01

    Present-day understanding of the large-scale galaxy distribution is reviewed. The statistics of the CfA redshift survey are briefly discussed. The need for deeper surveys to clarify the issues raised by recent studies of large-scale galactic distribution is addressed.

  17. SURVEY INSTRUMENT

    DOEpatents

    Borkowski, C J

    1954-01-19

    This pulse-type survey instrument is suitable for readily detecting {alpha} particles in the presence of high {beta} and {gamma} backgrounds. The instruments may also be used to survey for neutrons, {beta} particles and {gamma} rays by employing suitably designed interchangeable probes and selecting an operating potential to correspond to the particular probe.

  18. Heterotic computing: exploiting hybrid computational devices.

    PubMed

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. PMID:26078351

  19. ARM User Survey Report

    SciTech Connect

    Roeder, LR

    2010-06-22

    The objective of this survey was to obtain user feedback to, among other things, determine how to organize the exponentially growing data within the Atmospheric Radiation Measurement (ARM) Climate Research Facility, and identify users’ preferred data analysis system. The survey findings appear to have met this objective, having received approximately 300 responses that give insight into the type of work users perform, usage of the data, percentage of data analysis users might perform on an ARM-hosted computing resource, downloading volume level where users begin having reservations, opinion about usage if given more powerful computing resources (including ability to manipulate data), types of tools that would be most beneficial to them, preferred programming language and data analysis system, level of importance for certain types of capabilities, and finally, level of interest in participating in a code-sharing community.

  20. Computers and Computer Resources.

    ERIC Educational Resources Information Center

    Bitter, Gary

    1980-01-01

    This resource directory provides brief evaluative descriptions of six popular home computers and lists selected sources of educational software, computer books, and magazines. For a related article on microcomputers in the schools, see p53-58 of this journal issue. (SJL)

  1. Quantum chromodynamics with advanced computing

    SciTech Connect

    Kronfeld, Andreas S.; /Fermilab

    2008-07-01

    We survey results in lattice quantum chromodynamics from groups in the USQCD Collaboration. The main focus is on physics, but many aspects of the discussion are aimed at an audience of computational physicists.

  2. Computer representation of molecular surfaces

    SciTech Connect

    Max, N.L.

    1981-07-06

    This review article surveys recent work on computer representation of molecular surfaces. Several different algorithms are discussed for producing vector or raster drawings of space-filling models formed as the union of spheres. Other smoother surfaces are also considered.

  3. Parent Survey: Technology in the Laboratory Schools.

    ERIC Educational Resources Information Center

    Dean, Robert; Klass, Trisha

    1997-01-01

    Surveyed parents of laboratory school students regarding technology in the school and funding options for new technology. Most families had computers at home. Of the computer-owners, 38 percent had access to the World Wide Web. Half of the parents would upgrade or buy new computers if their children could electronically contact teachers or access…

  4. TQM in a Computer Lab.

    ERIC Educational Resources Information Center

    Swanson, Dewey A.; Phillips, Julie A.

    At the Purdue University School of Technology (PST) at Columbus, Indiana, the Total Quality Management (TQM) philosophy was used in the computer laboratories to better meet student needs. A customer satisfaction survey was conducted to gather data on lab facilities, lab assistants, and hardware/software; other sections of the survey included…

  5. Computers and Chinese Linguistics.

    ERIC Educational Resources Information Center

    Kierman, Frank A.; Barber, Elizabeth

    This survey of the field of Chinese language computational linguistics was prepared as a background study for the Chinese Linguistics Project at Princeton. Since the authors' main purpose was "critical reconnaissance," quantitative emphasis is on systems with which they are most familiar. The complexity of the Chinese writing system has presented…

  6. Computers in Small Business.

    ERIC Educational Resources Information Center

    Rumberger, Russell W.; Levin, Henry M.

    A survey was administered to a sample of about 10,000 members of the National Federation of Independent Business in 1985 to ascertain a variety of information about the use of computers in the nation's small businesses, including the extent of their use, training needs of users, and impacts and benefits. Major findings summarized from the 2,813…

  7. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-94, with projections to 2020; (supplement one to U.S. Geological Survey Water-resources investigations report 94-4251)

    USGS Publications Warehouse

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.). Output files resulting from the computer simulations are included for reference.

  8. Drug Survey.

    ERIC Educational Resources Information Center

    Gill, Wanda E.; And Others

    Results of a survey of student perceptions of drugs and drug use that was conducted at Bowie State College are presented. Studies that have been conducted on college students' use of alcohol, marijuana, and cocaine in the last five years are reviewed, along with additional studies relating to the general population and the following drugs:…

  9. Complexity Survey.

    ERIC Educational Resources Information Center

    Gordon, Sandra L.; Anderson, Beth C.

    To determine whether consensus existed among teachers about the complexity of common classroom materials, a survey was administered to 66 pre-service and in-service kindergarten and prekindergarten teachers. Participants were asked to rate 14 common classroom materials as simple, complex, or super-complex. Simple materials have one obvious part,…

  10. Computer Needs and Computer Problems in Developing Countries.

    ERIC Educational Resources Information Center

    Huskey, Harry D.

    A survey of the computer environment in a developing country is provided. Levels of development are considered and the educational requirements of countries at various levels are discussed. Computer activities in India, Burma, Pakistan, Brazil and a United Nations sponsored educational center in Hungary are all described. (SK/Author)

  11. Next-generation computers

    SciTech Connect

    Torrero, E.A.

    1985-01-01

    Developments related to tomorrow's computers are discussed, taking into account advances toward the fifth generation in Japan, the challenge to U.S. supercomputers, plans concerning the creation of supersmart computers for the U.S. military, a U.S. industry response to the Japanese challenge, a survey of U.S. and European research, Great Britain, the European Common Market, codifying human knowledge for machine reading, software engineering, the next-generation softwave, plans for obtaining the million-transistor chip, and fabrication issues for next-generation circuits. Other topics explored are related to a status report regarding artificial intelligence, an assessment of the technical challenges, aspects of sociotechnology, and defense advanced research projects. Attention is also given to expert systems, speech recognition, computer vision, function-level programming and automated programming, computing at the speed limit, VLSI, and superpower computers.

  12. Measurement of Computer Communication Networks.

    ERIC Educational Resources Information Center

    Abrams, Marshall D.; And Others

    Measures, tools, and techniques applicable to the performance measurement of computer communication networks are described for technicians who procure computer services from a remote access network. Cost considerations are discussed as a major component of evaluation, and measurement and evaluation methodologies are surveyed. External measurement…

  13. Audiovisual Media for Computer Education.

    ERIC Educational Resources Information Center

    Van Der Aa, H. J., Ed.

    The result of an international survey, this catalog lists over 450 films dealing with computing methods and automation and is intended for those who wish to use audiovisual displays as a means of instruction of computer education. The catalog gives the film's title, running time, and producer and tells whether the film is color or black-and-white,…

  14. 100-DR-1 radiological surveys

    SciTech Connect

    Naiknimbalkar, N.M.

    1994-01-28

    This report summarizes and documents the results of the radiological surveys conducted over the surface of the 100-DR-1 Operable Unit, Hanford Site, Richland, Washington. In addition, this report explains the survey methodology using the Ultrasonic Ranging and Data System (USRADS). The 100-DR-1 radiological survey field task consisted of two activities: characterization of the operable unit-specific background conditions and the radiological survey of the operable unit surface area. The survey methodology was based on utilization of USRADS for automated recording of the gross gamma radiation levels at or near 6 in. and at 3 ft from the surface soil. The purpose of the survey is to identify the location of unidentified subsurface radioactive material areas and any surface contamination associated with these areas. The radiological surveys were conducted using both a digital count rate meter with a NaI detector reporting in counts per minute (CPM) and a dose rate meter reporting micro-Roentgen per hour (uR) connected to a CHEMRAD Tennessee Corp. Series 2000 USRADS. The count rate meter was set for gross counting, i.e., Window ``out``. The window setting allows detection of low, intermediate, and high energy photons. The USRADS equipment is used to record the detector readings verses the location of the readings, generate a map of the survey area, and save the data on computer storage media.

  15. Cameron Station remedial investigation: Final asbestos survey report. Final report

    SciTech Connect

    1992-02-01

    Woodward-Clyde Federal Services (WCFS) conducted a comprehensive asbestos survey of the facilities at Cameron Station as part of its contract with the US Army Toxic and Hazardous Materials Agency (USATHAMA) to perform a remedial investigation and feasibility study (RI/FS) at the base. The purpose of the survey which was initiated August 23, 1990 in response to the Base Realignment And Closure Environmental Restoration Strategy (BRAC), was to identify friable and non-friable asbestos-containing material (ACM), provide options for abatement of asbestos, provide cost estimates for both abatement and operations and maintenance costs, and identifying actions requiring immediate action in Cameron Station`s 24 buildings. BRAC states that only friable asbestos which presents a threat to health and safety shall be removed; non-friable asbestos or friable asbestos which is encapsulated or in good repair shall be left in place and identified to the buyer per GSA agreement. The investigation followed protocols that met or exceeded the requirements of 40 CFR 763, the EPA regulations promulgated under the Asbestos Hazard Emergency Response Act (AHERA).

  16. Geophex Airborne Unmanned Survey System

    SciTech Connect

    Won, I.L.; Keiswetter, D.

    1995-12-31

    Ground-based surveys place personnel at risk due to the proximity of buried unexploded ordnance (UXO) items or by exposure to radioactive materials and hazardous chemicals. The purpose of this effort is to design, construct, and evaluate a portable, remotely-piloted, airborne, geophysical survey system. This non-intrusive system will provide stand-off capability to conduct surveys and detect buried objects, structures, and conditions of interest at hazardous locations. During a survey, the operators remain remote from, but within visual distance of, the site. The sensor system never contacts the Earth, but can be positioned near the ground so that weak geophysical anomalies can be detected. The Geophex Airborne Unmanned Survey System (GAUSS) is designed to detect and locate small-scale anomalies at hazardous sites using magnetic and electromagnetic survey techniques. The system consists of a remotely-piloted, radio-controlled, model helicopter (RCH) with flight computer, light-weight geophysical sensors, an electronic positioning system, a data telemetry system, and a computer base-station. The report describes GAUSS and its test results.

  17. Using electronic surveys in nursing research.

    PubMed

    Cope, Diane G

    2014-11-01

    Computer and Internet use in businesses and homes in the United States has dramatically increased since the early 1980s. In 2011, 76% of households reported having a computer, compared with only 8% in 1984 (File, 2013). A similar increase in Internet use has also been seen, with 72% of households reporting access of the Internet in 2011 compared with 18% in 1997 (File, 2013). This emerging trend in technology has prompted use of electronic surveys in the research community as an alternative to previous telephone and postal surveys. Electronic surveys can offer an efficient, cost-effective method for data collection; however, challenges exist. An awareness of the issues and strategies to optimize data collection using web-based surveys is critical when designing research studies. This column will discuss the different types and advantages and disadvantages of using electronic surveys in nursing research, as well as methods to optimize the quality and quantity of survey responses. PMID:25355023

  18. Computer animation challenges for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Vines, Mauricio; Lee, Won-Sook; Mavriplis, Catherine

    2012-07-01

    Computer animation requirements differ from those of traditional computational fluid dynamics (CFD) investigations in that visual plausibility and rapid frame update rates trump physical accuracy. We present an overview of the main techniques for fluid simulation in computer animation, starting with Eulerian grid approaches, the Lattice Boltzmann method, Fourier transform techniques and Lagrangian particle introduction. Adaptive grid methods, precomputation of results for model reduction, parallelisation and computation on graphical processing units (GPUs) are reviewed in the context of accelerating simulation computations for animation. A survey of current specific approaches for the application of these techniques to the simulation of smoke, fire, water, bubbles, mixing, phase change and solid-fluid coupling is also included. Adding plausibility to results through particle introduction, turbulence detail and concentration on regions of interest by level set techniques has elevated the degree of accuracy and realism of recent animations. Basic approaches are described here. Techniques to control the simulation to produce a desired visual effect are also discussed. Finally, some references to rendering techniques and haptic applications are mentioned to provide the reader with a complete picture of the challenges of simulating fluids in computer animation.

  19. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  20. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  1. Very large radio surveys of the sky.

    PubMed

    Condon, J J

    1999-04-27

    Recent advances in electronics and computing have made possible a new generation of large radio surveys of the sky that yield an order-of-magnitude higher sensitivity and positional accuracy. Combined with the unique properties of the radio universe, these quantitative improvements open up qualitatively different and exciting new scientific applications of radio surveys. PMID:10220365

  2. Very large radio surveys of the sky

    PubMed Central

    Condon, J. J.

    1999-01-01

    Recent advances in electronics and computing have made possible a new generation of large radio surveys of the sky that yield an order-of-magnitude higher sensitivity and positional accuracy. Combined with the unique properties of the radio universe, these quantitative improvements open up qualitatively different and exciting new scientific applications of radio surveys. PMID:10220365

  3. Multiple Surveys of Students and Survey Fatigue

    ERIC Educational Resources Information Center

    Porter, Stephen R.; Whitcomb, Michael E.; Weitzer, William H.

    2004-01-01

    This chapter reviews the literature on survey fatigue and summarizes a research project that indicates that administering multiple surveys in one academic year can significantly suppress response rates in later surveys. (Contains 4 tables.)

  4. Pygmalion's Computer.

    ERIC Educational Resources Information Center

    Peelle, Howard A.

    Computers have undoubtedly entered the educational arena, mainly in the areas of computer-assisted instruction (CAI) and artificial intelligence, but whether educators should embrace computers and exactly how they should use them are matters of great debate. The use of computers in support of educational administration is widely accepted.…

  5. Laser Surveying

    NASA Technical Reports Server (NTRS)

    1978-01-01

    NASA technology has produced a laser-aided system for surveying land boundaries in difficult terrain. It does the job more accurately than conventional methods, takes only one-third the time normally required, and is considerably less expensive. In surveying to mark property boundaries, the objective is to establish an accurate heading between two "corner" points. This is conventionally accomplished by erecting a "range pole" at one point and sighting it from the other point through an instrument called a theodolite. But how do you take a heading between two points which are not visible to each other, for instance, when tall trees, hills or other obstacles obstruct the line of sight? That was the problem confronting the U.S. Department of Agriculture's Forest Service. The Forest Service manages 187 million acres of land in 44 states and Puerto Rico. Unfortunately, National Forest System lands are not contiguous but intermingled in complex patterns with privately-owned land. In recent years much of the private land has been undergoing development for purposes ranging from timber harvesting to vacation resorts. There is a need for precise boundary definition so that both private owners and the Forest Service can manage their properties with confidence that they are not trespassing on the other's land.

  6. Farmland Survey

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A 1981 U.S. Department of Agriculture (USDA) study estimated that the nation is converting farmland to non-agricultural uses at the rate of 3 million acres a year. Seeking information on farmland loss in Florida, the state legislature, in 1984, directed establishment of a program for development of accurate data to enable intelligent legislation of state growth management. Thus was born Florida's massive Mapping and Monitoring of Agricultural Lands Project (MMALP). It employs data from the NASA-developed Landsat Earth resources survey satellite system as a quicker, less expensive alternative to ground surveying. The 3 year project involved inventory of Florida's 36 million acres classifying such as cropland, pastureland, citrus, woodlands, wetland, water and populated areas. Direction was assigned to Florida Department of Community Affairs (DCA) with assistance from the DOT. With the cooperation of the USDA, Soil Conservation Service, DCA decided that combining soil data with the Landsat land cover data would make available to land use planners a more comprehensive view of a county's land potential.

  7. A regional land use survey based on remote sensing and other data: A report on a LANDSAT and computer mapping project, volume 1. [Arizona, Colorado, Montana, New Mexico, Utah, and Wyoming

    NASA Technical Reports Server (NTRS)

    Nez, G. (Principal Investigator); Mutter, D.

    1977-01-01

    The author has identified the following significant results. New LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.

  8. Making maps with computers

    USGS Publications Warehouse

    Guptill, S.C.; Starr, L.E.

    1988-01-01

    Soon after their introduction in the 1950s, digital computers were used for various phases of the mapping process, especially for trigonometric calculations of survey data and for orientation of aerial photographs on map manuscripts. In addition, computer-controlled plotters were used to draw simple outline maps. The process of collecting data for the plotters was slow and not as precise as those produced by the best manual cartography. Only during the 1980s has it become technologically feasible and cost-effective to assemble and use the data required to automate the mapping process. -from Authors

  9. Faculty of Education Students' Computer Self-Efficacy Beliefs and Their Attitudes towards Computers and Implementing Computer Supported Education

    ERIC Educational Resources Information Center

    Berkant, Hasan Güner

    2016-01-01

    This study investigates faculty of education students' computer self-efficacy beliefs and their attitudes towards computers and implementing computer supported education. This study is descriptive and based on a correlational survey model. The final sample consisted of 414 students studying in the faculty of education of a Turkish university. The…

  10. Recent high precision surveys at PEP

    SciTech Connect

    Sah, R.C.

    1980-12-01

    The task of surveying and aligning the components of PEP has provided an opportunity to develop new instruments and techniques for the purpose of high precision surveys. The new instruments are quick and easy to use, and they automatically encode survey data and read them into the memory of an on-line computer. When measurements of several beam elements have been taken, the on-line computer analyzes the measured data, compares them with desired parameters, and calculates the required adjustments to beam element support stands.

  11. Computational methods for unsteady transonic flows

    NASA Technical Reports Server (NTRS)

    Edwards, John W.; Thomas, James L.

    1987-01-01

    Computational methods for unsteady transonic flows are surveyed with emphasis upon applications to aeroelastic analysis and flutter prediction. Computational difficulty is discussed with respect to type of unsteady flow; attached, mixed (attached/separated) and separated. Significant early computations of shock motions, aileron buzz and periodic oscillations are discussed. The maturation of computational methods towards the capability of treating complete vehicles with reasonable computational resources is noted and a survey of recent comparisons with experimental results is compiled. The importance of mixed attached and separated flow modeling for aeroelastic analysis is discussed and recent calculations of periodic aerodynamic oscillations for an 18 percent thick circular arc airfoil are given.

  12. Computational methods for unsteady transonic flows

    NASA Technical Reports Server (NTRS)

    Edwards, John W.; Thomas, J. L.

    1987-01-01

    Computational methods for unsteady transonic flows are surveyed with emphasis on prediction. Computational difficulty is discussed with respect to type of unsteady flow; attached, mixed (attached/separated) and separated. Significant early computations of shock motions, aileron buzz and periodic oscillations are discussed. The maturation of computational methods towards the capability of treating complete vehicles with reasonable computational resources is noted and a survey of recent comparisons with experimental results is compiled. The importance of mixed attached and separated flow modeling for aeroelastic analysis is discussed, and recent calculations of periodic aerodynamic oscillations for an 18 percent thick circular arc airfoil are given.

  13. Computational principles of memory.

    PubMed

    Chaudhuri, Rishidev; Fiete, Ila

    2016-03-01

    The ability to store and later use information is essential for a variety of adaptive behaviors, including integration, learning, generalization, prediction and inference. In this Review, we survey theoretical principles that can allow the brain to construct persistent states for memory. We identify requirements that a memory system must satisfy and analyze existing models and hypothesized biological substrates in light of these requirements. We also highlight open questions, theoretical puzzles and problems shared with computer science and information theory. PMID:26906506

  14. Infrastructure Survey 2011

    ERIC Educational Resources Information Center

    Group of Eight (NJ1), 2012

    2012-01-01

    In 2011, the Group of Eight (Go8) conducted a survey on the state of its buildings and infrastructure. The survey is the third Go8 Infrastructure survey, with previous surveys being conducted in 2007 and 2009. The current survey updated some of the information collected in the previous surveys. It also collated data related to aspects of the…

  15. The ASCI Network for SC '98: Dense Wave Division Multiplexing for Distributed and Distance Computing

    SciTech Connect

    Adams, R.L.; Butman, W.; Martinez, L.G.; Pratt, T.J.; Vahle, M.O.

    1999-06-01

    This document highlights the DISCOM's Distance computing and communication team activities at the 1998 Supercomputing conference in Orlando, Florida. This conference is sponsored by the IEEE and ACM. Sandia National Laboratories, Lawrence Livermore National Laboratory, and Los Alamos National Laboratory have participated in this conference for ten years. For the last three years, the three laboratories have a joint booth at the conference under the DOE's ASCI, Accelerated Strategic Computing Initiatives. The DISCOM communication team uses the forum to demonstrate and focus communications and networking developments. At SC '98, DISCOM demonstrated the capabilities of Dense Wave Division Multiplexing. We exhibited an OC48 ATM encryptor. We also coordinated the other networking activities within the booth. This paper documents those accomplishments, discusses the details of their implementation, and describes how these demonstrations support overall strategies in ATM networking.

  16. Using Personal Computers to Promote Economic Development.

    ERIC Educational Resources Information Center

    ECO Northwest, Ltd., Helena, MT.

    A study was conducted to determine the feasibility of increasing economic development within Montana through the use of personal computers in small businesses. A statewide mail survey of 1,650 businesses (employing between 4 and 25 employees) was conducted to determine the current status of computer use and the potential for expanding computer use…

  17. Social Attitudes and the Computer Revolution

    ERIC Educational Resources Information Center

    Lee, Robert S.

    1970-01-01

    Presents the results of a nationwide survey of attitudes toward computers in two categories: (1) the computer as a purposeful instrument, helpful in science, industry, and space exploration, and (2) the computer as a relatively autonomous machine that can perform the functions of human thinking. (MB)

  18. Online Hand Holding in Fixing Computer Glitches

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2005-01-01

    According to most surveys, computer manufacturers such as HP puts out reliable products, and computers in general are less troublesome than in the past. But personal computers are still prone to bugs, conflicts, viruses, spyware infestations, hacker and phishing attacks, and--most of all--user error. Unfortunately, technical support from computer…

  19. Computing Newsletter for Schools of Business.

    ERIC Educational Resources Information Center

    Couger, J. Daniel, Ed.

    1973-01-01

    The first of the two issues included here reports on various developments concerning the use of computers for schools of business. One-page articles cover these topics: widespread use of simulation games, survey of computer use in higher education, ten new computer cases which teach techniques for management analysis, advantages of the use of…

  20. Computers in Public Broadcasting: Who, What, Where.

    ERIC Educational Resources Information Center

    Yousuf, M. Osman

    This handbook offers guidance to public broadcasting managers on computer acquisition and development activities. Based on a 1981 survey of planned and current computer uses conducted by the Corporation for Public Broadcasting (CPB) Information Clearinghouse, computer systems in public radio and television broadcasting stations are listed by…

  1. Computer Organizational Techniques Used by Office Personnel.

    ERIC Educational Resources Information Center

    Alexander, Melody

    1995-01-01

    According to survey responses from 404 of 532 office personnel, 81.7% enjoy working with computers; the majority save files on their hard drives, use disk labels and storage files, do not use subdirectories or compress data, and do not make backups of floppy disks. Those with higher degrees, more computer experience, and more daily computer use…

  2. The Effects of Home Computers on School Enrollment

    ERIC Educational Resources Information Center

    Fairlie, R.W.

    2005-01-01

    Approximately 9 out of 10 high school students who have access to a home computer use that computer to complete school assignments. Do these home computers, however, improve educational outcomes? Using the Computer and Internet Use Supplement to the 2001 Current Population Survey, I explore whether access to home computers increases the likelihood…

  3. A Survey of Collectives

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Wolpert, David

    2004-01-01

    Due to the increasing sophistication and miniaturization of computational components, complex, distributed systems of interacting agents are becoming ubiquitous. Such systems, where each agent aims to optimize its own performance, but where there is a well-defined set of system-level performance criteria, are called collectives. The fundamental problem in analyzing/designing such systems is in determining how the combined actions of self-interested agents leads to 'coordinated' behavior on a iarge scale. Examples of artificial systems which exhibit such behavior include packet routing across a data network, control of an array of communication satellites, coordination of multiple deployables, and dynamic job scheduling across a distributed computer grid. Examples of natural systems include ecosystems, economies, and the organelles within a living cell. No current scientific discipline provides a thorough understanding of the relation between the structure of collectives and how well they meet their overall performance criteria. Although still very young, research on collectives has resulted in successes both in understanding and designing such systems. It is eqected that as it matures and draws upon other disciplines related to collectives, this field will greatly expand the range of computationally addressable tasks. Moreover, in addition to drawing on them, such a fully developed field of collective intelligence may provide insight into already established scientific fields, such as mechanism design, economics, game theory, and population biology. This chapter provides a survey to the emerging science of collectives.

  4. Computational dosimetry

    SciTech Connect

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  5. COMPUTATIONAL TOXICOLOGY

    EPA Science Inventory

    Over the last several years, there has been increased pressure to utilize novel technologies derived from computational chemistry, molecular biology and systems biology in toxicological risk assessment. This new area has been referred to as "Computational Toxicology". Our resear...

  6. Cloud Computing

    SciTech Connect

    Pete Beckman and Ian Foster

    2009-12-04

    Chicago Matters: Beyond Burnham (WTTW). Chicago has become a world center of "cloud computing." Argonne experts Pete Beckman and Ian Foster explain what "cloud computing" is and how you probably already use it on a daily basis.

  7. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Reviews the performance of computer programs for writing poetry and prose, including MARK V. SHANEY, MELL, POETRY GENERATOR, THUNDER THOUGHT, and ORPHEUS. Discusses the writing principles of the programs. Provides additional information on computer magnification techniques. (YP)

  8. Computational Toxicology

    EPA Science Inventory

    Computational toxicology’ is a broad term that encompasses all manner of computer-facilitated informatics, data-mining, and modeling endeavors in relation to toxicology, including exposure modeling, physiologically based pharmacokinetic (PBPK) modeling, dose-response modeling, ...

  9. DNA computing.

    PubMed

    Gibbons, A; Amos, M; Hodgson, D

    1997-02-01

    DNA computation is a novel and exciting recent development at the interface of computer science and molecular biology. We describe the current activity in this field following the seminal work of Adleman, who recently showed how techniques of molecular biology may be applied to the solution of a computationally intractable problem. PMID:9013647

  10. Computer Starters!

    ERIC Educational Resources Information Center

    Instructor, 1983

    1983-01-01

    Instructor's Computer-Using Teachers Board members give practical tips on how to get a classroom ready for a new computer, introduce students to the machine, and help them learn about programing and computer literacy. Safety, scheduling, and supervision requirements are noted. (PP)

  11. Computer Corner.

    ERIC Educational Resources Information Center

    Hampel, Paul J.

    1984-01-01

    Presents: (1) a computer program which will change a fraction into a decimal; (2) a program in which students supply missing lines to create the output given; and (3) suggestions for adding computer awareness to classrooms, including use of used punch cards and old computer-generated printouts. (JN)

  12. Computer Literacy.

    ERIC Educational Resources Information Center

    San Marcos Unified School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: After viewing many computer-literacy programs, we believe San Marcos Junior High School has developed a unique program which will truly develop computer literacy. Our hope is to give all students a comprehensive look at computers as they go through their two years here. They will not only learn the…

  13. Computer Literacy.

    ERIC Educational Resources Information Center

    Hunter, Beverly

    The concept of computer literacy is examined as it applies to two-year colleges. The paper begins with definitions of the term, emphasizing the skills, knowledge, and attitudes toward computers that are considered criteria for computer literacy. The paper continues by describing a conference at which educators attempted to visualize the technology…

  14. Computer Manual.

    ERIC Educational Resources Information Center

    Illinois State Office of Education, Springfield.

    This manual designed to provide the teacher with methods of understanding the computer and its potential in the classroom includes four units with exercises and an answer sheet. Unit 1 covers computer fundamentals, the mini computer, programming languages, an introduction to BASIC, and control instructions. Variable names and constants described…

  15. Computers improves sonar seabed maps

    SciTech Connect

    Not Available

    1984-05-01

    A software package for computer aided mapping of sonar (CAMOS) has been developed in Norway. It has automatic mosaic presentation, which produces fully scale-rectified side scan sonograms automatically plotted on geographical and UTM map grids. The program is the first of its kind in the world. The maps produced by this method are more accurate and detailed than those produced by conventional methods. The main applications of CAMOS are: seafloor mapping; pipeline route surveys; pipeline inspection surveys; platform site surveys; geological mapping and geotechnical investigations. With the aerial-photograph quality of the CAMOS maps, a more accurate and visual representation of the seabed is achieved.

  16. Computers for symbolic processing

    NASA Technical Reports Server (NTRS)

    Wah, Benjamin W.; Lowrie, Matthew B.; Li, Guo-Jie

    1989-01-01

    A detailed survey on the motivations, design, applications, current status, and limitations of computers designed for symbolic processing is provided. Symbolic processing computations are performed at the word, relation, or meaning levels, and the knowledge used in symbolic applications may be fuzzy, uncertain, indeterminate, and ill represented. Various techniques for knowledge representation and processing are discussed from both the designers' and users' points of view. The design and choice of a suitable language for symbolic processing and the mapping of applications into a software architecture are then considered. The process of refining the application requirements into hardware and software architectures is treated, and state-of-the-art sequential and parallel computers designed for symbolic processing are discussed.

  17. Quantum Computing

    NASA Astrophysics Data System (ADS)

    Steffen, Matthias

    2013-03-01

    Quantum mechanics plays a crucial role in many day-to-day products, and has been successfully used to explain a wide variety of observations in Physics. While some quantum effects such as tunneling limit the degree to which modern CMOS devices can be scaled to ever reducing dimensions, others may potentially be exploited to build an entirely new computing architecture: The quantum computer. In this talk I will review several basic concepts of a quantum computer. Why quantum computing and how do we do it? What is the status of several (but not all) approaches towards building a quantum computer, including IBM's approach using superconducting qubits? And what will it take to build a functional machine? The promise is that a quantum computer could solve certain interesting computational problems such as factoring using exponentially fewer computational steps than classical systems. Although the most sophisticated modern quantum computing experiments to date do not outperform simple classical computations, it is increasingly becoming clear that small scale demonstrations with as many as 100 qubits are beginning to be within reach over the next several years. Such a demonstration would undoubtedly be a thrilling feat, and usher in a new era of controllably testing quantum mechanics or quantum computing aspects. At the minimum, future demonstrations will shed much light on what lies ahead.

  18. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  19. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips. PMID:3536223

  20. Computer Literacy: Teaching Computer Ethics.

    ERIC Educational Resources Information Center

    Troutner, Joanne

    1986-01-01

    Suggests learning activities for teaching computer ethics in three areas: (1) equal access; (2) computer crime; and (3) privacy. Topics include computer time, advertising, class enrollments, copyright law, sabotage ("worms"), the Privacy Act of 1974 and the Freedom of Information Act of 1966. (JM)

  1. And the Survey Says…

    NASA Astrophysics Data System (ADS)

    White, Susan C.

    2016-02-01

    Every four years we survey a nationally representative sample of high school physics teachers. We define anyone who teaches at least one physics class to be a "physics teacher." About 40% of these teachers teach a majority of their classes in subjects other than physics. We also ask teachers to rate how well prepared they felt in various aspects of teaching. The response choices are "not adequately prepared," "adequately prepared," and "very well prepared." The accompanying figure shows the proportion of teachers who reported feeling adequately or very well prepared in the following aspects of teaching: • Basic physics knowledge, • Other science knowledge, • Application of physics to everyday experience, • Use of demonstrations, • Instructional laboratory design, • Use of computers in physics instruction and labs, and • Recent developments in physics.

  2. Teaching perspectives among introductory computer programming faculty in higher education

    NASA Astrophysics Data System (ADS)

    Mainier, Michael J.

    This study identified the teaching beliefs, intentions, and actions of 80 introductory computer programming (CS1) faculty members from institutions of higher education in the United States using the Teacher Perspectives Inventory. Instruction method used inside the classroom, categorized by ACM CS1 curriculum guidelines, was also captured along with information to develop a demographic profile of respondents. Introductory computer programming faculty combined beliefs, intentions, and actions scores displayed a dominant trend within the apprenticeship perspective while indicating a general preference for the imperative-first instruction method. This result indicates possible misalignment regarding the underlying value of these teachers to simulate the experience of computer programming in comparison to their non-traditional instructional approach of lecture and textbook learning. The factors of teaching experience and first language were found to have significant influence on faculty particularly within the social reform perspective, indicating established faculty members possess the intent to change society for the better while instructors born outside of the U.S. are more likely to actually teach through this perspective.

  3. Modeling computer interest in older adults: the role of age, education, computer knowledge, and computer anxiety.

    PubMed

    Ellis, D; Allaire, J C

    1999-09-01

    We proposed a mediation model to examine the effects of age, education, computer knowledge, and computer anxiety on computer interest in older adults. We hypothesized that computer knowledge and computer anxiety would fully mediate the effects of age and education on computer interest. A sample of 330 older adults from local senior-citizen apartment buildings completed a survey that included an assessment of the constructs included in the model. Using structural equation modeling, we found that the results supported the hypothesized mediation model. In particular, the effect of computer knowledge operated on computer interest through computer anxiety. The effect of age was not fully mitigated by the other model variables, indicating the need for future research that identifies and models other correlates of age and computer interest. The most immediate application of this research is the finding that a simple 3-item instrument can be used to assess computer interest in older populations. This will help professionals plan and implement computer services in public-access settings for older adults. An additional application of this research is the information it provides for training program designers. PMID:10665203

  4. Multiresolution Wavelet Based Adaptive Numerical Dissipation Control for Shock-Turbulence Computations

    NASA Technical Reports Server (NTRS)

    Sjoegreen, B.; Yee, H. C.

    2001-01-01

    The recently developed essentially fourth-order or higher low dissipative shock-capturing scheme of Yee, Sandham and Djomehri (1999) aimed at minimizing nu- merical dissipations for high speed compressible viscous flows containing shocks, shears and turbulence. To detect non smooth behavior and control the amount of numerical dissipation to be added, Yee et al. employed an artificial compression method (ACM) of Harten (1978) but utilize it in an entirely different context than Harten originally intended. The ACM sensor consists of two tuning parameters and is highly physical problem dependent. To minimize the tuning of parameters and physical problem dependence, new sensors with improved detection properties are proposed. The new sensors are derived from utilizing appropriate non-orthogonal wavelet basis functions and they can be used to completely switch to the extra numerical dissipation outside shock layers. The non-dissipative spatial base scheme of arbitrarily high order of accuracy can be maintained without compromising its stability at all parts of the domain where the solution is smooth. Two types of redundant non-orthogonal wavelet basis functions are considered. One is the B-spline wavelet (Mallat & Zhong 1992) used by Gerritsen and Olsson (1996) in an adaptive mesh refinement method, to determine regions where re nement should be done. The other is the modification of the multiresolution method of Harten (1995) by converting it to a new, redundant, non-orthogonal wavelet. The wavelet sensor is then obtained by computing the estimated Lipschitz exponent of a chosen physical quantity (or vector) to be sensed on a chosen wavelet basis function. Both wavelet sensors can be viewed as dual purpose adaptive methods leading to dynamic numerical dissipation control and improved grid adaptation indicators. Consequently, they are useful not only for shock-turbulence computations but also for computational aeroacoustics and numerical combustion. In addition, these

  5. Alumni Perspectives Survey, 2010. Survey Report

    ERIC Educational Resources Information Center

    Sheikh, Sabeen

    2010-01-01

    During the months of April and September of 2009, the Graduate Management Admission Council[R] (GMAC[R]) conducted the Alumni Perspectives Survey, a longitudinal study of prior respondents to the Global Management Education Graduate Survey of management students nearing graduation. A total of 3,708 alumni responded to the April 2009 survey,…

  6. 2012 Alumni Perspectives Survey. Survey Report

    ERIC Educational Resources Information Center

    Leach, Laura

    2012-01-01

    Conducted in September 2011, this Alumni Perspectives Survey by the Graduate Management Admission Council (GMAC) is a longitudinal study of respondents to the Global Management Education Graduate Survey, the annual GMAC[R] exit survey of graduate management students in their final year of business school. This 12th annual report includes responses…

  7. Computational psychiatry

    PubMed Central

    Montague, P. Read; Dolan, Raymond J.; Friston, Karl J.; Dayan, Peter

    2013-01-01

    Computational ideas pervade many areas of science and have an integrative explanatory role in neuroscience and cognitive science. However, computational depictions of cognitive function have had surprisingly little impact on the way we assess mental illness because diseases of the mind have not been systematically conceptualized in computational terms. Here, we outline goals and nascent efforts in the new field of computational psychiatry, which seeks to characterize mental dysfunction in terms of aberrant computations over multiple scales. We highlight early efforts in this area that employ reinforcement learning and game theoretic frameworks to elucidate decision-making in health and disease. Looking forwards, we emphasize a need for theory development and large-scale computational phenotyping in human subjects. PMID:22177032

  8. Sources of Information on Computer Assisted Instruction

    ERIC Educational Resources Information Center

    Dick, Walter; And Others

    1970-01-01

    A directory of projects dealing with computer-assisted instruction, primarily at the college level, based on a survey intended to uncover fugitive sources of information in this field (e.g., unpublished project progress reports). (LS)

  9. Computational aerothermodynamics

    NASA Technical Reports Server (NTRS)

    Deiwert, George S.; Green, Michael J.

    1987-01-01

    Computational aerothermodynamics (CAT) has in the past contributed to the understanding of real-gas flows encountered by hypervelocity reentry vehicles. With advances in computational fluid dynamics, in the modeling of high temperature phenomena, and in computer capability, CAT is an enabling technology for the design of many future space vehicles. An overview of the current capabilities of CAT is provided by describing available methods and their applications. Technical challenges that need to be met are discussed.

  10. Post Graduate Students' Computing Confidence, Computer and Internet Usage at Kuvempu University--An Indian Study

    ERIC Educational Resources Information Center

    Dange, Jagannath K.

    2010-01-01

    There is a common belief that students entering Post Graduation have appropriate computing skills for study purposes and there is no longer a felt need for computer training programmes in tertiary education. First year students of Post Graduation were surveyed in 2009, they were asked about their Education and Computing backgrounds. Further, the…

  11. An Examination of the Validity of Computer and Non-Computer Person Stereotypes.

    ERIC Educational Resources Information Center

    Dobbs, Linda Kay

    A series of three studies examined the validity of certain features of computer person and non-computer person stereotypes, including gender, academic achievement, communication apprehension, and receiver apprehension. First a pilot study developed a computer attitude estimate (CAE) scale and survey method. Subjects were 47 high school students…

  12. Computer Network Interconnection: Problems and Prospects. Computer Science & Technology Series.

    ERIC Educational Resources Information Center

    Cotton, Ira W.

    This report examines the current situation regarding the interconnection of computer networks, especially packet switched networks (PSNs). The emphasis is on idntifying the barriers to interconnection and on surveying approaches to a solution, rather than recommending any single course of action. Sufficient organizational and technical background…

  13. Computer Software.

    ERIC Educational Resources Information Center

    Kay, Alan

    1984-01-01

    Discusses the nature and development of computer software. Programing, programing languages, types of software (including dynamic spreadsheets), and software of the future are among the topics considered. (JN)

  14. Computed Tomography

    NASA Astrophysics Data System (ADS)

    Castellano, Isabel; Geleijns, Jacob

    After its clinical introduction in 1973, computed tomography developed from an x-ray modality for axial imaging in neuroradiology into a versatile three dimensional imaging modality for a wide range of applications in for example oncology, vascular radiology, cardiology, traumatology and even in interventional radiology. Computed tomography is applied for diagnosis, follow-up studies and screening of healthy subpopulations with specific risk factors. This chapter provides a general introduction in computed tomography, covering a short history of computed tomography, technology, image quality, dosimetry, room shielding, quality control and quality criteria.

  15. Astronomical surveys and big data

    NASA Astrophysics Data System (ADS)

    Mickaelian, Areg M.

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ -rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ -ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC) in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS), and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  16. Computer Interview Problem Assessment of Psychiatric Patients

    PubMed Central

    Angle, Hugh V.; Ellinwood, Everett H.; Carroll, Judith

    1978-01-01

    Behavioral Assessment information, a more general form of Problem- Oriented Record data, appears to have many useful clinical qualities and was selected to be the information content for a computer interview system. This interview system was designed to assess problematic behaviors of psychiatric patients. The computer interview covered 29 life problem areas and took patients from four to eight hours to complete. In two reliability studies, the computer interview was compared to human interviews. A greater number of general and specific patient problems were identified in the computer interview than in the human interviews. The attitudes of computer patients and clinicians receiving the computer reports were surveyed.

  17. Computing Life

    ERIC Educational Resources Information Center

    National Institute of General Medical Sciences (NIGMS), 2009

    2009-01-01

    Computer advances now let researchers quickly search through DNA sequences to find gene variations that could lead to disease, simulate how flu might spread through one's school, and design three-dimensional animations of molecules that rival any video game. By teaming computers and biology, scientists can answer new and old questions that could…

  18. Computational Pathology

    PubMed Central

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  19. Computer Code

    NASA Technical Reports Server (NTRS)

    1985-01-01

    COSMIC MINIVER, a computer code developed by NASA for analyzing aerodynamic heating and heat transfer on the Space Shuttle, has been used by Marquardt Company to analyze heat transfer on Navy/Air Force missile bodies. The code analyzes heat transfer by four different methods which can be compared for accuracy. MINIVER saved Marquardt three months in computer time and $15,000.

  20. Computational astrophysics

    NASA Technical Reports Server (NTRS)

    Miller, Richard H.

    1987-01-01

    Astronomy is an area of applied physics in which unusually beautiful objects challenge the imagination to explain observed phenomena in terms of known laws of physics. It is a field that has stimulated the development of physical laws and of mathematical and computational methods. Current computational applications are discussed in terms of stellar and galactic evolution, galactic dynamics, and particle motions.

  1. Recreational Computing.

    ERIC Educational Resources Information Center

    Strot, Melody

    1999-01-01

    Urges teachers of gifted students to allow students unstructured recreational computer time in the classroom to encourage student exploration and discovery, to promote creativity, to develop problem-solving skills, and to allow time to revisit programs and complete their own tasks. Different types of educational computer programs are referenced.…

  2. Computer Insecurity.

    ERIC Educational Resources Information Center

    Wilson, David L.

    1994-01-01

    College administrators recently appealed to students and faculty to change their computer passwords after security experts announced that tens of thousands had been stolen by computer hackers. Federal officials are investigating. Such attacks are not uncommon, but the most effective solutions are either inconvenient or cumbersome. (MSE)

  3. I, Computer

    ERIC Educational Resources Information Center

    Barack, Lauren

    2005-01-01

    What child hasn't chatted with friends through a computer? But chatting with a computer? Some Danish scientists have literally put a face on their latest software program, bringing to virtual life storyteller Hans Christian Andersen, who engages users in actual conversations. The digitized Andersen resides at the Hans Christian Andersen Museum in…

  4. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Discussed are three examples of computer graphics including biomorphs, Truchet tilings, and fractal popcorn. The graphics are shown and the basic algorithm using multiple iteration of a particular function or mathematical operation is described. An illustration of a snail shell created by computer graphics is presented. (YP)

  5. Computer Graphics.

    ERIC Educational Resources Information Center

    Halpern, Jeanne W.

    1970-01-01

    Computer graphics have been called the most exciting development in computer technology. At the University of Michigan, three kinds of graphics output equipment are now being used: symbolic printers, line plotters or drafting devices, and cathode-ray tubes (CRT). Six examples are given that demonstrate the range of graphics use at the University.…

  6. Computer News

    ERIC Educational Resources Information Center

    Science Activities: Classroom Projects and Curriculum Ideas, 2007

    2007-01-01

    This article presents several news stories about computers and technology. (1) Applied Science Associates of Narragansett, Rhode Island is providing computer modeling technology to help locate the remains to the USS Bonhomme Richard, which sank in 1779 after claiming a Revolutionary War victory. (2) Whyville, the leading edu-tainment virtual world…

  7. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  8. Applied technology center business plan and market survey

    NASA Technical Reports Server (NTRS)

    Hodgin, Robert F.; Marchesini, Roberto

    1990-01-01

    Business plan and market survey for the Applied Technology Center (ATC), computer technology transfer and development non-profit corporation, is presented. The mission of the ATC is to stimulate innovation in state-of-the-art and leading edge computer based technology. The ATC encourages the practical utilization of late-breaking computer technologies by firms of all variety.

  9. Learning To Use Computers for Future Communication Professions.

    ERIC Educational Resources Information Center

    Hurme, Pertti

    A study examined how to teach computer skills to future professionals in communications. The context of the study was the communications department in a mid-sized Finnish university. Data was collected on computer use and attitudes to computers and computer-mediated communication by means of surveys and learning journals during the Communications…

  10. Computers in Schools of Southeast Texas in 1994.

    ERIC Educational Resources Information Center

    Henderson, David L.; Renfrow, Raylene

    This paper reviews literature on the use of computers at work and home, computer skills needed by new teachers, and suggestions for administrators to support computer usage in schools. A survey of 52 school districts serving the Houston area of southeast Texas is reported, indicating that 22,664 computers were in use, with a mean of 436 computers…

  11. Expanding the View of Preservice Teachers' Computer Literacy: Implications from Written and Verbal Data and Metaphors as Freehand Drawings.

    ERIC Educational Resources Information Center

    Sherry, Annette C.

    2000-01-01

    Examines changes in attitudes towards computers and basic computer skills of preschool teachers participating in a two-year, school-based teacher training program. Written responses to a pre/post administration of the Computer Attitude Survey and computer skills survey, freehand drawings of metaphors expressing computer use, and verbal responses…

  12. Community Perception Survey, 2001.

    ERIC Educational Resources Information Center

    Rasmussen, Patricia; Silverman, Barbara

    This document is a report on the 2001 Community Perception Survey administered by Mt. San Antonio College (SAC) (California). The survey gathered public perception data of SAC services and programs. The survey was mailed to 773 service area community leaders; 160 (21%) responded. Survey results showed that: (1) 70% had knowledge of SAC programs…

  13. ACSI Survey 2014

    Atmospheric Science Data Center

    2014-08-26

    Upcoming EOSDIS Survey   Dear Colleagues,   In the next few days, you will ... on behalf of NASA. This message will ask you to complete a survey for users of NASA Earth science data and services, which includes the ... System (EOSDIS) science data centers evaluated by this survey. The purpose of this survey is to help NASA and the DAACs assess ...

  14. Universal computer test stand (recommended computer test requirements). [for space shuttle computer evaluation

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Techniques are considered which would be used to characterize areospace computers with the space shuttle application as end usage. The system level digital problems which have been encountered and documented are surveyed. From the large cross section of tests, an optimum set is recommended that has a high probability of discovering documented system level digital problems within laboratory environments. Defined is a baseline hardware, software system which is required as a laboratory tool to test aerospace computers. Hardware and software baselines and additions necessary to interface the UTE to aerospace computers for test purposes are outlined.

  15. Coal-seismic, desktop computer programs in BASIC; Part 7, Display and compute shear-pair seismograms

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of geophysical data taken with the U.S. Geological Survey's coal-seismic system is done with a desk-top, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report discusses and presents five computer pro grams used to display and compute shear-pair seismograms.

  16. The AAS Workforce Survey

    NASA Astrophysics Data System (ADS)

    Postman, Marc; Norman, D. J.; Evans, N. R.; Ivie, R.

    2014-01-01

    The AAS Demographics Committee, on behalf of the AAS, was tasked with initiating a biennial survey to improve the Society's ability to serve its members and to inform the community about changes in the community's demographics. A survey, based in part on similar surveys for other scientific societies, was developed in the summer of 2012 and was publicly launched in January 2013. The survey randomly targeted 2500 astronomers who are members of the AAS. The survey was closed 4 months later (April 2013). The response rate was excellent - 63% (1583 people) completed the survey. I will summarize the results from this survey, highlighting key results and plans for their broad dissemination.

  17. Optical computing.

    NASA Technical Reports Server (NTRS)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  18. Credit Risk or Credit Worthy? College Students and Credit Cards. A National Survey.

    ERIC Educational Resources Information Center

    Education Resources Inst., Boston, MA.

    This report presents findings of a national survey on the use of credit cards by college students. A computer-assisted telephone survey of a stratified random sample resulted in a total of 750 completed surveys. Following a chart summarizing selected earlier studies, the report presents the major findings of this survey: (1) credit card use is a…

  19. Improving radiation survey data using CADD/CAE

    SciTech Connect

    Palau, G.L.; Tarpinian, J.E.

    1987-01-01

    A new application of computer-aided design and drafting (CADD) and computer-aided engineering (CAE) at the Three Mile Island Unit 2 (TMI-2) cleanup is improving the quality of radiation survey data taken in the plant. The use of CADD/CAE-generated survey maps has increased both the accuracy of survey data and the capability to perform analyses with these data. In addition, health physics technician man hours and radiation exposure can be reduced in situations where the CADD/CAE-generated drawings are used for survey mapping.

  20. Computer modelling of minerals

    NASA Astrophysics Data System (ADS)

    Catlow, C. R. A.; Parker, S. C.

    We review briefly the methodology and achievements of computer simulation techniques in modelling structural and defect properties of inorganic solids. Special attention is paid to the role of interatomic potentials in such studies. We discuss the extension of the techniques to the modelling of minerals, and describe recent results on the study of structural properties of silicates. In a paper of this length, it is not possible to give a comprehensive survey of this field. We shall concentrate on the recent work of our own group. The reader should consult Tossell (1977), Gibbs (1982), and Busing (1970) for examples of other computational studies of inorganic solids. The techniques we discuss are all based on the principle of energy minimization. Simpler, "bridge-buildingrdquo procedures, based on known bond-lengths, of which distance least squares (DLS) techniques are the best known are discussed, for example, in Dempsey and Strens (1974).

  1. A search for stratiform massive-sulfide exploration targets in Appalachian Devonian rocks; a case study using computer-assisted attribute-coincidence mapping

    USGS Publications Warehouse

    Wedow, Helmuth

    1983-01-01

    The empirical model for sediment-associated, stratiform, exhalative, massive-sulfide deposits presented by D. Large in 1979 and 1980 has been redesigned to permit its use in a computer-assisted search for exploration-target areas in Devonian rocks of the Appalachian region using attribute-coincidence mapping (ACM). Some 36 gridded-data maps and selected maps derived therefrom were developed to show the orthogonal patterns, using the 7-1/2 minute quadrangle as an information cell, of geologic data patterns relevant to the empirical model. From these map and data files, six attribute-coincidence maps were prepared to illustrate both variation in the application of ACM techniques and the extent of possible significant exploration-target areas. As a result of this preliminary work in ACM, four major (and some lesser) exploration-target areas needing further study and analysis have been defined as follows: 1) in western and central New York in the outcrop area of lowermost Upper Devonian rocks straddling the Clarendon-Linden fault; 2) in western Virginia and eastern West Virginia in an area largely coincident with the well-known 'Oriskany' Mn-Fe ores; 3) an area in West Virginia, Maryland, and Virginia along and nearby the trend of the Alabama-New York lineament of King and Zietz approximately between 38- and 40-degrees N. latitude; and 4) an area in northeastern Ohio overlying an area coincident with a significant thickness of Silurian salt and high modern seismic activity. Some lesser, smaller areas suggested by relatively high coincidence may also be worthy of further study.

  2. Computer Science Research at Langley

    NASA Technical Reports Server (NTRS)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  3. Evolutionary Computing

    SciTech Connect

    Patton, Robert M; Cui, Xiaohui; Jiao, Yu; Potok, Thomas E

    2008-01-01

    The rate at which information overwhelms humans is significantly more than the rate at which humans have learned to process, analyze, and leverage this information. To overcome this challenge, new methods of computing must be formulated, and scientist and engineers have looked to nature for inspiration in developing these new methods. Consequently, evolutionary computing has emerged as new paradigm for computing, and has rapidly demonstrated its ability to solve real-world problems where traditional techniques have failed. This field of work has now become quite broad and encompasses areas ranging from artificial life to neural networks. This chapter focuses specifically on two sub-areas of nature-inspired computing: Evolutionary Algorithms and Swarm Intelligence.

  4. Computer Stimulation

    ERIC Educational Resources Information Center

    Moore, John W.; Moore, Elizabeth

    1977-01-01

    Discusses computer simulation approach of Limits to Growth, in which interactions of five variables (population, pollution, resources, food per capita, and industrial output per capita) indicate status of the world. Reviews other books that predict future of the world. (CS)

  5. 23 CFR 1340.9 - Computation of estimates.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.9 Computation of estimates. (a) Data... used, without exclusion, in the computation of the Statewide seat belt use rate, standard error, and nonresponse rate. (b) Data editing. Known values of data contributing to the Statewide seat belt use...

  6. 23 CFR 1340.9 - Computation of estimates.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.9 Computation of estimates. (a) Data... used, without exclusion, in the computation of the Statewide seat belt use rate, standard error, and nonresponse rate. (b) Data editing. Known values of data contributing to the Statewide seat belt use...

  7. 23 CFR 1340.9 - Computation of estimates.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.9 Computation of estimates. (a) Data... used, without exclusion, in the computation of the Statewide seat belt use rate, standard error, and nonresponse rate. (b) Data editing. Known values of data contributing to the Statewide seat belt use...

  8. Computers and Media Centers: Services, Satisfaction, and Cost Effectiveness.

    ERIC Educational Resources Information Center

    Givens, Patsy B.

    A survey was conducted of school media centers throughout the United States to determine: (1) how computers are being utilized by these centers, (2) the levels of satisfaction with present services, and (3) whether or not the services being provided by the computer are cost effective. Responses to survey forms returned by 20 school districts and…

  9. LHC Computing

    SciTech Connect

    Lincoln, Don

    2015-07-28

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  10. Advanced computing

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Advanced concepts in hardware, software and algorithms are being pursued for application in next generation space computers and for ground based analysis of space data. The research program focuses on massively parallel computation and neural networks, as well as optical processing and optical networking which are discussed under photonics. Also included are theoretical programs in neural and nonlinear science, and device development for magnetic and ferroelectric memories.

  11. Computing in College Courses: The Dartmouth Experience.

    ERIC Educational Resources Information Center

    Cohen, Peter A.

    In order to assess faculty and student use of, and attitudes toward, instructional computing at Dartmouth College, faculty members and students were surveyed during the spring of 1981 to determine how computing was being used in Dartmouth courses. Each of the 450 faculty members who taught an on-campus course from summer 1980 through spring 1981…

  12. Chromatin Computation

    PubMed Central

    Bryant, Barbara

    2012-01-01

    In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this “chromatin computer” to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal – and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines. PMID:22567109

  13. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  14. 75 FR 52508 - Proposed Information Collection; Comment Request; Information and Communication Technology Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-26

    ... Census Bureau Proposed Information Collection; Comment Request; Information and Communication Technology... 2012 Information and Communication Technology Survey (ICTS). The annual survey collects data on two... of information and communication technology equipment and software (computers and...

  15. Computational structures for robotic computations

    NASA Technical Reports Server (NTRS)

    Lee, C. S. G.; Chang, P. R.

    1987-01-01

    The computational problem of inverse kinematics and inverse dynamics of robot manipulators by taking advantage of parallelism and pipelining architectures is discussed. For the computation of inverse kinematic position solution, a maximum pipelined CORDIC architecture has been designed based on a functional decomposition of the closed-form joint equations. For the inverse dynamics computation, an efficient p-fold parallel algorithm to overcome the recurrence problem of the Newton-Euler equations of motion to achieve the time lower bound of O(log sub 2 n) has also been developed.

  16. Simulator for Microlens Planet Surveys

    NASA Astrophysics Data System (ADS)

    Ipatov, Sergei I.; Horne, Keith; Alsubai, Khalid A.; Bramich, Daniel M.; Dominik, Martin; Hundertmark, Markus P. G.; Liebig, Christine; Snodgrass, Colin D. B.; Street, Rachel A.; Tsapras, Yiannis

    2014-04-01

    We summarize the status of a computer simulator for microlens planet surveys. The simulator generates synthetic light curves of microlensing events observed with specified networks of telescopes over specified periods of time. Particular attention is paid to models for sky brightness and seeing, calibrated by fitting to data from the OGLE survey and RoboNet observations in 2011. Time intervals during which events are observable are identified by accounting for positions of the Sun and the Moon, and other restrictions on telescope pointing. Simulated observations are then generated for an algorithm that adjusts target priorities in real time with the aim of maximizing planet detection zone area summed over all the available events. The exoplanet detection capability of observations was compared for several telescopes.

  17. PPIC Statewide Survey - Special Survey on Education

    ERIC Educational Resources Information Center

    Baldassare, Mark

    2005-01-01

    The PPIC Statewide Survey series provides policymakers, the media, and the general public with objective, advocacy-free information on the perceptions, opinions, and public policy preferences of California residents. Inaugurated in April 1998, the survey series has generated a database that includes the responses of more than 114,000 Californians.…

  18. High resolution survey for topographic surveying

    NASA Astrophysics Data System (ADS)

    Luh, L. C.; Setan, H.; Majid, Z.; Chong, A. K.; Tan, Z.

    2014-02-01

    In this decade, terrestrial laser scanner (TLS) is getting popular in many fields such as reconstruction, monitoring, surveying, as-built of facilities, archaeology, and topographic surveying. This is due the high speed in data collection which is about 50,000 to 1,000,000 three-dimensional (3D) points per second at high accuracy. The main advantage of 3D representation for the data is that it is more approximate to the real world. Therefore, the aim of this paper is to show the use of High-Definition Surveying (HDS), also known as 3D laser scanning for topographic survey. This research investigates the effectiveness of using terrestrial laser scanning system for topographic survey by carrying out field test in Universiti Teknologi Malaysia (UTM), Skudai, Johor. The 3D laser scanner used in this study is a Leica ScanStation C10. Data acquisition was carried out by applying the traversing method. In this study, the result for the topographic survey is under 1st class survey. At the completion of this study, a standard of procedure was proposed for topographic data acquisition using laser scanning systems. This proposed procedure serves as a guideline for users who wish to utilize laser scanning system in topographic survey fully.

  19. 2012 Corporate Recruiters Survey. Survey Report

    ERIC Educational Resources Information Center

    Estrada, Rebecca

    2012-01-01

    This paper presents the results from the 2012 Corporate Recruiters Survey conducted by the Graduate Management Admission Council[R] (GMAC[R]). Conducted annually since 2001, this survey examines the job outlook for recent graduate business students as well as employer needs and expectations. The objectives of this study are to obtain a picture of…

  20. Corporate Recruiters Survey, 2011. Survey Report

    ERIC Educational Resources Information Center

    Edgington, Rachel

    2011-01-01

    In this report, the Graduate Management Admission Council[R] (GMAC[R]) presents the results from the 2011 Corporate Recruiters Survey. Conducted annually since 2001, this survey examines the job outlook for recent graduate business students as well as employer needs and expectations. The objectives of this study are to obtain a picture of the…

  1. Aerial radiation surveys

    SciTech Connect

    Jobst, J.

    1980-01-01

    A recent aerial radiation survey of the surroundings of the Vitro mill in Salt Lake City shows that uranium mill tailings have been removed to many locations outside their original boundary. To date, 52 remote sites have been discovered within a 100 square kilometer aerial survey perimeter surrounding the mill; 9 of these were discovered with the recent aerial survey map. Five additional sites, also discovered by aerial survey, contained uranium ore, milling equipment, or radioactive slag. Because of the success of this survey, plans are being made to extend the aerial survey program to other parts of the Salt Lake valley where diversions of Vitro tailings are also known to exist.

  2. [DNA computing].

    PubMed

    Błasiak, Janusz; Krasiński, Tadeusz; Popławski, Tomasz; Sakowski, Sebastian

    2011-01-01

    Biocomputers can be an alternative for traditional "silicon-based" computers, which continuous development may be limited due to further miniaturization (imposed by the Heisenberg Uncertainty Principle) and increasing the amount of information between the central processing unit and the main memory (von Neuman bottleneck). The idea of DNA computing came true for the first time in 1994, when Adleman solved the Hamiltonian Path Problem using short DNA oligomers and DNA ligase. In the early 2000s a series of biocomputer models was presented with a seminal work of Shapiro and his colleguas who presented molecular 2 state finite automaton, in which the restriction enzyme, FokI, constituted hardware and short DNA oligomers were software as well as input/output signals. DNA molecules provided also energy for this machine. DNA computing can be exploited in many applications, from study on the gene expression pattern to diagnosis and therapy of cancer. The idea of DNA computing is still in progress in research both in vitro and in vivo and at least promising results of these research allow to have a hope for a breakthrough in the computer science. PMID:21735816

  3. Computational mechanics

    SciTech Connect

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  4. College Student Notions of Computer Science

    NASA Astrophysics Data System (ADS)

    Ruslanov, Anatole D.; Yolevich, Andrew P.

    2011-08-01

    Two surveys of college students were conducted to study the students' perceptions and knowledge of computer science as a profession and as a career. Ignorance of the field was consistently observed in both samples. Students with an aptitude for computing tend to blame their high schools, media, and society for their lack of knowledge. These findings suggest that high school students need to be provided with a more balanced perspective on computing.

  5. Computational psychiatry.

    PubMed

    Wang, Xiao-Jing; Krystal, John H

    2014-11-01

    Psychiatric disorders such as autism and schizophrenia, arise from abnormalities in brain systems that underlie cognitive, emotional, and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically realistic modeling bridging cellular and synaptic mechanisms with behavior, and model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  6. Computational Psychiatry

    PubMed Central

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  7. Computational mechanics

    SciTech Connect

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  8. Computer viruses

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  9. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  10. Quantum Computation

    NASA Astrophysics Data System (ADS)

    Ekert, Artur

    1994-08-01

    As computers become faster they must become smaller because of the finiteness of the speed of light. The history of computer technology has involved a sequence of changes from one type of physical realisation to another - from gears to relays to valves to transistors to integrated circuits and so on. Quantum mechanics is already important in the design of microelectronic components. Soon it will be necessary to harness quantum mechanics rather than simply take it into account, and at that point it will be possible to give data processing devices new functionality.

  11. Survey of Rural Information Infrastructure Technologies.

    ERIC Educational Resources Information Center

    Allen, Kenneth C.; And Others

    Communication and information technologies can reduce the barriers of distance and space that disadvantage rural areas. This report defines a set of distinct voice, computer, and video telecommunication services; describes several rural information applications that make use of these services; and surveys various wireline and wireless systems and…

  12. A survey of big data research

    PubMed Central

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  13. Third Annual Survey of the Profession.

    ERIC Educational Resources Information Center

    Dugger, William E., Jr.; And Others

    1987-01-01

    Reports results of "School Shop's" annual survey of teachers of technology and vocational education. Questions centered on (1) information about respondents, (2) data on schools and programs, and (3) opinions about strengths, weaknesses, and problems. Results indicate that robotics and computers are among the fastest growing programs. (CH)

  14. Topics in Research Methods: Survey Sampling.

    ERIC Educational Resources Information Center

    Cook, Tony; Rushton, Brian S.

    1984-01-01

    Reviews a computer-assisted learning package (available from CONDUIT) which introduces survey and sampling techniques by pretending that the user is a pollster asking one of six questions of a more or less political nature. Documentation and performance are rated fair while ease of use is considered excellent. (JN)

  15. Use of multispectral data in design of forest sample surveys

    NASA Technical Reports Server (NTRS)

    Titus, S. J.; Wensel, L. C.

    1977-01-01

    The use of multispectral data in design of forest sample surveys using a computer software package is described. The system allows evaluation of a number of alternative sampling systems and, with appropriate cost data, estimates the implementation cost for each.

  16. Water Use: A Survey

    ERIC Educational Resources Information Center

    Fleming, Rose Glee; Warden, Jessie

    1976-01-01

    A survey of Florida State University students showed that their current laundry practices generate energy and water over-consumption. The survey also resulted in some concrete suggestions to the students that would improve their conservation practices. (Author/BP)

  17. National Health Care Survey

    Cancer.gov

    This survey encompasses a family of health care provider surveys, including information about the facilities that supply health care, the services rendered, and the characteristics of the patients served.

  18. Distributed GPU Computing in GIScience

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.

    2013-12-01

    Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE

  19. Kiso Supernova Survey (KISS): Survey strategy

    NASA Astrophysics Data System (ADS)

    Morokuma, Tomoki; Tominaga, Nozomu; Tanaka, Masaomi; Mori, Kensho; Matsumoto, Emiko; Kikuchi, Yuki; Shibata, Takumi; Sako, Shigeyuki; Aoki, Tsutomu; Doi, Mamoru; Kobayashi, Naoto; Maehara, Hiroyuki; Matsunaga, Noriyuki; Mito, Hiroyuki; Miyata, Takashi; Nakada, Yoshikazu; Soyano, Takao; Tarusawa, Ken'ichi; Miyazaki, Satoshi; Nakata, Fumiaki; Okada, Norio; Sarugaku, Yuki; Richmond, Michael W.; Akitaya, Hiroshi; Aldering, Greg; Arimatsu, Ko; Contreras, Carlos; Horiuchi, Takashi; Hsiao, Eric Y.; Itoh, Ryosuke; Iwata, Ikuru; Kawabata, Koji S.; Kawai, Nobuyuki; Kitagawa, Yutaro; Kokubo, Mitsuru; Kuroda, Daisuke; Mazzali, Paolo; Misawa, Toru; Moritani, Yuki; Morrell, Nidia; Okamoto, Rina; Pavlyuk, Nikolay; Phillips, Mark M.; Pian, Elena; Sahu, Devendra; Saito, Yoshihiko; Sano, Kei; Stritzinger, Maximilian D.; Tachibana, Yutaro; Taddia, Francesco; Takaki, Katsutoshi; Tateuchi, Ken; Tomita, Akihiko; Tsvetkov, Dmitry; Ui, Takahiro; Ukita, Nobuharu; Urata, Yuji; Walker, Emma S.; Yoshii, Taketoshi

    2014-12-01

    The Kiso Supernova Survey (KISS) is a high-cadence optical wide-field supernova (SN) survey. The primary goal of the survey is to catch the very early light of a SN, during the shock breakout phase. Detection of SN shock breakouts combined with multi-band photometry obtained with other facilities would provide detailed physical information on the progenitor stars of SNe. The survey is performed using a 2.2° × 2.2° field-of-view instrument on the 1.05-m Kiso Schmidt telescope, the Kiso Wide Field Camera (KWFC). We take a 3-min exposure in g-band once every hour in our survey, reaching magnitude g ˜ 20-21. About 100 nights of telescope time per year have been spent on the survey since 2012 April. The number of the shock breakout detections is estimated to be of the order of 1 during our three-year project. This paper summarizes the KISS project including the KWFC observing setup, the survey strategy, the data reduction system, and CBET-reported SNe discovered so far by KISS.

  20. Optical & NIR Transient Surveys

    NASA Astrophysics Data System (ADS)

    Cross, Nicholas J. G.; Djorgovski, S. G.

    2012-04-01

    A workshop on Optical & Near Infrared Transients took place during the first afternoon of the Symposium. It ran for two sessions. The first was given over to talks about various current optical and near-infrared transient surveys, focussing on the Vista surveys, the Catalina Real-Time Transient Survey, Pan-STARRS, Gaia, TAOS and TAOS2. The second session was a panel-led discussion about coordinating multi-wavelength surveys and associated follow-ups.

  1. A survey of aftbody flow prediction methods

    NASA Technical Reports Server (NTRS)

    Putnam, L. E.; Mace, J.

    1981-01-01

    A survey of computational methods used in the calculation of nozzle aftbody flows is presented. One class of methods reviewed are those which patch together solutions for the inviscid, boundary layer, and plume flow regions. The second class of methods reviewed are those which computationally solve the Navier Stokes equations over nozzle aftbodies with jet exhaust flow. Computed results from the methods are compared with experiment. Advantages and disadvantages of the various methods are discussed along with opportunities for further development of these methods.

  2. Computer Guerrillas.

    ERIC Educational Resources Information Center

    Immel, A. Richard

    1983-01-01

    Describes several cases in which microcomputers were used to prevent large organizations (e.g., utility companies, U.S. Government Forestry Commission) from carrying out actions considered not to be in the public's best interests. The use of the computers by social activitists in their efforts to halt environmental destruction is discussed. (EAO)

  3. Computer Corner.

    ERIC Educational Resources Information Center

    Smith, David A.; And Others

    1986-01-01

    APL was invented specifically as a mathematical teaching tool, and is an excellent vehicle for teaching mathematical concepts using computers. This article illustrates the use of APL in teaching many different topics in mathematics, including logic, set theory, functions, statistics, linear algebra, and matrices. (MNS)

  4. Computer Corner.

    ERIC Educational Resources Information Center

    Mason, Margie

    1985-01-01

    Provides tips to help primary-aged students with computer keyboarding skills (suggesting the use of color codes and listing currently available software). Also describes (and lists) a program which helps test students' understanding of IF-THEN statements and illustrates some hazards of "spaghetti programming" (debugging). (JN)

  5. Computational Musicology.

    ERIC Educational Resources Information Center

    Bel, Bernard; Vecchione, Bernard

    1993-01-01

    Asserts that a revolution has been occurring in musicology since the 1970s. Contends that music has change from being only a source of emotion to appearing more open to science and techniques based on computer technology. Describes recent research and other writings about the topic and provides an extensive bibliography. (CFR)

  6. Computational trigonometry

    SciTech Connect

    Gustafson, K.

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  7. Computer proposals

    NASA Astrophysics Data System (ADS)

    Richman, Barbara T.

    To expand the research community's access to supercomputers, the National Science Foundation (NSF) has begun a program to match researchers who require the capabilities of a supercomputer with those facilities that have such computer resources available.Recent studies on computer needs in scientific and engineering research underscore the need for greater access to supercomputers (Eos, July 6, 1982, p. 562), especially those categorized as “Class VI” machines. Complex computer models for research on astronomy, the oceans, and the atmosphere often require such capabilities. In addition, similar needs are emerging in the earth sciences: A Union session at the AGU Fall Meeting in San Francisco this week will focus on the research computing needs of the geosciences. A Class VI supercomputer has a memory capacity of at least 1 megaword, a speed of upwards of 100 MFLOPS (million floating point operations per second), and both scalar and vector registers in the CPU (central processing unit). Examples of Class VI machines are the CRAY-1 and the CYBER 205. The high costs o f these machines, the most powerful ones available, preclude most research facilities from owning one.

  8. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  9. DEMOGRAPHIC AND HEALTH SURVEYS

    EPA Science Inventory

    Demographic and Health Surveys are nationally representative household surveys with large sample sizes of between 5,000 and 30,000 households, typically. DHS surveys provide data for a wide range of monitoring and impact evaluation indicators in the areas of population, health, a...

  10. Sensitive Questions in Surveys

    ERIC Educational Resources Information Center

    Tourangeau, Roger; Yan, Ting

    2007-01-01

    Psychologists have worried about the distortions introduced into standardized personality measures by social desirability bias. Survey researchers have had similar concerns about the accuracy of survey reports about such topics as illicit drug use, abortion, and sexual behavior. The article reviews the research done by survey methodologists on…

  11. Campus Climate Survey.

    ERIC Educational Resources Information Center

    Mattice, Nancy J.

    A survey was conducted at College of the Canyons (COC) to assess the current status of the campus climate. The survey instrument focused on students' experiences, attitudes about diversity issues, and suggestions for improving the climate for diversity. The survey was mailed to all disabled and under-represented racial/ethnic group students plus a…

  12. NATIONAL SURVEY OF MEN

    EPA Science Inventory

    The 1991 National Survey of Men was conducted to examine issues related to sexual behavior and condom use among U.S. men aged 20 to 39. Data collection and processing took place between March 1991 and January 1992. This survey was intended to serve as a baseline survey for a long...

  13. The Introductory Sociology Survey

    ERIC Educational Resources Information Center

    Best, Joel

    1977-01-01

    The Introductory Sociology Survey (ISS) is designed to teach introductory students basic skills in developing causal arguments and in using a computerized statistical package to analyze survey data. Students are given codebooks for survey data and asked to write a brief paper predicting the relationship between at least two variables. (Author)

  14. Telephone Survey Designs.

    ERIC Educational Resources Information Center

    Casady, Robert J.

    The concepts, definitions, and notation that have evolved with the development of telephone survey design methodology are discussed and presented as a unified structure. This structure is then applied to some of the more well-known telephone survey designs and alternative designs are developed. The relative merits of the different survey designs…

  15. MALAYSIAN FAMILY LIFE SURVEY

    EPA Science Inventory

    The Malaysian Family Life Surveys (MFLS) comprise a pair of surveys with partially overlapping samples, designed by RAND and administered in Peninsular Malaysia in 1976-77 (MFLS-1) and 1988-89 (MFLS-2). Each survey collected detailed current and retrospective information on famil...

  16. Florida Employer Opinion Survey. Annual Report, June 1992.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee.

    Each year the Florida Education and Training Placement Information Program (FETPIP) conducts surveys to determine the opinions of employers about the preparation of graduates of vocational programs. The 1992 survey focused on eight specific occupational training areas (i.e., child care services, computer programming and analysis, dental assisting,…

  17. Social Media and Archives: A Survey of Archive Users

    ERIC Educational Resources Information Center

    Washburn, Bruce; Eckert, Ellen; Proffitt, Merrilee

    2013-01-01

    In April and May of 2012, the Online Computer Library Center (OCLC) Research conducted a survey of users of archives to learn more about their habits and preferences. In particular, they focused on the roles that social media, recommendations, reviews, and other forms of user-contributed annotation play in archival research. OCLC surveyed faculty,…

  18. Telecommunications and K-12 Educators: Findings from a National Survey.

    ERIC Educational Resources Information Center

    Honey, Margaret; Henriquez, Andres

    A survey was conducted to obtain a systematic profile of activities currently being undertaken by kindergarten through grade 12 educators in telecommunications technology. Based on the responses of 550 educators from 48 states, selected because of their involvement with computer technology, this survey represents the first large-scale description…

  19. Workforce Improvement Network 2000 Survey of Virginia Employers.

    ERIC Educational Resources Information Center

    Foucar-Szocki, Diane; Bolt, Les

    A stratified random sample of Virginia's 4,000 employers with over 100 employees was surveyed about workplace-based foundational basic skills (oral and written communication, reading, math, thinking skills, teamwork, English proficiency, and basic computer literacy). A total of 446 surveys were sent with a usable response rate of 18 percent.…

  20. Effect of Mailing Address Style on Survey Response Rate.

    ERIC Educational Resources Information Center

    Cookingham, Frank G.

    This study determined the effect of using mailing labels prepared by a letter-quality computer printer on survey response rate. D. A. Dillman's personalization approach to conducting mail surveys suggests that envelopes with addresses typed directly on them may produce a higher response rate than envelopes with addresses typed on self-adhesive…

  1. The Survey of English Usage: Past, Present--and Future.

    ERIC Educational Resources Information Center

    Ilson, Robert

    1982-01-01

    The 1959 Survey of English Usage has provided researchers and teachers with a corpus of spoken, manuscript, and printed Standard British English. New uses have been found for the survey's resources in recent years, and the spoken part is more widely available in book and computer tape form. (MSE)

  2. Autonomous mobile robot for radiologic surveys

    DOEpatents

    Dudar, Aed M.; Wagner, David G.; Teese, Gregory D.

    1994-01-01

    An apparatus for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm.

  3. Autonomous mobile robot for radiologic surveys

    DOEpatents

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1994-06-28

    An apparatus is described for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm. 5 figures.

  4. Amorphous Computing

    NASA Astrophysics Data System (ADS)

    Sussman, Gerald

    2002-03-01

    Digital computers have always been constructed to behave as precise arrangements of reliable parts, and our techniques for organizing computations depend upon this precision and reliability. Two emerging technologies, however, are begnning to undercut these assumptions about constructing and programming computers. These technologies -- microfabrication and bioengineering -- will make it possible to assemble systems composed of myriad information- processing units at almost no cost, provided: 1) that not all the units need to work correctly; and 2) that there is no need to manufacture precise geometrical arrangements or interconnection patterns among them. Microelectronic mechanical components are becoming so inexpensive to manufacture that we can anticipate combining logic circuits, microsensors, actuators, and communications devices integrated on the same chip to produce particles that could be mixed with bulk materials, such as paints, gels, and concrete. Imagine coating bridges or buildings with smart paint that can sense and report on traffic and wind loads and monitor structural integrity of the bridge. A smart paint coating on a wall could sense vibrations, monitor the premises for intruders, or cancel noise. Even more striking, there has been such astounding progress in understanding the biochemical mechanisms in individual cells, that it appears we'll be able to harness these mechanisms to construct digital- logic circuits. Imagine a discipline of cellular engineering that could tailor-make biological cells that function as sensors and actuators, as programmable delivery vehicles for pharmaceuticals, as chemical factories for the assembly of nanoscale structures. Fabricating such systems seem to be within our reach, even if it is not yet within our grasp Fabrication, however, is only part of the story. We can envision producing vast quantities of individual computing elements, whether microfabricated particles, engineered cells, or macromolecular computing

  5. Environmental Survey preliminary report

    SciTech Connect

    Not Available

    1988-04-01

    This report presents the preliminary findings from the first phase of the Environmental Survey of the United States Department of Energy (DOE) Sandia National Laboratories conducted August 17 through September 4, 1987. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with Sandia National Laboratories-Albuquerque (SNLA). The Survey covers all environmental media and all areas of environmental regulation. It is being performed in accordance with the DOE Environmental Survey Manual. This phase of the Survey involves the review of existing site environmental data, observations of the operations carried on at SNLA, and interviews with site personnel. 85 refs., 49 figs., 48 tabs.

  6. An Innovative, Effective and Cost Effective Survey Method Using a Survey-Check Response Format

    PubMed Central

    Feil, Edward G.; Severson, Herbert; Taylor, Ted; Boles, Shawn; Albert, David A.; Blair, Jason

    2007-01-01

    Maximizing the response rate to surveys involves thoughtful choices about survey design, sampling and collection methods. This paper describes an innovative survey method, to provide immediate reinforcement for responding and to minimize the response cost. This method involves using a questionnaire printed as checks on security (anti-fraud) paper with questions and responses separated using a perforated tear off section. Once a participant completes the survey, the response area is detached from the questions, thus protecting the confidentiality of the subject, and the check is returned via the banking system. This report describes the survey-check methodology, the survey flow process, and the results from four research studies which have used this method. These studies include (1) a technology accessibility survey of parents with children enrolled in a low-income preschool program; (2) a parent report of their child’s behavior used as screening criteria for inclusion in a computer-mediated parent education project; (3) a follow-up questionnaire as part of a longitudinal study of child behavior, covering home and classroom interventions, and service utilization, and; (4) a survey of dentists in support of efforts to recruit them to participate in a randomized control trial of tobacco cessation in dental offices. The results of using this method show great improvement in response rates over traditionally administered surveys for three of the four reported studies. Results are discussed in terms of future applications of this method, limitations, and potential cost savings. PMID:17180473

  7. An innovative, effective and cost effective survey method using a survey-check response format.

    PubMed

    Feil, Edward G; Severson, Herbert; Taylor, Ted K; Boles, Shawn; Albert, David A; Blair, Jason

    2007-06-01

    Maximizing the response rate to surveys involves thoughtful choices about survey design, sampling and collection methods. This paper describes an innovative survey method, to provide immediate reinforcement for responding and to minimize the response cost. This method involves using a questionnaire printed as checks on security (anti-fraud) paper with questions and responses separated using a perforated tear off section. Once a participant completes the survey, the response area is detached from the questions, thus protecting the confidentiality of the subject, and the check is returned via the banking system. This report describes the survey-check methodology, the survey flow process, and the results from four research studies which have used this method. These studies include (1) a technology accessibility survey of parents with children enrolled in a low-income preschool program; (2) a parent report of their child's behavior used as screening criteria for inclusion in a computer-mediated parent education project; (3) a follow-up questionnaire as part of a longitudinal study of child behavior, covering home and classroom interventions, and service utilization, and; (4) a survey of dentists in support of efforts to recruit them to participate in a randomized control trial of tobacco cessation in dental offices. The results of using this method show great improvement in response rates over traditionally administered surveys for three of the four reported studies. Results are discussed in terms of future applications of this method, limitations, and potential cost savings. PMID:17180473

  8. Bacteria as computers making computers

    PubMed Central

    Danchin, Antoine

    2009-01-01

    Various efforts to integrate biological knowledge into networks of interactions have produced a lively microbial systems biology. Putting molecular biology and computer sciences in perspective, we review another trend in systems biology, in which recursivity and information replace the usual concepts of differential equations, feedback and feedforward loops and the like. Noting that the processes of gene expression separate the genome from the cell machinery, we analyse the role of the separation between machine and program in computers. However, computers do not make computers. For cells to make cells requires a specific organization of the genetic program, which we investigate using available knowledge. Microbial genomes are organized into a paleome (the name emphasizes the role of the corresponding functions from the time of the origin of life), comprising a constructor and a replicator, and a cenome (emphasizing community-relevant genes), made up of genes that permit life in a particular context. The cell duplication process supposes rejuvenation of the machine and replication of the program. The paleome also possesses genes that enable information to accumulate in a ratchet-like process down the generations. The systems biology must include the dynamics of information creation in its future developments. PMID:19016882

  9. The Effect of Survey Mode on High School Risk Behavior Data: A Comparison between Web and Paper-Based Surveys

    ERIC Educational Resources Information Center

    Raghupathy, Shobana; Hahn-Smith, Stephen

    2013-01-01

    There has been increasing interest in using of web-based surveys--rather than paper based surveys--for collecting data on alcohol and other drug use in middle and high schools in the US. However, prior research has indicated that respondent confidentiality is an underlying concern with online data collection especially when computer-assisted…

  10. Nova Survey participation requested

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2013-03-01

    The AAVSO solicits participation in an online nova survey from our member and observer communities. The survey is being conducted in advance of an upcoming long-term observing campaign that will be launched in mid-April 2013. We are seeking participation in this survey from as broad a sample of the AAVSO community as possible, and your responses will help us gauge the effectiveness of the campaign and serve the observer community better. The survey may be completed anonymously, but you will have the option of providing us with your name and AAVSO observer code if you choose. Please visit the following website to complete the survey: https://www.surveymonkey.com/s/ZQHDYWB. The survey should take no more than five minutes to complete. We ask that you complete the survey by Monday, April 15, 2013.

  11. Developing the online survey.

    PubMed

    Gordon, Jeffry S; McNew, Ryan

    2008-12-01

    Institutions of higher education are now using Internet-based technology tools to conduct surveys for data collection. Research shows that the type and quality of responses one receives with online surveys are comparable with what one receives in paper-based surveys. Data collection can take place on Web-based surveys, e-mail-based surveys, and personal digital assistants/Smartphone devices. Web surveys can be subscription templates, software packages installed on one's own server, or created from scratch using Web programming development tools. All of these approaches have their advantages and disadvantages. The survey owner must make informed decisions as to the right technology to implement. The correct choice can save hours of work in sorting, organizing, and analyzing data. PMID:18940417

  12. CAMSS: A spectroscopic survey of meteoroid elemental abundances

    NASA Astrophysics Data System (ADS)

    Jenniskens, P.; Gural, P.; Berdeu, A.

    2014-07-01

    The main element abundances (Mg, Fe, Na, ...) of some Near Earth Objects can be measured by meteor spectroscopy. The Cameras for All-sky Meteor Surveillance (CAMS) Spectrograph project aims to scale up meteor spectroscopy in the same way as CAMS scaled up the measurement of precise meteoroid trajectories from multi-station video observations. Spectra are recorded with sixteen low-light video cameras, each equipped with a high 1379 lines/mm objective transmission grating. The cameras are operated in survey mode and have recorded spectra in the San Francisco Bay Area every clear night since March 12, 2013. An interactive software tool is being developed to calibrate the wavelength alignments projected on the focal plane and extract the meteor spectra. Because the meteoroid trajectory and pre-atmospheric orbit are also independently measured, the absolute abundances of elements in the meteoroid plasma can be calculated as a function of altitude, while the orbital information can tie the meteoroid back to its parent object. % 2007AdSpR..39..538A Berezhnoy, A. A., Borovička, J. 2012, ACM 2012, Abstract 6142 1993A&A...279..627B 1994A&AS..103...83B 2005Icar..174...15B 2011pimo.conf...28G Gural, P. S. 2012, M&PS, 47, 1405 1997ApJ...479..441J 2007AdSpR..39..491J 2011Icar..216...40J Gomez, N., Madiedo, J. M., & Trigo-Rodriguez, J. M. 2013, 44th LPSC, Abstract 1239 2007AdSpR..39..513K 2004AJ....128.2564M 2007AdSpR..39..583R 2007AdSpR..39..517T 2011A&A...526A.126W

  13. Multilingual Information Discovery and AccesS (MIDAS): A Joint ACM DL'99/ ACM SIGIR'99 Workshop.

    ERIC Educational Resources Information Center

    Oard, Douglas; Peters, Carol; Ruiz, Miguel; Frederking, Robert; Klavans, Judith; Sheridan, Paraic

    1999-01-01

    Discusses a multidisciplinary workshop that addressed issues concerning internationally distributed information networks. Highlights include multilingual information access in media other than character-coded text; cross-language information retrieval and multilingual metadata; and evaluation of multilingual systems. (LRW)

  14. ARM Airborne Carbon Measurements (ARM-ACME) and ARM-ACME 2.5 Final Campaign Reports

    SciTech Connect

    Biraud, S. C.; Tom, M. S.; Sweeney, C.

    2016-01-01

    We report on a 5-year multi-institution and multi-agency airborne study of atmospheric composition and carbon cycling at the Atmospheric Radiation Measurement (ARM) Climate Research Facility’s Southern Great Plains (SGP) site, with scientific objectives that are central to the carbon-cycle and radiative-forcing goals of the U.S. Global Change Research Program and the North American Carbon Program (NACP). The goal of these measurements is to improve understanding of 1) the carbon exchange of the Atmospheric Radiation Measurement (ARM) SGP region; 2) how CO2 and associated water and energy fluxes influence radiative-forcing, convective processes, and CO2 concentrations over the ARM SGP region, and 3) how greenhouse gases are transported on continental scales.

  15. RATIO COMPUTER

    DOEpatents

    Post, R.F.

    1958-11-11

    An electronic computer circuit is described for producing an output voltage proportional to the product or quotient of tbe voltages of a pair of input signals. ln essence, the disclosed invention provides a computer having two channels adapted to receive separate input signals and each having amplifiers with like fixed amplification factors and like negatlve feedback amplifiers. One of the channels receives a constant signal for comparison purposes, whereby a difference signal is produced to control the amplification factors of the variable feedback amplifiers. The output of the other channel is thereby proportional to the product or quotient of input signals depending upon the relation of input to fixed signals in the first mentioned channel.

  16. Computational Combustion

    SciTech Connect

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  17. Singularity computations

    NASA Technical Reports Server (NTRS)

    Swedlow, J. L.

    1976-01-01

    An approach is described for singularity computations based on a numerical method for elastoplastic flow to delineate radial and angular distribution of field quantities and measure the intensity of the singularity. The method is applicable to problems in solid mechanics and lends itself to certain types of heat flow and fluid motion studies. Its use is not limited to linear, elastic, small strain, or two-dimensional situations.

  18. Role of Computer Assisted Instruction (CAI) in an Introductory Computer Concepts Course.

    ERIC Educational Resources Information Center

    Skudrna, Vincent J.

    1997-01-01

    Discusses the role of computer assisted instruction (CAI) in undergraduate education via a survey of related literature and specific applications. Describes an undergraduate computer concepts course and includes appendices of instructions, flowcharts, programs, sample student work in accounting, COBOL instructional model, decision logic in a…

  19. Perceived Social Supports, Computer Self-Efficacy, and Computer Use among High School Students

    ERIC Educational Resources Information Center

    Hsiao, Hsi-Chi; Tu, Ya-Ling; Chung, Hsin-Nan

    2012-01-01

    This study investigated the function of social supports and computer self-efficacy in predicting high school students' perceived effect of computer use. The study was survey method to collect data. The questionnaires were distributed to the high school students in Taiwan. 620 questionnaires were distributed and 525 questionnaires were gathered…

  20. The Effects of Applying Authentic Learning Strategies to Develop Computational Thinking Skills in Computer Literacy Students

    ERIC Educational Resources Information Center

    Mingo, Wendye Dianne

    2013-01-01

    This study attempts to determine if authentic learning strategies can be used to acquire knowledge of and increase motivation for computational thinking. Over 600 students enrolled in a computer literacy course participated in this study which involved completing a pretest, posttest and motivation survey. The students were divided into an…

  1. The First Step in Utilizing Computers in Education: Preparing Computer Literate Teachers.

    ERIC Educational Resources Information Center

    Wells, Malcolm; Bitter, Gary

    As a result of a survey concerning computer assisted instruction in Arizona schools, Arizona State University developed a program to assist districts in computer instructional program development. In the initial planning phase of the program, a list was drawn up of preparatory functions essential for districts making a transition to computer…

  2. Summary of Computer Usage and Inventory of Computer Utilization in Curriculum. FY 1987-88.

    ERIC Educational Resources Information Center

    Tennessee Univ., Chattanooga. Center of Excellence for Computer Applications.

    This report presents the results of a computer usage survey/inventory, the ninth in a series conducted at the University of Tennessee at Chattanooga to obtain information on the changing status of computer usage in the curricula. Data analyses are reported in 11 tables, which include comparisons between annual inventories and demonstrate growth…

  3. Computational Biology Support: RECOMB Conference Series (Conference Support)

    SciTech Connect

    Michael Waterman

    2006-06-15

    This funding was support for student and postdoctoral attendance at the Annual Recomb Conference from 2001 to 2005. The RECOMB Conference series was founded in 1997 to provide a scientific forum for theoretical advances in computational biology and their applications in molecular biology and medicine. The conference series aims at attracting research contributions in all areas of computational molecular biology. Typical, but not exclusive, the topics of interest are: Genomics, Molecular sequence analysis, Recognition of genes and regulatory elements, Molecular evolution, Protein structure, Structural genomics, Gene Expression, Gene Networks, Drug Design, Combinatorial libraries, Computational proteomics, and Structural and functional genomics. The origins of the conference came from the mathematical and computational side of the field, and there remains to be a certain focus on computational advances. However, the effective use of computational techniques to biological innovation is also an important aspect of the conference. The conference had a growing number of attendees, topping 300 in recent years and often exceeding 500. The conference program includes between 30 and 40 contributed papers, that are selected by a international program committee with around 30 experts during a rigorous review process rivaling the editorial procedure for top-rate scientific journals. In previous years papers selection has been made from up to 130--200 submissions from well over a dozen countries. 10-page extended abstracts of the contributed papers are collected in a volume published by ACM Press and Springer, and are available at the conference. Full versions of a selection of the papers are published annually in a special issue of the Journal of Computational Biology devoted to the RECOMB Conference. A further point in the program is a lively poster session. From 120-300 posters have been presented each year at RECOMB 2000. One of the highlights of each RECOMB conference is a

  4. Survey Data for Geomagnetic Field Modelling

    NASA Technical Reports Server (NTRS)

    Barraclough, D. R.; Macmillan, S.

    1992-01-01

    The survey data discussed here are based on observations made relatively recently at points on land. A special subset of land survey data consists of those made at specially designated sites known as repeat stations. This class of data will be discussed in another part of this document (Barton, 1991b), so only the briefest of references will be made to repeat stations here. This discussion of 'ordinary' land survey data begins with a description of the spatial and temporal distributions of available survey data based on observations made since 1900. (The reason for this rather arbitrary choice of cut-off date is that this was the value used in the production of the computer file of magnetic survey data (land, sea, air, satellite, rocket) that is the primary source of data for geomagnetic main-field modeling). This is followed by a description of the various types of error to which these survey data are, or may be, subject and a discussion of the likely effects of such errors on field models produced from the data. Finally, there is a short section on the availability of geomagnetic survey data, which also describes how the data files are maintained.

  5. Wellbore inertial directional surveying system

    DOEpatents

    Andreas, R.D.; Heck, G.M.; Kohler, S.M.; Watts, A.C.

    1982-09-08

    A wellbore inertial directional surveying system for providing a complete directional survey of an oil or gas well borehole to determine the displacement in all three directions of the borehole path relative to the well head at the surface. The information generated by the present invention is especially useful when numerous wells are drilled to different geographical targets from a single offshore platform. Accurate knowledge of the path of the borehole allows proper well spacing and provides assurance that target formations are reached. The tool is lowered down into a borehole on an electrical cable. A computer positioned on the surface communicates with the tool via the cable. The tool contains a sensor block which is supported on a single gimbal, the rotation axis of which is aligned with the cylinder axis of the tool and, correspondingly, the borehole. The gyroscope measurement of the sensor block rotation is used in a null-seeking servo loop which essentially prevents rotation of the sensor block about the gimbal axis. Angular rates of the sensor block about axes which are perpendicular to te gimbal axis are measured by gyroscopes in a manner similar to a strapped-down arrangement. Three accelerometers provide acceleration information as the tool is lowered within the borehole. The uphole computer derives position information based upon acceleration information and angular rate information. Kalman estimation techniques are used to compensate for system errors. 25 figures.

  6. Wellbore inertial directional surveying system

    DOEpatents

    Andreas, Ronald D.; Heck, G. Michael; Kohler, Stewart M.; Watts, Alfred C.

    1991-01-01

    A wellbore inertial directional surveying system for providing a complete directional survey of an oil or gas well borehole to determine the displacement in all three directions of the borehole path relative to the well head at the surface. The information generated by the present invention is especially useful when numerous wells are drilled to different geographical targets from a single off-shore platform. Accurate knowledge of the path of the borehole allows proper well spacing and provides assurance that target formations are reached. The tool is lowered down into a borehole on the electrical cable. A computer positioned on the surface communicates with the tool via the cable. The tool contains a sensor block which is supported on a single gimbal, the rotation axis of which is aligned with the cylinder axis of the tool and, correspondingly, the borehole. The gyroscope measurement of the sensor block rotation is used in a null-seeking servo loop which essentially prevents rotation of the sensor block aboutthe gimbal axis. Angular rates of the sensor block about axes which are perpendicular to the gimbal axis are measured by gyroscopes in a manner similar to a strapped-down arrangement. Three accelerometers provide acceleration information as the tool is lowered within the borehole. The uphole computer derives position information based upon acceleration information and anular rate information. Kalman estimation techniques are used to compensate for system errors.

  7. Wellbore inertial directional surveying system

    SciTech Connect

    Andreas, R.D.; Heck, G.M.; Kohler, S.M.; Watts, A.C.

    1991-01-29

    This patent describes a wellbore inertial directional surveying system for providing a complete directional survey of an oil or gas well borehole to determine the displacement in all three directions of the borehole path relative to the well head at the surface. The information generated by the present invention is especially useful when numerous wells are drilled to different geographical targets from a single off-shore platform. Accurate knowledge of the path of the borehole allows proper well spacing and provides assurance that target formations are reached. The tool is lowered down into a borehole on the electrical cable. A computer positioned on the surface communicates with the tool via the cable. The tool contains a sensor block which is supported on a single gimbal, the rotation axis of which is aligned with the cylinder axis of the tool and, correspondingly, the borehole. The gyroscope measurement of the sensor block rotation is used in a null-seeking servo loop which essentially prevents rotation of the sensor block about the gimbal axis. Angular rates of the sensor block about axes which are perpendicular to the gimbal axis are measured by gyroscopes in a manner similar to a strapped-down arrangement. Three accelerometers provide acceleration information as the tool is lowered within the borehole. The uphole computer derives position information based upon acceleration information and annular rate information. Kalman estimation techniques are used to compensate for system errors.

  8. ESO imaging survey: infrared deep public survey

    NASA Astrophysics Data System (ADS)

    Olsen, L. F.; Miralles, J.-M.; da Costa, L.; Madejsky, R.; Jørgensen, H. E.; Mignano, A.; Arnouts, S.; Benoist, C.; Dietrich, J. P.; Slijkhuis, R.; Zaggia, S.

    2006-09-01

    This paper is part of the series presenting the final results obtained by the ESO Imaging Survey (EIS) project. It presents new J and Ks data obtained from observations conducted at the ESO 3.5 m New Technology Telescope (NTT) using the SOFI camera. These data were taken as part of the Deep Public Survey (DPS) carried out by the ESO Imaging Survey program, significantly extending the earlier optical/infrared EIS-DEEP survey presented in a previous paper of this series. The DPS-IR survey comprises two observing strategies: shallow Ks observations providing nearly full coverage of pointings with complementary multi-band (in general {UBVRI}) optical data obtained using ESO's wide-field imager (WFI) and deeper J and Ks observations of the central parts of these fields. Currently, the DPS-IR survey provides a coverage of roughly 2.1 square degrees ( 300 SOFI pointings) in Ks with 0.63 square degrees to fainter magnitudes and also covered in J, over three independent regions of the sky. The goal of the present paper is to briefly describe the observations, the data reduction procedures, and to present the final survey products which include fully calibrated pixel-maps and catalogs extracted from them. The astrometric solution with an estimated accuracy of ⪉0.15 arcsec is based on the USNO catalog and limited only by the accuracy of the reference catalog. The final stacked images presented here number 89 and 272, in J and K_s, respectively, the latter reflecting the larger surveyed area. The J and Ks images were taken with a median seeing of 0.77 arcsec and 0.8 arcsec. The images reach a median 5σ limiting magnitude of JAB˜23.06 as measured within an aperture of 2´´, while the corresponding limiting magnitude in KsAB is 21.41 and 22.16 mag for the shallow and deep strategies. Although some spatial variation due to varying observing conditions is observed, overall the observed limiting magnitudes are consistent with those originally proposed. The quality of the data

  9. Preparing ground states of quantum many-body systems on a quantum computer

    NASA Astrophysics Data System (ADS)

    Poulin, David

    2009-03-01

    The simulation of quantum many-body systems is a notoriously hard problem in condensed matter physics, but it could easily be handled by a quantum computer [4,1]. There is however one catch: while a quantum computer can naturally implement the dynamics of a quantum system --- i.e. solve Schr"odinger's equation --- there was until now no general method to initialize the computer in a low-energy state of the simulated system. We present a quantum algorithm [5] that can prepare the ground state and thermal states of a quantum many-body system in a time proportional to the square-root of its Hilbert space dimension. This is the same scaling as required by the best known algorithm to prepare the ground state of a classical many-body system on a quantum computer [3,2]. This provides strong evidence that for a quantum computer, preparing the ground state of a quantum system is in the worst case no more difficult than preparing the ground state of a classical system. 1 D. Aharonov and A. Ta-Shma, Adiabatic quantum state generation and statistical zero knowledge, Proc. 35th Annual ACM Symp. on Theo. Comp., (2003), p. 20. F. Barahona, On the computational complexity of ising spin glass models, J. Phys. A. Math. Gen., 15 (1982), p. 3241. C. H. Bennett, E. Bernstein, G. Brassard, and U. Vazirani, Strengths and weaknessess of quantum computing, SIAM J. Comput., 26 (1997), pp. 1510--1523, quant-ph/9701001. S. Lloyd, Universal quantum simulators, Science, 273 (1996), pp. 1073--1078. D. Poulin and P. Wocjan, Preparing ground states of quantum many-body systems on a quantum computer, 2008, arXiv:0809.2705.

  10. Computers, Sex, and Society. The Microcomputing Program at Drexel University.

    ERIC Educational Resources Information Center

    McCord, Joan

    Results of 1983 and 1984 surveys of faculty and students at Drexel University are reviewed to evaluate the effects of the requirement that incoming students must buy a microcomputer. Slightly more than half the faculty were computer-competent (i.e., had considerable experience with at least one class of computer, with using computers in teaching,…

  11. Sex Differences in Attitudes, Achievement and Use of Computers.

    ERIC Educational Resources Information Center

    Hattie, John; Fitzgerald, Donald

    1987-01-01

    Two Australian studies of male and female achievement, attitudes toward computers, and computer use are reported and discussed. One study investigated differences between male and female parents, teachers, and students in 32 schools with extensive computer experience; the other is a survey of 1,000 schools throughout the country. (MSE)

  12. Selective Guide to Literature on Computer Engineering. Engineering Literature Guides, Number 1.

    ERIC Educational Resources Information Center

    Bean, Margaret H., Ed.

    This bibliography covers computer engineering and computer architecture for all sizes of computers. Not covered are areas of mathematical computer science or publications dealing specifically with microcomputers. This document is a survey of information sources in computer engineering and is intended to identify those core resources which can help…

  13. Computers: from ethos and ethics to mythos and religion. Notes on the new frontier between computers and philosophy

    SciTech Connect

    Mitcham, C.

    1986-01-01

    This essay surveys recent studies concerning the social, cultural, ethical and religious dimensions of computers. The argument is that computers have certain cultural influences which call for ethical analysis. Further suggestions are that American culture is itself reflected in new ways in the high-technology computer milieu, and that ethical issues entail religious ones which are being largely ignored. 28 references.

  14. The current status of super computers

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1978-01-01

    In this paper, commercially available super computers are surveyed. Computer performance in general is limited by circuit speeds and physical size. Assuming the use of the fastest technology, super computers typically use parallelism in the form of either vector processing or array processing to obtain performance. The Burroughs Scientific Processor is an array computer with 16 separate processors, the Cray-1 and CDC STAR-100 are vector processors, the Goodyear Aerospace STARAN is an array processor with up to 8192 single bit processors, and the Systems Development Corporation PEPE is a collection of up to 288 separate processors.

  15. Quality indexing with computer-aided lexicography

    NASA Technical Reports Server (NTRS)

    Buchan, Ronald L.

    1992-01-01

    Indexing with computers is a far cry from indexing with the first indexing tool, the manual card sorter. With the aid of computer-aided lexicography, both indexing and indexing tools can provide standardization, consistency, and accuracy, resulting in greater quality control than ever before. A brief survey of computer activity in indexing is presented with detailed illustrations from NASA activity. Applications from techniques mentioned, such as Retrospective Indexing (RI), can be made to many indexing systems. In addition to improving the quality of indexing with computers, the improved efficiency with which certain tasks can be done is demonstrated.

  16. The CSS and SSS NEO surveys

    NASA Astrophysics Data System (ADS)

    Larson, S.; Beshore, E.; Hill, R.; Christensen, E.; McLean, D.; Kolar, S.; McNaught, R.; Garradd, G.

    2003-05-01

    After extensive refurbishment, the Catalina Schmidt is back on line and the Catalina Sky Survey for NEOs has resumed. Compared to the old system, the new coverage rate has doubled to the same R 19.3 limit with a red filter that reduces scattered moonlight and twilight. The data reduction and detection software has been improved with features to automate "housekeeping" and to be more fault tolerant. Together, these help free the observer to concentrate on visual validation of moving object candidates. The Siding Spring Survey is a southern hemisphere component based upon the modified 0.5-m Uppsala Schmidt to use the same type of camera, support computers, and software. Currently in commissioning phase, it will cover areas of the sky unreachable by the northern hemisphere surveys. New to the CSS is the Mt. Lemmon 1.5-m which has been upgraded with computer control and a prime focus camera. Our objective with this instrument is to survey smaller areas to a fainter limit (R 22) for two weeks per month. The cell for the field correcting optics is nearing completion and commissioning is planned for this fall. We describe the details of these facilities, the synergy afforded by the diverse locations and apertures, and the efficiencies that can result from using similar software and processes in conducting all three surveys. This work is supported by NASA NEOO grants NAGW5-10853 and NAGW5-132

  17. The Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Axelrod, T. S.

    2006-07-01

    The Large Synoptic Survey Telescope (LSST) is an 8.4 meter telescope with a 10 square degree field degree field and a 3 Gigapixel imager, planned to be on-sky in 2012. It is a dedicated all-sky survey instrument, with several complementary science missions. These include understanding dark energy through weak lensing and supernovae; exploring transients and variable objects; creating and maintaining a solar system map, with particular emphasis on potentially hazardous objects; and increasing the precision with which we understand the structure of the Milky Way. The instrument operates continuously at a rapid cadence, repetitively scanning the visible sky every few nights. The data flow rates from LSST are larger than those from current surveys by roughly a factor of 1000: A few GB/night are typical today. LSST will deliver a few TB/night. From a computing hardware perspective, this factor of 1000 can be dealt with easily in 2012. The major issues in designing the LSST data management system arise from the fact that the number of people available to critically examine the data will not grow from current levels. This has a number of implications. For example, every large imaging survey today is resigned to the fact that their image reduction pipelines fail at some significant rate. Many of these failures are dealt with by rerunning the reduction pipeline under human supervision, with carefully ``tweaked'' parameters to deal with the original problem. For LSST, this will no longer be feasible. The problem is compounded by the fact that the processing must of necessity occur on clusters with large numbers of CPU's and disk drives, and with some components connected by long-haul networks. This inevitably results in a significant rate of hardware component failures, which can easily lead to further software failures. Both hardware and software failures must be seen as a routine fact of life rather than rare exceptions to normality.

  18. Digitised optical sky surveys.

    NASA Astrophysics Data System (ADS)

    MacGillivray, H. T.

    1990-12-01

    Contents: 1. The Second Palomar Observatory Sky Survey. 2. The status of the UKST surveys. 3. A proposal for the construction of a 150/220-cm Schmidt Telescope and processing facilities in China. 4. The measuring machines - a world roundup. 5. Reports from the individual machine groups. 6. A progress report on the APS catalog of POSS I. 7. The ROE/NRL collaborative effort on the COSMOS/UKST survey material. 8. Automated optical identification of IRAS Faint Source Survey Objects. 9. A catalogue of the North Galactic Pole. 10. The need for standard data sets. 11. Programmes on plate calibration. 12. Automated image measuring system. 13. Astronomical image data compression. 14. Opportunities for image compression in astronomy. 15. The Loiano 152 cm telescope CCD images archive. 16. PPM: a reference star catalogue for sky surveys. 17. Announcement: Second Meeting on Digitised Optical Sky Surveys.

  19. New computer architectures

    SciTech Connect

    Tiberghien, J.

    1984-01-01

    This book presents papers on supercomputers. Topics considered include decentralized computer architecture, new programming languages, data flow computers, reduction computers, parallel prefix calculations, structural and behavioral descriptions of digital systems, instruction sets, software generation, personal computing, and computer architecture education.

  20. Atmospheric prediction model survey

    NASA Technical Reports Server (NTRS)

    Wellck, R. E.

    1976-01-01

    As part of the SEASAT Satellite program of NASA, a survey of representative primitive equation atmospheric prediction models that exist in the world today was written for the Jet Propulsion Laboratory. Seventeen models developed by eleven different operational and research centers throughout the world are included in the survey. The surveys are tutorial in nature describing the features of the various models in a systematic manner.