Science.gov

Sample records for acm computing surveys

  1. ACM TOMS replicated computational results initiative

    SciTech Connect

    Heroux, Michael Allen

    2015-06-03

    In this study, the scientific community relies on the peer review process for assuring the quality of published material, the goal of which is to build a body of work we can trust. Computational journals such as The ACM Transactions on Mathematical Software (TOMS) use this process for rigorously promoting the clarity and completeness of content, and citation of prior work. At the same time, it is unusual to independently confirm computational results.

  2. ACM TOMS replicated computational results initiative

    DOE PAGES

    Heroux, Michael Allen

    2015-06-03

    In this study, the scientific community relies on the peer review process for assuring the quality of published material, the goal of which is to build a body of work we can trust. Computational journals such as The ACM Transactions on Mathematical Software (TOMS) use this process for rigorously promoting the clarity and completeness of content, and citation of prior work. At the same time, it is unusual to independently confirm computational results.

  3. Categorization of Computing Education Resources into the ACM Computing Classification System

    SciTech Connect

    Chen, Yinlin; Bogen, Paul Logasa; Fox, Dr. Edward A.; Hsieh, Dr. Haowei; Cassel, Dr. Lillian N.

    2012-01-01

    The Ensemble Portal harvests resources from multiple heterogonous federated collections. Managing these dynamically increasing collections requires an automatic mechanism to categorize records in to corresponding topics. We propose an approach to use existing ACM DL metadata to build classifiers for harvested resources in the Ensemble project. We also present our experience on utilizing the Amazon Mechanical Turk platform to build ground truth training data sets from Ensemble collections.

  4. Preliminary survey report: control technology for the ceramic industry at Acme Brick Company, Malvern, Arkansas

    SciTech Connect

    Godbey, F.W.

    1983-06-01

    Health-hazard control methods, work processes, and existing control technologies used in the manufacture of brick were surveyed at Acme Brick Company, Malvern, Arkansas in June, 1983. The company employed about 32 workers to produce structural brick from alluvial clay, free clay, shale, and aggregate. A potential hazard existed from silica exposure since the clays contained about 20% quartz. Raw materials were transported in a cab-enclosed front-end loader to feeders that delivered the materials to a crusher. Blended coarsely crushed material was moved by conveyor to a hammer mill for fine crushing. Production-size product was transported by overhead conveyor to storage silos in the production building. The entire material particle-size reduction process was completely automated. The clay-preparation building and raw-material storage area were isolated from the production building, and only two workers performed the crushing and grinding operations. Material transfer points had removable covers, and a water-mist spray was used on one conveyor of each line. The operation was monitored from a totally enclosed air-conditioned control room. Head and eye protection were required. The author does not recommend an in-depth study of control technologies of the company.

  5. Proceedings of the ACM-SIGSAM 1989 international symposium on symbolic and algebraic computation

    SciTech Connect

    Not Available

    1989-01-01

    This book contain subjects under the following topics: Gencray: A portable code generator for Cray fortan; massively parallel symbolic computation; reduction of group constructions to point stabilizers; and constrained equational reasoning.

  6. Human factors in computing systems: focus on patient-centered health communication at the ACM SIGCHI conference.

    PubMed

    Wilcox, Lauren; Patel, Rupa; Chen, Yunan; Shachak, Aviv

    2013-12-01

    Health Information Technologies, such as electronic health records (EHR) and secure messaging, have already transformed interactions among patients and clinicians. In addition, technologies supporting asynchronous communication outside of clinical encounters, such as email, SMS, and patient portals, are being increasingly used for follow-up, education, and data reporting. Meanwhile, patients are increasingly adopting personal tools to track various aspects of health status and therapeutic progress, wishing to review these data with clinicians during consultations. These issues have drawn increasing interest from the human-computer interaction (HCI) community, with special focus on critical challenges in patient-centered interactions and design opportunities that can address these challenges. We saw this community presenting and interacting at the ACM SIGCHI 2013, Conference on Human Factors in Computing Systems, (also known as CHI), held April 27-May 2nd, 2013 at the Palais de Congrès de Paris in France. CHI 2013 featured many formal avenues to pursue patient-centered health communication: a well-attended workshop, tracks of original research, and a lively panel discussion. In this report, we highlight these events and the main themes we identified. We hope that it will help bring the health care communication and the HCI communities closer together.

  7. ACME-III and ACME-IV Final Campaign Reports

    SciTech Connect

    Biraud, S. C.

    2016-01-01

    The goals of the Atmospheric Radiation Measurement (ARM) Climate Research Facility’s third and fourth Airborne Carbon Measurements (ACME) field campaigns, ACME-III and ACME-IV, are: 1) to measure and model the exchange of CO2, water vapor, and other greenhouse gases by the natural, agricultural, and industrial ecosystems of the Southern Great Plains (SGP) region; 2) to develop quantitative approaches to relate these local fluxes to the concentration of greenhouse gases measured at the Central Facility tower and in the atmospheric column above the ARM SGP Central Facility, 3) to develop and test bottom-up measurement and modeling approaches to estimate regional scale carbon balances, and 4) to develop and test inverse modeling approaches to estimate regional scale carbon balance and anthropogenic sources over continental regions. Regular soundings of the atmosphere from near the surface into the mid-troposphere are essential for this research.

  8. Quark ACM with topologically generated gluon mass

    NASA Astrophysics Data System (ADS)

    Choudhury, Ishita Dutta; Lahiri, Amitabha

    2016-03-01

    We investigate the effect of a small, gauge-invariant mass of the gluon on the anomalous chromomagnetic moment (ACM) of quarks by perturbative calculations at one-loop level. The mass of the gluon is taken to have been generated via a topological mass generation mechanism, in which the gluon acquires a mass through its interaction with an antisymmetric tensor field Bμν. For a small gluon mass ( < 10 MeV), we calculate the ACM at momentum transfer q2 = -M Z2. We compare those with the ACM calculated for the gluon mass arising from a Proca mass term. We find that the ACM of up, down, strange and charm quarks vary significantly with the gluon mass, while the ACM of top and bottom quarks show negligible gluon mass dependence. The mechanism of gluon mass generation is most important for the strange quarks ACM, but not so much for the other quarks. We also show the results at q2 = -m t2. We find that the dependence on gluon mass at q2 = -m t2 is much less than at q2 = -M Z2 for all quarks.

  9. Experiments in Computing: A Survey

    PubMed Central

    Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general. PMID:24688404

  10. Additive Construction with Mobile Emplacement (ACME)

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The Additive Construction with Mobile Emplacement (ACME) project is developing technology to build structures on planetary surfaces using in-situ resources. The project focuses on the construction of both 2D (landing pads, roads, and structure foundations) and 3D (habitats, garages, radiation shelters, and other structures) infrastructure needs for planetary surface missions. The ACME project seeks to raise the Technology Readiness Level (TRL) of two components needed for planetary surface habitation and exploration: 3D additive construction (e.g., contour crafting), and excavation and handling technologies (to effectively and continuously produce in-situ feedstock). Additionally, the ACME project supports the research and development of new materials for planetary surface construction, with the goal of reducing the amount of material to be launched from Earth.

  11. Survey of Computer Usage in Louisiana Schools.

    ERIC Educational Resources Information Center

    Kirby, Peggy C.; And Others

    A survey of computer usage in 179 randomly selected public elementary and secondary schools in Louisiana was conducted in the spring of 1988. School principals responded to questions about school size, the socioeconomic status of the student population, the number of teachers certified in computer literacy and computer science, and the number of…

  12. 1987-88 Statewide Computer Survey Report.

    ERIC Educational Resources Information Center

    South Carolina Educational Television Network Columbia.

    This fifth annual survey of computers and their use in South Carolina schools covers the 1978-88 school years. A questionnaire inventoried computer equipment and software, and dealt with such issues as instructional and administrative uses of computers, and availability of funding. The forms were distributed to all South Carolina public school…

  13. Towards an Autonomic Cluster Management System (ACMS) with Reflex Autonomicity

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Hinchey, Mike; Sterritt, Roy

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of providing a fault-tolerant environment and achieving significant computational capabilities for high-performance computing applications. However, the task of manually managing and configuring a cluster quickly becomes daunting as the cluster grows in size. Autonomic computing, with its vision to provide self-management, can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Autonomic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management and its evolution to include reflex reactions via pulse monitoring.

  14. A Survey of Techniques for Approximate Computing

    DOE PAGES

    Mittal, Sparsh

    2016-03-18

    Approximate computing trades off computation quality with the effort expended and as rising performance demands confront with plateauing resource budgets, approximate computing has become, not merely attractive, but even imperative. Here, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU and FPGA), processor components, memory technologies etc., and programming frameworks for AC. Moreover, we classify these techniques based on several key characteristics to emphasize their similarities and differences. Finally, the aim of this paper is tomore » provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.« less

  15. How to recycle asbestos containing materials (ACM)

    SciTech Connect

    Jantzen, C.M.

    2000-04-11

    The current disposal of asbestos containing materials (ACM) in the private sector consists of sealing asbestos wetted with water in plastic for safe transportation and burial in regulated land fills. This disposal methodology requires large disposal volumes especially for asbestos covered pipe and asbestos/fiberglass adhering to metal framework, e.g. filters. This wrap and bury technology precludes recycle of the asbestos, the pipe and/or the metal frameworks. Safe disposal of ACM at U.S. Department of Energy (DOE) sites, likewise, requires large disposal volumes in landfills for non-radioactive ACM and large disposal volumes in radioactive burial grounds for radioactive and suspect contaminated ACM. The availability of regulated disposal sites is rapidly diminishing causing recycle to be a more attractive option. Asbestos adhering to metal (e.g., pipes) can be recycled by safely removing the asbestos from the metal in a patented hot caustic bath which prevents airborne contamination /inhalation of asbestos fibers. The dissolution residue (caustic and asbestos) can be wet slurry fed to a melter and vitrified into a glass or glass-ceramic. Palex glasses, which are commercially manufactured, are shown to be preferred over conventional borosilicate glasses. The Palex glasses are alkali magnesium silicate glasses derived by substituting MgO for B{sub 2}O{sub 3} in borosilicate type glasses. Palex glasses are very tolerant of the high MgO and high CaO content of the fillers used in forming asbestos coverings for pipes and found in boiler lashing, e.g., hydromagnesite (3MgCO{sub 3} Mg(OH){sub 2} 3H{sub 2}O) and plaster of paris, gypsum (CaSO{sub 4}). The high temperate of the vitrification process destroys the asbestos fibers and renders the asbestos non-hazardous, e.g., a glass or glass-ceramic. In this manner the glass or glass-ceramic produced can be recycled, e.g., glassphalt or glasscrete, as can the clean metal pipe or metal framework.

  16. Multivariate Lipschitz optimization: Survey and computational comparison

    SciTech Connect

    Hansen, P.; Gourdin, E.; Jaumard, B.

    1994-12-31

    Many methods have been proposed to minimize a multivariate Lipschitz function on a box. They pertain the three approaches: (i) reduction to the univariate case by projection (Pijavskii) or by using a space-filling curve (Strongin); (ii) construction and refinement of a single upper bounding function (Pijavskii, Mladineo, Mayne and Polak, Jaumard Hermann and Ribault, Wood...); (iii) branch and bound with local upper bounding functions (Galperin, Pint{acute e}r, Meewella and Mayne, the present authors). A survey is made, stressing similarities of algorithms, expressed when possible within a unified framework. Moreover, an extensive computational comparison is reported on.

  17. AcmD, a Homolog of the Major Autolysin AcmA of Lactococcus lactis, Binds to the Cell Wall and Contributes to Cell Separation and Autolysis

    PubMed Central

    Visweswaran, Ganesh Ram R.; Steen, Anton; Leenhouts, Kees; Szeliga, Monika; Ruban, Beata; Hesseling-Meinders, Anne; Dijkstra, Bauke W.; Kuipers, Oscar P.; Kok, Jan; Buist, Girbe

    2013-01-01

    Lactococcus lactis expresses the homologous glucosaminidases AcmB, AcmC, AcmA and AcmD. The latter two have three C-terminal LysM repeats for peptidoglycan binding. AcmD has much shorter intervening sequences separating the LysM repeats and a lower iso-electric point (4.3) than AcmA (10.3). Under standard laboratory conditions AcmD was mainly secreted into the culture supernatant. An L. lactis acmAacmD double mutant formed longer chains than the acmA single mutant, indicating that AcmD contributes to cell separation. This phenotype could be complemented by plasmid-encoded expression of AcmD in the double mutant. No clear difference in cellular lysis and protein secretion was observed between both mutants. Nevertheless, overexpression of AcmD resulted in increased autolysis when AcmA was present (as in the wild type strain) or when AcmA was added to the culture medium of an AcmA-minus strain. Possibly, AcmD is mainly active within the cell wall, at places where proper conditions are present for its binding and catalytic activity. Various fusion proteins carrying either the three LysM repeats of AcmA or AcmD were used to study and compare their cell wall binding characteristics. Whereas binding of the LysM domain of AcmA took place at pHs ranging from 4 to 8, LysM domain of AcmD seems to bind strongest at pH 4. PMID:23951292

  18. In-situ Data Analysis Framework for ACME Land Simulations

    NASA Astrophysics Data System (ADS)

    Wang, D.; Yao, C.; Jia, Y.; Steed, C.; Atchley, S.

    2015-12-01

    The realistic representation of key biogeophysical and biogeochemical functions is the fundamental of process-based ecosystem models. Investigating the behavior of those ecosystem functions within real-time model simulation can be a very challenging due to the complex of both model and software structure of an environmental model, such as the Accelerated Climate Model for Energy (ACME) Land Model (ALM). In this research, author will describe the urgent needs and challenges for in-situ data analysis for ALM simulations, and layouts our methods/strategies to meet these challenges. Specifically, an in-situ data analysis framework is designed to allow users interactively observe the biogeophyical and biogeochemical process during ALM simulation. There are two key components in this framework, automatically instrumented ecosystem simulation, in-situ data communication and large-scale data exploratory toolkit. This effort is developed by leveraging several active projects, including scientific unit testing platform, common communication interface and extreme-scale data exploratory toolkit. Authors believe that, based on advanced computing technologies, such as compiler-based software system analysis, automatic code instrumentation, and in-memory data transport, this software system provides not only much needed capability for real-time observation and in-situ data analytics for environmental model simulation, but also the potentials for in-situ model behavior adjustment via simulation steering.

  19. Faculty Computer Expertise and Use of Instructional Technology. Technology Survey.

    ERIC Educational Resources Information Center

    Gabriner, Robert; Mery, Pamela

    This report shows the findings of a 1997 technology survey used to assess degrees of faculty computer expertise and the use of instructional technology. Part 1 reviews general findings of the fall 1997 technology survey: (1) the level of computer expertise among faculty, staff and administrators appears to be increasing; (2) in comparison with the…

  20. Equivalency of Paper versus Tablet Computer Survey Data

    ERIC Educational Resources Information Center

    Ravert, Russell D.; Gomez-Scott, Jessica; Donnellan, M. Brent

    2015-01-01

    Survey responses collected via paper surveys and computer tablets were compared to test for differences between those methods of obtaining self-report data. College students (N = 258) were recruited in public campus locations and invited to complete identical surveys on either paper or iPad tablet. Only minor homogeneity differences were found…

  1. A Sample Survey of Attitudes to Computer Studies.

    ERIC Educational Resources Information Center

    Gardner, J. R.; And Others

    1986-01-01

    Discusses survey of 1,441 lower sixth-form Northern Ireland students which explored attitudes toward computing and computers, aspirations towards computer-related careers, and attitudes toward programming and game playing activities. Statistical tests were applied to results to identify overall trends and assess significance of boy-girl agreement…

  2. Computer Education - A Survey of Seventh and Eighth Grade Teachers.

    ERIC Educational Resources Information Center

    Bassler, Otto; And Others

    Tennessee is in the process of implementing a computer literacy plan for grades 7 and 8. Determining the views of teachers in those grades about computers, what they think students should be taught about computers, and the extent to which they agree with aspects of the plan was the goal of this survey. Data were analyzed from 122 teachers and…

  3. A survey of computer science capstone course literature

    NASA Astrophysics Data System (ADS)

    Dugan, Robert F., Jr.

    2011-09-01

    In this article, we surveyed literature related to undergraduate computer science capstone courses. The survey was organized around course and project issues. Course issues included: course models, learning theories, course goals, course topics, student evaluation, and course evaluation. Project issues included: software process models, software process phases, project type, documentation, tools, groups, and instructor administration. We reflected on these issues and thecomputer science capstone course we have taught for seven years. The survey summarized, organized, and synthesized the literature to provide a referenced resource for computer science instructors and researchers interested in computer science capstone courses.

  4. State of Washington Computer Use Survey.

    ERIC Educational Resources Information Center

    Beal, Jack L.; And Others

    This report presents the results of a spring 1982 survey of a random sample of Washington public schools which separated findings according to school level (elementary, middle, junior high, or high school) and district size (either less than or greater than 2,000 enrollment). A brief review of previous studies and a description of the survey…

  5. Model Diagnostics for the Department of Energy's Accelerated Climate Modeling for Energy (ACME) Project

    NASA Astrophysics Data System (ADS)

    Smith, B.

    2015-12-01

    In 2014, eight Department of Energy (DOE) national laboratories, four academic institutions, one company, and the National Centre for Atmospheric Research combined forces in a project called Accelerated Climate Modeling for Energy (ACME) with the goal to speed Earth system model development for climate and energy. Over the planned 10-year span, the project will conduct simulations and modeling on DOE's most powerful high-performance computing systems at Oak Ridge, Argonne, and Lawrence Berkeley Leadership Compute Facilities. A key component of the ACME project is the development of an interactive test bed for the advanced Earth system model. Its execution infrastructure will accelerate model development and testing cycles. The ACME Workflow Group is leading the efforts to automate labor-intensive tasks, provide intelligent support for complex tasks and reduce duplication of effort through collaboration support. As part of this new workflow environment, we have created a diagnostic, metric, and intercomparison Python framework, called UVCMetrics, to aid in the testing-to-production execution of the ACME model. The framework exploits similarities among different diagnostics to compactly support diagnosis of new models. It presently focuses on atmosphere and land but is designed to support ocean and sea ice model components as well. This framework is built on top of the existing open-source software framework known as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT). Because of its flexible framework design, scientists and modelers now can generate thousands of possible diagnostic outputs. These diagnostics can compare model runs, compare model vs. observation, or simply verify a model is physically realistic. Additional diagnostics are easily integrated into the framework, and our users have already added several. Diagnostics can be generated, viewed, and manipulated from the UV-CDAT graphical user interface, Python command line scripts and programs

  6. Canadian Community College Computer Usage Survey, May 1983.

    ERIC Educational Resources Information Center

    Gee, Michael Dennis

    This survey was conducted to provide information on the level of computer usage in Canadian community colleges. A 19-question form was mailed to the deans of instruction in 175 Canadian public community colleges identified as such by Statistics Canada. Of these, 111 colleges returned their surveys (a 63% response rate), and the results were…

  7. A Survey of Computer Science Capstone Course Literature

    ERIC Educational Resources Information Center

    Dugan, Robert F., Jr.

    2011-01-01

    In this article, we surveyed literature related to undergraduate computer science capstone courses. The survey was organized around course and project issues. Course issues included: course models, learning theories, course goals, course topics, student evaluation, and course evaluation. Project issues included: software process models, software…

  8. Contributions to Art Therapy Literature: A Computer Database Survey.

    ERIC Educational Resources Information Center

    Zeigler, Heather Anne; Hays, Ronald

    1996-01-01

    Surveyed art therapy literature available on computer databases to assess the literary activity of registered art therapists along with other mental health professionals between the years 1983 and 1993. Computer databases were selected as the source of gathering information because they offer a wide and varied audience accessibility to a vast…

  9. Survey of nurse perceptions regarding the utilization of bedside computers.

    PubMed Central

    Willson, D.

    1994-01-01

    In December 1993, Intermountain Health Care (IHC) placed a moratorium on the installation of bedside computers in the acute care setting unless information could be obtained to justify resumption of these installations. A survey was developed and administered to nurses at two IHC hospitals. The survey results indicate that acute care nurses value bedside computers and believe that IHC should install them at other facilities. In addition, the acute care nurses estimate that they are using the bedside computers over 75% of the time during the day shift to document vital signs/measurements and intake/output quantities. Based on the results of this survey, IHC has decided to continue installing bedside computers in the acute care setting. PMID:7949989

  10. Autonomic Cluster Management System (ACMS): A Demonstration of Autonomic Principles at Work

    NASA Technical Reports Server (NTRS)

    Baldassari, James D.; Kopec, Christopher L.; Leshay, Eric S.; Truszkowski, Walt; Finkel, David

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of achieving significant computational capabilities for high-performance computing applications, while simultaneously affording the ability to. increase that capability simply by adding more (inexpensive) processors. However, the task of manually managing and con.guring a cluster quickly becomes impossible as the cluster grows in size. Autonomic computing is a relatively new approach to managing complex systems that can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Automatic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management.

  11. Survey of Intelligent Computer-Aided Training

    NASA Technical Reports Server (NTRS)

    Loftin, R. B.; Savely, Robert T.

    1992-01-01

    Intelligent Computer-Aided Training (ICAT) systems integrate artificial intelligence and simulation technologies to deliver training for complex, procedural tasks in a distributed, workstation-based environment. Such systems embody both the knowledge of how to perform a task and how to train someone to perform that task. This paper briefly reviews the antecedents of ICAT systems and describes the approach to their creation developed at the NASA Lyndon B. Johnson Space Center. In addition to the general ICAT architecture, specific ICAT applications that have been or are currently under development are discussed. ICAT systems can offer effective solutions to a number of training problems of interest to the aerospace community.

  12. Prevalence and genetic diversity of arginine catabolic mobile element (ACME) in clinical isolates of coagulase-negative staphylococci: identification of ACME type I variants in Staphylococcus epidermidis.

    PubMed

    Onishi, Mayumi; Urushibara, Noriko; Kawaguchiya, Mitsuyo; Ghosh, Souvik; Shinagawa, Masaaki; Watanabe, Naoki; Kobayashi, Nobumichi

    2013-12-01

    Arginine catabolic mobile element (ACME), a genomic island consisting of the arc and/or opp3 gene clusters found in staphylococcal species, is related to increased bacterial adaptability to hosts. Staphylococcus epidermidis is considered a major ACME reservoir; however, prevalence and genetic diversity of ACME in coagulase-negative staphylococci (CNS) have not yet been well characterized for clinical isolates in Japan. A total of 271 clinical isolates of CNS in a Japanese hospital were investigated for the presence and genotype of ACME and SCCmec. The prevalence of ACME-arcA was significantly higher (p<0.001) in S. epidermidis (45.8%) than in other CNS species (3.7%). ACME in S. epidermidis isolates (n=87) were differentiated into type I (n=33), variant forms of type I (ΔI, n=26) newly identified in this study, type II (n=6), and type ΔII (n=19). ACME-type ΔI, which were further classified into three subtypes, lacked some genetic components between the arc and opp3 clusters in archetypal type I, whereas the arc and opp3 clusters were intact. The arc cluster exhibited high sequence identity (95.8-100%) to that of type I ACME; in contrast, the opp3 cluster was highly diverse, and showed relatively lower identities (94.8-98.7%) to the identical regions in type I ACME. Twenty-one isolates of ΔI ACME-carrying S. epidermidis possessed SCCmec IVa and belonged to ST5 (clonal complex 2). Phylogenetic analysis revealed that isolates harboring ACME ΔI in this study clustered with previously reported S. epidermidis strains with other lineges, suggesting that S. epidermidis originally had some genetic variations in the opp3 cluster. In summary, ACME type ΔI, a truncated variant of ACME-I, was first identified in S. epidermidis, and revealed to be prevalent in ST5 MRSE clinical isolates with SCCmec IVa.

  13. Evaluating Tablet Computers as a Survey Tool in Rural Communities

    PubMed Central

    Newell, Steve M.; Logan, Henrietta L.; Guo, Yi; Marks, John G.; Shepperd, James A.

    2015-01-01

    Purpose Although tablet computers offer advantages in data collection over traditional paper-and-pencil methods, little research has examined whether the 2 formats yield similar responses, especially with underserved populations. We compared the 2 survey formats and tested whether participants’ responses to common health questionnaires or perceptions of usability differed by survey format. We also tested whether we could replicate established paper-and-pencil findings via tablet computer. Methods We recruited a sample of low-income community members living in the rural southern United States. Participants were 170 residents (black = 49%; white = 36%; other races and missing data = 15%) drawn from 2 counties meeting Florida’s state statutory definition of rural with 100 persons or fewer per square mile. We randomly assigned participants to complete scales (Center for Epidemiologic Studies Depression Inventory and Regulatory Focus Questionnaire) along with survey format usability ratings via paper-and-pencil or tablet computer. All participants rated a series of previously validated posters using a tablet computer. Finally, participants completed comparisons of the survey formats and reported survey format preferences. Findings Participants preferred using the tablet computer and showed no significant differences between formats in mean responses, scale reliabilities, or in participants’ usability ratings. Conclusions Overall, participants reported similar scales responses and usability ratings between formats. However, participants reported both preferring and enjoying responding via tablet computer more. Collectively, these findings are among the first data to show that tablet computers represent a suitable substitute among an underrepresented rural sample for paper-and-pencil methodology in survey research. PMID:25243953

  14. Sealing Force Increasing of ACM Gasket through Electron Beam Radiation

    NASA Astrophysics Data System (ADS)

    dos Santos, D. J.; Batalha, G. F.

    2011-01-01

    Rubber is an engineering material largely used as sealing parts, in form of O-rings, solid gaskets and liquid gaskets, materials applied in liquid state with posterior vulcanization and sealing. Stress relaxation is a rubber characteristic which impacts negatively in such industrial applications (rings and solid gaskets). This work has the purpose to investigate the use of electron beam radiation (EB) as a technology able to decrease the stress relaxation in acrylic rubber (ACM), consequently increasing the sealing capability of this material. ACM samples were irradiated with dose of 100 kGy and 250 kGy, its behavior was comparatively investigated using, dynamic mechanical analysis (DMA) and compression stress relaxation (CSR) experiments. The results obtained by DMA shown an increase of Tg and changes in dynamic mechanical behavior.

  15. Computers for real time flight simulation: A market survey

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.; Karplus, W. J.

    1977-01-01

    An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

  16. Autonomous collaborative mission systems (ACMS) for multi-UAV missions

    NASA Astrophysics Data System (ADS)

    Chen, Y.-L.; Peot, M.; Lee, J.; Sundareswaran, V.; Altshuler, T.

    2005-05-01

    UAVs are a key element of the Army"s vision for Force Transformation, and are expected to be employed in large numbers per FCS Unit of Action (UoA). This necessitates a multi-UAV level of autonomous collaboration behavior capability that meets RSTA and other mission needs of FCS UoAs. Autonomous Collaborative Mission Systems (ACMS) is a scalable architecture and behavior planning / collaborative approach to achieve this level of capability. The architecture is modular and the modules may be run in different locations/platforms to accommodate the constraints of available hardware, processing resources and mission needs. The Mission Management Module determines the role of member autonomous entities by employing collaboration mechanisms (e.g., market-based, etc.), the individual Entity Management Modules work with the Mission Manager in determining the role and task of the entity, the individual Entity Execution Modules monitor task execution and platform navigation and sensor control, and the World Model Module hosts local and global versions of the environment and the Common Operating Picture (COP). The modules and uniform interfaces provide a consistent and platform-independent baseline mission collaboration mechanism and signaling protocol across different platforms. Further, the modular design allows flexible and convenient addition of new autonomous collaborative behaviors to the ACMS through: adding new behavioral templates in the Mission Planner component, adding new components in appropriate ACMS modules to provide new mission specific functionality, adding or modifying constraints or parameters to the existing components, or any combination of these. We describe the ACMS architecture, its main features, current development status and future plans for simulations in this report.

  17. 2005 DOE Computer Graphics Forum Site Survey

    SciTech Connect

    Rebecca, S; Eric, B

    2005-04-15

    The Information Management and Graphics Group supports and develops tools that enhance our ability to access, display, and understand large, complex data sets. Activities include developing visualization software for terascale data exploration; running two video production labs; supporting graphics libraries and tools for end users; maintaining four PowerWalls and assorted other advanced displays; and providing integrated tools for searching, organizing, and browsing scientific data. The Data group supports Defense and Nuclear technologies (D&NT) Directorate. The group's visualization team has developed and maintains two visualization tools: MeshTV and VisIt. These are interactive graphical analysis tools for visualizing and analyzing data on two- and three-dimensional meshes. They also provide movie production support. Researchers in the Center for Applied Scientific Computing (CASC) work on various projects including the development of visualization and data mining techniques for terascale data exploration that are funded by ASC. The researchers also have LDRD projects and collaborations with other lab researchers, academia, and industry.

  18. Campus Computing 1990: The EDUCOM/USC Survey of Desktop Computing in Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.; Eastman, Skip

    The National Survey of Desktop Computer Use in Higher Education was conducted in the spring and summer of 1990 by the Center for Scholarly Technology at the University of Southern California, in cooperation with EDUCOM and with support from 15 corporate sponsors. The survey was designed to collect information about campus planning, policies, and…

  19. The ACLS Survey of Scholars: Views on Publications, Computers, Libraries.

    ERIC Educational Resources Information Center

    Morton, Herbert C.; Price, Anne Jamieson

    1986-01-01

    Reviews results of a survey by the American Council of Learned Societies (ACLS) of 3,835 scholars in the humanities and social sciences who are working both in colleges and universities and outside the academic community. Areas highlighted include professional reading, authorship patterns, computer use, and library use. (LRW)

  20. Business School Computer Usage, Fourth Annual UCLA Survey.

    ERIC Educational Resources Information Center

    Frand, Jason L.; And Others

    The changing nature of the business school computing environment is monitored in a report whose purpose is to provide deans and other policy-makers with information to use in making allocation decisions and program plans. This survey focuses on resource allocations of 249 accredited U.S. business schools and 15 Canadian schools. A total of 128…

  1. Empirical Validation and Application of the Computing Attitudes Survey

    ERIC Educational Resources Information Center

    Dorn, Brian; Elliott Tew, Allison

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…

  2. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  3. Computer program to analyze multipass pressure-temperature-spinner surveys

    SciTech Connect

    Spielman, Paul

    1994-01-20

    A computer program has been developed to analyze multipass pressure-temperature-spinner surveys and summarize the data in graphical form on two plots: (1) an overlay of spinner passes along with a fluid velocity profile calculated from the spinner and (2) an overlay of pressure, pressure gradient, and temperature profiles from each pass. The program has been written using SmartWare II Software. Fluid velocity is calculated for each data point using a cross-plot of tool speed and spinner counts to account for changing flow conditions in the wellbore. The program has been used successfully to analyze spinner surveys run in geothermal wells with two-phase flashing flow.

  4. Icing simulation: A survey of computer models and experimental facilities

    NASA Technical Reports Server (NTRS)

    Potapczuk, M. G.; Reinmann, J. J.

    1991-01-01

    A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focussed on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for increased understanding of the physical processes governing ice accretion, ice shedding, and iced airfoil aerodynamics is examined.

  5. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1982-01-01

    An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.

  6. Survey of computer programs for heat transfer analysis

    NASA Astrophysics Data System (ADS)

    Noor, A. K.

    An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.

  7. Surveying co-located space geodesy techniques for ITRF computation

    NASA Astrophysics Data System (ADS)

    Sarti, P.; Sillard, P.; Vittuari, L.

    2003-04-01

    We present a comprehensive operational methodology, based on classical geodesy triangulation and trilateration, that allows the determination of reference points of the five space geodesy techniques used in ITRF computation (i.e.: DORIS, GPS, LLR, SLR, VLBI). Most of the times, for a single technique, the reference point is not accessible and measurable directly. Likewise, no mechanically determined ex-center with respect to an external and measurable point is usually given. In these cases, it is not possible to directly measure the sought reference points and it is even less straightforward to obtain the statistical information relating these points for different techniques. We outline the most general practical surveying methodology that permits to recover the reference points of the different techniques regardless of their physical materialization. We also give a detailed analytical approach for less straightforward cases (e.g.: non geodetic VLBI antennae and SLR/LLR systems). We stress the importance of surveying instrumentation and procedure in achieving the best possible results and outline the impact of the information retrieved with our method in ITRF computation. In particular, we will give numerical examples of computation of the reference point of VLBI antennae (Ny Aalesund and Medicina) and the ex-centre vector computation linking co-located VLBI and GPS techniques in Medicina (Italy). A special attention was paid to the rigorous derivation of statistical elements. They will be presented in an other presentation.

  8. A Survey of Architectural Techniques for Near-Threshold Computing

    DOE PAGES

    Mittal, Sparsh

    2015-12-28

    Energy efficiency has now become the primary obstacle in scaling the performance of all classes of computing systems. In low-voltage computing and specifically, near-threshold voltage computing (NTC), which involves operating the transistor very close to and yet above its threshold voltage, holds the promise of providing many-fold improvement in energy efficiency. However, use of NTC also presents several challenges such as increased parametric variation, failure rate and performance loss etc. Our paper surveys several re- cent techniques which aim to offset these challenges for fully leveraging the potential of NTC. By classifying these techniques along several dimensions, we also highlightmore » their similarities and differences. Ultimately, we hope that this paper will provide insights into state-of-art NTC techniques to researchers and system-designers and inspire further research in this field.« less

  9. A Survey of Architectural Techniques for Near-Threshold Computing

    SciTech Connect

    Mittal, Sparsh

    2015-12-28

    Energy efficiency has now become the primary obstacle in scaling the performance of all classes of computing systems. In low-voltage computing and specifically, near-threshold voltage computing (NTC), which involves operating the transistor very close to and yet above its threshold voltage, holds the promise of providing many-fold improvement in energy efficiency. However, use of NTC also presents several challenges such as increased parametric variation, failure rate and performance loss etc. Our paper surveys several re- cent techniques which aim to offset these challenges for fully leveraging the potential of NTC. By classifying these techniques along several dimensions, we also highlight their similarities and differences. Ultimately, we hope that this paper will provide insights into state-of-art NTC techniques to researchers and system-designers and inspire further research in this field.

  10. A survey of CPU-GPU heterogeneous computing techniques

    SciTech Connect

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-07-04

    As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and application level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.

  11. A survey of CPU-GPU heterogeneous computing techniques

    DOE PAGES

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-07-04

    As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and applicationmore » level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.« less

  12. Campus Computing, 1995: The Sixth National Survey of Desktop Computing in Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    This monograph reports findings of a Fall, 1995 survey of computing officials at approximately 650 two- and four-year colleges and universities across the United States concerning increasing use of technology on college campuses. Major findings include: the percentage of college courses using e-mail and multimedia resources more than doubled; the…

  13. Campus Computing 1991. The EDUCOM-USC Survey of Desktop Computing in Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.; Eastman, Skip

    A national survey of desktop computing in higher education was conducted in 1991 of 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges, and community colleges. Respondents (N=1099) were individuals specifically responsible for the operation and future direction of academic…

  14. Air Traffic Complexity Measurement Environment (ACME): Software User's Guide

    NASA Technical Reports Server (NTRS)

    1996-01-01

    A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.

  15. Campus Computing, 1998. The Ninth National Survey of Desktop Computing and Information Technology in American Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    This report presents findings of a June 1998 survey of computing officials at 1,623 two- and four-year U.S. colleges and universities concerning the use of computer technology. The survey found that computing and information technology (IT) are now core components of the campus environment and classroom experience. However, key aspects of IT…

  16. The Presence of Computers in American Schools. Teaching, Learning, and Computing: 1998 National Survey. Report No.2.

    ERIC Educational Resources Information Center

    Anderson, Ronald E.; Ronnkvist, Amy

    In order to assess the current presence of computing technology in American schools, a national survey was conducted of elementary and secondary principals and technology coordinators in 655 public and private schools. Results are discussed in terms of: computer density; computer capacity; computer renewal; peripherals; computer location;…

  17. Sparse Polynomial Chaos Surrogate for ACME Land Model via Iterative Bayesian Compressive Sensing

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Ricciuto, D. M.; Safta, C.; Debusschere, B.; Najm, H. N.; Thornton, P. E.

    2015-12-01

    For computationally expensive climate models, Monte-Carlo approaches of exploring the input parameter space are often prohibitive due to slow convergence with respect to ensemble size. To alleviate this, we build inexpensive surrogates using uncertainty quantification (UQ) methods employing Polynomial Chaos (PC) expansions that approximate the input-output relationships using as few model evaluations as possible. However, when many uncertain input parameters are present, such UQ studies suffer from the curse of dimensionality. In particular, for 50-100 input parameters non-adaptive PC representations have infeasible numbers of basis terms. To this end, we develop and employ Weighted Iterative Bayesian Compressive Sensing to learn the most important input parameter relationships for efficient, sparse PC surrogate construction with posterior uncertainty quantified due to insufficient data. Besides drastic dimensionality reduction, the uncertain surrogate can efficiently replace the model in computationally intensive studies such as forward uncertainty propagation and variance-based sensitivity analysis, as well as design optimization and parameter estimation using observational data. We applied the surrogate construction and variance-based uncertainty decomposition to Accelerated Climate Model for Energy (ACME) Land Model for several output QoIs at nearly 100 FLUXNET sites covering multiple plant functional types and climates, varying 65 input parameters over broad ranges of possible values. This work is supported by the U.S. Department of Energy, Office of Science, Biological and Environmental Research, Accelerated Climate Modeling for Energy (ACME) project. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  18. Survey of patient dose in computed tomography in Syria 2009.

    PubMed

    Kharita, M H; Khazzam, S

    2010-09-01

    The radiation doses to patient in computed tomography (CT) in Syria have been investigated and compared with similar studies in different countries. This work surveyed 30 CT scanners from six different manufacturers distributed all over Syria. Some of the results in this paper were part of a project launched by the International Atomic Energy Agency in different regions of the world covering Asia, Africa and Eastern Europe. The dose quantities covered are CT dose index (CTDI(w)), dose-length product (DLP), effective dose (E) and collective dose. It was found that most CTDI(w) and DLP values were similar to the European reference levels and in line with the results of similar surveys in the world. The results were in good agreement with the UNSCEAR Report 2007. This study concluded a recommendation for national diagnostic reference level for the most common CT protocols in Syria. The results can be used as a base for future optimisation studies in the country.

  19. A survey of computational aerodynamics in the United States

    NASA Technical Reports Server (NTRS)

    Gessow, A.; Morris, D. J.

    1977-01-01

    Programs in theoretical and computational aerodynamics in the United States are described. Those aspects of programs that relate to aeronautics are detailed. The role of analysis at various levels of sophistication is discussed as well as the inverse solution techniques that are of primary importance in design methodology. The research is divided into the broad categories of application for boundary layer flow, Navier-Stokes turbulence modeling, internal flows, two-dimensional configurations, subsonic and supersonic aircraft, transonic aircraft, and the space shuttle. A survey of representative work in each area is presented.

  20. Importance of Computer Competencies for Entering JCCC Students: A Survey of Faculty and Staff.

    ERIC Educational Resources Information Center

    Weglarz, Shirley

    Johnson County Community College (JCCC) conducted a survey in response to faculty comments regarding entering students' lack of rudimentary computer skills. Faculty were spending time in non-computer related classes teaching students basic computer skills. The aim of the survey was to determine what the basic computer competencies for entering…

  1. Design and implementation of GaAs HBT circuits with ACME

    NASA Technical Reports Server (NTRS)

    Hutchings, Brad L.; Carter, Tony M.

    1993-01-01

    GaAs HBT circuits offer high performance (5-20 GHz) and radiation hardness (500 Mrad) that is attractive for space applications. ACME is a CAD tool specifically developed for HBT circuits. ACME implements a novel physical schematic-capture design technique where designers simultaneously view the structure and physical organization of a circuit. ACME's design interface is similar to schematic capture; however, unlike conventional schematic capture, designers can directly control the physical placement of both function and interconnect at the schematic level. In addition, ACME provides design-time parasitic extraction, complex wire models, and extensions to Multi-Chip Modules (MCM's). A GaAs HBT gate-array and semi-custom circuits have been developed with ACME; several circuits have been fabricated and found to be fully functional .

  2. Pomegranate MR images analysis using ACM and FCM algorithms

    NASA Astrophysics Data System (ADS)

    Morad, Ghobad; Shamsi, Mousa; Sedaaghi, M. H.; Alsharif, M. R.

    2011-10-01

    Segmentation of an image plays an important role in image processing applications. In this paper segmentation of pomegranate magnetic resonance (MR) images has been explored. Pomegranate has healthy nutritional and medicinal properties for which the maturity indices and quality of internal tissues play an important role in the sorting process in which the admissible determination of features mentioned above cannot be easily achieved by human operator. Seeds and soft tissues are the main internal components of pomegranate. For research purposes, such as non-destructive investigation, in order to determine the ripening index and the percentage of seeds in growth period, segmentation of the internal structures should be performed as exactly as possible. In this paper, we present an automatic algorithm to segment the internal structure of pomegranate. Since its intensity of stem and calyx is close to the internal tissues, the stem and calyx pixels are usually labeled to the internal tissues by segmentation algorithm. To solve this problem, first, the fruit shape is extracted from its background using active contour model (ACM). Then stem and calyx are removed using morphological filters. Finally the image is segmented by fuzzy c-means (FCM). The experimental results represent an accuracy of 95.91% in the presence of stem and calyx, while the accuracy of segmentation increases to 97.53% when stem and calyx are first removed by morphological filters.

  3. ARM Airborne Carbon Measurements VI (ACME VI) Science Plan

    SciTech Connect

    Biraud, S

    2015-12-01

    From October 1 through September 30, 2016, the Atmospheric Radiation Measurement (ARM) Aerial Facility will deploy the Cessna 206 aircraft over the Southern Great Plains (SGP) site, collecting observations of trace-gas mixing ratios over the ARM’s SGP facility. The aircraft payload includes two Atmospheric Observing Systems, Inc., analyzers for continuous measurements of CO2 and a 12-flask sampler for analysis of carbon cycle gases (CO2, CO, CH4, N2O, 13CO2, 14CO2, carbonyl sulfide, and trace hydrocarbon species, including ethane). The aircraft payload also includes instrumentation for solar/infrared radiation measurements. This research is supported by the U.S. Department of Energy’s ARM Climate Research Facility and Terrestrial Ecosystem Science Program and builds upon previous ARM Airborne Carbon Measurements (ARM-ACME) missions. The goal of these measurements is to improve understanding of 1) the carbon exchange at the SGP site, 2) how CO2 and associated water and energy fluxes influence radiative forcing, convective processes and CO2 concentrations over the SGP site, and 3) how greenhouse gases are transported on continental scales.

  4. Computer-Aided Diagnostic System For Mass Survey Chest Images

    NASA Astrophysics Data System (ADS)

    Yasuda, Yoshizumi; Kinoshita, Yasuhiro; Emori, Yasufumi; Yoshimura, Hitoshi

    1988-06-01

    In order to support screening of chest radiographs on mass survey, a computer-aided diagnostic system that automatically detects abnormality of candidate images using a digital image analysis technique has been developed. Extracting boundary lines of lung fields and examining their shapes allowed various kind of abnormalities to be detected. Correction and expansion were facilitated by describing the system control, image analysis control and judgement of abnormality in the rule type programing language. In the experiments using typical samples of student's radiograms, good results were obtained for the detection of abnormal shape of lung field, cardiac hypertrophy and scoliosis. As for the detection of diaphragmatic abnormality, relatively good results were obtained but further improvements will be necessary.

  5. Computational needs survey of NASA automation and robotics missions. Volume 1: Survey and results

    NASA Technical Reports Server (NTRS)

    Davis, Gloria J.

    1991-01-01

    NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is that mission computing requirements are frequently unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. A preliminary set of advanced mission computational processing requirements of automation and robotics (A&R) systems are provided for use by NASA, industry, and academic communities. These results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implementation capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Volume one includes the survey and results. Volume two contains the appendixes.

  6. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1986-01-01

    An overview is given of the current capabilities of thirty-three computer programs that are used to solve heat transfer problems. The programs considered range from large general-purpose codes with broad spectrum of capabilities, large user community, and comprehensive user support (e.g., ABAQUS, ANSYS, EAL, MARC, MITAS II, MSC/NASTRAN, and SAMCEF) to the small, special-purpose codes with limited user community such as ANDES, NTEMP, TAC2D, TAC3D, TEPSA and TRUMP. The majority of the programs use either finite elements or finite differences for the spatial discretization. The capabilities of the programs are listed in tabular form followed by a summary of the major features of each program. The information presented herein is based on a questionnaire sent to the developers of each program. This information is preceded by a brief background material needed for effective evaluation and use of computer programs for heat transfer analysis. The present survey is useful in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program.

  7. Improving radiation survey data using CADD/CAE (computer-aided design and drafting computer-aided engineering)

    SciTech Connect

    Palau, G.L.; Tarpinian, J.E.

    1987-01-01

    A new application of computer-aided design and drafting (CADD) and computer-aided engineering (CAE) at the Three Mile Island Unit 2 (TMI-2) cleanup is improving the quality of radiation survey data taken in the plant. The use of CADD/CAE-generated survey maps has increased both the accuracy of survey data and the capability to perform analyses with these data. In addition, health physics technician manhours and radiation exposure can be reduced in situations where the CADD/CAE-generated drawings are used for survey mapping.

  8. Comparative study of numerical schemes of TVD3, UNO3-ACM and optimized compact scheme

    NASA Technical Reports Server (NTRS)

    Lee, Duck-Joo; Hwang, Chang-Jeon; Ko, Duck-Kon; Kim, Jae-Wook

    1995-01-01

    Three different schemes are employed to solve the benchmark problem. The first one is a conventional TVD-MUSCL (Monotone Upwind Schemes for Conservation Laws) scheme. The second scheme is a UNO3-ACM (Uniformly Non-Oscillatory Artificial Compression Method) scheme. The third scheme is an optimized compact finite difference scheme modified by us: the 4th order Runge Kutta time stepping, the 4th order pentadiagonal compact spatial discretization with the maximum resolution characteristics. The problems of category 1 are solved by using the second (UNO3-ACM) and third (Optimized Compact) schemes. The problems of category 2 are solved by using the first (TVD3) and second (UNO3-ACM) schemes. The problem of category 5 is solved by using the first (TVD3) scheme. It can be concluded from the present calculations that the Optimized Compact scheme and the UN03-ACM show good resolutions for category 1 and category 2 respectively.

  9. The acellular matrix (ACM) for bladder tissue engineering: A quantitative magnetic resonance imaging study.

    PubMed

    Cheng, Hai-Ling Margaret; Loai, Yasir; Beaumont, Marine; Farhat, Walid A

    2010-08-01

    Bladder acellular matrices (ACMs) derived from natural tissue are gaining increasing attention for their role in tissue engineering and regeneration. Unlike conventional scaffolds based on biodegradable polymers or gels, ACMs possess native biomechanical and many acquired biologic properties. Efforts to optimize ACM-based scaffolds are ongoing and would be greatly assisted by a noninvasive means to characterize scaffold properties and monitor interaction with cells. MRI is well suited to this role, but research with MRI for scaffold characterization has been limited. This study presents initial results from quantitative MRI measurements for bladder ACM characterization and investigates the effects of incorporating hyaluronic acid, a natural biomaterial useful in tissue-engineering and regeneration. Measured MR relaxation times (T(1), T(2)) and diffusion coefficient were consistent with increased water uptake and glycosaminoglycan content observed on biochemistry in hyaluronic acid ACMs. Multicomponent MRI provided greater specificity, with diffusion data showing an acellular environment and T(2) components distinguishing the separate effects of increased glycosaminoglycans and hydration. These results suggest that quantitative MRI may provide useful information on matrix composition and structure, which is valuable in guiding further development using bladder ACMs for organ regeneration and in strategies involving the use of hyaluronic acid.

  10. Computer Databases: A Survey; Part 1: General and News Databases.

    ERIC Educational Resources Information Center

    O'Leary, Mick

    1986-01-01

    Descriptions and evaluations of 13 databases devoted to computer information are presented by type under four headings: bibliographic databases; daily news services; online computer magazines; and specialized computer industry databases. Information on database producers, starting date of file, update frequency, vendors, and prices is summarized…

  11. Survey of Commercially Available Computer-Readable Bibliographic Data Bases.

    ERIC Educational Resources Information Center

    Schneider, John H., Ed.; And Others

    This document contains the results of a survey of 94 U. S. organizations, and 36 organizations in other countries that were thought to prepare machine-readable data bases. Of those surveyed, 55 organizations (40 in U. S., 15 in other countries) provided completed camera-ready forms describing 81 commercially available, machine-readable data bases…

  12. A Survey of Computer Use in Two-Year College Reading Programs.

    ERIC Educational Resources Information Center

    Swartz, Donna

    1985-01-01

    A mail survey of two-year colleges was conducted to identify (1) two-year colleges using computer technology to teach reading, (2) the types of hardware and software used, (3) the courses in which computer technology is used, and (4) the ways in which computer technology is used in two-year college reading programs. Responses from 181 two-year…

  13. Computation of optimized arrays for 3-D electrical imaging surveys

    NASA Astrophysics Data System (ADS)

    Loke, M. H.; Wilkinson, P. B.; Uhlemann, S. S.; Chambers, J. E.; Oxby, L. S.

    2014-12-01

    3-D electrical resistivity surveys and inversion models are required to accurately resolve structures in areas with very complex geology where 2-D models might suffer from artefacts. Many 3-D surveys use a grid where the number of electrodes along one direction (x) is much greater than in the perpendicular direction (y). Frequently, due to limitations in the number of independent electrodes in the multi-electrode system, the surveys use a roll-along system with a small number of parallel survey lines aligned along the x-direction. The `Compare R' array optimization method previously used for 2-D surveys is adapted for such 3-D surveys. Offset versions of the inline arrays used in 2-D surveys are included in the number of possible arrays (the comprehensive data set) to improve the sensitivity to structures in between the lines. The array geometric factor and its relative error are used to filter out potentially unstable arrays in the construction of the comprehensive data set. Comparisons of the conventional (consisting of dipole-dipole and Wenner-Schlumberger arrays) and optimized arrays are made using a synthetic model and experimental measurements in a tank. The tests show that structures located between the lines are better resolved with the optimized arrays. The optimized arrays also have significantly better depth resolution compared to the conventional arrays.

  14. A comparison of computational methods and algorithms for the complex gamma function

    NASA Technical Reports Server (NTRS)

    Ng, E. W.

    1974-01-01

    A survey and comparison of some computational methods and algorithms for gamma and log-gamma functions of complex arguments are presented. Methods and algorithms reported include Chebyshev approximations, Pade expansion and Stirling's asymptotic series. The comparison leads to the conclusion that Algorithm 421 published in the Communications of ACM by H. Kuki is the best program either for individual application or for the inclusion in subroutine libraries.

  15. Survey of Unsteady Computational Aerodynamics for Horizontal Axis Wind Turbines

    NASA Astrophysics Data System (ADS)

    Frunzulicǎ, F.; Dumitrescu, H.; Cardoş, V.

    2010-09-01

    We present a short review of aerodynamic computational models for horizontal axis wind turbines (HAWT). Models presented have a various level of complexity to calculate aerodynamic loads on rotor of HAWT, starting with the simplest blade element momentum (BEM) and ending with the complex model of Navier-Stokes equations. Also, we present some computational aspects of these models.

  16. Campus Computing, 2000: The 11th National Survey of Computing and Information Technology in American Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    The 2000 Campus Computing Survey, the 11th such survey, was sent to the chief academic officer at 1,176 two-year and four-year colleges and universities across the United States. By October 2000, 506 responses had been received, a response rate of 43%. New data reveal that the growing demand for technology talent across all sectors of the U.S.…

  17. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software

    SciTech Connect

    Fenske, K.R.

    1991-11-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization Plans for Word Processors, Personal Computers, Workstations, and Associated Software to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference that documents the plans of each organization for office automation, identifies appropriate planners and other contact people in those organizations, and encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan.

  18. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software. Revision 3

    SciTech Connect

    Fenske, K.R.

    1991-11-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization Plans for Word Processors, Personal Computers, Workstations, and Associated Software to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference that documents the plans of each organization for office automation, identifies appropriate planners and other contact people in those organizations, and encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan.

  19. The Successive Contributions of Computers to Education: A Survey.

    ERIC Educational Resources Information Center

    Lelouche, Ruddy

    1998-01-01

    Shows how education has successively benefited from traditional information processing through programmed instruction and computer-assisted instruction (CAI), artificial intelligence, intelligent CAI, intelligent tutoring systems, and hypermedia techniques. Contains 29 references. (DDR)

  20. A Synthesis and Survey of Critical Success Factors for Computer Technology Projects

    ERIC Educational Resources Information Center

    Baker, Ross A.

    2012-01-01

    The author investigated the existence of critical success factors for computer technology projects. Current research literature and a survey of experienced project managers indicate that there are 23 critical success factors (CSFs) that correlate with project success. The survey gathered an assessment of project success and the degree to which…

  1. A Survey of Computer Usage in Adult Education Programs in Florida Report.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Div. of Vocational, Adult, and Community Education.

    A study was conducted to identify the types and uses of computer hardware and software in adult and community education programs in Florida. Information was gathered through a survey instrument developed for the study and mailed to 100 adult and community education directors and adult literacy center coordinators (92 surveys were returned). The…

  2. CNTF-ACM promotes mitochondrial respiration and oxidative stress in cortical neurons through upregulating L-type calcium channel activity.

    PubMed

    Sun, Meiqun; Liu, Hongli; Xu, Huanbai; Wang, Hongtao; Wang, Xiaojing

    2016-09-01

    A specialized culture medium termed ciliary neurotrophic factor-treated astrocyte-conditioned medium (CNTF-ACM) allows investigators to assess the peripheral effects of CNTF-induced activated astrocytes upon cultured neurons. CNTF-ACM has been shown to upregulate neuronal L-type calcium channel current activity, which has been previously linked to changes in mitochondrial respiration and oxidative stress. Therefore, the aim of this study was to evaluate CNTF-ACM's effects upon mitochondrial respiration and oxidative stress in rat cortical neurons. Cortical neurons, CNTF-ACM, and untreated control astrocyte-conditioned medium (UC-ACM) were prepared from neonatal Sprague-Dawley rat cortical tissue. Neurons were cultured in either CNTF-ACM or UC-ACM for a 48-h period. Changes in the following parameters before and after treatment with the L-type calcium channel blocker isradipine were assessed: (i) intracellular calcium levels, (ii) mitochondrial membrane potential (ΔΨm), (iii) oxygen consumption rate (OCR) and adenosine triphosphate (ATP) formation, (iv) intracellular nitric oxide (NO) levels, (v) mitochondrial reactive oxygen species (ROS) production, and (vi) susceptibility to the mitochondrial complex I toxin rotenone. CNTF-ACM neurons displayed the following significant changes relative to UC-ACM neurons: (i) increased intracellular calcium levels (p < 0.05), (ii) elevation in ΔΨm (p < 0.05), (iii) increased OCR and ATP formation (p < 0.05), (iv) increased intracellular NO levels (p < 0.05), (v) increased mitochondrial ROS production (p < 0.05), and (vi) increased susceptibility to rotenone (p < 0.05). Treatment with isradipine was able to partially rescue these negative effects of CNTF-ACM (p < 0.05). CNTF-ACM promotes mitochondrial respiration and oxidative stress in cortical neurons through elevating L-type calcium channel activity. PMID:27514537

  3. An audience-channel-message-evaluation (ACME) framework for health communication campaigns.

    PubMed

    Noar, Seth M

    2012-07-01

    Recent reviews of the literature have indicated that a number of health communication campaigns continue to fail to adhere to principles of effective campaign design. The lack of an integrated, organizing framework for the design, implementation, and evaluation of health communication campaigns may contribute to this state of affairs. The current article introduces an audience-channel-message-evaluation (ACME) framework that organizes the major principles of health campaign design, implementation, and evaluation. ACME also explicates the relationships and linkages between the varying principles. Insights from ACME include the following: The choice of audience segment(s) to focus on in a campaign affects all other campaign design choices, including message strategy and channel/component options. Although channel selection influences options for message design, choice of message design also influences channel options. Evaluation should not be thought of as a separate activity, but rather should be infused and integrated throughout the campaign design and implementation process, including formative, process, and outcome evaluation activities. Overall, health communication campaigns that adhere to this integrated set of principles of effective campaign design will have a greater chance of success than those using principles idiosyncratically. These design, implementation, and evaluation principles are embodied in the ACME framework.

  4. An audience-channel-message-evaluation (ACME) framework for health communication campaigns.

    PubMed

    Noar, Seth M

    2012-07-01

    Recent reviews of the literature have indicated that a number of health communication campaigns continue to fail to adhere to principles of effective campaign design. The lack of an integrated, organizing framework for the design, implementation, and evaluation of health communication campaigns may contribute to this state of affairs. The current article introduces an audience-channel-message-evaluation (ACME) framework that organizes the major principles of health campaign design, implementation, and evaluation. ACME also explicates the relationships and linkages between the varying principles. Insights from ACME include the following: The choice of audience segment(s) to focus on in a campaign affects all other campaign design choices, including message strategy and channel/component options. Although channel selection influences options for message design, choice of message design also influences channel options. Evaluation should not be thought of as a separate activity, but rather should be infused and integrated throughout the campaign design and implementation process, including formative, process, and outcome evaluation activities. Overall, health communication campaigns that adhere to this integrated set of principles of effective campaign design will have a greater chance of success than those using principles idiosyncratically. These design, implementation, and evaluation principles are embodied in the ACME framework. PMID:21441207

  5. 76 FR 64943 - Proposed Cercla Administrative Cost Recovery Settlement; ACM Smelter and Refinery Site, Located...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-19

    ... administrative settlement for recovery of past and projected future response costs concerning the ACM Smelter and... past response costs, as well as future response costs under the settlement. The settlement includes a... considerations which indicate that the settlement is inappropriate, improper, or inadequate. The...

  6. SUPERFUND TREATABILITY CLEARINGHOUSE: FINAL REPORT, PHASE I - IMMEDIATE ASSESSMENT, ACME SOLVENTS SITE

    EPA Science Inventory

    This is a site assessment and feasibility study of incineration alternatives at the ACME Solvents Site at Rockford, Illinois. The document contains laboratory results that are reported to simulate incineration conditions but no details on test methods were provided. The d...

  7. A Survey of Computer Use by Undergraduate Psychology Departments in Virginia.

    ERIC Educational Resources Information Center

    Stoloff, Michael L.; Couch, James V.

    1987-01-01

    Reports a survey of computer use in psychology departments in Virginia's four year colleges. Results showed that faculty, students, and clerical staff used word processing, statistical analysis, and database management most frequently. The three most numerous computers brands were the Apple II family, IBM PCs, and the Apple Macintosh. (Author/JDH)

  8. Computer Databases: A Survey. Part 3: Product Databases.

    ERIC Educational Resources Information Center

    O'Leary, Mick

    1987-01-01

    Describes five online databases that focus on computer products, primarily software and microcomputing hardware, and compares the databases in terms of record content, product coverage, vertical market coverage, currency, availability, and price. Sample records and searches are provided, as well as a directory of product databases. (CLB)

  9. Computer Usage Survey for NUCEA Region IV. Summary and Observations.

    ERIC Educational Resources Information Center

    Jeska, Elizabeth E.; White, Cynthia

    The 57 institutional members of Region IV of the National University Continuing Education Association (NUCEA) were asked to provide information on computerization for teaching and conference use. Forty institutions (70 percent) responded. Sixty percent of the respondents indicated having a computer teaching facility. Of the 16 schools without a…

  10. Software survey: VOSviewer, a computer program for bibliometric mapping

    PubMed Central

    Waltman, Ludo

    2009-01-01

    We present VOSviewer, a freely available computer program that we have developed for constructing and viewing bibliometric maps. Unlike most computer programs that are used for bibliometric mapping, VOSviewer pays special attention to the graphical representation of bibliometric maps. The functionality of VOSviewer is especially useful for displaying large bibliometric maps in an easy-to-interpret way. The paper consists of three parts. In the first part, an overview of VOSviewer’s functionality for displaying bibliometric maps is provided. In the second part, the technical implementation of specific parts of the program is discussed. Finally, in the third part, VOSviewer’s ability to handle large maps is demonstrated by using the program to construct and display a co-citation map of 5,000 major scientific journals. PMID:20585380

  11. General-purpose computation with neural networks: a survey of complexity theoretic results.

    PubMed

    Síma, Jirí; Orponen, Pekka

    2003-12-01

    We survey and summarize the literature on the computational aspects of neural network models by presenting a detailed taxonomy of the various models according to their complexity theoretic characteristics. The criteria of classification include the architecture of the network (feedforward versus recurrent), time model (discrete versus continuous), state type (binary versus analog), weight constraints (symmetric versus asymmetric), network size (finite nets versus infinite families), and computation type (deterministic versus probabilistic), among others. The underlying results concerning the computational power and complexity issues of perceptron, radial basis function, winner-take-all, and spiking neural networks are briefly surveyed, with pointers to the relevant literature. In our survey, we focus mainly on the digital computation whose inputs and outputs are binary in nature, although their values are quite often encoded as analog neuron states. We omit the important learning issues.

  12. Computers in Medicine: A survey of decision aids for clinicians

    PubMed Central

    Young, D W

    1982-01-01

    Inconsistency in applying medical knowledge is a major reason for varying standards of medical care. Five types of aid have been introduced into medicine to help decision-making: questionnaires, algorithms, database systems, diagnostic systems, and, finally, computer-based decision-support systems. Of these, the most effective act as reminder or prompt systems to assist doctors without threatening their clinical freedom. PMID:6812701

  13. A Survey of Computational Intelligence Techniques in Protein Function Prediction

    PubMed Central

    Tiwari, Arvind Kumar; Srivastava, Rajeev

    2014-01-01

    During the past, there was a massive growth of knowledge of unknown proteins with the advancement of high throughput microarray technologies. Protein function prediction is the most challenging problem in bioinformatics. In the past, the homology based approaches were used to predict the protein function, but they failed when a new protein was different from the previous one. Therefore, to alleviate the problems associated with homology based traditional approaches, numerous computational intelligence techniques have been proposed in the recent past. This paper presents a state-of-the-art comprehensive review of various computational intelligence techniques for protein function predictions using sequence, structure, protein-protein interaction network, and gene expression data used in wide areas of applications such as prediction of DNA and RNA binding sites, subcellular localization, enzyme functions, signal peptides, catalytic residues, nuclear/G-protein coupled receptors, membrane proteins, and pathway analysis from gene expression datasets. This paper also summarizes the result obtained by many researchers to solve these problems by using computational intelligence techniques with appropriate datasets to improve the prediction performance. The summary shows that ensemble classifiers and integration of multiple heterogeneous data are useful for protein function prediction. PMID:25574395

  14. Children's Experiences of Completing a Computer-Based Violence Survey: Finnish Child Victim Survey Revisited.

    PubMed

    Fagerlund, Monica; Ellonen, Noora

    2016-07-01

    The involvement of children as research subjects requires special considerations with regard to research practices and ethics. This is especially true concerning sensitive research topics such as sexual victimization. Prior research suggests that reflecting these experiences in a survey can cause negative feelings in child participants, although posing only a minimal to moderate risk. Analyzing only predefined, often negative feelings related to answering a sexual victimization survey has dominated the existing literature. In this article children's free-text comments about answering a victimization survey and experiences of sexual victimization are analyzed together to evaluate the effects of research participation in relation to this sensitive issue. Altogether 11,364 children, aged 11-12 and 15-16, participated in the Finnish Child Victim Survey in 2013. Of these, 69% (7,852) reflected on their feelings about answering the survey. Results indicate that both clearly negative and positive feelings are more prevalent among victimized children compared to their nonvictimized peers. Characteristics unique to sexual victimization as well as differences related to gender and age are also discussed. The study contributes to the important yet contradictory field of studying the effects of research participation on children. PMID:27472509

  15. A survey on computer aided diagnosis for ocular diseases

    PubMed Central

    2014-01-01

    Background Computer Aided Diagnosis (CAD), which can automate the detection process for ocular diseases, has attracted extensive attention from clinicians and researchers alike. It not only alleviates the burden on the clinicians by providing objective opinion with valuable insights, but also offers early detection and easy access for patients. Method We review ocular CAD methodologies for various data types. For each data type, we investigate the databases and the algorithms to detect different ocular diseases. Their advantages and shortcomings are analyzed and discussed. Result We have studied three types of data (i.e., clinical, genetic and imaging) that have been commonly used in existing methods for CAD. The recent developments in methods used in CAD of ocular diseases (such as Diabetic Retinopathy, Glaucoma, Age-related Macular Degeneration and Pathological Myopia) are investigated and summarized comprehensively. Conclusion While CAD for ocular diseases has shown considerable progress over the past years, the clinical importance of fully automatic CAD systems which are able to embed clinical knowledge and integrate heterogeneous data sources still show great potential for future breakthrough. PMID:25175552

  16. Optimizing the Advanced Ceramic Material (ACM) for Diesel Particulate Filter Applications

    SciTech Connect

    Dillon, Heather E.; Stewart, Mark L.; Maupin, Gary D.; Gallant, Thomas R.; Li, Cheng; Mao, Frank H.; Pyzik, Aleksander J.; Ramanathan, Ravi

    2006-10-02

    This paper describes the application of pore-scale filtration simulations to the ‘Advanced Ceramic Material’ (ACM) developed by Dow Automotive for use in advanced diesel particulate filters. The application required the generation of a three dimensional substrate geometry to provide the boundary conditions for the flow model. An innovative stochastic modeling technique was applied matching chord length distribution and the porosity profile of the material. Additional experimental validation was provided by the single channel experimental apparatus. Results show that the stochastic reconstruction techniques provide flexibility and appropriate accuracy for the modeling efforts. Early optimization efforts imply that needle length may provide a mechanism for adjusting performance of the ACM for DPF applications. New techniques have been developed to visualize soot deposition in both traditional and new DPF substrate materials. Loading experiments have been conducted on a variety of single channel DPF substrates to develop a deeper understanding of soot penetration, soot deposition characteristics, and to confirm modeling results.

  17. Use of a new microporous insulation in a sub car at Acme Steel

    SciTech Connect

    Harvey, H.; Gamble, F.C.; MacKenzie, I.B.

    1996-12-31

    Acme Steel Co. is a small integrated steel company headquartered in Riverdale IL., with its blast furnace and coke plant operations located in the city of Chicago. Rail transportation between the two plants is by Conrail with two crews assigned exclusively to Acme. The torpedo cars used for this service are specially reinforced, with 36 in. wheels and additional braking capability for safety on public rail tracks. Over a seven month period, microporous insulating panels 0.28 in. thick in No. 49 sub ladle saved an average 24 degrees in the iron on arrival at the BOF compared to the average for the rest of the fleet. The microporous insulation replaced 0.25 in. of compressed fiber panel.

  18. Cloud computing for energy management in smart grid - an application survey

    NASA Astrophysics Data System (ADS)

    Naveen, P.; Kiing Ing, Wong; Kobina Danquah, Michael; Sidhu, Amandeep S.; Abu-Siada, Ahmed

    2016-03-01

    The smart grid is the emerging energy system wherein the application of information technology, tools and techniques that make the grid run more efficiently. It possesses demand response capacity to help balance electrical consumption with supply. The challenges and opportunities of emerging and future smart grids can be addressed by cloud computing. To focus on these requirements, we provide an in-depth survey on different cloud computing applications for energy management in the smart grid architecture. In this survey, we present an outline of the current state of research on smart grid development. We also propose a model of cloud based economic power dispatch for smart grid.

  19. On the modeling of a single-stage, entrained-flow gasifier using Aspen Custom Modeler (ACM)

    SciTech Connect

    Kasule, J.; Turton, R.; Bhattacharyya, D.; Zitney, S.

    2010-01-01

    Coal-fired gasifiers are the centerpiece of integrated gasification combined cycle (IGCC) power plants. The gasifier produces synthesis gas that is subsequently converted into electricity through combustion in a gas turbine. Several mathematical models have been developed to study the physical and chemical processes taking place inside the gasifier. Such models range from simple one-dimensional (1D) steady-state models to sophisticated dynamic 3D computational fluid dynamics (CFD) models that incorporate turbulence effects in the reactor. The practical operation of the gasifier is dynamic in nature but most 1D and some higher-dimensional models are often steady state. On the other hand, many higher order CFD-based models are dynamic in nature, but are too computationally expensive to be used directly in operability and controllability dynamic studies. They are also difficult to incorporate in the framework of process simulation software such as Aspen Plus Dynamics. Thus lower-dimensional dynamic models are still useful in these types of studies. In the current study, a 1D dynamic model for a single-stage, downward-firing, entrained-flow GE-type gasifier is developed using Aspen Custom Modeler{reg_sign} (ACM), which is a commercial equation-based simulator for creating, editing, and re-using models of process units. The gasifier model is based on mass, momentum, and energy balances for the solid and gas phases. The physical and chemical reactions considered in the model are drying, devolatilization/pyrolysis, gasification, combustion, and the homogeneous gas phase reactions. The dynamic gasifier model is being developed for use in a plant-wide dynamic model of an IGCC power plant. For dynamic simulation, the resulting highly nonlinear system of partial differential algebraic equations (PDAE) is solved in ACM using the well-known Method of Lines (MoL) approach. The MoL discretizes the space domain and leaves the time domain continuous, thereby converting the PDAE to

  20. USL NASA/RECON project presentations at the 1985 ACM Computer Science Conference: Abstracts and visuals

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Chum, Frank Y.; Gallagher, Suzy; Granier, Martin; Hall, Philip P.; Moreau, Dennis R.; Triantafyllopoulos, Spiros

    1985-01-01

    This Working Paper Series entry represents the abstracts and visuals associated with presentations delivered by six USL NASA/RECON research team members at the above named conference. The presentations highlight various aspects of NASA contract activities pursued by the participants as they relate to individual research projects. The titles of the six presentations are as follows: (1) The Specification and Design of a Distributed Workstation; (2) An Innovative, Multidisciplinary Educational Program in Interactive Information Storage and Retrieval; (3) Critical Comparative Analysis of the Major Commercial IS and R Systems; (4) Design Criteria for a PC-Based Common User Interface to Remote Information Systems; (5) The Design of an Object-Oriented Graphics Interface; and (6) Knowledge-Based Information Retrieval: Techniques and Applications.

  1. National Survey of Internet Usage: Teachers, Computer Coordinators, and School Librarians, Grades 3-12.

    ERIC Educational Resources Information Center

    Market Data Retrieval, Inc., Shelton, CT.

    A study was conducted to assess the number and type of schools and educators who use the Internet/World Wide Web. The national survey was conducted in November and December of 1996, and included 6,000 teachers, computer coordinators, and school librarians currently working in grades 3-5, 6-8, and 9-12. At the elementary level, classroom teachers…

  2. Children's Experiences of Completing a Computer-Based Violence Survey: Ethical Implications

    ERIC Educational Resources Information Center

    Ellonen, Noora; Poso, Tarja

    2011-01-01

    This article aims to contribute to the discussion about the ethics of research on children when studying sensitive issues such as violence. The empirical analysis is based on the accounts given by children (11 377) who completed a computer-based questionnaire about their experiences of violence ("The Finnish Child Victim Survey 2008") and their…

  3. Developing a Computer Information Systems Curriculum Based on an Industry Needs Survey.

    ERIC Educational Resources Information Center

    Ghafarian, Ahmad; Sisk, Kathy A.

    This paper details experiences in developing an undergraduate Computer Information Systems (CIS) curriculum at a small liberal arts school. The development of the program was based on the study of needs assessment. Findings were based on the analysis of four sources of data: the results of an industry needs survey, data from a needs assessment…

  4. Technology survey of computer software as applicable to the MIUS project

    NASA Technical Reports Server (NTRS)

    Fulbright, B. E.

    1975-01-01

    Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.

  5. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software. Revision 4

    SciTech Connect

    Fenske, K.R.; Rockwell, V.S.

    1992-08-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization plans for Word Processors, Personal Computers, Workstations, and Associated Software (ANL/TM, Revision 4) to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference document that (1) documents the plans of each organization for office automation, (2) identifies appropriate planners and other contact people in those organizations and (3) encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations (ANL/TM 458) and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan (ANL/TM 466).

  6. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software

    SciTech Connect

    Fenske, K.R.; Rockwell, V.S.

    1992-08-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization plans for Word Processors, Personal Computers, Workstations, and Associated Software (ANL/TM, Revision 4) to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference document that (1) documents the plans of each organization for office automation, (2) identifies appropriate planners and other contact people in those organizations and (3) encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations (ANL/TM 458) and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan (ANL/TM 466).

  7. On the Integration of Computer Algebra Systems (CAS) by Canadian Mathematicians: Results of a National Survey

    ERIC Educational Resources Information Center

    Buteau, Chantal; Jarvis, Daniel H.; Lavicza, Zsolt

    2014-01-01

    In this article, we outline the findings of a Canadian survey study (N = 302) that focused on the extent of computer algebra systems (CAS)-based technology use in postsecondary mathematics instruction. Results suggest that a considerable number of Canadian mathematicians use CAS in research and teaching. CAS use in research was found to be the…

  8. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    DOE PAGES

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-04-24

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less

  9. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    SciTech Connect

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-04-24

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based on their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.

  10. Issues in nursing: strategies for an Internet-based, computer-assisted telephone survey.

    PubMed

    Piamjariyakul, Ubolrat; Bott, Marjorie J; Taunton, Roma Lee

    2006-08-01

    The study describes the design and implementation of an Internet-based, computed-assisted telephone survey about the care-planning process in 107 long-term care facilities in the Midwest. Two structured telephone surveys were developed to interview the care planning coordinators and their team members. Questionmark Perception Software Version 3 was used to develop the surveys in a wide range of formats. The responses were drawn into a database that was exported to a spreadsheet format and converted to a statistical format by the Information Technology team. Security of the database was protected. Training sessions were provided to project staff. The interviews were tape-recorded for the quality checks. The inter-rater reliabilities were above 95% to 100% agreement. Investigators should consider using Internet-based survey tools, especially for multisite studies that allow access to larger samples at less cost. Exploring multiple software systems for the best fit to the study requirements is essential.

  11. Similarity computation strategies in the microRNA-disease network: a survey.

    PubMed

    Zou, Quan; Li, Jinjin; Song, Li; Zeng, Xiangxiang; Wang, Guohua

    2016-01-01

    Various microRNAs have been demonstrated to play roles in a number of human diseases. Several microRNA-disease network reconstruction methods have been used to describe the association from a systems biology perspective. The key problem for the network is the similarity computation model. In this article, we reviewed the main similarity computation methods and discussed these methods and future works. This survey may prompt and guide systems biology and bioinformatics researchers to build more perfect microRNA-disease associations and may make the network relationship clear for medical researchers.

  12. TWO NOVEL ACM (ACTIVE CONTOUR MODEL) METHODS FOR INTRAVASCULAR ULTRASOUND IMAGE SEGMENTATION

    SciTech Connect

    Chen, Chi Hau; Potdat, Labhesh; Chittineni, Rakesh

    2010-02-22

    One of the attractive image segmentation methods is the Active Contour Model (ACM) which has been widely used in medical imaging as it always produces sub-regions with continuous boundaries. Intravascular ultrasound (IVUS) is a catheter based medical imaging technique which is used for quantitative assessment of atherosclerotic disease. Two methods of ACM realizations are presented in this paper. The gradient descent flow based on minimizing energy functional can be used for segmentation of IVUS images. However this local operation alone may not be adequate to work with the complex IVUS images. The first method presented consists of basically combining the local geodesic active contours and global region-based active contours. The advantage of combining the local and global operations is to allow curves deforming under the energy to find only significant local minima and delineate object borders despite noise, poor edge information and heterogeneous intensity profiles. Results for this algorithm are compared to standard techniques to demonstrate the method's robustness and accuracy. In the second method, the energy function is appropriately modified and minimized using a Hopfield neural network. Proper modifications in the definition of the bias of the neurons have been introduced to incorporate image characteristics. The method overcomes distortions in the expected image pattern, such as due to the presence of calcium, and employs a specialized structure of the neural network and boundary correction schemes which are based on a priori knowledge about the vessel geometry. The presented method is very fast and has been evaluated using sequences of IVUS frames.

  13. Superfund Record of Decision (EPA Region 5): Acme Solvents, Morristown, Illinois, September 1985. Final report

    SciTech Connect

    Not Available

    1985-09-27

    The Acme Solvents Reclaiming, Inc. facility is located approximately five miles south of Rockford, Illinois. From 1960 until 1973, the facility served as a disposal site for paints, oils and still bottoms from the solvent reclamation plant located in Rockford. In addition, empty drums were stored onsite. Wastes were dumped into depressions created from either previous quarrying activities or by scraping overburden from the near surface bedrock to form berms. In September 1972, the Illinois Pollution Control Board (IPCB) ordered Acme to remove all drums and wastes from the facility and to backfill the lagoons. Follow-up inspections revealed that wastes and crushed drums were being left onsite and merely covered with soil. Sampling of the site revealed high concentrations of chlorinated organics in the drinking water. The major source of hazardous substances at the facility are the waste disposal mounds. These mounds contain volatile and semi-volatile organic compounds and concentrations of PCBs up to several hundred mg/kg. The selected remedial action is included.

  14. AIHA position statement on the removal of asbestos-containing materials (ACM) from buildings

    SciTech Connect

    Not Available

    1991-06-01

    The health risks associated with asbestos exposure for building occupants has been demonstrated to be very low. The decision to remove asbestos-containing materials (ACM) in undamaged, intact condition that are not readily accessible to occupants should be made only after assessing all other options. Both technical and financial issues should be fully explored by a team of trained specialists, including industrial hygienists, architects, and engineers. The optimal solution will vary from building to building, based on factors unique to each situation. One important consideration is the use of a well-designed air-monitoring program to identify changes in airborne levels of asbestos. Special training and maintenance programs are needed to ensure the safety and health of building and contract workers who may encounter asbestos or who may disturb it during routine or nonroutine activities. Each building owner who has ACM in a building should identify an in-house asbestos manager, and it is also necessary to provide appropriate resources, including professional consultants, to develop and manage a responsible and effective in-place management program throughout the life of a building containing asbestos.

  15. Patient grouping for dose surveys and establishment of diagnostic reference levels in paediatric computed tomography.

    PubMed

    Vassileva, J; Rehani, M

    2015-07-01

    There has been confusion in literature on whether paediatric patients should be grouped according to age, weight or other parameters when dealing with dose surveys. The present work aims to suggest a pragmatic approach to achieve reasonable accuracy for performing patient dose surveys in countries with limited resources. The analysis is based on a subset of data collected within the IAEA survey of paediatric computed tomography (CT) doses, involving 82 CT facilities from 32 countries in Asia, Europe, Africa and Latin America. Data for 6115 patients were collected, in 34.5 % of which data for weight were available. The present study suggests that using four age groups, <1, >1-5, >5-10 and >10-15 y, is realistic and pragmatic for dose surveys in less resourced countries and for the establishment of DRLs. To ensure relevant accuracy of results, data for >30 patients in a particular age group should be collected if patient weight is not known. If a smaller sample is used, patient weight should be recorded and the median weight in the sample should be within 5-10 % from the median weight of the sample for which the DRLs were established. Comparison of results from different surveys should always be performed with caution, taking into consideration the way of grouping of paediatric patients. Dose results can be corrected for differences in patient weight/age group.

  16. Patient grouping for dose surveys and establishment of diagnostic reference levels in paediatric computed tomography.

    PubMed

    Vassileva, J; Rehani, M

    2015-07-01

    There has been confusion in literature on whether paediatric patients should be grouped according to age, weight or other parameters when dealing with dose surveys. The present work aims to suggest a pragmatic approach to achieve reasonable accuracy for performing patient dose surveys in countries with limited resources. The analysis is based on a subset of data collected within the IAEA survey of paediatric computed tomography (CT) doses, involving 82 CT facilities from 32 countries in Asia, Europe, Africa and Latin America. Data for 6115 patients were collected, in 34.5 % of which data for weight were available. The present study suggests that using four age groups, <1, >1-5, >5-10 and >10-15 y, is realistic and pragmatic for dose surveys in less resourced countries and for the establishment of DRLs. To ensure relevant accuracy of results, data for >30 patients in a particular age group should be collected if patient weight is not known. If a smaller sample is used, patient weight should be recorded and the median weight in the sample should be within 5-10 % from the median weight of the sample for which the DRLs were established. Comparison of results from different surveys should always be performed with caution, taking into consideration the way of grouping of paediatric patients. Dose results can be corrected for differences in patient weight/age group. PMID:25836695

  17. Paleoenvironmental conditions for the development of calcareous nannofossil acme during the late Miocene in the eastern equatorial Pacific

    NASA Astrophysics Data System (ADS)

    Beltran, Catherine; Rousselle, Gabrielle; Backman, Jan; Wade, Bridget S.; Sicre, Marie Alexandrine

    2014-03-01

    Repeated monospecific coccolithophore dominance intervals (acmes) of specimens belonging to the Noelaerhabdaceae family—including the genus Reticulofenestra and modern descendants Emiliania and Gephyrocapsa—occurred during the Neogene. Such acme was recognized during the late Miocene (~ 8.6 Ma), at a time of a major reorganization of nannofossil assemblages resulting in a worldwide temporary disappearance of larger forms of the genus Reticulofenestra (R. pseudoumbilicus) and the gradual recovery and dominance of its smaller forms (< 5 µm). In this study we present a multiproxy investigation of late Miocene sediments from the east equatorial Pacific Integrated Ocean Drilling Program Site U1338 where small reticulofenestrid-type placoliths with a closed central area—known as small Dictyococcites spp. (< 3 µm)—formed an acme. We report on oxygen and carbon stable isotope records of multispecies planktic calcite and alkenone-derived sea surface temperature. Our data indicate that, during this 100 kyr long acme, the east equatorial Pacific thermocline remained deep and stable. Local surface stratification state fails to explain this acme and thus contradicts the model-based hypothesis of a Southern Ocean high-latitude nutrient control of the surface waters in the east equatorial Pacific. Instead, our findings suggest that external forcing such as an extended period of low eccentricity may have created favorable conditions for the small Dictyococcites spp. growth.

  18. Opportunities and Needs for Mobile-Computing Technology to Support U.S. Geological Survey Fieldwork

    USGS Publications Warehouse

    Wood, Nathan J.; Halsing, David L.

    2006-01-01

    To assess the opportunities and needs for mobile-computing technology at the U.S. Geological Survey (USGS), we conducted an internal, Internet-based survey of bureau scientists whose research includes fieldwork. In summer 2005, 144 survey participants answered 65 questions about fieldwork activities and conditions, technology to support field research, and postfieldwork data processing and analysis. Results suggest that some types of mobile-computing technology are already commonplace, such as digital cameras and Global Positioning System (GPS) receivers, whereas others are not, such as personal digital assistants (PDAs) and tablet-based personal computers (tablet PCs). The potential for PDA use in the USGS is high: 97 percent of respondents record field observations (primarily environmental conditions and water-quality data), and 87 percent take field samples (primarily water-quality data, water samples, and sediment/soil samples). The potential for tablet PC use in the USGS is also high: 59 percent of respondents map environmental features in the field, primarily by sketching in field notebooks, on aerial photographs, or on topographic-map sheets. Results also suggest that efficient mobile-computing-technology solutions could benefit many USGS scientists because most respondents spend at least 1 week per year in the field, conduct field sessions that are least 1 week in duration, have field crews of one to three people, and typically travel on foot about 1 mi from their field vehicles. By allowing researchers to enter data directly into digital databases while in the field, mobile-computing technology could also minimize postfieldwork data processing: 93 percent of respondents enter collected field data into their office computers, and more than 50 percent spend at least 1 week per year on postfieldwork data processing. Reducing postfieldwork data processing could free up additional time for researchers and result in cost savings for the bureau. Generally

  19. Public health assessment for ACME Solvent Reclaiming Incorporated, Winnebago, Winnebago County, Illinois, region 6. Cerclis No. ILD053219259. Final report

    SciTech Connect

    1995-08-11

    Acme Solvent Reclaiming, Inc. (ACME), covers approximately 20 acres 5 miles south of Rockford on Lindenwood Road in Winnebago County. The wastes disposed on-site included paints, oils, still-bottoms, sludges, and non-recoverable solvents. Disposal practices resulted in soils contaminated with numerous inorganic and organic compounds including metals, volatiles, semi-volatiles, and polychlorinated biphenyls (PCBs). In addition to the soil contamination, a a contaminant plume migrating south-southwest has been identified in groundwater beneath and around the ACME site. Based on available information, this site is considered to be a public health hazard because of the risk to human health resulting from past, present, and potential future exposure to groundwater contaminated with various inorganic and organic compounds, including metals, volatiles, semi-volatiles, and polychlorinated biphenyls (PCBs), at concentrations that may result in an increased risk of adverse health effects.

  20. Health assessment for Acme Solvents Reclamation, Inc. , Winnebago County, Illinois, Region 5. CERCLIS No. ILD053219259. Final report

    SciTech Connect

    Not Available

    1988-08-01

    The Acme Solvents Reclamation, Inc. (Acme) National Priorities List Site is located in Winnebago County, Illinois. There are volatile organic compounds, base neutral extractable compounds, polychlorinated biphenyls, and several metals present in the soil, sediment, ground water, air, and/or leachate at or around the site. The Record of Decision signed September 1985, mandated several remedial actions which included the provision of interim alternate water, excavation, and incineration of waste and contaminated soil, landfilling of non-incinerable waste in an off-site Resource Conservation and Recovery Act landfill, and continued investigation of the connection between the ground water flow and the bedrock.

  1. AcmB Is an S-Layer-Associated β-N-Acetylglucosaminidase and Functional Autolysin in Lactobacillus acidophilus NCFM

    PubMed Central

    Johnson, Brant R.

    2016-01-01

    ABSTRACT Autolysins, also known as peptidoglycan hydrolases, are enzymes that hydrolyze specific bonds within bacterial cell wall peptidoglycan during cell division and daughter cell separation. Within the genome of Lactobacillus acidophilus NCFM, there are 11 genes encoding proteins with peptidoglycan hydrolase catalytic domains, 9 of which are predicted to be functional. Notably, 5 of the 9 putative autolysins in L. acidophilus NCFM are S-layer-associated proteins (SLAPs) noncovalently colocalized along with the surface (S)-layer at the cell surface. One of these SLAPs, AcmB, a β-N-acetylglucosaminidase encoded by the gene lba0176 (acmB), was selected for functional analysis. In silico analysis revealed that acmB orthologs are found exclusively in S-layer- forming species of Lactobacillus. Chromosomal deletion of acmB resulted in aberrant cell division, autolysis, and autoaggregation. Complementation of acmB in the ΔacmB mutant restored the wild-type phenotype, confirming the role of this SLAP in cell division. The absence of AcmB within the exoproteome had a pleiotropic effect on the extracellular proteins covalently and noncovalently bound to the peptidoglycan, which likely led to the observed decrease in the binding capacity of the ΔacmB strain for mucin and extracellular matrices fibronectin, laminin, and collagen in vitro. These data suggest a functional association between the S-layer and the multiple autolysins noncovalently colocalized at the cell surface of L. acidophilus NCFM and other S-layer-producing Lactobacillus species. IMPORTANCE Lactobacillus acidophilus is one of the most widely used probiotic microbes incorporated in many dairy foods and dietary supplements. This organism produces a surface (S)-layer, which is a self-assembling crystalline array found as the outermost layer of the cell wall. The S-layer, along with colocalized associated proteins, is an important mediator of probiotic activity through intestinal adhesion and modulation of

  2. A survey of parametrized variational principles and applications to computational mechanics

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1993-01-01

    This survey paper describes recent developments in the area of parametrized variational principles (PVP's) and selected applications to finite-element computational mechanics. A PVP is a variational principle containing free parameters that have no effect on the Euler-Lagrange equations. The theory of single-field PVP's based on gauge functions (also known as null Lagrangians) is a subset of the inverse problem of variational calculus that has limited value. On the other hand, multifield PVP's are more interesting from theoretical and practical standpoints. Following a tutorial introduction, the paper describes the recent construction of multifield PVP's in several areas of elasticity and electromagnetics. It then discusses three applications to finite-element computational mechanics: the derivation of high-performance finite elements, the development of element-level error indicators, and the constructions of finite element templates. The paper concludes with an overview of open research areas.

  3. Survey results of Internet and computer usage in veterans with epilepsy.

    PubMed

    Pramuka, Michael; Hendrickson, Rick; Van Cott, Anne C

    2010-03-01

    After our study of a self-management intervention for epilepsy, we gathered data on Internet use and computer availability to assess the feasibility of computer-based interventions in a veteran population. Veterans were asked to complete an anonymous questionnaire that gathered information regarding seizures/epilepsy in addition to demographic data, Internet use, computer availability, and interest in distance education regarding epilepsy. Three hundred twenty-four VA neurology clinic patients completed the survey. One hundred twenty-six self-reported a medical diagnosis of epilepsy and constituted the epilepsy/seizure group. For this group of veterans, the need for remote/distance-based interventions was validated given the majority of veterans traveled long distances (>2 hours). Only 51% of the epilepsy/seizure group had access to the Internet, and less than half (42%) expressed an interest in getting information on epilepsy self-management on their computer, suggesting that Web-based interventions may not be an optimal method for a self-management intervention in this population. PMID:20116339

  4. A survey on resource allocation in high performance distributed computing systems

    SciTech Connect

    Hussain, Hameed; Malik, Saif Ur Rehman; Hameed, Abdul; Khan, Samee Ullah; Bickler, Gage; Min-Allah, Nasro; Qureshi, Muhammad Bilal; Zhang, Limin; Yongji, Wang; Ghani, Nasir; Kolodziej, Joanna; Zomaya, Albert Y.; Xu, Cheng-Zhong; Balaji, Pavan; Vishnu, Abhinav; Pinel, Fredric; Pecero, Johnatan E.; Kliazovich, Dzmitry; Bouvry, Pascal; Li, Hongxiang; Wang, Lizhe; Chen, Dan; Rayes, Ammar

    2013-11-01

    An efficient resource allocation is a fundamental requirement in high performance computing (HPC) systems. Many projects are dedicated to large-scale distributed computing systems that have designed and developed resource allocation mechanisms with a variety of architectures and services. In our study, through analysis, a comprehensive survey for describing resource allocation in various HPCs is reported. The aim of the work is to aggregate under a joint framework, the existing solutions for HPC to provide a thorough analysis and characteristics of the resource management and allocation strategies. Resource allocation mechanisms and strategies play a vital role towards the performance improvement of all the HPCs classifications. Therefore, a comprehensive discussion of widely used resource allocation strategies deployed in HPC environment is required, which is one of the motivations of this survey. Moreover, we have classified the HPC systems into three broad categories, namely: (a) cluster, (b) grid, and (c) cloud systems and define the characteristics of each class by extracting sets of common attributes. All of the aforementioned systems are cataloged into pure software and hybrid/hardware solutions. The system classification is used to identify approaches followed by the implementation of existing resource allocation strategies that are widely presented in the literature.

  5. A State-Wide Survey of South Australian Secondary Schools to Determine the Current Emphasis on Ergonomics and Computer Use

    ERIC Educational Resources Information Center

    Sawyer, Janet; Penman, Joy

    2012-01-01

    This study investigated the pattern of teaching of healthy computing skills to high school students in South Australia. A survey approach was used to collect data, specifically to determine the emphasis placed by schools on ergonomics that relate to computer use. Participating schools were recruited through the Department for Education and Child…

  6. Survey and alignment analysis for the ALS storage ring using computer spreadsheets

    SciTech Connect

    Keller, R.

    1993-07-01

    The general survey and alignment concept for the ALS is based on a network of fixed, three-dimensional monuments installed in the building floor, to which all accelerator component positions are referred. The survey of these monuments is performed separately for horizontal and vertical coordinates, following the scheme imposed by the code PC-GEONET that is used for monument data analysis. For most of the accelerator objects the tasks of data acquisition, bundling, and transformations from observation-station into object coordinate-systems are being handled by the commercial software package ECDS rather than by PC-GEONET. This choice had to be made because no instrument stands are presently available at LBL that can be placed exactly over monuments and are high enough to permit observing the fiducials of installed magnets from above. Theodolites only are used with ECDS as observation instruments, and absolute scaling has to be provided by observing some object of precisely known length. To create ideal data and compute alignment values for all accelerator components, spreadsheets were developed by the author using the application EXCEL for Macintosh computers. Choice of a spreadsheet method rather than conventional programming techniques proved very convenient when in the course of this work the sheets had to be created and progressively modified under severe time pressure to include new effects and help redefine the observation procedures. With spreadsheets, varying input data formats coming from the survey crew could be easily accommodated, and adding numerous consistency checks as well as generating additional ideal data for special alignment tasks was possible with comparatively little effort. Dedicated spreadsheets were created for each of the 12 curved sectors of the storage ring. In this paper, the main features of the spreadsheets are presented, and the alignment results for lattice and corrector magnets are listed and discussed.

  7. Survey on computer aided decision support for diagnosis of celiac disease

    PubMed Central

    Hegenbart, Sebastian; Uhl, Andreas; Vécsei, Andreas

    2015-01-01

    Celiac disease (CD) is a complex autoimmune disorder in genetically predisposed individuals of all age groups triggered by the ingestion of food containing gluten. A reliable diagnosis is of high interest in view of embarking on a strict gluten-free diet, which is the CD treatment modality of first choice. The gold standard for diagnosis of CD is currently based on a histological confirmation of serology, using biopsies performed during upper endoscopy. Computer aided decision support is an emerging option in medicine and endoscopy in particular. Such systems could potentially save costs and manpower while simultaneously increasing the safety of the procedure. Research focused on computer-assisted systems in the context of automated diagnosis of CD has started in 2008. Since then, over 40 publications on the topic have appeared. In this context, data from classical flexible endoscopy as well as wireless capsule endoscopy (WCE) and confocal laser endomicrosopy (CLE) has been used. In this survey paper, we try to give a comprehensive overview of the research focused on computer-assisted diagnosis of CD. PMID:25770906

  8. Improving the Limit on the Electron EDM: Data Acquisition and Systematics Studies in the ACME Experiment

    NASA Astrophysics Data System (ADS)

    Hess, Paul William

    The ACME collaboration has completed a measurement setting a new upper limit on the size of the electron's permanent electric dipole moment (EDM). The existence of the EDM is well motivated by theories extending the standard model of particle physics, with predicted sizes very close to the current experimental limit. The new limit was set by measuring spin precession within the metastable H state of the polar molecule thorium monoxide (ThO). A particular focus here is on the automated data acquisition system developed to search for a precession phase odd under internal and external reversal of the electric field. Automated switching of many different experimental controls allowed a rapid diagnosis of major systematics, including the dominant systematic caused by non-reversing electric fields and laser polarization gradients. Polarimetry measurements made it possible to quantify and minimize the polarization gradients in our state preparation and probe lasers. Three separate measurements were used to determine the electric field that did not reverse when we tried to switch the field direction. The new bound of | de| < 8.7 x 10--29 e·cm is over an order of magnitude smaller than previous limits, and strongly limits T-violating physics at TeV energy scales.

  9. Wireless data collection of self-administered surveys using tablet computers.

    PubMed

    Singleton, Kyle W; Lan, Mars; Arnold, Corey; Vahidi, Mani; Arangua, Lisa; Gelberg, Lillian; Bui, Alex A T

    2011-01-01

    The accurate and expeditious collection of survey data by coordinators in the field is critical in the support of research studies. Early methods that used paper documentation have slowly evolved into electronic capture systems. Indeed, tools such as REDCap and others illustrate this transition. However, many current systems are tailored web-browsers running on desktop/laptop computers, requiring keyboard and mouse input. We present a system that utilizes a touch screen interface running on a tablet PC with consideration for portability, limited screen space, wireless connectivity, and potentially inexperienced and low literacy users. The system was developed using C#, ASP.net, and SQL Server by multiple programmers over the course of a year. The system was developed in coordination with UCLA Family Medicine and is currently deployed for the collection of data in a group of Los Angeles area clinics of community health centers for a study on drug addiction and intervention. PMID:22195187

  10. Wireless data collection of self-administered surveys using tablet computers.

    PubMed

    Singleton, Kyle W; Lan, Mars; Arnold, Corey; Vahidi, Mani; Arangua, Lisa; Gelberg, Lillian; Bui, Alex A T

    2011-01-01

    The accurate and expeditious collection of survey data by coordinators in the field is critical in the support of research studies. Early methods that used paper documentation have slowly evolved into electronic capture systems. Indeed, tools such as REDCap and others illustrate this transition. However, many current systems are tailored web-browsers running on desktop/laptop computers, requiring keyboard and mouse input. We present a system that utilizes a touch screen interface running on a tablet PC with consideration for portability, limited screen space, wireless connectivity, and potentially inexperienced and low literacy users. The system was developed using C#, ASP.net, and SQL Server by multiple programmers over the course of a year. The system was developed in coordination with UCLA Family Medicine and is currently deployed for the collection of data in a group of Los Angeles area clinics of community health centers for a study on drug addiction and intervention.

  11. Wireless Data Collection of Self-administered Surveys using Tablet Computers

    PubMed Central

    Singleton, Kyle W.; Lan, Mars; Arnold, Corey; Vahidi, Mani; Arangua, Lisa; Gelberg, Lillian; Bui, Alex A.T.

    2011-01-01

    The accurate and expeditious collection of survey data by coordinators in the field is critical in the support of research studies. Early methods that used paper documentation have slowly evolved into electronic capture systems. Indeed, tools such as REDCap and others illustrate this transition. However, many current systems are tailored web-browsers running on desktop/laptop computers, requiring keyboard and mouse input. We present a system that utilizes a touch screen interface running on a tablet PC with consideration for portability, limited screen space, wireless connectivity, and potentially inexperienced and low literacy users. The system was developed using C#, ASP.net, and SQL Server by multiple programmers over the course of a year. The system was developed in coordination with UCLA Family Medicine and is currently deployed for the collection of data in a group of Los Angeles area clinics of community health centers for a study on drug addiction and intervention. PMID:22195187

  12. Sci—Thur PM: Imaging — 06: Canada's National Computed Tomography (CT) Survey

    SciTech Connect

    Wardlaw, GM; Martel, N; Blackler, W; Asselin, J-F

    2014-08-15

    The value of computed tomography (CT) in medical imaging is reflected in its' increased use and availability since the early 1990's; however, given CT's relatively larger exposures (vs. planar x-ray) greater care must be taken to ensure that CT procedures are optimised in terms of providing the smallest dose possible while maintaining sufficient diagnostic image quality. The development of CT Diagnostic Reference Levels (DRLs) supports this process. DRLs have been suggested/supported by international/national bodies since the early 1990's and widely adopted elsewhere, but not on a national basis in Canada. Essentially, CT DRLs provide guidance on what is considered good practice for common CT exams, but require a representative sample of CT examination data to make any recommendations. Canada's National CT Survey project, in collaboration with provincial/territorial authorities, has collected a large national sample of CT practice data for 7 common examinations (with associated clinical indications) of both adult and pediatric patients. Following completion of data entry into a common database, a survey summary report and recommendations will be made on CT DRLs from this data. It is hoped that these can then be used by local regions to promote CT practice optimisation and support any dose reduction initiatives.

  13. Computer analysis of digital sky surveys using citizen science and manual classification

    NASA Astrophysics Data System (ADS)

    Kuminski, Evan; Shamir, Lior

    2015-01-01

    As current and future digital sky surveys such as SDSS, LSST, DES, Pan-STARRS and Gaia create increasingly massive databases containing millions of galaxies, there is a growing need to be able to efficiently analyze these data. An effective way to do this is through manual analysis, however, this may be insufficient considering the extremely vast pipelines of astronomical images generated by the present and future surveys. Some efforts have been made to use citizen science to classify galaxies by their morphology on a larger scale than individual or small groups of scientists can. While these citizen science efforts such as Zooniverse have helped obtain reasonably accurate morphological information about large numbers of galaxies, they cannot scale to provide complete analysis of billions of galaxy images that will be collected by future ventures such as LSST. Since current forms of manual classification cannot scale to the masses of data collected by digital sky surveys, it is clear that in order to keep up with the growing databases some form of automation of the data analysis will be required, and will work either independently or in combination with human analysis such as citizen science. Here we describe a computer vision method that can automatically analyze galaxy images and deduce galaxy morphology. Experiments using Galaxy Zoo 2 data show that the performance of the method increases as the degree of agreement between the citizen scientists gets higher, providing a cleaner dataset. For several morphological features, such as the spirality of the galaxy, the algorithm agreed with the citizen scientists on around 95% of the samples. However, the method failed to analyze some of the morphological features such as the number of spiral arms, and provided accuracy of just ~36%.

  14. Geologic, geotechnical, and geophysical properties of core from the Acme Fire-Pit-1 drill hole, Sheridan County, Wyoming

    USGS Publications Warehouse

    Collins, Donley S.

    1983-01-01

    A preliminary core study from the Acme Fire-Pit-1 drill hole, Sheridan County, Wyoming, revealed that the upper portion of the core had been baked by a fire confined to the underlying Monarch coal bed. The baked (clinkered) sediment above the Monarch coal bed was determined to have higher point-load strength values (greater than 2 MPa) than the sediment under the burned coal

  15. Segmentation of solid subregion of high grade gliomas in MRI images based on active contour model (ACM)

    NASA Astrophysics Data System (ADS)

    Seow, P.; Win, M. T.; Wong, J. H. D.; Abdullah, N. A.; Ramli, N.

    2016-03-01

    Gliomas are tumours arising from the interstitial tissue of the brain which are heterogeneous, infiltrative and possess ill-defined borders. Tumour subregions (e.g. solid enhancing part, edema and necrosis) are often used for tumour characterisation. Tumour demarcation into substructures facilitates glioma staging and provides essential information. Manual segmentation had several drawbacks that include laborious, time consuming, subjected to intra and inter-rater variability and hindered by diversity in the appearance of tumour tissues. In this work, active contour model (ACM) was used to segment the solid enhancing subregion of the tumour. 2D brain image acquisition data using 3T MRI fast spoiled gradient echo sequence in post gadolinium of four histologically proven high-grade glioma patients were obtained. Preprocessing of the images which includes subtraction and skull stripping were performed and then followed by ACM segmentation. The results of the automatic segmentation method were compared against the manual delineation of the tumour by a trainee radiologist. Both results were further validated by an experienced neuroradiologist and a brief quantitative evaluations (pixel area and difference ratio) were performed. Preliminary results of the clinical data showed the potential of ACM model in the application of fast and large scale tumour segmentation in medical imaging.

  16. A Survey of Exemplar Teachers' Perceptions, Use, and Access of Computer-Based Games and Technology for Classroom Instruction

    ERIC Educational Resources Information Center

    Proctor, Michael D.; Marks, Yaela

    2013-01-01

    This research reports and analyzes for archival purposes surveyed perceptions, use, and access by 259 United States based exemplar Primary and Secondary educators of computer-based games and technology for classroom instruction. Participating respondents were considered exemplary as they each won the Milken Educator Award during the 1996-2009…

  17. A Comprehensive Survey on the Status of Social and Professional Issues in United States Undergraduate Computer Science Programs and Recommendations

    ERIC Educational Resources Information Center

    Spradling, Carol; Soh, Leen-Kiat; Ansorge, Charles J.

    2009-01-01

    A national web-based survey was administered to 700 undergraduate computer science (CS) programs in the United States as part of a stratified random sample of 797 undergraduate CS programs. The 251 program responses (36% response rate) regarding social and professional issues are presented. This article describes the demographics of the…

  18. Promoting CLT within a Computer Assisted Learning Environment: A Survey of the Communicative English Course of FLTC

    ERIC Educational Resources Information Center

    Haider, Md. Zulfeqar; Chowdhury, Takad Ahmed

    2012-01-01

    This study is based on a survey of the Communicative English Language Certificate (CELC) course run by the Foreign Language Training Center (FLTC), a Project under the Ministry of Education, Bangladesh. FLTC is working to promote the teaching and learning of English through its eleven computer-based and state of the art language laboratories. As…

  19. Do Mathematicians Integrate Computer Algebra Systems in University Teaching? Comparing a Literature Review to an International Survey Study

    ERIC Educational Resources Information Center

    Marshall, Neil; Buteau, Chantal; Jarvis, Daniel H.; Lavicza, Zsolt

    2012-01-01

    We present a comparative study of a literature review of 326 selected contributions (Buteau, Marshall, Jarvis & Lavicza, 2010) to an international (US, UK, Hungary) survey of mathematicians (Lavicza, 2008) regarding the use of Computer Algebra Systems (CAS) in post-secondary mathematics education. The comparison results are organized with respect…

  20. Development and first application of an Aerosol Collection Module (ACM) for quasi online compound specific aerosol measurements

    NASA Astrophysics Data System (ADS)

    Hohaus, Thorsten; Kiendler-Scharr, Astrid; Trimborn, Dagmar; Jayne, John; Wahner, Andreas; Worsnop, Doug

    2010-05-01

    Atmospheric aerosols influence climate and human health on regional and global scales (IPCC, 2007). In many environments organics are a major fraction of the aerosol influencing its properties. Due to the huge variety of organic compounds present in atmospheric aerosol current measurement techniques are far from providing a full speciation of organic aerosol (Hallquist et al., 2009). The development of new techniques for compound specific measurements with high time resolution is a timely issue in organic aerosol research. Here we present first laboratory characterisations of an aerosol collection module (ACM) which was developed to allow for the sampling and transfer of atmospheric PM1 aerosol. The system consists of an aerodynamic lens system focussing particles on a beam. This beam is directed to a 3.4 mm in diameter surface which is cooled to -30 °C with liquid nitrogen. After collection the aerosol sample can be evaporated from the surface by heating it to up to 270 °C. The sample is transferred through a 60cm long line with a carrier gas. In order to test the ACM for linearity and sensitivity we combined it with a GC-MS system. The tests were performed with octadecane aerosol. The octadecane mass as measured with the ACM-GC-MS was compared versus the mass as calculated from SMPS derived total volume. The data correlate well (R2 0.99, slope of linear fit 1.1) indicating 100 % collection efficiency. From 150 °C to 270 °C no effect of desorption temperature on transfer efficiency could be observed. The ACM-GC-MS system was proven to be linear over the mass range 2-100 ng and has a detection limit of ~ 2 ng. First experiments applying the ACM-GC-MS system were conducted at the Jülich Aerosol Chamber. Secondary organic aerosol (SOA) was formed from ozonolysis of 600 ppbv of b-pinene. The major oxidation product nopinone was detected in the aerosol and could be shown to decrease from 2 % of the total aerosol to 0.5 % of the aerosol over the 48 hours of

  1. ARECIBO PALFA SURVEY AND EINSTEIN-HOME: BINARY PULSAR DISCOVERY BY VOLUNTEER COMPUTING

    SciTech Connect

    Knispel, B.; Allen, B.; Aulbert, C.; Bock, O.; Fehrmann, H.; Lazarus, P.; Bogdanov, S.; Anderson, D.; Bhat, N. D. R.; Brazier, A.; Chatterjee, S.; Cordes, J. M.; Camilo, F.; Crawford, F.; Deneva, J. S.; Desvignes, G.; Freire, P. C. C.; Hammer, D.; Hessels, J. W. T.; Jenet, F. A.

    2011-05-01

    We report the discovery of the 20.7 ms binary pulsar J1952+2630, made using the distributed computing project Einstein-Home in Pulsar ALFA survey observations with the Arecibo telescope. Follow-up observations with the Arecibo telescope confirm the binary nature of the system. We obtain a circular orbital solution with an orbital period of 9.4 hr, a projected orbital radius of 2.8 lt-s, and a mass function of f = 0.15 M{sub sun} by analysis of spin period measurements. No evidence of orbital eccentricity is apparent; we set a 2{sigma} upper limit e {approx}< 1.7 x 10{sup -3}. The orbital parameters suggest a massive white dwarf companion with a minimum mass of 0.95 M{sub sun}, assuming a pulsar mass of 1.4 M{sub sun}. Most likely, this pulsar belongs to the rare class of intermediate-mass binary pulsars. Future timing observations will aim to determine the parameters of this system further, measure relativistic effects, and elucidate the nature of the companion star.

  2. Experimental evidence validating the computational inference of functional associations from gene fusion events: a critical survey.

    PubMed

    Promponas, Vasilis J; Ouzounis, Christos A; Iliopoulos, Ioannis

    2014-05-01

    More than a decade ago, a number of methods were proposed for the inference of protein interactions, using whole-genome information from gene clusters, gene fusions and phylogenetic profiles. This structural and evolutionary view of entire genomes has provided a valuable approach for the functional characterization of proteins, especially those without sequence similarity to proteins of known function. Furthermore, this view has raised the real possibility to detect functional associations of genes and their corresponding proteins for any entire genome sequence. Yet, despite these exciting developments, there have been relatively few cases of real use of these methods outside the computational biology field, as reflected from citation analysis. These methods have the potential to be used in high-throughput experimental settings in functional genomics and proteomics to validate results with very high accuracy and good coverage. In this critical survey, we provide a comprehensive overview of 30 most prominent examples of single pairwise protein interaction cases in small-scale studies, where protein interactions have either been detected by gene fusion or yielded additional, corroborating evidence from biochemical observations. Our conclusion is that with the derivation of a validated gold-standard corpus and better data integration with big experiments, gene fusion detection can truly become a valuable tool for large-scale experimental biology.

  3. Experimental evidence validating the computational inference of functional associations from gene fusion events: a critical survey

    PubMed Central

    Promponas, Vasilis J.; Ouzounis, Christos A.; Iliopoulos, Ioannis

    2014-01-01

    More than a decade ago, a number of methods were proposed for the inference of protein interactions, using whole-genome information from gene clusters, gene fusions and phylogenetic profiles. This structural and evolutionary view of entire genomes has provided a valuable approach for the functional characterization of proteins, especially those without sequence similarity to proteins of known function. Furthermore, this view has raised the real possibility to detect functional associations of genes and their corresponding proteins for any entire genome sequence. Yet, despite these exciting developments, there have been relatively few cases of real use of these methods outside the computational biology field, as reflected from citation analysis. These methods have the potential to be used in high-throughput experimental settings in functional genomics and proteomics to validate results with very high accuracy and good coverage. In this critical survey, we provide a comprehensive overview of 30 most prominent examples of single pairwise protein interaction cases in small-scale studies, where protein interactions have either been detected by gene fusion or yielded additional, corroborating evidence from biochemical observations. Our conclusion is that with the derivation of a validated gold-standard corpus and better data integration with big experiments, gene fusion detection can truly become a valuable tool for large-scale experimental biology. PMID:23220349

  4. A Prediction of the Damping Properties of Hindered Phenol AO-60/polyacrylate Rubber (AO-60/ACM) Composites through Molecular Dynamics Simulation

    NASA Astrophysics Data System (ADS)

    Yang, Da-Wei; Zhao, Xiu-Ying; Zhang, Geng; Li, Qiang-Guo; Wu, Si-Zhu

    2016-05-01

    Molecule dynamics (MD) simulation, a molecular-level method, was applied to predict the damping properties of AO-60/polyacrylate rubber (AO-60/ACM) composites before experimental measures were performed. MD simulation results revealed that two types of hydrogen bond, namely, type A (AO-60) -OH•••O=C- (ACM), type B (AO-60) - OH•••O=C- (AO-60) were formed. Then, the AO-60/ACM composites were fabricated and tested to verify the accuracy of the MD simulation through dynamic mechanical thermal analysis (DMTA). DMTA results showed that the introduction of AO-60 could remarkably improve the damping properties of the composites, including the increase of glass transition temperature (Tg) alongside with the loss factor (tan δ), also indicating the AO-60/ACM(98/100) had the best damping performance amongst the composites which verified by the experimental.

  5. Survey of clinical doses from computed tomography examinations in the Canadian province of Manitoba.

    PubMed

    A Elbakri, Idris; D C Kirkpatrick, Iain

    2013-12-01

    The purpose of this study was to document CT doses for common CT examinations performed throughout the province of Manitoba. Survey forms were sent out to all provincial CT sites. Thirteen out of sixteen (81 %) sites participated. The authors assessed scans of the brain, routine abdomen-pelvis, routine chest, sinuses, lumbar spine, low-dose lung nodule studies, CT pulmonary angiograms, CT KUBs, CT colonographies and combination chest-abdomen-pelvis exams. Sites recorded scanner model, protocol techniques and patient and dose data for 100 consecutive patients who were scanned with any of the aforementioned examinations. Mean effective doses and standard deviations for the province and for individual scanners were computed. The Kruskal-Wallis test was used to compare the variability of effective doses amongst scanners. The t test was used to compare doses and their provincial ranges between newer and older scanners and scanners that used dose saving tools and those that did not. Abdomen-pelvis, chest and brain scans accounted for over 70 % of scans. Their mean effective doses were 18.0 ± 6.7, 13.2 ± 6.4 and 3.0 ± 1.0 mSv, respectively. Variations in doses amongst scanners were statistically significant. Most examinations were performed at 120 kVp, and no lower kVp was used. Dose variations due to scanner age and use of dose saving tools were not statistically significant. Clinical CT doses in Manitoba are broadly similar to but higher than those reported in other Canadian provinces. Results suggest that further dose reduction can be achieved by modifying scanning techniques, such as using lower kVp. Wide variation in doses amongst different scanners suggests that standardisation of scanning protocols can reduce patient dose. New technological advances, such as dose-reduction software algorithms, can be adopted to reduce patient dose.

  6. Computer ethics: A capstone course

    SciTech Connect

    Fisher, T.G.; Abunawass, A.M.

    1994-12-31

    This paper presents a capstone course on computer ethics required for all computer science majors in our program. The course was designed to encourage students to evaluate their own personal value systems in terms of the established values in computer science as represented by the ACM Code of Ethics. The structure, activities, and topics of the course as well as assessment of the students are presented. Observations on various course components and student evaluations of the course are also presented.

  7. Survey of the Computer Users of the Upper Arlington Public Library.

    ERIC Educational Resources Information Center

    Tsardoulias, L. Sevim

    The Computer Services Department of the Upper Arlington Public Library in Franklin County, Ohio, provides microcomputers for public use, including IBM compatible and Macintosh computers, a laser printer, and dot-matrix printers. Circulation statistics provide data regarding the frequency and amount of computer use, but these statistics indicate…

  8. Pre-Service ELT Teachers' Attitudes towards Computer Use: A Turkish Survey

    ERIC Educational Resources Information Center

    Sariçoban, Arif

    2013-01-01

    Problem Statement: Computer technology plays a crucial role in foreign/second language (L2) instruction, and as such, L2 teachers display different attitudes towards the use of computers in their teaching activities. It is important to know what attitudes these teachers hold towards the use of computers and whether they have these varying…

  9. Man-Computer Symbiosis Through Interactive Graphics: A Survey and Identification of Critical Research Areas.

    ERIC Educational Resources Information Center

    Knoop, Patricia A.

    The purpose of this report was to determine the research areas that appear most critical to achieving man-computer symbiosis. An operational definition of man-computer symbiosis was developed by: (1) reviewing and summarizing what others have said about it, and (2) attempting to distinguish it from other types of man-computer relationships. From…

  10. Successful use of tablet personal computers and wireless technologies for the 2011 Nepal Demographic and Health Survey

    PubMed Central

    Paudel, Deepak; Ahmed, Marie; Pradhan, Anjushree; Lal Dangol, Rajendra

    2013-01-01

    ABSTRACT Computer-Assisted Personal Interviewing (CAPI), coupled with the use of mobile and wireless technology, is growing as a data collection methodology. Nepal, a geographically diverse and resource-scarce country, implemented the 2011 Nepal Demographic and Health Survey, a nationwide survey of major health indicators, using tablet personal computers (tablet PCs) and wireless technology for the first time in the country. This paper synthesizes responses on the benefits and challenges of using new technology in such a challenging environment from the 89 interviewers who administered the survey. Overall, feedback from the interviewers indicate that the use of tablet PCs and wireless technology to administer the survey demonstrated potential to improve data quality and reduce data collection time—benefits that outweigh manageable challenges, such as storage and transport of the tablet PCs during fieldwork, limited options for confidential interview space due to screen readability issues under direct sunlight, and inconsistent electricity supply at times. The introduction of this technology holds great promise for improving data availability and quality, even in a context with limited infrastructure and extremely difficult terrain. PMID:25276539

  11. Perceived problems with computer gaming and internet use among adolescents: measurement tool for non-clinical survey studies

    PubMed Central

    2014-01-01

    Background Existing instruments for measuring problematic computer and console gaming and internet use are often lengthy and often based on a pathological perspective. The objective was to develop and present a new and short non-clinical measurement tool for perceived problems related to computer use and gaming among adolescents and to study the association between screen time and perceived problems. Methods Cross-sectional school-survey of 11-, 13-, and 15-year old students in thirteen schools in the City of Aarhus, Denmark, participation rate 89%, n = 2100. The main exposure was time spend on weekdays on computer- and console-gaming and internet use for communication and surfing. The outcome measures were three indexes on perceived problems related to computer and console gaming and internet use. Results The three new indexes showed high face validity and acceptable internal consistency. Most schoolchildren with high screen time did not experience problems related to computer use. Still, there was a strong and graded association between time use and perceived problems related to computer gaming, console gaming (only boys) and internet use, odds ratios ranging from 6.90 to 10.23. Conclusion The three new measures of perceived problems related to computer and console gaming and internet use among adolescents are appropriate, reliable and valid for use in non-clinical surveys about young people’s everyday life and behaviour. These new measures do not assess Internet Gaming Disorder as it is listed in the DSM and therefore has no parity with DSM criteria. We found an increasing risk of perceived problems with increasing time spent with gaming and internet use. Nevertheless, most schoolchildren who spent much time with gaming and internet use did not experience problems. PMID:24731270

  12. Insights into Degron Recognition by APC/C Coactivators from the Structure of an Acm1-Cdh1 Complex

    PubMed Central

    He, Jun; Chao, William C.H.; Zhang, Ziguo; Yang, Jing; Cronin, Nora; Barford, David

    2013-01-01

    Summary The anaphase-promoting complex/cyclosome (APC/C) regulates sister chromatid segregation and the exit from mitosis. Selection of most APC/C substrates is controlled by coactivator subunits (either Cdc20 or Cdh1) that interact with substrate destruction motifs—predominantly the destruction (D) box and KEN box degrons. How coactivators recognize D box degrons and how this is inhibited by APC/C regulatory proteins is not defined at the atomic level. Here, from the crystal structure of S. cerevisiae Cdh1 in complex with its specific inhibitor Acm1, which incorporates D and KEN box pseudosubstrate motifs, we describe the molecular basis for D box recognition. Additional interactions between Acm1 and Cdh1 identify a third protein-binding site on Cdh1 that is likely to confer coactivator-specific protein functions including substrate association. We provide a structural rationalization for D box and KEN box recognition by coactivators and demonstrate that many noncanonical APC/C degrons bind APC/C coactivators at the D box coreceptor. PMID:23707760

  13. Superfund Record of Decision (EPA Region 5): Acme Solvent Reclaiming, Winnebago County, IL. (Second remedial action), December 1990

    SciTech Connect

    Not Available

    1990-12-31

    The 20-acre Acme Solvent Reclaiming site is a former industrial disposal site in Winnebago County, Illinois. Land use in the area is mixed agricultural and residential. From 1960 to 1973, Acme Solvent Reclaiming disposed of paints, oils, and still bottoms onsite from its solvent reclamation plant. Wastes were dumped into depressions created from previous quarrying and landscaping operations, and empty drums also were stored onsite. State investigations in 1981 identified elevated levels of chlorinated organic compounds in ground water. A 1985 Record of Decision (ROD) provided for excavation and onsite incineration of 26,000 cubic yards of contaminated soil and sludge, supplying home carbon treatment units to affected residences, and further study of ground water and bedrock. During illegal removal actions taken by PRPs in 1986, 40,000 tons of soil and sludge were removed from the site. The selected remedial action for the site includes excavating and treating 6,000 tons of soil and sludge from two waste areas, using low-temperature thermal stripping; treating residuals using solidification, if necessary, followed by onsite or offsite disposal; treating the remaining contaminated soil and possibly bedrock using soil/bedrock vapor extraction; consolidating the remaining contaminated soil onsite with any treatment residuals, followed by capping; incinerating offsite 8,000 gallons of liquids and sludge from two remaining tanks, and disposing of the tanks offsite; providing an alternate water supply to residents with contaminated wells; pumping and onsite treatment of VOC-contaminated ground water.

  14. Computing the Deflection of the Vertical for Improving Aerial Surveys: A Comparison between EGM2008 and ITALGEO05 Estimates

    PubMed Central

    Barzaghi, Riccardo; Carrion, Daniela; Pepe, Massimiliano; Prezioso, Giuseppina

    2016-01-01

    Recent studies on the influence of the anomalous gravity field in GNSS/INS applications have shown that neglecting the impact of the deflection of vertical in aerial surveys induces horizontal and vertical errors in the measurement of an object that is part of the observed scene; these errors can vary from a few tens of centimetres to over one meter. The works reported in the literature refer to vertical deflection values based on global geopotential model estimates. In this paper we compared this approach with the one based on local gravity data and collocation methods. In particular, denoted by ξ and η, the two mutually-perpendicular components of the deflection of the vertical vector (in the north and east directions, respectively), their values were computed by collocation in the framework of the Remove-Compute-Restore technique, applied to the gravity database used for estimating the ITALGEO05 geoid. Following this approach, these values have been computed at different altitudes that are relevant in aerial surveys. The (ξ, η) values were then also estimated using the high degree EGM2008 global geopotential model and compared with those obtained in the previous computation. The analysis of the differences between the two estimates has shown that the (ξ, η) global geopotential model estimate can be reliably used in aerial navigation applications that require the use of sensors connected to a GNSS/INS system only above a given height (e.g., 3000 m in this paper) that must be defined by simulations. PMID:27472333

  15. Computing the Deflection of the Vertical for Improving Aerial Surveys: A Comparison between EGM2008 and ITALGEO05 Estimates.

    PubMed

    Barzaghi, Riccardo; Carrion, Daniela; Pepe, Massimiliano; Prezioso, Giuseppina

    2016-01-01

    Recent studies on the influence of the anomalous gravity field in GNSS/INS applications have shown that neglecting the impact of the deflection of vertical in aerial surveys induces horizontal and vertical errors in the measurement of an object that is part of the observed scene; these errors can vary from a few tens of centimetres to over one meter. The works reported in the literature refer to vertical deflection values based on global geopotential model estimates. In this paper we compared this approach with the one based on local gravity data and collocation methods. In particular, denoted by ξ and η, the two mutually-perpendicular components of the deflection of the vertical vector (in the north and east directions, respectively), their values were computed by collocation in the framework of the Remove-Compute-Restore technique, applied to the gravity database used for estimating the ITALGEO05 geoid. Following this approach, these values have been computed at different altitudes that are relevant in aerial surveys. The (ξ, η) values were then also estimated using the high degree EGM2008 global geopotential model and compared with those obtained in the previous computation. The analysis of the differences between the two estimates has shown that the (ξ, η) global geopotential model estimate can be reliably used in aerial navigation applications that require the use of sensors connected to a GNSS/INS system only above a given height (e.g., 3000 m in this paper) that must be defined by simulations. PMID:27472333

  16. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    USGS Publications Warehouse

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  17. A Survey of High-Quality Computational Libraries and their Impactin Science and Engineering Applications

    SciTech Connect

    Drummond, L.A.; Hernandez, V.; Marques, O.; Roman, J.E.; Vidal, V.

    2004-09-20

    Recently, a number of important scientific and engineering problems have been successfully studied and solved by means of computational modeling and simulation. Many of these computational models and simulations benefited from the use of available software tools and libraries to achieve high performance and portability. In this article, we present a reference matrix of the performance of robust, reliable and widely used tools mapped to scientific and engineering applications that use them. We aim at regularly maintaining and disseminating this matrix to the computational science community. This matrix will contain information on state-of-the-art computational tools, their applications and their use.

  18. A survey of students` ethical attitudes using computer-related scenarios

    SciTech Connect

    Hanchey, C.M.; Kingsbury, J.

    1994-12-31

    Many studies exist that examine ethical beliefs and attitudes of university students ascending medium or large institutions. There are also many studies which examine ethical attitudes and beliefs of computer science and computer information systems majors. None, however, examines ethical attitudes of university students (regardless of undergraduate major) at a small, Christian, liberal arts institution regarding computer-related situations. This paper will present data accumulated by an on-going study in which students are presented seven scenarios--all of which involve some aspect of computing technology. These students were randomly selected from a small, Christian, liberal-arts university.

  19. A Survey and Evaluation of Simulators Suitable for Teaching Courses in Computer Architecture and Organization

    ERIC Educational Resources Information Center

    Nikolic, B.; Radivojevic, Z.; Djordjevic, J.; Milutinovic, V.

    2009-01-01

    Courses in Computer Architecture and Organization are regularly included in Computer Engineering curricula. These courses are usually organized in such a way that students obtain not only a purely theoretical experience, but also a practical understanding of the topics lectured. This practical work is usually done in a laboratory using simulators…

  20. Survey of Turbulence Models for the Computation of Turbulent Jet Flow and Noise

    NASA Technical Reports Server (NTRS)

    Nallasamy, N.

    1999-01-01

    The report presents an overview of jet noise computation utilizing the computational fluid dynamic solution of the turbulent jet flow field. The jet flow solution obtained with an appropriate turbulence model provides the turbulence characteristics needed for the computation of jet mixing noise. A brief account of turbulence models that are relevant for the jet noise computation is presented. The jet flow solutions that have been directly used to calculate jet noise are first reviewed. Then, the turbulent jet flow studies that compute the turbulence characteristics that may be used for noise calculations are summarized. In particular, flow solutions obtained with the k-e model, algebraic Reynolds stress model, and Reynolds stress transport equation model are reviewed. Since, the small scale jet mixing noise predictions can be improved by utilizing anisotropic turbulence characteristics, turbulence models that can provide the Reynolds stress components must now be considered for jet flow computations. In this regard, algebraic stress models and Reynolds stress transport models are good candidates. Reynolds stress transport models involve more modeling and computational effort and time compared to algebraic stress models. Hence, it is recommended that an algebraic Reynolds stress model (ASM) be implemented in flow solvers to compute the Reynolds stress components.

  1. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    ERIC Educational Resources Information Center

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  2. CLIC-ACM: generic modular rad-hard data acquisition system based on CERN GBT versatile link

    NASA Astrophysics Data System (ADS)

    Bielawski, B.; Locci, F.; Magnoni, S.

    2015-01-01

    CLIC is a world-wide collaboration to study the next ``terascale'' lepton collider, relying upon a very innovative concept of two-beam-acceleration. This accelerator, currently under study, will be composed of the subsequence of 21000 two-beam-modules. Each module requires more than 300 analogue and digital signals which need to be acquired and controlled in a synchronous way. CLIC-ACM (Acquisition and Control Module) is the 'generic' control and acquisition module developed to accommodate the controls of all these signals for various sub-systems and related specification in term of data bandwidth, triggering and timing synchronization. This paper describes the system architecture with respect to its radiation-tolerance, power consumption and scalability.

  3. Holism, ambiguity and approximation in the logics of quantum computation: a survey

    NASA Astrophysics Data System (ADS)

    Dalla Chiara, Maria Luisa; Giuntini, Roberto; Leporini, Roberto

    2011-01-01

    Quantum computation has suggested some new forms of quantum logic (called quantum computational logics), where meanings of sentences are identified with quantum information quantities. This provides a mathematical formalism for an abstract theory of meanings that can be applied to investigate different kinds of semantic phenomena (in social sciences, in medicine, in natural languages and in the languages of art), where both ambiguity and holism play an essential role.

  4. How We Surveyed Doctors to Learn What They Want from Computers and Technology

    ERIC Educational Resources Information Center

    Bardyn, Tania; Young, Caroline; Lombardi, Lin C.

    2008-01-01

    Librarians at New York City's Bellevue Hospital Center needed to write a 3-year strategic plan that included technology data. In this article, they describe how they surveyed doctors and residents about their technology and internet use to determine what the Bellevue Medical Library needed to do in order to support those who deliver medical care.…

  5. Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes

    PubMed Central

    2016-01-01

    Background The National Health Service (NHS) 70-item inpatient questionnaire surveys inpatients on their perceptions of their hospitalization experience. However, it imposes more burden on the patient than other similar surveys. The literature shows that computerized adaptive testing (CAT) based on item response theory can help shorten the item length of a questionnaire without compromising its precision. Objective Our aim was to investigate whether CAT can be (1) efficient with item reduction and (2) used with quick response (QR) codes scanned by mobile phones. Methods After downloading the 2008 inpatient survey data from the Picker Institute Europe website and analyzing the difficulties of this 70-item questionnaire, we used an author-made Excel program using the Rasch partial credit model to simulate 1000 patients’ true scores followed by a standard normal distribution. The CAT was compared to two other scenarios of answering all items (AAI) and the randomized selection method (RSM), as we investigated item length (efficiency) and measurement accuracy. The author-made Web-based CAT program for gathering patient feedback was effectively accessed from mobile phones by scanning the QR code. Results We found that the CAT can be more efficient for patients answering questions (ie, fewer items to respond to) than either AAI or RSM without compromising its measurement accuracy. A Web-based CAT inpatient survey accessed by scanning a QR code on a mobile phone was viable for gathering inpatient satisfaction responses. Conclusions With advances in technology, patients can now be offered alternatives for providing feedback about hospitalization satisfaction. This Web-based CAT is a possible option in health care settings for reducing the number of survey items, as well as offering an innovative QR code access. PMID:26935793

  6. Dynamic MRI-based computer aided diagnostic systems for early detection of kidney transplant rejection: A survey

    NASA Astrophysics Data System (ADS)

    Mostapha, Mahmoud; Khalifa, Fahmi; Alansary, Amir; Soliman, Ahmed; Gimel'farb, Georgy; El-Baz, Ayman

    2013-10-01

    Early detection of renal transplant rejection is important to implement appropriate medical and immune therapy in patients with transplanted kidneys. In literature, a large number of computer-aided diagnostic (CAD) systems using different image modalities, such as ultrasound (US), magnetic resonance imaging (MRI), computed tomography (CT), and radionuclide imaging, have been proposed for early detection of kidney diseases. A typical CAD system for kidney diagnosis consists of a set of processing steps including: motion correction, segmentation of the kidney and/or its internal structures (e.g., cortex, medulla), construction of agent kinetic curves, functional parameter estimation, diagnosis, and assessment of the kidney status. In this paper, we survey the current state-of-the-art CAD systems that have been developed for kidney disease diagnosis using dynamic MRI. In addition, the paper addresses several challenges that researchers face in developing efficient, fast and reliable CAD systems for the early detection of kidney diseases.

  7. Characterization of PVL/ACME-positive methicillin-resistant Staphylococcus aureus (genotypes ST8-MRSA-IV and ST5-MRSA-II) isolated from a university hospital in Japan.

    PubMed

    Kawaguchiya, Mitsuyo; Urushibara, Noriko; Yamamoto, Dai; Yamashita, Toshiharu; Shinagawa, Masaaki; Watanabe, Naoki; Kobayashi, Nobumichi

    2013-02-01

    The ST8 methicillin-resistant Staphylococcus aureus (MRSA) with Staphylococcal cassette chromosome mec (SCCmec) type IVa, known as USA300, is a prevalent community-acquired MRSA (CA-MRSA) clone in the United States and has been spreading worldwide. The USA300 characteristically harbors Panton-Valentine Leukocidin (PVL) genes and the arginine catabolic mobile element (ACME, type I). Prevalence and molecular characteristics of PVL(+) and/or ACME(+) S. aureus were investigated in a university hospital located in northern Japan, for 1,366 S. aureus isolates, including 601 MRSA strains derived from clinical specimens collected from 2008 to 2010. The PVL gene was identified in three MRSA strains with SCCmec IV, which belonged to ST8, spa type t008, coagulase type III, and agr type I. Two PVL-positive MRSA strains had also type I ACME, and were isolated from skin abscess of outpatients who have not travelled abroad recently. One of these PVL(+)/ACME(+) strains carried tet(K), msrA, and aph(3')-IIIa, showing resistance to kanamycin, tetracycline, erythromycin, and ciprofloxacin, suggesting acquisition of more resistance than ST8 CA-MRSA reported in Japan previously. In contrast, another PVL(+)/ACME(+) strain and a PVL(+)/ACME(-) strain were susceptible to more antimicrobials and had less virulence factors than PVL(-)/ACME(+) MRSA strains. Besides the two PVL(+) MRSA strains, ACME (type-ΔII) was identified into seven MRSA strains with SCCmec II belonging to ST5, one of the three spa types (t002, t067, and t071), coagulase type II, and agr type II. These PVL(-)/ACME(+) MRSA strains showed multiple drug resistance and harbored various toxin genes as observed for ST5 PVL(-)/ACME(-) MRSA-II. The present study suggested the spread of ST8-MRSA-IV in northern Japan, and a potential significance of ACME-positive ST5-MRSA-II as an emerging MRSA clone in a hospital.

  8. Using Computers in Distance Study: Results of a Survey amongst Disabled Distance Students.

    ERIC Educational Resources Information Center

    Ommerborn, Rainer; Schuemer, Rudolf

    2002-01-01

    In the euphoria about new technologies in distance education there exists the danger of not sufficiently considering how ever increasing "virtualization" may exclude some student groups. An explorative study was conducted that asked disabled students about their experiences with using computers and the Internet. Overall, those questioned mentioned…

  9. Using Computers in Distance Study: Results of a Survey amongst Disabled Distance Students.

    ERIC Educational Resources Information Center

    Ommerborn, Rainer; Schuemer, Rudolf

    A study at Germany's FernUniversitat sent a questionnaire to 300 enrolled distance education students (mostly adult, mostly part-time) who labeled themselves as severely disabled or chronically ill (about 2 percent of students), asking them about the types of their disabilities and their attitudes toward computer-assisted learning and online…

  10. A Survey of Knowledge Management Skills Acquisition in an Online Team-Based Distributed Computing Course

    ERIC Educational Resources Information Center

    Thomas, Jennifer D. E.

    2007-01-01

    This paper investigates students' perceptions of their acquisition of knowledge management skills, namely thinking and team-building skills, resulting from the integration of various resources and technologies into an entirely team-based, online upper level distributed computing (DC) information systems (IS) course. Results seem to indicate that…

  11. A Survey of Students Participating in a Computer-Assisted Education Programme

    ERIC Educational Resources Information Center

    Yel, Elif Binboga; Korhan, Orhan

    2015-01-01

    This paper mainly examines anthropometric data, data regarding the habits, experiences, and attitudes of the students about their tablet/laptop/desktop computer use, in addition to self-reported musculoskeletal discomfort levels and frequencies of students participating in a tablet-assisted interactive education programme. A two-part questionnaire…

  12. Computational needs survey of NASA automation and robotics missions. Volume 2: Appendixes

    NASA Technical Reports Server (NTRS)

    Davis, Gloria J.

    1991-01-01

    NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is the fact that mission computing requirements are frequency unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. Here, NASA, industry and academic communities are provided with a preliminary set of advanced mission computational processing requirements of automation and robotics (A and R) systems. The results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implemented capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Here, appendixes are provided.

  13. Unmanned aircraft systems image collection and computer vision image processing for surveying and mapping that meets professional needs

    NASA Astrophysics Data System (ADS)

    Peterson, James Preston, II

    Unmanned Aerial Systems (UAS) are rapidly blurring the lines between traditional and close range photogrammetry, and between surveying and photogrammetry. UAS are providing an economic platform for performing aerial surveying on small projects. The focus of this research was to describe traditional photogrammetric imagery and Light Detection and Ranging (LiDAR) geospatial products, describe close range photogrammetry (CRP), introduce UAS and computer vision (CV), and investigate whether industry mapping standards for accuracy can be met using UAS collection and CV processing. A 120-acre site was selected and 97 aerial targets were surveyed for evaluation purposes. Four UAS flights of varying heights above ground level (AGL) were executed, and three different target patterns of varying distances between targets were analyzed for compliance with American Society for Photogrammetry and Remote Sensing (ASPRS) and National Standard for Spatial Data Accuracy (NSSDA) mapping standards. This analysis resulted in twelve datasets. Error patterns were evaluated and reasons for these errors were determined. The relationship between the AGL, ground sample distance, target spacing and the root mean square error of the targets is exploited by this research to develop guidelines that use the ASPRS and NSSDA map standard as the template. These guidelines allow the user to select the desired mapping accuracy and determine what target spacing and AGL is required to produce the desired accuracy. These guidelines also address how UAS/CV phenomena affect map accuracy. General guidelines and recommendations are presented that give the user helpful information for planning a UAS flight using CV technology.

  14. Proceeding of the ACM/IEEE-CS Joint Conference on Digital Libraries (1st, Roanoke, Virginia, June 24-28, 2001).

    ERIC Educational Resources Information Center

    Association for Computing Machinery, New York, NY.

    Papers in this Proceedings of the ACM/IEEE-CS Joint Conference on Digital Libraries (Roanoke, Virginia, June 24-28, 2001) discuss: automatic genre analysis; text categorization; automated name authority control; automatic event generation; linked active content; designing e-books for legal research; metadata harvesting; mapping the…

  15. Creating a New Model Curriculum: A Rationale for "Computing Curricula 1990".

    ERIC Educational Resources Information Center

    Bruce, Kim B.

    1991-01-01

    Describes a model for the design of undergraduate curricula in the discipline of computing that was developed by the ACM/IEEE (Association for Computing Machinery/Institute of Electrical and Electronics Engineers) Computer Society Joint Curriculum Task Force. Institutional settings and structures in which computing degrees are awarded are…

  16. GTE: a new FFT based software to compute terrain correction on airborne gravity surveys in spherical approximation.

    NASA Astrophysics Data System (ADS)

    Capponi, Martina; Sampietro, Daniele; Sansò, Fernando

    2016-04-01

    The computation of the vertical attraction due to the topographic masses (Terrain Correction) is still a matter of study both in geodetic as well as in geophysical applications. In fact it is required in high precision geoid estimation by the remove-restore technique and it is used to isolate the gravitational effect of anomalous masses in geophysical exploration. This topographical effect can be evaluated from the knowledge of a Digital Terrain Model in different ways: e.g. by means of numerical integration, by prisms, tesseroids, polyedra or Fast Fourier Transform (FFT) techniques. The increasing resolution of recently developed digital terrain models, the increasing number of observation points due to extensive use of airborne gravimetry and the increasing accuracy of gravity data represents nowadays major issues for the terrain correction computation. Classical methods such as prism or point masses approximations are indeed too slow while Fourier based techniques are usually too approximate for the required accuracy. In this work a new software, called Gravity Terrain Effects (GTE), developed in order to guarantee high accuracy and fast computation of terrain corrections is presented. GTE has been thought expressly for geophysical applications allowing the computation not only of the effect of topographic and bathymetric masses but also those due to sedimentary layers or to the Earth crust-mantle discontinuity (the so called Moho). In the present contribution we summarize the basic theory of the software and its practical implementation. Basically the GTE software is based on a new algorithm which, by exploiting the properties of the Fast Fourier Transform, allows to quickly compute the terrain correction, in spherical approximation, at ground or airborne level. Some tests to prove its performances are also described showing GTE capability to compute high accurate terrain corrections in a very short time. Results obtained for a real airborne survey with GTE

  17. Computational analysis in epilepsy neuroimaging: A survey of features and methods

    PubMed Central

    Kini, Lohith G.; Gee, James C.; Litt, Brian

    2016-01-01

    Epilepsy affects 65 million people worldwide, a third of whom have seizures that are resistant to anti-epileptic medications. Some of these patients may be amenable to surgical therapy or treatment with implantable devices, but this usually requires delineation of discrete structural or functional lesion(s), which is challenging in a large percentage of these patients. Advances in neuroimaging and machine learning allow semi-automated detection of malformations of cortical development (MCDs), a common cause of drug resistant epilepsy. A frequently asked question in the field is what techniques currently exist to assist radiologists in identifying these lesions, especially subtle forms of MCDs such as focal cortical dysplasia (FCD) Type I and low grade glial tumors. Below we introduce some of the common lesions encountered in patients with epilepsy and the common imaging findings that radiologists look for in these patients. We then review and discuss the computational techniques introduced over the past 10 years for quantifying and automatically detecting these imaging findings. Due to large variations in the accuracy and implementation of these studies, specific techniques are traditionally used at individual centers, often guided by local expertise, as well as selection bias introduced by the varying prevalence of specific patient populations in different epilepsy centers. We discuss the need for a multi-institutional study that combines features from different imaging modalities as well as computational techniques to definitively assess the utility of specific automated approaches to epilepsy imaging. We conclude that sharing and comparing these different computational techniques through a common data platform provides an opportunity to rigorously test and compare the accuracy of these tools across different patient populations and geographical locations. We propose that these kinds of tools, quantitative imaging analysis methods and open data platforms for

  18. Computational analysis in epilepsy neuroimaging: A survey of features and methods.

    PubMed

    Kini, Lohith G; Gee, James C; Litt, Brian

    2016-01-01

    Epilepsy affects 65 million people worldwide, a third of whom have seizures that are resistant to anti-epileptic medications. Some of these patients may be amenable to surgical therapy or treatment with implantable devices, but this usually requires delineation of discrete structural or functional lesion(s), which is challenging in a large percentage of these patients. Advances in neuroimaging and machine learning allow semi-automated detection of malformations of cortical development (MCDs), a common cause of drug resistant epilepsy. A frequently asked question in the field is what techniques currently exist to assist radiologists in identifying these lesions, especially subtle forms of MCDs such as focal cortical dysplasia (FCD) Type I and low grade glial tumors. Below we introduce some of the common lesions encountered in patients with epilepsy and the common imaging findings that radiologists look for in these patients. We then review and discuss the computational techniques introduced over the past 10 years for quantifying and automatically detecting these imaging findings. Due to large variations in the accuracy and implementation of these studies, specific techniques are traditionally used at individual centers, often guided by local expertise, as well as selection bias introduced by the varying prevalence of specific patient populations in different epilepsy centers. We discuss the need for a multi-institutional study that combines features from different imaging modalities as well as computational techniques to definitively assess the utility of specific automated approaches to epilepsy imaging. We conclude that sharing and comparing these different computational techniques through a common data platform provides an opportunity to rigorously test and compare the accuracy of these tools across different patient populations and geographical locations. We propose that these kinds of tools, quantitative imaging analysis methods and open data platforms for

  19. SAM 2.1—A computer program for plotting and formatting surveying data for estimating peak discharges by the slope-area method

    USGS Publications Warehouse

    Hortness, J.E.

    2004-01-01

    The U.S. Geological Survey (USGS) measures discharge in streams using several methods. However, measurement of peak discharges is often impossible or impractical due to difficult access, inherent danger of making measurements during flood events, and timing often associated with flood events. Thus, many peak discharge values often are calculated after the fact by use of indirect methods. The most common indirect method for estimating peak dis- charges in streams is the slope-area method. This, like other indirect methods, requires measuring the flood profile through detailed surveys. Processing the survey data for efficient entry into computer streamflow models can be time demanding; SAM 2.1 is a program designed to expedite that process. The SAM 2.1 computer program is designed to be run in the field on a portable computer. The program processes digital surveying data obtained from an electronic surveying instrument during slope- area measurements. After all measurements have been completed, the program generates files to be input into the SAC (Slope-Area Computation program; Fulford, 1994) or HEC-RAS (Hydrologic Engineering Center-River Analysis System; Brunner, 2001) computer streamflow models so that an estimate of the peak discharge can be calculated.

  20. Hydrologic effects of phreatophyte control, Acme-Artesia reach of the Pecos River, New Mexico, 1967-82

    USGS Publications Warehouse

    Welder, G.E.

    1988-01-01

    The U.S. Bureau of Reclamation began a phreatophyte clearing and control program in the bottom land of the Acme-Artesia reach of the Pecos River in March 1967. The initial cutting of 19,000 acres of saltcedar trees, the dominant phreatophyte in the area, was completed in May 1969. Saltcedar regrowth continued each year until July 1975, when root plowing eradicated most of the regrowth. The major objective of the clearing and control program was to salvage water that could be put to beneficial use. Measurements of changes in the water table in the bottom land and changes in the base flow of the Pecos River were made in order to determine the hydrologic effects of the program. Some salvage of water was indicated, but it is not readily recognized as an increase in base flow. The quantity of salvage probably is less than the average annual base-flow gain of 19 ,110 acre-ft in the reach during 1967-82. (Author 's abstract)

  1. Detection of structural and numerical chomosomal abnormalities by ACM-FISH analysis in sperm of oligozoospermic infertility patients

    SciTech Connect

    Schmid, T E; Brinkworth, M H; Hill, F; Sloter, E; Kamischke, A; Marchetti, F; Nieschlag, E; Wyrobek, A J

    2003-11-10

    Modern reproductive technologies are enabling the treatment of infertile men with severe disturbances of spermatogenesis. The possibility of elevated frequencies of genetically and chromosomally defective sperm has become an issue of concern with the increased usage of intracytoplasmic sperm injection (ICSI), which can enable men with severely impaired sperm production to father children. Several papers have been published about aneuploidy in oligozoospermic patients, but relatively little is known about chromosome structural aberrations in the sperm of these patients. We examined sperm from infertile, oligozoospermic individuals for structural and numerical chromosomal abnormalities using a multicolor ACM FISH assay that utilizes DNA probes specific for three regions of chromosome 1 to detect human sperm that carry numerical chromosomal abnormalities plus two categories of structural aberrations: duplications and deletions of 1pter and 1cen, and chromosomal breaks within the 1cen-1q12 region. There was a significant increase in the average frequencies of sperm with duplications and deletions in the infertility patients compared with the healthy concurrent controls. There was also a significantly elevated level of breaks within the 1cen-1q12 region. There was no evidence for an increase in chromosome-1 disomy, or in diploidy. Our data reveal that oligozoospermia is associated with chromosomal structural abnormalities suggesting that, oligozoospermic men carry a higher burden of transmissible, chromosome damage. The findings raise the possibility of elevated levels of transmissible chromosomal defects following ICSI treatment.

  2. Assessment of Two Planetary Boundary Layer Schemes (ACM2 and YSU) within the Weather Research and Forecasting (WRF) Model

    NASA Astrophysics Data System (ADS)

    Wolff, J.; Harrold, M.; Xu, M.

    2014-12-01

    The Weather Research and Forecasting (WRF) model is a highly configurable numerical weather prediction system used in both research and operational forecasting applications. Rigorously testing select configurations and evaluating the performance for specific applications is necessary due to the flexibility offered by the system. The Developmental Testbed Center (DTC) performed extensive testing and evaluation with the Advanced Research WRF (ARW) dynamic core for two physics suite configurations with a goal of assessing the impact that the planetary boundary layer (PBL) scheme had on the final forecast performance. The baseline configuration was run with the Air Force Weather Agency's physics suite, which includes the Yonsei University PBL scheme, while the second configuration was substituted with the Asymmetric Convective Model (ACM2) PBL scheme. This presentation will focus on assessing the forecast performance of the two configurations; both configurations were run over the same set of cases, allowing for a direct comparison of performance. The evaluation was performed over a 15 km CONUS domain for a testing period from September 2013 through August 2014. Simulations were initialized every 36 hours and run out to 48 hours; a 6-hour "warm start" spin-up, including data assimilation using the Gridpoint Statistical Interpolation system preceded each simulation. The extensive testing period allows for robust results as well as the ability to investigate seasonal and regional differences between the two configurations. Results will focus on the evaluation of traditional verification metrics for surface and upper air variables, along with an assessment of statistical and practical significance.

  3. A Self-Report Computer-Based Survey of Technology Use by People with Intellectual and Developmental Disabilities

    PubMed Central

    Tanis, Emily Shea; Palmer, Susan B.; Wehmeyer, Michael L.; Davies, Danial; Stock, Steven; Lobb, Kathy; Bishop, Barbara

    2014-01-01

    Advancements of technologies in the areas of mobiliy, hearing and vision, communication, and daily living for people with intellectual and developmental disabilities (IDD) has the potential to greatly enhance indepencence and self-determination. Previous research, however, suggests that there is a “technological divide” with regard to the use of such technologies by people with IDD when compared with the general public. The present study sought to provide current information with regard to technology use by people with IDD by examining the technology needs, use, and barriers to such use experienced by 180 adults with IDD through QuestNet, a self-directed computer survey program. The study findings suggest that although there has been progress in technology acquisition and use by people IDD, yet there remains an underutilization of technologies across the population. PMID:22316226

  4. Survey of illuminance distribution of vignetted image at autocollimation systems by computer simulation

    NASA Astrophysics Data System (ADS)

    Konyakhin, Igor A.; Smekhov, Andrey

    2013-01-01

    During installation and maintaining industrial large-scale structures there is a necessary of usage angular measuring devices such as autocollimation systems, which allows controlling some characteristics of objects. Moreover, autocollimation systems are often used in science experiments and technical modeling. Parallelism, coaxiality, other alignments, yaw, roll, pitch or deformation angles are the wide spread parameters often required to be determined. However, autocollimation systems have some principal issues. Errors caused by incoming radiation vignetting is one of them. Despite the resulting image at the detector's sensitive area is reduced by entrance aperture, the errors of such vignetting could be eliminated being systematical ones. Implementation of the software model, which traces rays through the simulated autocollimation system, predicts the illuminance distribution and calculates vignetting errors are the purposes of this survey. The amount of precalculated vignetting errors for each reflecting element position which is fixed at the object to control is saved at any database. Due to the simple recovery algorithm this amount gives the possibility to reconstruct the real object position and eliminate vignetting error. Furthermore, the ability to model different types of apertures, reflecting elements and emitter's radiation patterns incorporated into the software gives the ability to apply one at much more complicated systems and decrease the time and exps of a design process.

  5. A review of brain-computer interface games and an opinion survey from researchers, developers and users.

    PubMed

    Ahn, Minkyu; Lee, Mijin; Choi, Jinyoung; Jun, Sung Chan

    2014-08-11

    In recent years, research on Brain-Computer Interface (BCI) technology for healthy users has attracted considerable interest, and BCI games are especially popular. This study reviews the current status of, and describes future directions, in the field of BCI games. To this end, we conducted a literature search and found that BCI control paradigms using electroencephalographic signals (motor imagery, P300, steady state visual evoked potential and passive approach reading mental state) have been the primary focus of research. We also conducted a survey of nearly three hundred participants that included researchers, game developers and users around the world. From this survey, we found that all three groups (researchers, developers and users) agreed on the significant influence and applicability of BCI and BCI games, and they all selected prostheses, rehabilitation and games as the most promising BCI applications. User and developer groups tended to give low priority to passive BCI and the whole head sensor array. Developers gave higher priorities to "the easiness of playing" and the "development platform" as important elements for BCI games and the market. Based on our assessment, we discuss the critical point at which BCI games will be able to progress from their current stage to widespread marketing to consumers. In conclusion, we propose three critical elements important for expansion of the BCI game market: standards, gameplay and appropriate integration.

  6. A Review of Brain-Computer Interface Games and an Opinion Survey from Researchers, Developers and Users

    PubMed Central

    Ahn, Minkyu; Lee, Mijin; Choi, Jinyoung; Jun, Sung Chan

    2014-01-01

    In recent years, research on Brain-Computer Interface (BCI) technology for healthy users has attracted considerable interest, and BCI games are especially popular. This study reviews the current status of, and describes future directions, in the field of BCI games. To this end, we conducted a literature search and found that BCI control paradigms using electroencephalographic signals (motor imagery, P300, steady state visual evoked potential and passive approach reading mental state) have been the primary focus of research. We also conducted a survey of nearly three hundred participants that included researchers, game developers and users around the world. From this survey, we found that all three groups (researchers, developers and users) agreed on the significant influence and applicability of BCI and BCI games, and they all selected prostheses, rehabilitation and games as the most promising BCI applications. User and developer groups tended to give low priority to passive BCI and the whole head sensor array. Developers gave higher priorities to “the easiness of playing” and the “development platform” as important elements for BCI games and the market. Based on our assessment, we discuss the critical point at which BCI games will be able to progress from their current stage to widespread marketing to consumers. In conclusion, we propose three critical elements important for expansion of the BCI game market: standards, gameplay and appropriate integration. PMID:25116904

  7. Accurate treatments of electrostatics for computer simulations of biological systems: A brief survey of developments and existing problems

    NASA Astrophysics Data System (ADS)

    Yi, Sha-Sha; Pan, Cong; Hu, Zhong-Han

    2015-12-01

    Modern computer simulations of biological systems often involve an explicit treatment of the complex interactions among a large number of molecules. While it is straightforward to compute the short-ranged Van der Waals interaction in classical molecular dynamics simulations, it has been a long-lasting issue to develop accurate methods for the longranged Coulomb interaction. In this short review, we discuss three types of methodologies for the accurate treatment of electrostatics in simulations of explicit molecules: truncation-type methods, Ewald-type methods, and mean-field-type methods. Throughout the discussion, we brief the formulations and developments of these methods, emphasize the intrinsic connections among the three types of methods, and focus on the existing problems which are often associated with the boundary conditions of electrostatics. This brief survey is summarized with a short perspective on future trends along the method developments and applications in the field of biological simulations. Project supported by the National Natural Science Foundation of China (Grant Nos. 91127015 and 21522304) and the Open Project from the State Key Laboratory of Theoretical Physics, and the Innovation Project from the State Key Laboratory of Supramolecular Structure and Materials.

  8. Quality of human-computer interaction - results of a national usability survey of hospital-IT in Germany

    PubMed Central

    2011-01-01

    Background Due to the increasing functionality of medical information systems, it is hard to imagine day to day work in hospitals without IT support. Therefore, the design of dialogues between humans and information systems is one of the most important issues to be addressed in health care. This survey presents an analysis of the current quality level of human-computer interaction of healthcare-IT in German hospitals, focused on the users' point of view. Methods To evaluate the usability of clinical-IT according to the design principles of EN ISO 9241-10 the IsoMetrics Inventory, an assessment tool, was used. The focus of this paper has been put on suitability for task, training effort and conformity with user expectations, differentiated by information systems. Effectiveness has been evaluated with the focus on interoperability and functionality of different IT systems. Results 4521 persons from 371 hospitals visited the start page of the study, while 1003 persons from 158 hospitals completed the questionnaire. The results show relevant variations between different information systems. Conclusions Specialised information systems with defined functionality received better assessments than clinical information systems in general. This could be attributed to the improved customisation of these specialised systems for specific working environments. The results can be used as reference data for evaluation and benchmarking of human computer engineering in clinical health IT context for future studies. PMID:22070880

  9. A survey of radiation dose to patients and operators during radiofrequency ablation using computed tomography

    PubMed Central

    Saidatul, A; Azlan, CA; Megat Amin, MSA; Abdullah, BJJ; Ng, KH

    2010-01-01

    Computed tomography (CT) fluoroscopy is able to give real time images to a physician undertaking minimally invasive procedures such as biopsies, percutaneous drainage, and radio frequency ablation (RFA). Both operators executing the procedure and patients too, are thus at risk of radiation exposure during a CT fluoroscopy. This study focuses on the radiation exposure present during a series of radio frequency ablation (RFA) procedures, and used Gafchromic film (Type XR-QA; International Specialty Products, USA) and thermoluminescent dosimeters (TLD-100H; Bicron, USA) to measure the radiation received by patients undergoing treatment, and also operators subject to scatter radiation. The voltage was held constant at 120 kVp and the current 70mA, with 5mm thickness. The duration of irradiation was between 150-638 seconds. Ultimately, from a sample of 30 liver that have undergone RFA, the study revealed that the operator received the highest dose at the hands, which was followed by the eyes and thyroid, while secondary staff dosage was moderately uniform across all parts of the body that were measured. PMID:21611060

  10. Computational survey of representative energetic materials as propellants for microthruster applications

    NASA Astrophysics Data System (ADS)

    Fuchs, Brian; Stec, Daniel, III

    2007-04-01

    Microthrusters are critical for the development of terrestrial micromissiles and nano air vehicles for reconnaissance, surveillance, and sensor emplacement. With the maturation of MEMS manufacturing technology, the physical components of the thrusters can be readily fabricated. The thruster type that is the most straightforward is chemical combustion of a propellant that is ignited by a heating element giving a single shot thrust. Arrays of MEMS manufactured thrusters can be ganged to give multiple firings. The basic model for such a system is a solid rocket motor. The desired elements for the propellant of a chemical thruster are high specific impulse (I sp), high temperature and pressure, and low molecular weight combustion gases. Since the combustion chamber of a microthruster is extremely small, the propellant material must be able to ignite, sustain and complete its burn inside the chamber. The propellant can be either a solid or a liquid. There are a large number of energetic materials available as candidates for a propellant for microthrusters. There has been no systematic evaluation of the available energetic materials as propellant candidates for microthrusters. This report summarizes computations done on a series of energetic materials to address their suitabilities as microthruster propellants.

  11. A Review of Models for Teacher Preparation Programs for Precollege Computer Science Education.

    ERIC Educational Resources Information Center

    Deek, Fadi P.; Kimmel, Howard

    2002-01-01

    Discusses the need for adequate precollege computer science education and focuses on the issues of teacher preparation programs and requirements needed to teach high school computer science. Presents models of teacher preparation programs and compares state requirements with Association for Computing Machinery (ACM) recommendations. (Author/LRW)

  12. Prescriptions for ACME's Future.

    ERIC Educational Resources Information Center

    Felch, William Campbell

    1991-01-01

    Five prescriptions for the future agenda of the Alliance for Continuing Medical Education are (1) a core curriculum; (2) informatics; (3) remedial continuing medical education (CME); (4) focus on the individual learner; and (5) practice-oriented CME. (SK)

  13. WTP Calculation Sheet: Determining the LAW Glass Former Constituents and Amounts for G2 and Acm Models. 24590-LAW-M4C-LFP-00002, Rev. B

    SciTech Connect

    Gimpel, Rodney F.; Kruger, Albert A.

    2013-12-16

    The purpose of this calculation is to determine the LAW glass former recipe and additives with their respective amounts. The methodology and equations contained herein are to be used in the G2 and ACM models until better information is supplied by R&T efforts. This revision includes calculations that determines the mass and volume of the bulk chemicals/minerals needed per batch. Plus, it contains calculations (for the G2 model) to help prevent overflow in LAW Feed Preparation Vessel.

  14. A 90-day subchronic feeding study of genetically modified maize expressing Cry1Ac-M protein in Sprague-Dawley rats.

    PubMed

    Liu, Pengfei; He, Xiaoyun; Chen, Delong; Luo, Yunbo; Cao, Sishuo; Song, Huan; Liu, Ting; Huang, Kunlun; Xu, Wentao

    2012-09-01

    The cry1Ac-M gene, coding one of Bacillus thuringiensis (Bt) crystal proteins, was introduced into maize H99 × Hi IIB genome to produce insect-resistant GM maize BT-38. The food safety assessment of the BT-38 maize was conducted in Sprague-Dawley rats by a 90-days feeding study. We incorporated maize grains from BT-38 and H99 × Hi IIB into rodent diets at three concentrations (12.5%, 25%, 50%) and administered to Sprague-Dawley rats (n=10/sex/group) for 90 days. A commercialized rodent diet was fed to an additional group as control group. Body weight, feed consumption and toxicological response variables were measured, and gross as well as microscopic pathology were examined. Moreover, detection of residual Cry1Ac-M protein in the serum of rats fed with GM maize was conducted. No death or adverse effects were observed in the current feeding study. No adverse differences in the values of the response variables were observed between rats that consumed diets containing GM maize BT-38 and non-GM maize H99 × Hi IIB. No detectable Cry1Ac-M protein was found in the serum of rats after feeding diets containing GM maize for 3 months. The results demonstrated that BT-38 maize is as safe as conventional non-GM maize.

  15. Assessment of Universal Healthcare Coverage in a District of North India: A Rapid Cross-Sectional Survey Using Tablet Computers

    PubMed Central

    Singh, Tarundeep; Roy, Pritam; Jamir, Limalemla; Gupta, Saurav; Kaur, Navpreet; Jain, D. K.; Kumar, Rajesh

    2016-01-01

    Objective A rapid survey was carried out in Shaheed Bhagat Singh Nagar District of Punjab state in India to ascertain health seeking behavior and out-of-pocket health expenditures. Methods Using multistage cluster sampling design, 1,008 households (28 clusters x 36 households in each cluster) were selected proportionately from urban and rural areas. Households were selected through a house-to-house survey during April and May 2014 whose members had (a) experienced illness in the past 30 days, (b) had illness lasting longer than 30 days, (c) were hospitalized in the past 365 days, or (d) had women who were currently pregnant or experienced childbirth in the past two years. In these selected households, trained investigators, using a tablet computer-based structured questionnaire, enquired about the socio-demographics, nature of illness, source of healthcare, and healthcare and household expenditure. The data was transmitted daily to a central server using wireless communication network. Mean healthcare expenditures were computed for various health conditions. Catastrophic healthcare expenditure was defined as more than 10% of the total annual household expenditure on healthcare. Chi square test for trend was used to compare catastrophic expenditures on hospitalization between households classified into expenditure quartiles. Results The mean monthly household expenditure was 15,029 Indian Rupees (USD 188.2). Nearly 14.2% of the household expenditure was on healthcare. Fever, respiratory tract diseases, gastrointestinal diseases were the common acute illnesses, while heart disease, diabetes mellitus, and respiratory diseases were the more common chronic diseases. Hospitalizations were mainly due to cardiovascular diseases, gastrointestinal problems, and accidents. Only 17%, 18%, 20% and 31% of the healthcare for acute illnesses, chronic illnesses, hospitalizations and childbirth was sought in the government health facilities. Average expenditure in government health

  16. A 50-State Survey of Initiatives in Science, Mathematics and Computer Education. ECS Working Papers. Task Force on Education for Economic Growth.

    ERIC Educational Resources Information Center

    Education Commission of the States, Denver, CO. Task Force on Education for Economic Growth.

    A 50-state survey of mathematics, science, and computer education initiatives was undertaken to identify the number and diversity of responses states have made in response to the national crisis in precollege mathematics and science education. Descriptions of these initiatives (obtained from questionnaires, telephone interviews with state…

  17. Development of a Computer-Based Survey Instrument for Organophosphate and N-Methyl-Carbamate Exposure Assessment among Agricultural Pesticide Handlers

    PubMed Central

    Hofmann, Jonathan N.; Checkoway, Harvey; Borges, Ofelio; Servin, Flor; Fenske, Richard A.; Keifer, Matthew C.

    2010-01-01

    Background: Assessment of occupational pesticide exposures based on self-reported information can be challenging, particularly with immigrant farm worker populations for whom specialized methods are needed to address language and cultural barriers and account for limited literacy. An audio computer-assisted self-interview (A-CASI) survey instrument was developed to collect information about organophosphate (OP) and N-methyl-carbamate (CB) exposures and other personal characteristics among male agricultural pesticide handlers for an ongoing cholinesterase biomonitoring study in Washington State. Objectives: To assess the feasibility of collecting data using the A-CASI instrument and evaluate reliability for a subset of survey items. Methods: The survey consisted of 64 items administered in Spanish or English on a touch-screen tablet computer. Participants listened to digitally recorded questions on headphones and selected responses on the screen, most of which were displayed as images or icons to facilitate participation of low literacy respondents. From 2006–2008, a total of 195 participants completed the survey during the OP/CB application seasons on at least one occasion. Percent agreement and kappa coefficients were calculated to evaluate test–retest reliability for selected characteristics among 45 participants who completed the survey on two separate occasions within the same year. Results: Almost all participants self-identified as Hispanic or Latino (98%), and 97% completed the survey in Spanish. Most participants completed the survey in a half-hour or less, with minimal assistance from on-site research staff. Analyses of test–retest reliability showed substantial agreement for most demographic, work history, and health characteristics and at least moderate agreement for most variables related to personal protective equipment use during pesticide applications. Conclusions: This A-CASI survey instrument is a novel method that has been used successfully

  18. A Placement Test for Computer Science: Design, Implementation, and Analysis

    ERIC Educational Resources Information Center

    Nugent, Gwen; Soh, Leen-Kiat; Samal, Ashok; Lang, Jeff

    2006-01-01

    An introductory CS1 course presents problems for educators and students due to students' diverse background in programming knowledge and exposure. Students who enroll in CS1 also have different expectations and motivations. Prompted by the curricular guidelines for undergraduate programmes in computer science released in 2001 by the ACM/IEEE, and…

  19. Teaching Perspectives among Introductory Computer Programming Faculty in Higher Education

    ERIC Educational Resources Information Center

    Mainier, Michael J.

    2011-01-01

    This study identified the teaching beliefs, intentions, and actions of 80 introductory computer programming (CS1) faculty members from institutions of higher education in the United States using the Teacher Perspectives Inventory. Instruction method used inside the classroom, categorized by ACM CS1 curriculum guidelines, was also captured along…

  20. Macro- and microstructural diversity of sea urchin teeth revealed by large-scale mircro-computed tomography survey

    NASA Astrophysics Data System (ADS)

    Ziegler, Alexander; Stock, Stuart R.; Menze, Björn H.; Smith, Andrew B.

    2012-10-01

    Sea urchins (Echinodermata: Echinoidea) generally possess an intricate jaw apparatus that incorporates five teeth. Although echinoid teeth consist of calcite, their complex internal design results in biomechanical properties far superior to those of inorganic forms of the constituent material. While the individual elements (or microstructure) of echinoid teeth provide general insight into processes of biomineralization, the cross-sectional shape (or macrostructure) of echinoid teeth is useful for phylogenetic and biomechanical inferences. However, studies of sea urchin tooth macro- and microstructure have traditionally been limited to a few readily available species, effectively disregarding a potentially high degree of structural diversity that could be informative in a number of ways. Having scanned numerous sea urchin species using micro-computed tomography µCT) and synchrotron µCT, we report a large variation in macro- and microstructure of sea urchin teeth. In addition, we describe aberrant tooth shapes and apply 3D visualization protocols that permit accelerated visual access to the complex microstructure of sea urchin teeth. Our broad survey identifies key taxa for further in-depth study and integrates previously assembled data on fossil species into a more comprehensive systematic analysis of sea urchin teeth. In order to circumvent the imprecise, word-based description of tooth shape, we introduce shape analysis algorithms that will permit the numerical and therefore more objective description of tooth macrostructure. Finally, we discuss how synchrotron µCT datasets permit virtual models of tooth microstructure to be generated as well as the simulation of tooth mechanics based on finite element modeling.

  1. Radiation Dose from Whole-Body F-18 Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography: Nationwide Survey in Korea

    PubMed Central

    2016-01-01

    The purpose of this study was to estimate average radiation exposure from 18F-fluorodeoxyglucose (FDG) positron emission tomography/computed tomography (PET/CT) examinations and to analyze possible factors affecting the radiation dose. A nation-wide questionnaire survey was conducted involving all institutions that operate PET/CT scanners in Korea. From the response, radiation doses from injected FDG and CT examination were calculated. A total of 105 PET/CT scanners in 73 institutions were included in the analysis (response rate of 62.4%). The average FDG injected activity was 310 ± 77 MBq and 5.11 ± 1.19 MBq/kg. The average effective dose from FDG was estimated to be 5.89 ± 1.46 mSv. The average CT dose index and dose-length product were 4.60 ± 2.47 mGy and 429.2 ± 227.6 mGy∙cm, which corresponded to 6.26 ± 3.06 mSv. The radiation doses from FDG and CT were significantly lower in case of newer scanners than older ones (P < 0.001). Advanced PET technologies such as time-of-flight acquisition and point-spread function recovery were also related to low radiation dose (P < 0.001). In conclusion, the average radiation dose from FDG PET/CT is estimated to be 12.2 mSv. The radiation dose from FDG PET/CT is reduced with more recent scanners equipped with image-enhancing algorithms. PMID:26908992

  2. Experimental determination of the partitioning coefficient and volatility of important BVOC oxidation products using the Aerosol Collection Module (ACM) coupled to a PTR-ToF-MS

    NASA Astrophysics Data System (ADS)

    Gkatzelis, G.; Hohaus, T.; Tillmann, R.; Schmitt, S. H.; Yu, Z.; Schlag, P.; Wegener, R.; Kaminski, M.; Kiendler-Scharr, A.

    2015-12-01

    Atmospheric aerosol can alter the Earth's radiative budget and global climate but can also affect human health. A dominant contributor to the submicrometer particulate matter (PM) is organic aerosol (OA). OA can be either directly emitted through e.g. combustion processes (primary OA) or formed through the oxidation of organic gases (secondary organic aerosol, SOA). A detailed understanding of SOA formation is of importance as it constitutes a major contribution to the total OA. The partitioning between the gas and particle phase as well as the volatility of individual components of SOA is yet poorly understood adding uncertainties and thus complicating climate modelling. In this work, a new experimental methodology was used for compound-specific analysis of organic aerosol. The Aerosol Collection Module (ACM) is a newly developed instrument that deploys an aerodynamic lens to separate the gas and particle phase of an aerosol. The particle phase is directed to a cooled sampling surface. After collection particles are thermally desorbed and transferred to a detector for further analysis. In the present work, the ACM was coupled to a Proton Transfer Reaction-Time of Flight-Mass Spectrometer (PTR-ToF-MS) to detect and quantify organic compounds partitioning between the gas and particle phase. This experimental approach was used in a set of experiments at the atmosphere simulation chamber SAPHIR to investigate SOA formation. Ozone oxidation with subsequent photochemical aging of β-pinene, limonene and real plant emissions from Pinus sylvestris (Scots pine) were studied. Simultaneous measurement of the gas and particle phase using the ACM-PTR-ToF-MS allows to report partitioning coefficients of important BVOC oxidation products. Additionally, volatility trends and changes of the SOA with photochemical aging are investigated and compared for all systems studied.

  3. Computer use and needs of internists: a survey of members of the American College of Physicians-American Society of Internal Medicine.

    PubMed

    Lacher, D; Nelson, E; Bylsma, W; Spena, R

    2000-01-01

    The American College of Physicians-American Society of Internal Medicine conducted a membership survey in late 1998 to assess their activities, needs, and attitudes. A total of 9,466 members (20.9% response rate) reported on 198 items related to computer use and needs of internists. Eighty-two percent of the respondents reported that they use computers for personal or professional reasons. Physicians younger than 50 years old who had full- or part-time academic affiliation reported using computers more frequently for medical applications. About two thirds of respondents who had access to computers connected to the Internet at least weekly, with most using the Internet from home for e-mail and nonmedical uses. Physicians expressed concerns about Internet security, confidentiality, and accuracy, and the lack of time to browse the Internet. In practice settings, internists used computers for administrative and financial functions. Less than 19% of respondents had partial or complete electronic clinical functions in their offices. Less than 7% of respondents exchanged e-mail with their patients on a weekly or daily basis. Also, less than 15% of respondents used computers for continuing medical education (CME). Respondents reported they wanted to increase their general computer skills and enhance their knowledge of computer-based information sources for patient care, electronic medical record systems, computer-based CME, and telemedicine While most respondents used computers and connected to the Internet, few physicians utilized computers for clinical management. Medical organizations face the challenge of increasing physician use of clinical systems and electronic CME.

  4. Python: a language for computational physics

    NASA Astrophysics Data System (ADS)

    Borcherds, P. H.

    2007-07-01

    Python is a relatively new computing language, created by Guido van Rossum [A.S. Tanenbaum, R. van Renesse, H. van Staveren, G.J. Sharp, S.J. Mullender, A.J. Jansen, G. van Rossum, Experiences with the Amoeba distributed operating system, Communications of the ACM 33 (1990) 46-63; also on-line at http://www.cs.vu.nl/pub/amoeba/. [6

  5. Peak data for U.S. Geological Survey gaging stations, Texas network and computer program to estimate peak-streamflow frequency

    USGS Publications Warehouse

    Slade, R.M.; Asquith, W.H.

    1996-01-01

    About 23,000 annual peak streamflows and about 400 historical peak streamflows exist for about 950 stations in the surface-water data-collection network of Texas. These data are presented on a computer diskette along with the corresponding dates, gage heights, and information concerning the basin, and nature or cause for the flood. Also on the computer diskette is a U.S. Geological Survey computer program that estimates peak-streamflow frequency based on annual and historical peak streamflow. The program estimates peak streamflow for 2-, 5-, 10-, 25-, 50-, and 100-year recurrence intervals and is based on guidelines established by the Interagency Advisory Committee on Water Data. Explanations are presented for installing the program, and an example is presented with discussion of its options.

  6. Finding Hidden Geothermal Resources in the Basin and Range Using Electrical Survey Techniques: A Computational Feasibility Study

    SciTech Connect

    J. W. Pritchett; not used on publication

    2004-12-01

    For many years, there has been speculation about "hidden" or "blind" geothermal systems—reservoirs that lack an obvious overlying surface fluid outlet. At present, it is simply not known whether "hidden" geothermal reservoirs are rare or common. An approach to identifying promising drilling targets using methods that are cheaper than drilling is needed. These methods should be regarded as reconnaissance tools, whose primary purpose is to locate high-probability targets for subsequent deep confirmation drilling. The purpose of this study was to appraise the feasibility of finding "hidden" geothermal reservoirs in the Basin and Range using electrical survey techniques, and of adequately locating promising targets for deep exploratory drilling based on the survey results. The approach was purely theoretical. A geothermal reservoir simulator was used to carry out a lengthy calculation of the evolution of a synthetic but generic Great Basin-type geothermal reservoir to a quasi-steady "natural state". Postprocessors were used to try to estimate what a suite of geophysical surveys of the prospect would see. Based on these results, the different survey techniques were compared and evaluated in terms of their ability to identify suitable drilling targets. This process was completed for eight different "reservoir models". Of the eight cases considered, four were "hidden" systems, so that the survey techniques could be appraised in terms of their ability to detect and characterize such resources and to distinguish them from more conventionally situated geothermal reservoirs. It is concluded that the best way to find "hidden" basin and range geothermal resources of this general type is to carry out simultaneous SP and low-frequency MT surveys, and then to combine the results of both surveys with other pertinent information using mathematical "inversion" techniques to characterize the subsurface quantitatively. Many such surveys and accompanying analyses can be carried out

  7. A system of computer programs (WAT{_}MOVE) for transferring data among data bases in the US Geological Survey National Water Information System

    SciTech Connect

    Rogers, G.D.; Kerans, B.K.

    1991-11-01

    This report describes WAT{_}MOVE, a system of computer programs that was developed for moving National Water Information System data between US Geological Survey distributed computer databases. WAT{_}MOVE has three major sub-systems: one for retrieval, one for loading, and one for purging. The retrieval sub-system creates transaction files of retrieved data for transfer and invokes a file transfer to send the transaction files to the receiving site. The loading sub-system reads the control and transaction files retrieved from the source database and loads the data in the appropriate files. The purging sub-system deletes data from a database. Although WAT{_}MOVE was developed for use by the Geological Survey`s Hydrologic Investigations Program of the Yucca Mountain Project Branch, the software can be beneficial to any office maintaining data in the Site File, ADAPS (Automated Data Processing System), GWSI (Ground-Water Site Inventory), and QW (Quality of Water) sub-systems of the National Water Information System. The software also can be used to move data between databases on a single network node or to modify data within a database.

  8. Ethics in the computer age. Conference proceedings

    SciTech Connect

    Kizza, J.M.

    1994-12-31

    These proceedings contain the papers presented at the Ethics in the Computer Age conference held in Gatlinburg, Tennessee, November 11-13, 1994. The conference was sponsored by ACM SIGCAS (Computers and Society) to which I am very grateful. The Ethics in the Computer Age conference sequence started in 1991 with the first conference at the campus of the University of Tennessee at Chattanooga. The second was help at the same location a year later. These two conferences were limited to only invited speakers, but their success was overwhelming. This is the third in the sequence and the first truly international one. Plans are already under way for the fourth in 1996.

  9. Papers Presented at the ACM SIGCSE Technical Symposium on Academic Education in Computer Science [held in Houston, Texas, November 16, 1970].

    ERIC Educational Resources Information Center

    Aiken, Robert M., Ed.

    1970-01-01

    The papers given at this symposium were selected for their description of how specific problems were tackled, and with what success, as opposed to proposals unsupported by experience. The goal was to permit the audience to profit from the trials (and errors) of others. The eighteen papers presented are: "Business and the University Computer…

  10. Social presence reinforcement and computer-mediated communication: the effect of the solicitor's photography on compliance to a survey request made by e-mail.

    PubMed

    Guéguen, Nicolas; Jacob, Céline

    2002-04-01

    Personal information is scarce in computer-mediated communication. So when information about the sender is attached with an e-mail, this could induce a positive feeling toward the sender. An experiment was carried out where a male and a female student-solicitor, by way of an e-mail, requested a student-subject to participate in a survey. In half of the cases, a digital photograph of the solicitor appeared at the end of the e-mail. Results show that subjects agreed more readily to the request in the experimental condition than in the control condition where no digital photograph was sent with the e-mail. The importance of social information on computer-mediated communication is used to explain such results.

  11. Survey of new vector computers: The CRAY 1S from CRAY research; the CYBER 205 from CDC and the parallel computer from ICL - architecture and programming

    NASA Technical Reports Server (NTRS)

    Gentzsch, W.

    1982-01-01

    Problems which can arise with vector and parallel computers are discussed in a user oriented context. Emphasis is placed on the algorithms used and the programming techniques adopted. Three recently developed supercomputers are examined and typical application examples are given in CRAY FORTRAN, CYBER 205 FORTRAN and DAP (distributed array processor) FORTRAN. The systems performance is compared. The addition of parts of two N x N arrays is considered. The influence of the architecture on the algorithms and programming language is demonstrated. Numerical analysis of magnetohydrodynamic differential equations by an explicit difference method is illustrated, showing very good results for all three systems. The prognosis for supercomputer development is assessed.

  12. Computer-science guest-lecture series at Langston University sponsored by the U.S. Geological Survey; abstracts, 1992-93

    USGS Publications Warehouse

    Steele, K. S.

    1994-01-01

    Langston University, a Historically Black University located at Langston, Oklahoma, has a computing and information science program within the Langston University Division of Business. Since 1984, Langston University has participated in the Historically Black College and University program of the U.S. Department of Interior, which provided education, training, and funding through a combined earth-science and computer-technology cooperative program with the U.S. Geological Survey (USGS). USGS personnel have presented guest lectures at Langston University since 1984. Students have been enthusiastic about the lectures, and as a result of this program, 13 Langston University students have been hired by the USGS on a part-time basis while they continued their education at the University. The USGS expanded the offering of guest lectures in 1992 by increasing the number of visits to Langston University, and by inviting participation of speakers from throughout the country. The objectives of the guest-lecture series are to assist Langston University in offering state-of-the-art education in the computer sciences, to provide students with an opportunity to learn from and interact with skilled computer-science professionals, and to develop a pool of potential future employees for part-time and full-time employment. This report includes abstracts for guest-lecture presentations during 1992-93 school year.

  13. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    PubMed

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research. PMID:27017830

  14. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    PubMed

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  15. Documentation of computer programs to compute and display pathlines using results from the U.S. Geological Survey modular three-dimensional finite-difference ground-water flow model

    USGS Publications Warehouse

    Pollock, David W.

    1989-01-01

    A particle tracking post-processing package was developed to compute three-dimensional path lines based on output from steady-state simulations obtained with the U.S. Geological Survey modular 3-dimensional finite difference groundwater flow model. The package consists of two FORTRAN 77 computer programs: (1) MODPATH, which calculates pathlines, and (2) MODPATH-PLOT, which presents results graphically. MODPATH uses a semi-analytical particle tracking scheme. The method is based on the assumption that each directional velocity component varies linearly within a grid cell in its own coordinate direction. This assumption allows an analytical expression to be obtained describing the flow path within a grid cell. Given the initial position of a particle anywhere in a cell, the coordinates of any other point along its path line within the cell, and the time of travel between them, can be computed directly. Data is input to MODPATH and MODPATH-PLOT through a combination of files and interactive dialogue. Examples of how to use MODPATH and MODPATH-PLOT are provided for a sample problem. Listings of the computer codes and detailed descriptions of input data format and program options are also presented. (Author 's abstract)

  16. How to Implement Rigorous Computer Science Education in K-12 Schools? Some Answers and Many Questions

    ERIC Educational Resources Information Center

    Hubwieser, Peter; Armoni, Michal; Giannakos, Michail N.

    2015-01-01

    Aiming to collect various concepts, approaches, and strategies for improving computer science education in K-12 schools, we edited this second special issue of the "ACM TOCE" journal. Our intention was to collect a set of case studies from different countries that would describe all relevant aspects of specific implementations of…

  17. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature. [Once-through Cycle and Plutonium Recycle

    SciTech Connect

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included.

  18. Survey of Current Practice in Computer and Information Technology in the Youth Training Scheme. Publication No. 2.

    ERIC Educational Resources Information Center

    Brown, Alan; Mills, Julian

    A study examined the computer and information technology (CIT) training provided in 61 training schemes in 10 regions throughout the United Kingdom under the auspices of the Youth Training Scheme. Of the 52 programs for which data on the time spent on CIT were available, 12 offered 5 days or less of off-the-job training with little other…

  19. Effect of survey instrument on participation in a follow-up study: a randomization study of a mailed questionnaire versus a computer-assisted telephone interview

    PubMed Central

    2012-01-01

    Background Many epidemiological and public health surveys report increasing difficulty obtaining high participation rates. We conducted a pilot follow-up study to determine whether a mailed or telephone survey would better facilitate data collection in a subset of respondents to an earlier telephone survey conducted as part of the National Birth Defects Prevention Study. Methods We randomly assigned 392 eligible mothers to receive a self-administered, mailed questionnaire (MQ) or a computer-assisted telephone interview (CATI) using similar recruitment protocols. If mothers gave permission to contact the fathers, fathers were recruited to complete the same instrument (MQ or CATI) as mothers. Results Mothers contacted for the MQ, within all demographic strata examined, were more likely to participate than those contacted for the CATI (86.6% vs. 70.6%). The median response time for mothers completing the MQ was 17 days, compared to 29 days for mothers completing the CATI. Mothers completing the MQ also required fewer reminder calls or letters to finish participation versus those assigned to the CATI (median 3 versus 6), though they were less likely to give permission to contact the father (75.0% vs. 85.8%). Fathers contacted for the MQ, however, had higher participation compared to fathers contacted for the CATI (85.2% vs. 54.5%). Fathers recruited to the MQ also had a shorter response time (median 17 days) and required fewer reminder calls and letters (median 3 reminders) than those completing the CATI (medians 28 days and 6 reminders). Conclusions We concluded that offering a MQ substantially improved participation rates and reduced recruitment effort compared to a CATI in this study. While a CATI has the advantage of being able to clarify answers to complex questions or eligibility requirements, our experience suggests that a MQ might be a good survey option for some studies. PMID:22849754

  20. Internet Use for Health-Related Information via Personal Computers and Cell Phones in Japan: A Cross-Sectional Population-Based Survey

    PubMed Central

    Takahashi, Yoshimitsu; Ohura, Tomoko; Ishizaki, Tatsuro; Okamoto, Shigeru; Miki, Kenji; Naito, Mariko; Akamatsu, Rie; Sugimori, Hiroki; Yoshiike, Nobuo; Miyaki, Koichi; Shimbo, Takuro

    2011-01-01

    Background The Internet is known to be used for health purposes by the general public all over the world. However, little is known about the use of, attitudes toward, and activities regarding eHealth among the Japanese population. Objectives This study aimed to measure the prevalence of Internet use for health-related information compared with other sources, and to examine the effects on user knowledge, attitudes, and activities with regard to Internet use for health-related information in Japan. We examined the extent of use via personal computers and cell phones. Methods We conducted a cross-sectional survey of a quasi-representative sample (N = 1200) of the Japanese general population aged 15–79 years in September 2007. The main outcome measures were (1) self-reported rates of Internet use in the past year to acquire health-related information and to contact health professionals, family, friends, and peers specifically for health-related purposes, and (2) perceived effects of Internet use on health care. Results The prevalence of Internet use via personal computer for acquiring health-related information was 23.8% (286/1200) among those surveyed, whereas the prevalence via cell phone was 6% (77). Internet use via both personal computer and cell phone for communicating with health professionals, family, friends, or peers was not common. The Internet was used via personal computer for acquiring health-related information primarily by younger people, people with higher education levels, and people with higher household incomes. The majority of those who used the Internet for health care purposes responded that the Internet improved their knowledge or affected their lifestyle attitude, and that they felt confident in the health-related information they obtained from the Internet. However, less than one-quarter thought it improved their ability to manage their health or affected their health-related activities. Conclusions Japanese moderately used the Internet via

  1. Computational Genomics Using Graph Theory

    NASA Astrophysics Data System (ADS)

    Schlick, Tamar

    2005-03-01

    . Laserson, H. H. Gan, and T. Schlick, ``Searching for 2D RNA Geometries in Bacterial Genomes,'' Proceedings of the ACM Symposium on Computational Geometry, June 9--11, New York, pp. 373--377 (2004). (http://socg.poly.edu/home.htm). N. Kim, N. Shiffeldrim, H. H. Gan, and T. Schlick, ``Novel Candidates of RNA Topologies,'' J. Mol. Biol. 341: 1129--1144 (2004). Schlick, ``RAG: RNA-As-Graphs Web Resource,'' BMC Bioinformatics 5: 88--97 (2004) (http://www.biomedcentral.com/1471-2105/5/88). S. Pasquali, H. H. Gan, and T. Schlick, ``Modular RNA Architecture Revealed by Computational Analysis of Existing Pseudoknots and Ribosomal RNAs,'' Nucl. Acids Res., Submitted (2004). T. Schlick, Molecular Modeling: An Interdisciplinary Guide, Springer-Verlag, New York, 2002.

  2. Discovering MicroRNA-Regulatory Modules in Multi-Dimensional Cancer Genomic Data: A Survey of Computational Methods

    PubMed Central

    Walsh, Christopher J.; Hu, Pingzhao; Batt, Jane; dos Santos, Claudia C.

    2016-01-01

    MicroRNAs (miRs) are small single-stranded noncoding RNA that function in RNA silencing and post-transcriptional regulation of gene expression. An increasing number of studies have shown that miRs play an important role in tumorigenesis, and understanding the regulatory mechanism of miRs in this gene regulatory network will help elucidate the complex biological processes at play during malignancy. Despite advances, determination of miR–target interactions (MTIs) and identification of functional modules composed of miRs and their specific targets remain a challenge. A large amount of data generated by high-throughput methods from various sources are available to investigate MTIs. The development of data-driven tools to harness these multi-dimensional data has resulted in significant progress over the past decade. In parallel, large-scale cancer genomic projects are allowing new insights into the commonalities and disparities of miR–target regulation across cancers. In the first half of this review, we explore methods for identification of pairwise MTIs, and in the second half, we explore computational tools for discovery of miR-regulatory modules in a cancer-specific and pan-cancer context. We highlight strengths and limitations of each of these tools as a practical guide for the computational biologists. PMID:27721651

  3. A survey of advancements in nucleic acid-based logic gates and computing for applications in biotechnology and biomedicine.

    PubMed

    Wu, Cuichen; Wan, Shuo; Hou, Weijia; Zhang, Liqin; Xu, Jiehua; Cui, Cheng; Wang, Yanyue; Hu, Jun; Tan, Weihong

    2015-03-01

    Nucleic acid-based logic devices were first introduced in 1994. Since then, science has seen the emergence of new logic systems for mimicking mathematical functions, diagnosing disease and even imitating biological systems. The unique features of nucleic acids, such as facile and high-throughput synthesis, Watson-Crick complementary base pairing, and predictable structures, together with the aid of programming design, have led to the widespread applications of nucleic acids (NA) for logic gate and computing in biotechnology and biomedicine. In this feature article, the development of in vitro NA logic systems will be discussed, as well as the expansion of such systems using various input molecules for potential cellular, or even in vivo, applications.

  4. A Survey of Advancements in Nucleic Acid-based Logic Gates and Computing for Applications in Biotechnology and biomedicine

    PubMed Central

    Wu, Cuichen; Wan, Shuo; Hou, Weijia; Zhang, Liqin; Xu, Jiehua; Cui, Cheng; Wang, Yanyue; Hu, Jun

    2015-01-01

    Nucleic acid-based logic devices were first introduced in 1994. Since then, science has seen the emergence of new logic systems for mimicking mathematical functions, diagnosing disease and even imitating biological systems. The unique features of nucleic acids, such as facile and high-throughput synthesis, Watson-Crick complementary base pairing, and predictable structures, together with the aid of programming design, have led to the widespread applications of nucleic acids (NA) for logic gating and computing in biotechnology and biomedicine. In this feature article, the development of in vitro NA logic systems will be discussed, as well as the expansion of such systems using various input molecules for potential cellular, or even in vivo, applications. PMID:25597946

  5. PR Educators Stress Computers.

    ERIC Educational Resources Information Center

    Fleming, Charles A.

    1988-01-01

    Surveys the varied roles computers play in public relations education. Asserts that, because computers are used extensively in the public relations field, students should become acquainted with the varied capabilities of computers and their role in public relations practice. (MM)

  6. HIV-related risk behaviors among the general population: a survey using Audio Computer-Assisted Self-Interview in 3 cities in Vietnam.

    PubMed

    Vu, Lan T H; Nadol, Patrick; Le, Linh Cu

    2015-03-01

    This study used a confidential survey method-namely, Audio Computer-Assisted Self-Interview (ACASI)-to gather data about HIV-related risk knowledge/behaviors among the general population in Vietnam. The study sample included 1371 people aged 15 to 49 years in 3 cities-Hanoi, Da nang, and Can Tho. Results indicated that 7% of participants had ever had nonconsensual sex, and 3.6% of them had ever had a one-night stand. The percentage of male participants reported to ever have sex with sex workers was 9.6% and to ever inject drugs was 4.3%. The proportion of respondents who had ever tested for HIV was 17.6%. The risk factors and attitudes reported in the survey indicate the importance of analyzing risk behaviors related to HIV infection among the general population. Young people, especially men in more urbanized settings, are engaging in risky behaviors and may act as a "bridge" for the transmission of HIV from high-risk groups to the general population in Vietnam.

  7. Survey of computed tomography doses and establishment of national diagnostic reference levels in the Republic of Belarus.

    PubMed

    Kharuzhyk, S A; Matskevich, S A; Filjustin, A E; Bogushevich, E V; Ugolkova, S A

    2010-01-01

    Computed tomography dose index (CTDI) was measured on eight CT scanners at seven public hospitals in the Republic of Belarus. The effective dose was calculated using normalised values of effective dose per dose-length product (DLP) over various body regions. Considerable variations of the dose values were observed. Mean effective doses amounted to 1.4 +/- 0.4 mSv for brain, 2.6 +/- 1.0 mSv for neck, 6.9 +/- 2.2 mSv for thorax, 7.0 +/- 2.3 mSv for abdomen and 8.8 +/- 3.2 mSv for pelvis. Diagnostic reference levels (DRLs) were proposed by calculating the third quartiles of dose value distributions (body region/volume CTDI, mGy/DLP, mGy cm): brain/60/730, neck/55/640, thorax/20/500, abdomen/25/600 and pelvis/25/490. It is evident that the protocols need to be optimised on some of the CT scanners, in view of the fact that these are the first formulated DRLs for the Republic of Belarus.

  8. Potential and limitations of X-Ray micro-computed tomography in arthropod neuroanatomy: a methodological and comparative survey.

    PubMed

    Sombke, Andy; Lipke, Elisabeth; Michalik, Peter; Uhl, Gabriele; Harzsch, Steffen

    2015-06-01

    Classical histology or immunohistochemistry combined with fluorescence or confocal laser scanning microscopy are common techniques in arthropod neuroanatomy, and these methods often require time-consuming and difficult dissections and sample preparations. Moreover, these methods are prone to artifacts due to compression and distortion of tissues, which often result in information loss and especially affect the spatial relationships of the examined parts of the nervous system in their natural anatomical context. Noninvasive approaches such as X-ray micro-computed tomography (micro-CT) can overcome such limitations and have been shown to be a valuable tool for understanding and visualizing internal anatomy and structural complexity. Nevertheless, knowledge about the potential of this method for analyzing the anatomy and organization of nervous systems, especially of taxa with smaller body size (e.g., many arthropods), is limited. This study set out to analyze the brains of selected arthropods with micro-CT, and to compare these results with available histological and immunohistochemical data. Specifically, we explored the influence of different sample preparation procedures. Our study shows that micro-CT is highly suitable for analyzing arthropod neuroarchitecture in situ and allows specific neuropils to be distinguished within the brain to extract quantitative data such as neuropil volumes. Moreover, data acquisition is considerably faster compared with many classical histological techniques. Thus, we conclude that micro-CT is highly suitable for targeting neuroanatomy, as it reduces the risk of artifacts and is faster than classical techniques. PMID:25728683

  9. Enhanced delegated computing using coherence

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie; Dunjko, Vedran; Schlederer, Florian; Moore, Merritt; Kashefi, Elham; Walmsley, Ian A.

    2016-03-01

    A longstanding question is whether it is possible to delegate computational tasks securely—such that neither the computation nor the data is revealed to the server. Recently, both a classical and a quantum solution to this problem were found [C. Gentry, in Proceedings of the 41st Annual ACM Symposium on the Theory of Computing (Association for Computing Machinery, New York, 2009), pp. 167-178; A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science (IEEE Computer Society, Los Alamitos, CA, 2009), pp. 517-526]. Here, we study the first step towards the interplay between classical and quantum approaches and show how coherence can be used as a tool for secure delegated classical computation. We show that a client with limited computational capacity—restricted to an XOR gate—can perform universal classical computation by manipulating information carriers that may occupy superpositions of two states. Using single photonic qubits or coherent light, we experimentally implement secure delegated classical computations between an independent client and a server, which are installed in two different laboratories and separated by 50 m . The server has access to the light sources and measurement devices, whereas the client may use only a restricted set of passive optical devices to manipulate the information-carrying light beams. Thus, our work highlights how minimal quantum and classical resources can be combined and exploited for classical computing.

  10. Surveying System

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Sunrise Geodetic Surveys are setting up their equipment for a town survey. Their equipment differs from conventional surveying systems that employ transit rod and chain to measure angles and distances. They are using ISTAC Inc.'s Model 2002 positioning system, which offers fast accurate surveying with exceptional signals from orbiting satellites. The special utility of the ISTAC Model 2002 is that it can provide positioning of the highest accuracy from Navstar PPS signals because it requires no knowledge of secret codes. It operates by comparing the frequency and time phase of a Navstar signal arriving at one ISTAC receiver with the reception of the same set of signals by another receiver. Data is computer processed and translated into three dimensional position data - latitude, longitude and elevation.

  11. Computer Programs for Obtaining and Analyzing Daily Mean Steamflow Data from the U.S. Geological Survey National Water Information System Web Site

    USGS Publications Warehouse

    Granato, Gregory E.

    2009-01-01

    Research Council, 2004). The USGS maintains the National Water Information System (NWIS), a distributed network of computers and file servers used to store and retrieve hydrologic data (Mathey, 1998; U.S. Geological Survey, 2008). NWISWeb is an online version of this database that includes water data from more than 24,000 streamflow-gaging stations throughout the United States (U.S. Geological Survey, 2002, 2008). Information from NWISWeb is commonly used to characterize streamflows at gaged sites and to help predict streamflows at ungaged sites. Five computer programs were developed for obtaining and analyzing streamflow from the National Water Information System (NWISWeb). The programs were developed as part of a study by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, to develop a stochastic empirical loading and dilution model. The programs were developed because reliable, efficient, and repeatable methods are needed to access and process streamflow information and data. The first program is designed to facilitate the downloading and reformatting of NWISWeb streamflow data. The second program is designed to facilitate graphical analysis of streamflow data. The third program is designed to facilitate streamflow-record extension and augmentation to help develop long-term statistical estimates for sites with limited data. The fourth program is designed to facilitate statistical analysis of streamflow data. The fifth program is a preprocessor to create batch input files for the U.S. Environmental Protection Agency DFLOW3 program for calculating low-flow statistics. These computer programs were developed to facilitate the analysis of daily mean streamflow data for planning-level water-quality analyses but also are useful for many other applications pertaining to streamflow data and statistics. These programs and the associated documentation are included on the CD-ROM accompanying this report. This report and the appendixes on the

  12. Potential and limitations of X-Ray micro-computed tomography in arthropod neuroanatomy: A methodological and comparative survey

    PubMed Central

    Sombke, Andy; Lipke, Elisabeth; Michalik, Peter; Uhl, Gabriele; Harzsch, Steffen

    2015-01-01

    Classical histology or immunohistochemistry combined with fluorescence or confocal laser scanning microscopy are common techniques in arthropod neuroanatomy, and these methods often require time-consuming and difficult dissections and sample preparations. Moreover, these methods are prone to artifacts due to compression and distortion of tissues, which often result in information loss and especially affect the spatial relationships of the examined parts of the nervous system in their natural anatomical context. Noninvasive approaches such as X-ray micro-computed tomography (micro-CT) can overcome such limitations and have been shown to be a valuable tool for understanding and visualizing internal anatomy and structural complexity. Nevertheless, knowledge about the potential of this method for analyzing the anatomy and organization of nervous systems, especially of taxa with smaller body size (e.g., many arthropods), is limited. This study set out to analyze the brains of selected arthropods with micro-CT, and to compare these results with available histological and immunohistochemical data. Specifically, we explored the influence of different sample preparation procedures. Our study shows that micro-CT is highly suitable for analyzing arthropod neuroarchitecture in situ and allows specific neuropils to be distinguished within the brain to extract quantitative data such as neuropil volumes. Moreover, data acquisition is considerably faster compared with many classical histological techniques. Thus, we conclude that micro-CT is highly suitable for targeting neuroanatomy, as it reduces the risk of artifacts and is faster than classical techniques. J. Comp. Neurol. 523:1281–1295, 2015. © 2015 Wiley Periodicals, Inc. PMID:25728683

  13. Prior to the oral therapy, what do we know about HCV-4 in Egypt: a randomized survey of prevalence and risks using data mining computed analysis.

    PubMed

    Abd Elrazek, Abd Elrazek; Bilasy, Shymaa E; Elbanna, Abduh E M; Elsherif, Abd Elhalim A

    2014-12-01

    Hepatitis C virus (HCV) affects over 180 million people worldwide and it's the leading cause of chronic liver diseases and hepatocellular carcinoma. HCV is classified into seven major genotypes and a series of subtypes. In general, HCV genotype 4 (HCV-4) is common in the Middle East and Africa, where it is responsible for more than 80% of HCV infections. Although HCV-4 is the cause of approximately 20% of the 180 million cases of chronic hepatitis C worldwide, it has not been a major subject of research yet. The aim of the current study is to survey the morbidities and disease complications among Egyptian population infected with HCV-4 using data mining advanced computing methods mainly and other complementary statistical analysis. Six thousand six hundred sixty subjects, aged between 17 and 58 years old, from different Egyptian Governorates were screened for HCV infection by ELISA and qualitative PCR. HCV-positive patients were further investigated for the incidence of liver cirrhosis and esophageal varices. Obtained data were analyzed by data mining approach. Among 6660 subjects enrolled in this survey, 1018 patients (15.28%) were HCV-positive. Proportion of infected-males was significantly higher than females; 61.6% versus 38.4% (P=0.0052). Around two-third of infected-patients (635/1018; 62.4%) were presented with liver cirrhosis. Additionally, approximately half of the cirrhotic patients (301/635; 47.4%) showed degrees of large esophageal varices (LEVs), with higher variceal grade observed in males. Age for esophageal variceal development was 47±1. Data mining analysis yielded esophageal wall thickness (>6.5 mm), determined by conventional U/S, as the only independent predictor for esophageal varices. This study emphasizes the high prevalence of HCV infection among Egyptian population, in particular among males. Egyptians with HCV-4 infection are at a higher risk to develop cirrhotic liver and esophageal varices. Data mining, a new analytic technique in

  14. A survey of surveys

    SciTech Connect

    Kent, S.M.

    1994-11-01

    A new era for the field of Galactic structure is about to be opened with the advent of wide-area digital sky surveys. In this article, the author reviews the status and prospects for research for 3 new ground-based surveys: the Sloan Digital Sky Survey (SDSS), the Deep Near-Infrared Survey of the Southern Sky (DENIS) and the Two Micron AU Sky Survey (2MASS). These surveys will permit detailed studies of Galactic structure and stellar populations in the Galaxy with unprecedented detail. Extracting the information, however, will be challenging.

  15. Computers in Schools.

    ERIC Educational Resources Information Center

    Moore, John W.; Moore, Elizabeth A.

    1988-01-01

    Surveys the types of computers being used in high school and university chemistry courses. Identifies types of hardware found on Apple and MS-DOS computers. Makes recommendations for the upgrading of current equipment. (ML)

  16. Do Home Computers Improve Educational Outcomes? Evidence from Matched Current Population Surveys and the National Longitudinal Survey of Youth 1997. National Poverty Center Working Paper Series #06-01

    ERIC Educational Resources Information Center

    Beltran, Daniel O.; Das, Kuntal K.; Fairlie, Robert W.

    2006-01-01

    Nearly twenty million children in the United States do not have computers in their homes. The role of "home" computers in the educational process, however, has drawn very little attention in the previous literature. We use panel data from the two main U.S. datasets that include recent information on computer ownership among children--the 2000-2003…

  17. Computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An overview of computational fluid dynamics (CFD) activities at the Langley Research Center is given. The role of supercomputers in CFD research, algorithm development, multigrid approaches to computational fluid flows, aerodynamics computer programs, computational grid generation, turbulence research, and studies of rarefied gas flows are among the topics that are briefly surveyed.

  18. Nursing Education Update: Computer Technology.

    ERIC Educational Resources Information Center

    Gothler, Ann M.

    1985-01-01

    A survey of nursing faculty showed that 91 percent of nursing education programs had faculty members who had attended or participated in a conference on computers during 1983 and 1984. Other survey responses concerned computer applications integrated into nursing courses, required courses in computer technology, and computer-assisted instruction.…

  19. Early science from the Pan-STARRS1 Optical Galaxy Survey (POGS): Maps of stellar mass and star formation rate surface density obtained from distributed-computing pixel-SED fitting

    NASA Astrophysics Data System (ADS)

    Thilker, David A.; Vinsen, K.; Galaxy Properties Key Project, PS1

    2014-01-01

    To measure resolved galactic physical properties unbiased by the mask of recent star formation and dust features, we are conducting a citizen-scientist enabled nearby galaxy survey based on the unprecedented optical (g,r,i,z,y) imaging from Pan-STARRS1 (PS1). The PS1 Optical Galaxy Survey (POGS) covers 3π steradians (75% of the sky), about twice the footprint of SDSS. Whenever possible we also incorporate ancillary multi-wavelength image data from the ultraviolet (GALEX) and infrared (WISE, Spitzer) spectral regimes. For each cataloged nearby galaxy with a reliable redshift estimate of z < 0.05 - 0.1 (dependent on donated CPU power), publicly-distributed computing is being harnessed to enable pixel-by-pixel spectral energy distribution (SED) fitting, which in turn provides maps of key physical parameters such as the local stellar mass surface density, crude star formation history, and dust attenuation. With pixel SED fitting output we will then constrain parametric models of galaxy structure in a more meaningful way than ordinarily achieved. In particular, we will fit multi-component (e.g. bulge, bar, disk) galaxy models directly to the distribution of stellar mass rather than surface brightness in a single band, which is often locally biased. We will also compute non-parametric measures of morphology such as concentration, asymmetry using the POGS stellar mass and SFR surface density images. We anticipate studying how galactic substructures evolve by comparing our results with simulations and against more distant imaging surveys, some of which which will also be processed in the POGS pipeline. The reliance of our survey on citizen-scientist volunteers provides a world-wide opportunity for education. We developed an interactive interface which highlights the science being produced by each volunteer’s own CPU cycles. The POGS project has already proven popular amongst the public, attracting about 5000 volunteers with nearly 12,000 participating computers, and is

  20. [Experience with video-computer method of assessing the mental state with the help of "Vidicor" in a psychological survey of entrants in Suvorov Military School].

    PubMed

    Tsymbal, A N; Platonova, I A; Tsymbal, A A

    2011-08-01

    The authors analyzed the use of two different survey methods (PPO and "Vidicor") of entrants of the St. Petersburg Suvorov Military School. It is shown that the methods of the final results do not contradict each other. The discrepancy between the results is 4,4%, but the method of "Vidicor" has several advantages: lower labor costs and psycho trauma of surveyed, most dynamic and effective, the possibility of creating an electronic database. This allows us to recommend it for use. PMID:22164987

  1. A regional land use survey based on remote sensing and other data: A report on a LANDSAT and computer mapping project, volume 2

    NASA Technical Reports Server (NTRS)

    Nez, G. (Principal Investigator); Mutter, D.

    1977-01-01

    The author has identified the following significant results. The project mapped land use/cover classifications from LANDSAT computer compatible tape data and combined those results with other multisource data via computer mapping/compositing techniques to analyze various land use planning/natural resource management problems. Data were analyzed on 1:24,000 scale maps at 1.1 acre resolution. LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.

  2. Computational vision

    NASA Technical Reports Server (NTRS)

    Barrow, H. G.; Tenenbaum, J. M.

    1981-01-01

    The range of fundamental computational principles underlying human vision that equally apply to artificial and natural systems is surveyed. There emerges from research a view of the structuring of vision systems as a sequence of levels of representation, with the initial levels being primarily iconic (edges, regions, gradients) and the highest symbolic (surfaces, objects, scenes). Intermediate levels are constrained by information made available by preceding levels and information required by subsequent levels. In particular, it appears that physical and three-dimensional surface characteristics provide a critical transition from iconic to symbolic representations. A plausible vision system design incorporating these principles is outlined, and its key computational processes are elaborated.

  3. Reliability automation tool (RAT) for fault tolerance computation

    NASA Astrophysics Data System (ADS)

    Singh, N. S. S.; Hamid, N. H.; Asirvadam, V. S.

    2012-09-01

    As CMOS transistors reduced in size, the circuit built using these nano-scale transistors naturally becomes less reliable. The reliability reduction, which is the measure of circuit performance, has brought up so many challenges in designing modern logic integrated circuit. Therefore, reliability modeling is increasingly important subject to be considered in designing modern logic integrated circuit. This drives a need to compute reliability measures for nano-scale circuits. This paper looks into the development of reliability automation tool (RAT) for circuit's reliability computation. The tool is developed using Matlab programming language based on the reliability evaluation model called Probabilistic Transfer Matrix (PTM). RAT allows users to significantly speed-up the reliability assessments of nano-scale circuits. Users have to provide circuit's netlist as the input to RAT for its reliability computation. The netlist signifies the circuit's description in terms of Gate Profile Matrix (GPM), Adjacency Computation Matrix (ACM) and Grid Layout Matrix (GLM). GPM, ACM and GLM indicate the types of logic gates, the interconnection between these logic gates and the layout matrix of these logic gates respectively in a given circuit design. Here, the reliability assessment by RAT is carried out on Full Adder circuit as the benchmark test circuit.

  4. 'Towers in the Tempest' Computer Animation Submission

    NASA Technical Reports Server (NTRS)

    Shirah, Greg

    2008-01-01

    The following describes a computer animation that has been submitted to the ACM/SIGGRAPH 2008 computer graphics conference: 'Towers in the Tempest' clearly communicates recent scientific research into how hurricanes intensify. This intensification can be caused by a phenomenon called a 'hot tower.' For the first time, research meteorologists have run complex atmospheric simulations at a very fine temporal resolution of 3 minutes. Combining this simulation data with satellite observations enables detailed study of 'hot towers.' The science of 'hot towers' is described using: satellite observation data, conceptual illustrations, and a volumetric atmospheric simulation data. The movie starts by showing a 'hot tower' observed by NASA's Tropical Rainfall Measuring Mission (TRMM) spacecraft's three dimensional precipitation radar data of Hurricane Bonnie. Next, the dynamics of a hurricane and the formation of 'hot towers' are briefly explained using conceptual illustrations. Finally, volumetric cloud, wind, and vorticity data from a supercomputer simulation of Hurricane Bonnie are shown using volume techniques such as ray marching.

  5. The Challenge of Computers.

    ERIC Educational Resources Information Center

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  6. Publishing Trends in Educational Computing.

    ERIC Educational Resources Information Center

    O'Hair, Marilyn; Johnson, D. LaMont

    1989-01-01

    Describes results of a survey of secondary school and college teachers that was conducted to determine subject matter that should be included in educational computing journals. Areas of interest included computer applications; artificial intelligence; computer-aided instruction; computer literacy; computer-managed instruction; databases; distance…

  7. Quantum computing with trapped ions

    SciTech Connect

    Hughes, R.J.

    1998-01-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  8. QADATA user's manual; an interactive computer program for the retrieval and analysis of the results from the external blind sample quality- assurance project of the U.S. Geological Survey

    USGS Publications Warehouse

    Lucey, K.J.

    1990-01-01

    The U.S. Geological Survey conducts an external blind sample quality assurance project for its National Water Quality Laboratory in Denver, Colorado, based on the analysis of reference water samples. Reference samples containing selected inorganic and nutrient constituents are disguised as environmental samples at the Survey 's office in Ocala, Florida, and are sent periodically through other Survey offices to the laboratory. The results of this blind sample project indicate the quality of analytical data produced by the laboratory. This report provides instructions on the use of QADATA, an interactive, menu-driven program that allows users to retrieve the results of the blind sample quality- assurance project. The QADATA program, which is available on the U.S. Geological Survey 's national computer network, accesses a blind sample data base that contains more than 50,000 determinations from the last five water years for approximately 40 constituents at various concentrations. The data can be retrieved from the database for any user- defined time period and for any or all available constituents. After the user defines the retrieval, the program prepares statistical tables, control charts, and precision plots and generates a report which can be transferred to the user 's office through the computer network. A discussion of the interpretation of the program output is also included. This quality assurance information will permit users to document the quality of the analytical results received from the laboratory. The blind sample data is entered into the database within weeks after being produced by the laboratory and can be retrieved to meet the needs of specific projects or programs. (USGS)

  9. Techniques for computer-aided analysis of ERTS-1 data, useful in geologic, forest and water resource surveys. [Colorado Rocky Mountains

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1974-01-01

    Forestry, geology, and water resource applications were the focus of this study, which involved the use of computer-implemented pattern-recognition techniques to analyze ERTS-1 data. The results have proven the value of computer-aided analysis techniques, even in areas of mountainous terrain. Several analysis capabilities have been developed during these ERTS-1 investigations. A procedure to rotate, deskew, and geometrically scale the MSS data results in 1:24,000 scale printouts that can be directly overlayed on 7 1/2 minutes U.S.G.S. topographic maps. Several scales of computer-enhanced "false color-infrared" composites of MSS data can be obtained from a digital display unit, and emphasize the tremendous detail present in the ERTS-1 data. A grid can also be superimposed on the displayed data to aid in specifying areas of interest.

  10. PEP surveying procedures and equipment

    SciTech Connect

    Linker, F.

    1982-06-01

    The PEP Survey and Alignment System, which employs both laser-based and optical survey methods, is described. The laser is operated in conjunction with the Tektronix 4051 computer and surveying instruments such as ARM and SAM, system which is designed to automate data input, reduction, and production of alignment instructions. The laser system is used when surveying ring quadrupoles, main bend magnets, sextupoles, and is optional when surveying RF cavities and insertion quadrupoles. Optical methods usually require that data be manually entered into the computer for alignment, but in some cases, an element can be aligned using nominal values of fiducial locations without use of the computer. Optical surveying is used in the alignment of NIT and SIT, low field bend magnets, wigglers, RF cavities, and insertion quadrupoles.

  11. What Children Think about Computers.

    ERIC Educational Resources Information Center

    Future of Children, 2000

    2000-01-01

    Surveyed Internet-using children about their experiences with and perceptions of computer technology. Respondents valued the role of computers in their lives for entertainment, accomplishing goals, and becoming competent and empowered. They believed computers and the Internet improved their lives. Most children had computers at school, but nearly…

  12. Computer vision

    NASA Technical Reports Server (NTRS)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  13. Survey of digital filtering

    NASA Technical Reports Server (NTRS)

    Nagle, H. T., Jr.

    1972-01-01

    A three part survey is made of the state-of-the-art in digital filtering. Part one presents background material including sampled data transformations and the discrete Fourier transform. Part two, digital filter theory, gives an in-depth coverage of filter categories, transfer function synthesis, quantization and other nonlinear errors, filter structures and computer aided design. Part three presents hardware mechanization techniques. Implementations by general purpose, mini-, and special-purpose computers are presented.

  14. Computed tomography imaging spectrometer (CTIS) with 2D reflective grating for ultraviolet to long-wave infrared detection especially useful for surveying transient events

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Maker, Paul D. (Inventor); Muller, Richard E. (Inventor); Mouroulis, Pantazis Z. (Inventor)

    2003-01-01

    The optical system of this invention is an unique type of imaging spectrometer, i.e. an instrument that can determine the spectra of all points in a two-dimensional scene. The general type of imaging spectrometer under which this invention falls has been termed a computed-tomography imaging spectrometer (CTIS). CTIS's have the ability to perform spectral imaging of scenes containing rapidly moving objects or evolving features, hereafter referred to as transient scenes. This invention, a reflective CTIS with an unique two-dimensional reflective grating, can operate in any wavelength band from the ultraviolet through long-wave infrared. Although this spectrometer is especially useful for rapidly occurring events it is also useful for investigation of some slow moving phenomena as in the life sciences.

  15. Computed Tomography Imaging Spectrometer (CTIS) with 2D Reflective Grating for Ultraviolet to Long-Wave Infrared Detection Especially Useful for Surveying Transient Events

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Maker, Paul D. (Inventor); Muller, Richard E. (Inventor); Mouroulis, Pantazis Z. (Inventor)

    2003-01-01

    The optical system of this invention is an unique type of imaging spectrometer, i.e. an instrument that can determine the spectra of all points in a two-dimensional scene. The general type of imaging spectrometer under which this invention falls has been termed a computed-tomography imaging spectrometer (CTIS). CTIS's have the ability to perform spectral imaging of scenes containing rapidly moving objects or evolving features, hereafter referred to as transient scenes. This invention, a reflective CTIS with an unique two-dimensional reflective grating, can operate in any wavelength band from the ultraviolet through long-wave infrared. Although this spectrometer is especially useful for events it is also for investigation of some slow moving phenomena as in the life sciences.

  16. Willingness of Patients with Breast Cancer in the Adjuvant and Metastatic Setting to Use Electronic Surveys (ePRO) Depends on Sociodemographic Factors, Health-related Quality of Life, Disease Status and Computer Skills

    PubMed Central

    Graf, J.; Simoes, E.; Wißlicen, K.; Rava, L.; Walter, C. B.; Hartkopf, A.; Keilmann, L.; Taran, A.; Wallwiener, S.; Fasching, P.; Brucker, S. Y.; Wallwiener, M.

    2016-01-01

    Introduction: Because of the often unfavorable prognosis, particularly for patients with metastases, health-related quality of life is extremely important for breast cancer patients. In recent years, data on patient-relevant endpoints is being increasingly collected electronically; however, knowledge on the acceptance and practicability of, and barriers to, this form of data collection remains limited. Material and Methods: A questionnaire was completed by 96 patients to determine to what extent existing computer skills, disease status, health-related quality of life and sociodemographic factors affect patientsʼ potential willingness to use electronics methods of data collection (ePRO). Results: 52 of 96 (55 %) patients reported a priori that they could envisage using ePRO. Patients who a priori preferred a paper-based survey (pPRO) tended to be older (ePRO 53 years vs. pPRO 62 years; p = 0.0014) and typically had lower levels of education (p = 0.0002), were in poorer health (p = 0.0327) and had fewer computer skills (p = 0.0003). Conclusion: Barriers to the prospective use of ePRO were identified in older patients and patients with a lower quality of life. Given the appropriate conditions with regard to age, education and current health status, opportunities to participate should be provided to encourage patientsʼ willingness to take part and ensure the validity of survey results. Focusing on ease of use of ePRO applications and making applications more patient-oriented and straightforward appears to be the way forward. PMID:27239062

  17. Audio computer-assisted survey instrument versus face-to-face interviews: optimal method for detecting high-risk behaviour in pregnant women and their sexual partners in the south of Brazil

    PubMed Central

    Yeganeh, N; Dillavou, C; Simon, M; Gorbach, P; Santos, B; Fonseca, R; Saraiva, J; Melo, M; Nielsen-Saines, K

    2016-01-01

    Summary Audio computer-assisted survey instrument (ACASI) has been shown to decrease under-reporting of socially undesirable behaviours, but has not been evaluated in pregnant women at risk of HIV acquisition in Brazil. We assigned HIV-negative pregnant women receiving routine antenatal care at in Porto Alegre, Brazil and their partners to receive a survey regarding high-risk sexual behaviours and drug use via ACASI (n = 372) or face-to-face (FTF) (n = 283) interviews. Logistic regression showed that compared with FTF, pregnant women interviewed via ACASI were significantly more likely to self-report themselves as single (14% versus 6%), having >5 sexual partners (35% versus 29%), having oral sex (42% versus 35%), using intravenous drugs (5% versus 0), smoking cigarettes (23% versus 16%), drinking alcohol (13% versus 8%) and using condoms during pregnancy (32% versus 17%). Therefore, ACASI may be a useful method in assessing risk behaviours in pregnant women, especially in relation to drug and alcohol use. PMID:23970659

  18. Copyright Survey Results.

    ERIC Educational Resources Information Center

    Botterbusch, Hope R.

    1992-01-01

    Reports results of a survey of copyright concerns that was conducted by the Association for Educational Communications and Technology. Areas addressed include video and television; copyright legislation; printed materials; music; audiovisual materials; and computer software. A checklist of proper copyright procedures is included. (six references)…

  19. Potential of known and short prokaryotic protein motifs as a basis for novel peptide-based antibacterial therapeutics: a computational survey

    PubMed Central

    Ruhanen, Heini; Hurley, Daniel; Ghosh, Ambarnil; O'Brien, Kevin T.; Johnston, Catrióna R.; Shields, Denis C.

    2014-01-01

    Short linear motifs (SLiMs) are functional stretches of protein sequence that are of crucial importance for numerous biological processes by mediating protein–protein interactions. These motifs often comprise peptides of less than 10 amino acids that modulate protein–protein interactions. While well-characterized in eukaryotic intracellular signaling, their role in prokaryotic signaling is less well-understood. We surveyed the distribution of known motifs in prokaryotic extracellular and virulence proteins across a range of bacterial species and conducted searches for novel motifs in virulence proteins. Many known motifs in virulence effector proteins mimic eukaryotic motifs and enable the pathogen to control the intracellular processes of their hosts. Novel motifs were detected by finding those that had evolved independently in three or more unrelated virulence proteins. The search returned several significantly over-represented linear motifs of which some were known motifs and others are novel candidates with potential roles in bacterial pathogenesis. A putative C-terminal G[AG].$ motif found in type IV secretion system proteins was among the most significant detected. A KK$ motif that has been previously identified in a plasminogen-binding protein, was demonstrated to be enriched across a number of adhesion and lipoproteins. While there is some potential to develop peptide drugs against bacterial infection based on bacterial peptides that mimic host components, this could have unwanted effects on host signaling. Thus, novel SLiMs in virulence factors that do not mimic host components but are crucial for bacterial pathogenesis, such as the type IV secretion system, may be more useful to develop as leads for anti-microbial peptides or drugs. PMID:24478765

  20. Efficient Computational Research Protocol to Survey Free Energy Surface for Solution Chemical Reaction in the QM/MM Framework: The FEG-ER Methodology and Its Application to Isomerization Reaction of Glycine in Aqueous Solution.

    PubMed

    Takenaka, Norio; Kitamura, Yukichi; Nagaoka, Masataka

    2016-03-01

    In solution chemical reaction, we often need to consider a multidimensional free energy (FE) surface (FES) which is analogous to a Born-Oppenheimer potential energy surface. To survey the FES, an efficient computational research protocol is proposed within the QM/MM framework; (i) we first obtain some stable states (or transition states) involved by optimizing their structures on the FES, in a stepwise fashion, finally using the free energy gradient (FEG) method, and then (ii) we directly obtain the FE differences among any arbitrary states on the FES, efficiently by employing the QM/MM method with energy representation (ER), i.e., the QM/MM-ER method. To validate the calculation accuracy and efficiency, we applied the above FEG-ER methodology to a typical isomerization reaction of glycine in aqueous solution, and reproduced quite satisfactorily the experimental value of the reaction FE. Further, it was found that the structural relaxation of the solute in the QM/MM force field is not negligible to estimate correctly the FES. We believe that the present research protocol should become prevailing as one computational strategy and will play promising and important roles in solution chemistry toward solution reaction ergodography. PMID:26794718

  1. Efficient Computational Research Protocol to Survey Free Energy Surface for Solution Chemical Reaction in the QM/MM Framework: The FEG-ER Methodology and Its Application to Isomerization Reaction of Glycine in Aqueous Solution.

    PubMed

    Takenaka, Norio; Kitamura, Yukichi; Nagaoka, Masataka

    2016-03-01

    In solution chemical reaction, we often need to consider a multidimensional free energy (FE) surface (FES) which is analogous to a Born-Oppenheimer potential energy surface. To survey the FES, an efficient computational research protocol is proposed within the QM/MM framework; (i) we first obtain some stable states (or transition states) involved by optimizing their structures on the FES, in a stepwise fashion, finally using the free energy gradient (FEG) method, and then (ii) we directly obtain the FE differences among any arbitrary states on the FES, efficiently by employing the QM/MM method with energy representation (ER), i.e., the QM/MM-ER method. To validate the calculation accuracy and efficiency, we applied the above FEG-ER methodology to a typical isomerization reaction of glycine in aqueous solution, and reproduced quite satisfactorily the experimental value of the reaction FE. Further, it was found that the structural relaxation of the solute in the QM/MM force field is not negligible to estimate correctly the FES. We believe that the present research protocol should become prevailing as one computational strategy and will play promising and important roles in solution chemistry toward solution reaction ergodography.

  2. Computer program for simulation of variable recharge with the U. S. Geological Survey modular finite-difference ground-water flow model (MODFLOW)

    USGS Publications Warehouse

    Kontis, A.L.

    2001-01-01

    The Variable-Recharge Package is a computerized method designed for use with the U.S. Geological Survey three-dimensional finitedifference ground-water flow model (MODFLOW-88) to simulate areal recharge to an aquifer. It is suitable for simulations of aquifers in which the relation between ground-water levels and land surface can affect the amount and distribution of recharge. The method is based on the premise that recharge to an aquifer cannot occur where the water level is at or above land surface. Consequently, recharge will vary spatially in simulations in which the Variable- Recharge Package is applied, if the water levels are sufficiently high. The input data required by the program for each model cell that can potentially receive recharge includes the average land-surface elevation and a quantity termed ?water available for recharge,? which is equal to precipitation minus evapotranspiration. The Variable-Recharge Package also can be used to simulate recharge to a valley-fill aquifer in which the valley fill and the adjoining uplands are explicitly simulated. Valley-fill aquifers, which are the most common type of aquifer in the glaciated northeastern United States, receive much of their recharge from upland sources as channeled and(or) unchanneled surface runoff and as lateral ground-water flow. Surface runoff in the uplands is generated in the model when the applied water available for recharge is rejected because simulated water levels are at or above land surface. The surface runoff can be distributed to other parts of the model by (1) applying the amount of the surface runoff that flows to upland streams (channeled runoff) to explicitly simulated streams that flow onto the valley floor, and(or) (2) applying the amount that flows downslope toward the valley- fill aquifer (unchanneled runoff) to specified model cells, typically those near the valley wall. An example model of an idealized valley- fill aquifer is presented to demonstrate application of the

  3. Seismic, side-scan survey, diving, and coring data analyzed by a Macintosh II sup TM computer and inexpensive software provide answers to a possible offshore extension of landslides at Palos Verdes Peninsula, California

    SciTech Connect

    Dill, R.F. ); Slosson, J.E. ); McEachen, D.B. )

    1990-05-01

    A Macintosh II{sup TM} computer and commercially available software were used to analyze and depict the topography, construct an isopach sediment thickness map, plot core positions, and locate the geology of an offshore area facing an active landslide on the southern side of Palos Verdes Peninsula California. Profile data from side scan sonar, 3.5 kHz, and Boomer subbottom, high-resolution seismic, diving, echo sounder traverses, and cores - all controlled with a mini Ranger II navigation system - were placed in MacGridzo{sup TM} and WingZ{sup TM} software programs. The computer-plotted data from seven sources were used to construct maps with overlays for evaluating the possibility of a shoreside landslide extending offshore. The poster session describes the offshore survey system and demonstrates the development of the computer data base, its placement into the MacGridzo{sup TM} gridding program, and transfer of gridded navigational locations to the WingZ{sup TM} data base and graphics program. Data will be manipulated to show how sea-floor features are enhanced and how isopach data were used to interpret the possibility of landslide displacement and Holocene sea level rise. The software permits rapid assessment of data using computerized overlays and a simple, inexpensive means of constructing and evaluating information in map form and the preparation of final written reports. This system could be useful in many other areas where seismic profiles, precision navigational locations, soundings, diver observations, and core provide a great volume of information that must be compared on regional plots to develop of field maps for geological evaluation and reports.

  4. Survey Says

    ERIC Educational Resources Information Center

    McCarthy, Susan K.

    2005-01-01

    Survey Says is a lesson plan designed to teach college students how to access Internet resources for valid data related to the sexual health of young people. Discussion questions based on the most recent available data from two national surveys, the Youth Risk Behavior Surveillance-United States, 2003 (CDC, 2004) and the National Survey of…

  5. Survey of Anatomy and Root Canal Morphology of Maxillary First Molars Regarding Age and Gender in an Iranian Population Using Cone-Beam Computed Tomography

    PubMed Central

    Naseri, Mandana; Safi, Yaser; Akbarzadeh Baghban, Alireza; Khayat, Akbar; Eftekhar, Leila

    2016-01-01

    Introduction: The purpose of this study was to investigate the root and canal morphology of maxillary first molars with regards to patients’ age and gender with cone-beam computed tomography (CBCT). Methods and Materials: A total of 149 CBCT scans from 92 (67.1%) female and 57 (31.3%) male patients with mean age of 40.5 years were evaluated. Tooth length, presence of root fusion, number of the roots and canals, canal types based on Vertucci’s classification, deviation of root and apical foramen in coronal and sagittal planes and the correlation of all items with gender and age were recorded. The Mann Whitney U, Kruskal Wallis and Fisher’s exact tests were used to analyze these items. Results: The rate of root fusion was 1.3%. Multiple canals were present in the following frequencies: four canals 78.5%, five canals 11.4% and three canals 10.1%. Additional canal was detected in 86.6% of mesiobuccal roots in which Vertucci’s type VI configuration was the most prevalent followed by type II and I. Type I was the most common one in distobuccal and palatal roots. There was no statistically significant difference in the canal configurations in relation to gender and age as well as the incidence root or canal numbers (P>0.05). The mean tooth length was 19.3 and 20.3 mm in female and male patients, respectively which was statistically significant (P<0.05). Evaluation of root deviation showed that most commonly, a general pattern of straight-distal in the mesiobuccal and straight-straight for distobuccal and palatal roots occurred. In mesiobuccal roots, straight and distal deviations were more dominant in male and female, respectively (P<0.05). The prevalence of apical foramen deviation in mesiobuccal and palatal roots statistically differed with gender. Conclusion: The root and canal configuration of Iranian population showed different features from those of other populations. PMID:27790259

  6. The Influence of Computer Training Platform on Subsequent Computer Preferences.

    ERIC Educational Resources Information Center

    Pardamean, Bens; Slovaceks, Simeon

    1995-01-01

    Reports a study that examined the impact of an introductory college computer course on users' subsequent preferences in their choice of computer (IBM versus Macintosh). Surveys found a strong positive relationship between the type of computer students used in the course and their later use and purchasing preferences. (SM)

  7. Computer Experience of Nurses.

    PubMed

    Schleder Gonçalves, Luciana; Cândida Castro, Talita; Fialek, Soraya

    2015-01-01

    This study aimed to identify the computing experience of nurses in southern Brazil, through exploratory survey research. The results, which were obtained from the application of The Staggers Nursing Computer Experience Questionnaire®, were analyzed by statistical tests. The survey was conducted with nurses working both in hospitals, as in public health, in a capital in southern Brazil. There is the predominance of novice nurses in the application of computer tools in their practices but most often declare the use the computers to develop their professional and also personal life activities. We conclude that the computer and health information systems are part of the working reality of the participants, being considered indispensable resources his activity, while noting limitations on the potential use of these tools. This study reflects on how the issue has been addressed in educational schools and the challenges of inclusion of the theme of Nursing Informatics in the curricula in Brazil. PMID:26262313

  8. Theory Survey or Survey Theory?

    ERIC Educational Resources Information Center

    Dean, Jodi

    2010-01-01

    Matthew Moore's survey of political theorists in U.S. American colleges and universities is an impressive contribution to political science (Moore 2010). It is the first such survey of political theory as a subfield, the response rate is very high, and the answers to the survey questions provide new information about how political theorists look…

  9. Beyond Computer Planning: Managing Educational Computer Innovations.

    ERIC Educational Resources Information Center

    Washington, Wenifort

    The vast underutilization of technology in educational environments suggests the need for more research to develop models to successfully adopt and diffuse computer systems in schools. Of 980 surveys mailed to various Ohio public schools, 529 were completed and returned to help determine current attitudes and perceptions of teachers and…

  10. Robotic Surveying

    SciTech Connect

    Suzy Cantor-McKinney; Michael Kruzic

    2007-03-01

    -actuated functions to be controlled by an onboard computer. The computer-controlled Speedrower was developed at Carnegie Mellon University to automate agricultural harvesting. Harvesting tasks require the vehicle to cover a field using minimally overlapping rows at slow speeds in a similar manner to geophysical data acquisition. The Speedrower had demonstrated its ability to perform as it had already logged hundreds of acres of autonomous harvesting. This project is the first use of autonomous robotic technology on a large-scale for geophysical surveying.

  11. Survey research and societal change.

    PubMed

    Tourangeau, Roger

    2004-01-01

    Surveys reflect societal change in a way that few other research tools do. Over the past two decades, three developments have transformed surveys. First, survey organizations have adopted new methods for selecting telephone samples; these new methods were made possible by the creation of large databases that include all listed telephone numbers in the United States. A second development has been the widespread decline in response rates for all types of surveys. In the face of this problem, survey researchers have developed new theories of nonresponse that build on the persuasion literature in social psychology. Finally, surveys have adopted many new methods of data collection; the new modes reflect technological developments in computing and the emergence of the Internet. Research has spawned several theories that examine how characteristics of the data collection method shape the answers obtained. Rapid change in survey methods is likely to continue in the coming years.

  12. College Students' Attitudes toward Computers.

    ERIC Educational Resources Information Center

    Leite, Pedro T.

    This paper reports a survey conducted at a private midwestern university to investigate 143 undergraduate students' attitudes toward computers. The study used a 10-item questionnaire called General Attitudes toward Computers. Results indicated that students had positive attitudes toward computers. There were no significant differences in attitudes…

  13. Computer Education for Engineers, Part III.

    ERIC Educational Resources Information Center

    McCullough, Earl S.; Lofy, Frank J.

    1989-01-01

    Reports the results of the third survey of computer use in engineering education conducted in the fall of 1987 in comparing with 1981 and 1984 results. Summarizes survey data on computer course credits, languages, equipment use, CAD/CAM instruction, faculty access, and computer graphics. (YP)

  14. Cryptography, quantum computation and trapped ions

    SciTech Connect

    Hughes, Richard J.

    1998-03-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  15. Computers in the World of College English.

    ERIC Educational Resources Information Center

    Tannheimer, Charlotte

    This sabbatical report surveys some computer software presently being developed, already in use, and/or available, and describes computer use in several Massachusetts colleges. A general introduction to computers, word processors, artificial intelligence, and computer assisted instruction is provided, as well as a discussion of what computers can…

  16. Heterotic computing: exploiting hybrid computational devices.

    PubMed

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications.

  17. Computers and Computer Resources.

    ERIC Educational Resources Information Center

    Bitter, Gary

    1980-01-01

    This resource directory provides brief evaluative descriptions of six popular home computers and lists selected sources of educational software, computer books, and magazines. For a related article on microcomputers in the schools, see p53-58 of this journal issue. (SJL)

  18. Multicultural Survey.

    ERIC Educational Resources Information Center

    Renyi, Judith, Comp.

    In May of 1992, the Alliance for Curriculum Reform (ACR) surveyed member organizations and others who had participated in ACR activities concerning their printed policies on issues relating to multicultural education. The areas of interest for the survey were: printed policy(ies) on multicultural content/curriculum; printed policy(ies) on student…

  19. SURVEY INSTRUMENT

    DOEpatents

    Borkowski, C J

    1954-01-19

    This pulse-type survey instrument is suitable for readily detecting {alpha} particles in the presence of high {beta} and {gamma} backgrounds. The instruments may also be used to survey for neutrons, {beta} particles and {gamma} rays by employing suitably designed interchangeable probes and selecting an operating potential to correspond to the particular probe.

  20. ARM User Survey Report

    SciTech Connect

    Roeder, LR

    2010-06-22

    The objective of this survey was to obtain user feedback to, among other things, determine how to organize the exponentially growing data within the Atmospheric Radiation Measurement (ARM) Climate Research Facility, and identify users’ preferred data analysis system. The survey findings appear to have met this objective, having received approximately 300 responses that give insight into the type of work users perform, usage of the data, percentage of data analysis users might perform on an ARM-hosted computing resource, downloading volume level where users begin having reservations, opinion about usage if given more powerful computing resources (including ability to manipulate data), types of tools that would be most beneficial to them, preferred programming language and data analysis system, level of importance for certain types of capabilities, and finally, level of interest in participating in a code-sharing community.

  1. Quantum chromodynamics with advanced computing

    SciTech Connect

    Kronfeld, Andreas S.; /Fermilab

    2008-07-01

    We survey results in lattice quantum chromodynamics from groups in the USQCD Collaboration. The main focus is on physics, but many aspects of the discussion are aimed at an audience of computational physicists.

  2. Computer representation of molecular surfaces

    SciTech Connect

    Max, N.L.

    1981-07-06

    This review article surveys recent work on computer representation of molecular surfaces. Several different algorithms are discussed for producing vector or raster drawings of space-filling models formed as the union of spheres. Other smoother surfaces are also considered.

  3. Geosat survey

    NASA Astrophysics Data System (ADS)

    The Geosat Committee, a nonprofit, educational organization dedicated to improving satellite remote sensing for geological applications, is surveying the international geological community to determine the most important areas of the world for the exploration of nonrenewable resources. The results of this survey, whose sources will be kept confidential, will be given as recommendations for early satellite-scene selection to the the U.S. government (via the National Oceanic and Atmospheric Administration) and to other countries with satellites or ground receiving stations.

  4. Computers and Chinese Linguistics.

    ERIC Educational Resources Information Center

    Kierman, Frank A.; Barber, Elizabeth

    This survey of the field of Chinese language computational linguistics was prepared as a background study for the Chinese Linguistics Project at Princeton. Since the authors' main purpose was "critical reconnaissance," quantitative emphasis is on systems with which they are most familiar. The complexity of the Chinese writing system has presented…

  5. Computers and Young Children

    ERIC Educational Resources Information Center

    Lacina, Jan

    2007-01-01

    Technology is a way of life for most Americans. A recent study published by the National Writing Project (2007) found that Americans believe that computers have a positive effect on writing skills. The importance of learning to use technology ranked just below learning to read and write, and 74 percent of the survey respondents noted that children…

  6. Computing Activities in Secondary Education. Final Report.

    ERIC Educational Resources Information Center

    Bukoski, William J.; Korotkin, Arthur L.

    As a followup to a 1969 study, called Project CASE, a survey was initiated to determine to what extent computers are used in secondary public schools, and to discern to what extent computers affect the quality of education. Some 5,580 randomly selected schools were questioned about their use of computers; commercial computer manufacturers were…

  7. Computers and Computer Cultures.

    ERIC Educational Resources Information Center

    Papert, Seymour

    1981-01-01

    Instruction using computers is viewed as different from most other approaches to education, by allowing more than right or wrong answers, by providing models for systematic procedures, by shifting the boundary between formal and concrete processes, and by influencing the development of thinking in many new ways. (MP)

  8. Next-generation computers

    SciTech Connect

    Torrero, E.A.

    1985-01-01

    Developments related to tomorrow's computers are discussed, taking into account advances toward the fifth generation in Japan, the challenge to U.S. supercomputers, plans concerning the creation of supersmart computers for the U.S. military, a U.S. industry response to the Japanese challenge, a survey of U.S. and European research, Great Britain, the European Common Market, codifying human knowledge for machine reading, software engineering, the next-generation softwave, plans for obtaining the million-transistor chip, and fabrication issues for next-generation circuits. Other topics explored are related to a status report regarding artificial intelligence, an assessment of the technical challenges, aspects of sociotechnology, and defense advanced research projects. Attention is also given to expert systems, speech recognition, computer vision, function-level programming and automated programming, computing at the speed limit, VLSI, and superpower computers.

  9. Computer Education for Dental Students.

    ERIC Educational Resources Information Center

    Jeffcoat, M. K.; And Others

    1986-01-01

    A required computer course for preparing dental students to write computer programs, use software packages, and evaluate software for purchase is described. A post-course questionnaire survey of students revealed that the majority found the course helpful. A course outline is included. (Author/MLW)

  10. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  11. Computer animation challenges for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Vines, Mauricio; Lee, Won-Sook; Mavriplis, Catherine

    2012-07-01

    Computer animation requirements differ from those of traditional computational fluid dynamics (CFD) investigations in that visual plausibility and rapid frame update rates trump physical accuracy. We present an overview of the main techniques for fluid simulation in computer animation, starting with Eulerian grid approaches, the Lattice Boltzmann method, Fourier transform techniques and Lagrangian particle introduction. Adaptive grid methods, precomputation of results for model reduction, parallelisation and computation on graphical processing units (GPUs) are reviewed in the context of accelerating simulation computations for animation. A survey of current specific approaches for the application of these techniques to the simulation of smoke, fire, water, bubbles, mixing, phase change and solid-fluid coupling is also included. Adding plausibility to results through particle introduction, turbulence detail and concentration on regions of interest by level set techniques has elevated the degree of accuracy and realism of recent animations. Basic approaches are described here. Techniques to control the simulation to produce a desired visual effect are also discussed. Finally, some references to rendering techniques and haptic applications are mentioned to provide the reader with a complete picture of the challenges of simulating fluids in computer animation.

  12. Complexity Survey.

    ERIC Educational Resources Information Center

    Gordon, Sandra L.; Anderson, Beth C.

    To determine whether consensus existed among teachers about the complexity of common classroom materials, a survey was administered to 66 pre-service and in-service kindergarten and prekindergarten teachers. Participants were asked to rate 14 common classroom materials as simple, complex, or super-complex. Simple materials have one obvious part,…

  13. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  14. Cameron Station remedial investigation: Final asbestos survey report. Final report

    SciTech Connect

    1992-02-01

    Woodward-Clyde Federal Services (WCFS) conducted a comprehensive asbestos survey of the facilities at Cameron Station as part of its contract with the US Army Toxic and Hazardous Materials Agency (USATHAMA) to perform a remedial investigation and feasibility study (RI/FS) at the base. The purpose of the survey which was initiated August 23, 1990 in response to the Base Realignment And Closure Environmental Restoration Strategy (BRAC), was to identify friable and non-friable asbestos-containing material (ACM), provide options for abatement of asbestos, provide cost estimates for both abatement and operations and maintenance costs, and identifying actions requiring immediate action in Cameron Station`s 24 buildings. BRAC states that only friable asbestos which presents a threat to health and safety shall be removed; non-friable asbestos or friable asbestos which is encapsulated or in good repair shall be left in place and identified to the buyer per GSA agreement. The investigation followed protocols that met or exceeded the requirements of 40 CFR 763, the EPA regulations promulgated under the Asbestos Hazard Emergency Response Act (AHERA).

  15. Cooling Computers.

    ERIC Educational Resources Information Center

    Birken, Marvin N.

    1967-01-01

    Numerous decisions must be made in the design of computer air conditioning, each determined by a combination of economics, physical, and esthetic characteristics, and computer requirements. Several computer air conditioning systems are analyzed--(1) underfloor supply and overhead return, (2) underfloor plenum and overhead supply with computer unit…

  16. Pygmalion's Computer.

    ERIC Educational Resources Information Center

    Peelle, Howard A.

    Computers have undoubtedly entered the educational arena, mainly in the areas of computer-assisted instruction (CAI) and artificial intelligence, but whether educators should embrace computers and exactly how they should use them are matters of great debate. The use of computers in support of educational administration is widely accepted.…

  17. World survey of CAM

    NASA Astrophysics Data System (ADS)

    Hatvany, J.; Merchant, M. E.; Rathmill, K.; Yoshikawa, H.

    The worldwide state of the art and development trends in CAM are surveyed, emphasizing flexible manufacturing systems (FMS), robotics, computer-aided process planning, and computer-aided scheduling. The use of FMS, NC machine tools, DNC systems, and unmanned and nearly unmanned factories, are discussed as the state of the art in the USA, Japan, Western Europe and Eastern Europe. For the same areas, trends are projected, including the use of graphics and languages in CAM, and metamorphic machine tools. A Delphi-type forecast and its conclusions are presented. A CAM system for manufacture is projected for 1985, the use of robots equalling humans in assembly capability for 1990, and the fifty percent replacement of direct labor in automobile final assembly by programmable automation by 1995. An attempt is made to outline a methodical approach to forecasting the development of CAM over the next 10-15 years. Key issues in CAM proliferation, including financial and social aspects, are addressed.

  18. Geophex Airborne Unmanned Survey System

    SciTech Connect

    Won, I.L.; Keiswetter, D.

    1995-12-31

    Ground-based surveys place personnel at risk due to the proximity of buried unexploded ordnance (UXO) items or by exposure to radioactive materials and hazardous chemicals. The purpose of this effort is to design, construct, and evaluate a portable, remotely-piloted, airborne, geophysical survey system. This non-intrusive system will provide stand-off capability to conduct surveys and detect buried objects, structures, and conditions of interest at hazardous locations. During a survey, the operators remain remote from, but within visual distance of, the site. The sensor system never contacts the Earth, but can be positioned near the ground so that weak geophysical anomalies can be detected. The Geophex Airborne Unmanned Survey System (GAUSS) is designed to detect and locate small-scale anomalies at hazardous sites using magnetic and electromagnetic survey techniques. The system consists of a remotely-piloted, radio-controlled, model helicopter (RCH) with flight computer, light-weight geophysical sensors, an electronic positioning system, a data telemetry system, and a computer base-station. The report describes GAUSS and its test results.

  19. Using electronic surveys in nursing research.

    PubMed

    Cope, Diane G

    2014-11-01

    Computer and Internet use in businesses and homes in the United States has dramatically increased since the early 1980s. In 2011, 76% of households reported having a computer, compared with only 8% in 1984 (File, 2013). A similar increase in Internet use has also been seen, with 72% of households reporting access of the Internet in 2011 compared with 18% in 1997 (File, 2013). This emerging trend in technology has prompted use of electronic surveys in the research community as an alternative to previous telephone and postal surveys. Electronic surveys can offer an efficient, cost-effective method for data collection; however, challenges exist. An awareness of the issues and strategies to optimize data collection using web-based surveys is critical when designing research studies. This column will discuss the different types and advantages and disadvantages of using electronic surveys in nursing research, as well as methods to optimize the quality and quantity of survey responses. PMID:25355023

  20. Using electronic surveys in nursing research.

    PubMed

    Cope, Diane G

    2014-11-01

    Computer and Internet use in businesses and homes in the United States has dramatically increased since the early 1980s. In 2011, 76% of households reported having a computer, compared with only 8% in 1984 (File, 2013). A similar increase in Internet use has also been seen, with 72% of households reporting access of the Internet in 2011 compared with 18% in 1997 (File, 2013). This emerging trend in technology has prompted use of electronic surveys in the research community as an alternative to previous telephone and postal surveys. Electronic surveys can offer an efficient, cost-effective method for data collection; however, challenges exist. An awareness of the issues and strategies to optimize data collection using web-based surveys is critical when designing research studies. This column will discuss the different types and advantages and disadvantages of using electronic surveys in nursing research, as well as methods to optimize the quality and quantity of survey responses.

  1. Very large radio surveys of the sky.

    PubMed

    Condon, J J

    1999-04-27

    Recent advances in electronics and computing have made possible a new generation of large radio surveys of the sky that yield an order-of-magnitude higher sensitivity and positional accuracy. Combined with the unique properties of the radio universe, these quantitative improvements open up qualitatively different and exciting new scientific applications of radio surveys.

  2. Very large radio surveys of the sky

    PubMed Central

    Condon, J. J.

    1999-01-01

    Recent advances in electronics and computing have made possible a new generation of large radio surveys of the sky that yield an order-of-magnitude higher sensitivity and positional accuracy. Combined with the unique properties of the radio universe, these quantitative improvements open up qualitatively different and exciting new scientific applications of radio surveys. PMID:10220365

  3. Hardware survey for the avionics test bed

    NASA Technical Reports Server (NTRS)

    Cobb, J. M.

    1981-01-01

    A survey of maor hardware items that could possibly be used in the development of an avionics test bed for space shuttle attached or autonomous large space structures was conducted in NASA Johnson Space Center building 16. The results of the survey are organized to show the hardware by laboratory usage. Computer systems in each laboratory are described in some detail.

  4. Faculty of Education Students' Computer Self-Efficacy Beliefs and Their Attitudes towards Computers and Implementing Computer Supported Education

    ERIC Educational Resources Information Center

    Berkant, Hasan Güner

    2016-01-01

    This study investigates faculty of education students' computer self-efficacy beliefs and their attitudes towards computers and implementing computer supported education. This study is descriptive and based on a correlational survey model. The final sample consisted of 414 students studying in the faculty of education of a Turkish university. The…

  5. A regional land use survey based on remote sensing and other data: A report on a LANDSAT and computer mapping project, volume 1. [Arizona, Colorado, Montana, New Mexico, Utah, and Wyoming

    NASA Technical Reports Server (NTRS)

    Nez, G. (Principal Investigator); Mutter, D.

    1977-01-01

    The author has identified the following significant results. New LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.

  6. Computational principles of memory.

    PubMed

    Chaudhuri, Rishidev; Fiete, Ila

    2016-03-01

    The ability to store and later use information is essential for a variety of adaptive behaviors, including integration, learning, generalization, prediction and inference. In this Review, we survey theoretical principles that can allow the brain to construct persistent states for memory. We identify requirements that a memory system must satisfy and analyze existing models and hypothesized biological substrates in light of these requirements. We also highlight open questions, theoretical puzzles and problems shared with computer science and information theory. PMID:26906506

  7. On teaching computer ethics within a computer science department.

    PubMed

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  8. The ASCI Network for SC '98: Dense Wave Division Multiplexing for Distributed and Distance Computing

    SciTech Connect

    Adams, R.L.; Butman, W.; Martinez, L.G.; Pratt, T.J.; Vahle, M.O.

    1999-06-01

    This document highlights the DISCOM's Distance computing and communication team activities at the 1998 Supercomputing conference in Orlando, Florida. This conference is sponsored by the IEEE and ACM. Sandia National Laboratories, Lawrence Livermore National Laboratory, and Los Alamos National Laboratory have participated in this conference for ten years. For the last three years, the three laboratories have a joint booth at the conference under the DOE's ASCI, Accelerated Strategic Computing Initiatives. The DISCOM communication team uses the forum to demonstrate and focus communications and networking developments. At SC '98, DISCOM demonstrated the capabilities of Dense Wave Division Multiplexing. We exhibited an OC48 ATM encryptor. We also coordinated the other networking activities within the booth. This paper documents those accomplishments, discusses the details of their implementation, and describes how these demonstrations support overall strategies in ATM networking.

  9. Children and Computers: Greek Parents' Expectations.

    ERIC Educational Resources Information Center

    Vryzas, Konstantinos; Tsitouridou, Melpomene

    2002-01-01

    This survey investigated the expectations of Greek parents with regard to the potential impact of children's computer use on the fields of education, interpersonal relationships, and professional and social life. Considers socio-cultural environment; sex and age; and whether the parents had knowledge of computers, used computers at work, or had a…

  10. Computers in Public Broadcasting: Who, What, Where.

    ERIC Educational Resources Information Center

    Yousuf, M. Osman

    This handbook offers guidance to public broadcasting managers on computer acquisition and development activities. Based on a 1981 survey of planned and current computer uses conducted by the Corporation for Public Broadcasting (CPB) Information Clearinghouse, computer systems in public radio and television broadcasting stations are listed by…

  11. Online Hand Holding in Fixing Computer Glitches

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2005-01-01

    According to most surveys, computer manufacturers such as HP puts out reliable products, and computers in general are less troublesome than in the past. But personal computers are still prone to bugs, conflicts, viruses, spyware infestations, hacker and phishing attacks, and--most of all--user error. Unfortunately, technical support from computer…

  12. Computer Organizational Techniques Used by Office Personnel.

    ERIC Educational Resources Information Center

    Alexander, Melody

    1995-01-01

    According to survey responses from 404 of 532 office personnel, 81.7% enjoy working with computers; the majority save files on their hard drives, use disk labels and storage files, do not use subdirectories or compress data, and do not make backups of floppy disks. Those with higher degrees, more computer experience, and more daily computer use…

  13. The Effects of Home Computers on School Enrollment

    ERIC Educational Resources Information Center

    Fairlie, R.W.

    2005-01-01

    Approximately 9 out of 10 high school students who have access to a home computer use that computer to complete school assignments. Do these home computers, however, improve educational outcomes? Using the Computer and Internet Use Supplement to the 2001 Current Population Survey, I explore whether access to home computers increases the likelihood…

  14. Computational dosimetry

    SciTech Connect

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  15. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Reviews the performance of computer programs for writing poetry and prose, including MARK V. SHANEY, MELL, POETRY GENERATOR, THUNDER THOUGHT, and ORPHEUS. Discusses the writing principles of the programs. Provides additional information on computer magnification techniques. (YP)

  16. Computational Toxicology

    EPA Science Inventory

    Computational toxicology’ is a broad term that encompasses all manner of computer-facilitated informatics, data-mining, and modeling endeavors in relation to toxicology, including exposure modeling, physiologically based pharmacokinetic (PBPK) modeling, dose-response modeling, ...

  17. Female Computer

    NASA Technical Reports Server (NTRS)

    1964-01-01

    Melba Roy heads the group of NASA mathematicians, known as 'computers,' who track the Echo satellites. Roy's computations help produce the orbital element timetables by which millions can view the satellite from Earth as it passes overhead.

  18. Cloud Computing

    SciTech Connect

    Pete Beckman and Ian Foster

    2009-12-04

    Chicago Matters: Beyond Burnham (WTTW). Chicago has become a world center of "cloud computing." Argonne experts Pete Beckman and Ian Foster explain what "cloud computing" is and how you probably already use it on a daily basis.

  19. Quantum computing

    PubMed Central

    Li, Shu-Shen; Long, Gui-Lu; Bai, Feng-Shan; Feng, Song-Lin; Zheng, Hou-Zhi

    2001-01-01

    Quantum computing is a quickly growing research field. This article introduces the basic concepts of quantum computing, recent developments in quantum searching, and decoherence in a possible quantum dot realization. PMID:11562459

  20. Computer Algebra.

    ERIC Educational Resources Information Center

    Pavelle, Richard; And Others

    1981-01-01

    Describes the nature and use of computer algebra and its applications to various physical sciences. Includes diagrams illustrating, among others, a computer algebra system and flow chart of operation of the Euclidean algorithm. (SK)

  1. Competence, continuing education, and computers.

    PubMed

    Hegge, Margaret; Powers, Penny; Hendrickx, Lori; Vinson, Judith

    2002-01-01

    A survey of RNs in South Dakota was performed to determine their perceived level of competence, the extent to which their continuing nursing education (CNE) needs are being met, and their use of computers for CNE. Nationally certified nurses rated themselves significantly more competent than nurses who are not nationally certified. Fewer than half of the RNs reported their CNE needs were being met despite geographic access to CNE and programs available in their specialty. Three-fourths of nurses had computers at home while 76% had computers at work, yet fewer than 20% of nurses used these computers for CNE.

  2. Laser Surveying

    NASA Technical Reports Server (NTRS)

    1978-01-01

    NASA technology has produced a laser-aided system for surveying land boundaries in difficult terrain. It does the job more accurately than conventional methods, takes only one-third the time normally required, and is considerably less expensive. In surveying to mark property boundaries, the objective is to establish an accurate heading between two "corner" points. This is conventionally accomplished by erecting a "range pole" at one point and sighting it from the other point through an instrument called a theodolite. But how do you take a heading between two points which are not visible to each other, for instance, when tall trees, hills or other obstacles obstruct the line of sight? That was the problem confronting the U.S. Department of Agriculture's Forest Service. The Forest Service manages 187 million acres of land in 44 states and Puerto Rico. Unfortunately, National Forest System lands are not contiguous but intermingled in complex patterns with privately-owned land. In recent years much of the private land has been undergoing development for purposes ranging from timber harvesting to vacation resorts. There is a need for precise boundary definition so that both private owners and the Forest Service can manage their properties with confidence that they are not trespassing on the other's land.

  3. Farmland Survey

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A 1981 U.S. Department of Agriculture (USDA) study estimated that the nation is converting farmland to non-agricultural uses at the rate of 3 million acres a year. Seeking information on farmland loss in Florida, the state legislature, in 1984, directed establishment of a program for development of accurate data to enable intelligent legislation of state growth management. Thus was born Florida's massive Mapping and Monitoring of Agricultural Lands Project (MMALP). It employs data from the NASA-developed Landsat Earth resources survey satellite system as a quicker, less expensive alternative to ground surveying. The 3 year project involved inventory of Florida's 36 million acres classifying such as cropland, pastureland, citrus, woodlands, wetland, water and populated areas. Direction was assigned to Florida Department of Community Affairs (DCA) with assistance from the DOT. With the cooperation of the USDA, Soil Conservation Service, DCA decided that combining soil data with the Landsat land cover data would make available to land use planners a more comprehensive view of a county's land potential.

  4. Computer Ease.

    ERIC Educational Resources Information Center

    Drenning, Susan; Getz, Lou

    1992-01-01

    Computer Ease is an intergenerational program designed to put an Ohio elementary school's computer lab, software library, staff, and students at the disposal of older adults desiring to become computer literate. Three 90-minute instructional sessions allow seniors to experience 1-to-1 high-tech instruction by enthusiastic, nonthreatening…

  5. Parallel computers

    SciTech Connect

    Treveaven, P.

    1989-01-01

    This book presents an introduction to object-oriented, functional, and logic parallel computing on which the fifth generation of computer systems will be based. Coverage includes concepts for parallel computing languages, a parallel object-oriented system (DOOM) and its language (POOL), an object-oriented multilevel VLSI simulator using POOL, and implementation of lazy functional languages on parallel architectures.

  6. Computer Manual.

    ERIC Educational Resources Information Center

    Illinois State Office of Education, Springfield.

    This manual designed to provide the teacher with methods of understanding the computer and its potential in the classroom includes four units with exercises and an answer sheet. Unit 1 covers computer fundamentals, the mini computer, programming languages, an introduction to BASIC, and control instructions. Variable names and constants described…

  7. Computer Literacy.

    ERIC Educational Resources Information Center

    San Marcos Unified School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: After viewing many computer-literacy programs, we believe San Marcos Junior High School has developed a unique program which will truly develop computer literacy. Our hope is to give all students a comprehensive look at computers as they go through their two years here. They will not only learn the…

  8. Infrastructure Survey 2011

    ERIC Educational Resources Information Center

    Group of Eight (NJ1), 2012

    2012-01-01

    In 2011, the Group of Eight (Go8) conducted a survey on the state of its buildings and infrastructure. The survey is the third Go8 Infrastructure survey, with previous surveys being conducted in 2007 and 2009. The current survey updated some of the information collected in the previous surveys. It also collated data related to aspects of the…

  9. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  10. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  11. A Survey of Collectives

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Wolpert, David

    2004-01-01

    Due to the increasing sophistication and miniaturization of computational components, complex, distributed systems of interacting agents are becoming ubiquitous. Such systems, where each agent aims to optimize its own performance, but where there is a well-defined set of system-level performance criteria, are called collectives. The fundamental problem in analyzing/designing such systems is in determining how the combined actions of self-interested agents leads to 'coordinated' behavior on a iarge scale. Examples of artificial systems which exhibit such behavior include packet routing across a data network, control of an array of communication satellites, coordination of multiple deployables, and dynamic job scheduling across a distributed computer grid. Examples of natural systems include ecosystems, economies, and the organelles within a living cell. No current scientific discipline provides a thorough understanding of the relation between the structure of collectives and how well they meet their overall performance criteria. Although still very young, research on collectives has resulted in successes both in understanding and designing such systems. It is eqected that as it matures and draws upon other disciplines related to collectives, this field will greatly expand the range of computationally addressable tasks. Moreover, in addition to drawing on them, such a fully developed field of collective intelligence may provide insight into already established scientific fields, such as mechanism design, economics, game theory, and population biology. This chapter provides a survey to the emerging science of collectives.

  12. Computer Literacy: Teaching Computer Ethics.

    ERIC Educational Resources Information Center

    Troutner, Joanne

    1986-01-01

    Suggests learning activities for teaching computer ethics in three areas: (1) equal access; (2) computer crime; and (3) privacy. Topics include computer time, advertising, class enrollments, copyright law, sabotage ("worms"), the Privacy Act of 1974 and the Freedom of Information Act of 1966. (JM)

  13. Computers improves sonar seabed maps

    SciTech Connect

    Not Available

    1984-05-01

    A software package for computer aided mapping of sonar (CAMOS) has been developed in Norway. It has automatic mosaic presentation, which produces fully scale-rectified side scan sonograms automatically plotted on geographical and UTM map grids. The program is the first of its kind in the world. The maps produced by this method are more accurate and detailed than those produced by conventional methods. The main applications of CAMOS are: seafloor mapping; pipeline route surveys; pipeline inspection surveys; platform site surveys; geological mapping and geotechnical investigations. With the aerial-photograph quality of the CAMOS maps, a more accurate and visual representation of the seabed is achieved.

  14. Computing with neuron nets (review)

    SciTech Connect

    Achasova, S.M.

    1992-01-01

    This survey treats neural networks as a mathematical model for parallel computation. It describes Hopfield networks and Boltzmann machines. Methods are presented for programming computations with these models. The computalional-energy function is introduced for the Hopfield model; the model is used for programming networks (the examples used are pattern recognition and the travelling salesman problem). Consensus functions are defined for the Boltzmann model, and it is shown how this model can be used for the travelling salesman problem. 73 refs.

  15. Computational psychiatry.

    PubMed

    Montague, P Read; Dolan, Raymond J; Friston, Karl J; Dayan, Peter

    2012-01-01

    Computational ideas pervade many areas of science and have an integrative explanatory role in neuroscience and cognitive science. However, computational depictions of cognitive function have had surprisingly little impact on the way we assess mental illness because diseases of the mind have not been systematically conceptualized in computational terms. Here, we outline goals and nascent efforts in the new field of computational psychiatry, which seeks to characterize mental dysfunction in terms of aberrant computations over multiple scales. We highlight early efforts in this area that employ reinforcement learning and game theoretic frameworks to elucidate decision-making in health and disease. Looking forwards, we emphasize a need for theory development and large-scale computational phenotyping in human subjects.

  16. Multiresolution Wavelet Based Adaptive Numerical Dissipation Control for Shock-Turbulence Computations

    NASA Technical Reports Server (NTRS)

    Sjoegreen, B.; Yee, H. C.

    2001-01-01

    The recently developed essentially fourth-order or higher low dissipative shock-capturing scheme of Yee, Sandham and Djomehri (1999) aimed at minimizing nu- merical dissipations for high speed compressible viscous flows containing shocks, shears and turbulence. To detect non smooth behavior and control the amount of numerical dissipation to be added, Yee et al. employed an artificial compression method (ACM) of Harten (1978) but utilize it in an entirely different context than Harten originally intended. The ACM sensor consists of two tuning parameters and is highly physical problem dependent. To minimize the tuning of parameters and physical problem dependence, new sensors with improved detection properties are proposed. The new sensors are derived from utilizing appropriate non-orthogonal wavelet basis functions and they can be used to completely switch to the extra numerical dissipation outside shock layers. The non-dissipative spatial base scheme of arbitrarily high order of accuracy can be maintained without compromising its stability at all parts of the domain where the solution is smooth. Two types of redundant non-orthogonal wavelet basis functions are considered. One is the B-spline wavelet (Mallat & Zhong 1992) used by Gerritsen and Olsson (1996) in an adaptive mesh refinement method, to determine regions where re nement should be done. The other is the modification of the multiresolution method of Harten (1995) by converting it to a new, redundant, non-orthogonal wavelet. The wavelet sensor is then obtained by computing the estimated Lipschitz exponent of a chosen physical quantity (or vector) to be sensed on a chosen wavelet basis function. Both wavelet sensors can be viewed as dual purpose adaptive methods leading to dynamic numerical dissipation control and improved grid adaptation indicators. Consequently, they are useful not only for shock-turbulence computations but also for computational aeroacoustics and numerical combustion. In addition, these

  17. Computed Tomography

    NASA Astrophysics Data System (ADS)

    Castellano, Isabel; Geleijns, Jacob

    After its clinical introduction in 1973, computed tomography developed from an x-ray modality for axial imaging in neuroradiology into a versatile three dimensional imaging modality for a wide range of applications in for example oncology, vascular radiology, cardiology, traumatology and even in interventional radiology. Computed tomography is applied for diagnosis, follow-up studies and screening of healthy subpopulations with specific risk factors. This chapter provides a general introduction in computed tomography, covering a short history of computed tomography, technology, image quality, dosimetry, room shielding, quality control and quality criteria.

  18. Computer Software.

    ERIC Educational Resources Information Center

    Kay, Alan

    1984-01-01

    Discusses the nature and development of computer software. Programing, programing languages, types of software (including dynamic spreadsheets), and software of the future are among the topics considered. (JN)

  19. Knowledge of computer among healthcare professionals of India: a key toward e-health.

    PubMed

    Gour, Neeraj; Srivastava, Dhiraj

    2010-11-01

    Information technology has radically changed the way that many people work and think. Over the years, technology has touched a new acme and now it is not confined to developed countries. Developing countries such as India have kept pace with the world in modern technology. Healthcare professionals can no longer ignore the application of information technology to healthcare because they are key to e-health. This study was conducted to enlighten the perspective and implications of computers among healthcare professionals, with the objective to assess the knowledge, use, and need of computers among healthcare professionals. A cross-sectional study of 240 healthcare professionals, including doctors, nurses, lab technicians, and pharmacists, was conducted. Each participant was interviewed using a pretested, semistructured format. Of 240 healthcare professionals, 57.91% were knowledgeable about computers. Of them, 22.08% had extensive knowledge and 35.83% had partial knowledge. Computer knowledge was greater among the age group 20-25 years (high knowledge-43.33% and partial knowledge-46.66%). Of 99 males, 21.21% were found to have good knowledge and 42.42% had partial knowledge. A majority of doctors and nurses used computer for study purposes. The remaining healthcare professionals used it basically for the sake of entertainment, Internet, and e-mail. A majority of all healthcare professionals (95.41%) requested computer training, which according to them would definitely help to make their future more bright and nurtured as well as to enhance their knowledge regarding computers.

  20. Computational models and resource allocation for supercomputers

    NASA Technical Reports Server (NTRS)

    Mauney, Jon; Agrawal, Dharma P.; Harcourt, Edwin A.; Choe, Young K.; Kim, Sukil

    1989-01-01

    There are several different architectures used in supercomputers, with differing computational models. These different models present a variety of resource allocation problems that must be solved. The computational needs of a program must be cast in terms of the computational model supported by the supercomputer, and this must be done in a way that makes effective use of the machine's resources. This is the resource allocation problem. The computational models of available supercomputers and the associated resource allocation techniques are surveyed. It is shown that many problems and solutions appear repeatedly in very different computing environments. Some case studies are presented, showing concrete computational models and the allocation strategies used.

  1. Computer News

    ERIC Educational Resources Information Center

    Science Activities: Classroom Projects and Curriculum Ideas, 2007

    2007-01-01

    This article presents several news stories about computers and technology. (1) Applied Science Associates of Narragansett, Rhode Island is providing computer modeling technology to help locate the remains to the USS Bonhomme Richard, which sank in 1779 after claiming a Revolutionary War victory. (2) Whyville, the leading edu-tainment virtual world…

  2. Cafeteria Computers.

    ERIC Educational Resources Information Center

    Dervarics, Charles

    1992-01-01

    By relying on new computer hardware and software, school food service departments can keep better records of daily food consumption, free and reduced-price meals, inventory, production, and other essentials. The most commonly used systems fall into two basic categories: point-of-sale computers and behind-the-counter systems. State funding efforts…

  3. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Discussed are three examples of computer graphics including biomorphs, Truchet tilings, and fractal popcorn. The graphics are shown and the basic algorithm using multiple iteration of a particular function or mathematical operation is described. An illustration of a snail shell created by computer graphics is presented. (YP)

  4. Computer Insecurity.

    ERIC Educational Resources Information Center

    Wilson, David L.

    1994-01-01

    College administrators recently appealed to students and faculty to change their computer passwords after security experts announced that tens of thousands had been stolen by computer hackers. Federal officials are investigating. Such attacks are not uncommon, but the most effective solutions are either inconvenient or cumbersome. (MSE)

  5. Computer Graphics.

    ERIC Educational Resources Information Center

    Halpern, Jeanne W.

    1970-01-01

    Computer graphics have been called the most exciting development in computer technology. At the University of Michigan, three kinds of graphics output equipment are now being used: symbolic printers, line plotters or drafting devices, and cathode-ray tubes (CRT). Six examples are given that demonstrate the range of graphics use at the University.…

  6. Computing Life

    ERIC Educational Resources Information Center

    National Institute of General Medical Sciences (NIGMS), 2009

    2009-01-01

    Computer advances now let researchers quickly search through DNA sequences to find gene variations that could lead to disease, simulate how flu might spread through one's school, and design three-dimensional animations of molecules that rival any video game. By teaming computers and biology, scientists can answer new and old questions that could…

  7. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  8. Computational astrophysics

    NASA Technical Reports Server (NTRS)

    Miller, Richard H.

    1987-01-01

    Astronomy is an area of applied physics in which unusually beautiful objects challenge the imagination to explain observed phenomena in terms of known laws of physics. It is a field that has stimulated the development of physical laws and of mathematical and computational methods. Current computational applications are discussed in terms of stellar and galactic evolution, galactic dynamics, and particle motions.

  9. Computational Pathology

    PubMed Central

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  10. I, Computer

    ERIC Educational Resources Information Center

    Barack, Lauren

    2005-01-01

    What child hasn't chatted with friends through a computer? But chatting with a computer? Some Danish scientists have literally put a face on their latest software program, bringing to virtual life storyteller Hans Christian Andersen, who engages users in actual conversations. The digitized Andersen resides at the Hans Christian Andersen Museum in…

  11. Trust models in ubiquitous computing.

    PubMed

    Krukow, Karl; Nielsen, Mogens; Sassone, Vladimiro

    2008-10-28

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.

  12. The Use of Computers in Japanese Schools.

    ERIC Educational Resources Information Center

    Watanabe, Ryo; Sawada, Toshio

    Results of surveys conducted to determine the present situation and trends in the use of computers in Japanese elementary and lower and upper secondary schools are summarized. Much of the data quoted comes from surveys by the International Association for the Evaluation of Educational Achievement and the Ministry of Education, Science, and Culture…

  13. Computers in Schools of Southeast Texas in 1994.

    ERIC Educational Resources Information Center

    Henderson, David L.; Renfrow, Raylene

    This paper reviews literature on the use of computers at work and home, computer skills needed by new teachers, and suggestions for administrators to support computer usage in schools. A survey of 52 school districts serving the Houston area of southeast Texas is reported, indicating that 22,664 computers were in use, with a mean of 436 computers…

  14. Learning To Use Computers for Future Communication Professions.

    ERIC Educational Resources Information Center

    Hurme, Pertti

    A study examined how to teach computer skills to future professionals in communications. The context of the study was the communications department in a mid-sized Finnish university. Data was collected on computer use and attitudes to computers and computer-mediated communication by means of surveys and learning journals during the Communications…

  15. A Web of Resources for Introductory Computer Science.

    ERIC Educational Resources Information Center

    Rebelsky, Samuel A.

    As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…

  16. Dental Student Experience and Perceptions of Computer Technology.

    ERIC Educational Resources Information Center

    Feldman, Cecile A.

    1992-01-01

    A survey of 180 dental students in 3 classes assessed student knowledge of computer-related topics and perceptions of the usefulness of computers in different areas of practice management. Computer ownership and use, computer-related courses taken, software types used, and student characteristics (age, sex, academic achievement, undergraduate…

  17. Alumni Perspectives Survey, 2010. Survey Report

    ERIC Educational Resources Information Center

    Sheikh, Sabeen

    2010-01-01

    During the months of April and September of 2009, the Graduate Management Admission Council[R] (GMAC[R]) conducted the Alumni Perspectives Survey, a longitudinal study of prior respondents to the Global Management Education Graduate Survey of management students nearing graduation. A total of 3,708 alumni responded to the April 2009 survey,…

  18. 2012 Alumni Perspectives Survey. Survey Report

    ERIC Educational Resources Information Center

    Leach, Laura

    2012-01-01

    Conducted in September 2011, this Alumni Perspectives Survey by the Graduate Management Admission Council (GMAC) is a longitudinal study of respondents to the Global Management Education Graduate Survey, the annual GMAC[R] exit survey of graduate management students in their final year of business school. This 12th annual report includes responses…

  19. Astronomical surveys and big data

    NASA Astrophysics Data System (ADS)

    Mickaelian, Areg M.

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ -rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ -ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC) in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS), and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  20. Universal computer test stand (recommended computer test requirements). [for space shuttle computer evaluation

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Techniques are considered which would be used to characterize areospace computers with the space shuttle application as end usage. The system level digital problems which have been encountered and documented are surveyed. From the large cross section of tests, an optimum set is recommended that has a high probability of discovering documented system level digital problems within laboratory environments. Defined is a baseline hardware, software system which is required as a laboratory tool to test aerospace computers. Hardware and software baselines and additions necessary to interface the UTE to aerospace computers for test purposes are outlined.

  1. Expanding the View of Preservice Teachers' Computer Literacy: Implications from Written and Verbal Data and Metaphors as Freehand Drawings.

    ERIC Educational Resources Information Center

    Sherry, Annette C.

    2000-01-01

    Examines changes in attitudes towards computers and basic computer skills of preschool teachers participating in a two-year, school-based teacher training program. Written responses to a pre/post administration of the Computer Attitude Survey and computer skills survey, freehand drawings of metaphors expressing computer use, and verbal responses…

  2. Coal-seismic, desktop computer programs in BASIC; Part 7, Display and compute shear-pair seismograms

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of geophysical data taken with the U.S. Geological Survey's coal-seismic system is done with a desk-top, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report discusses and presents five computer pro grams used to display and compute shear-pair seismograms.

  3. Computer Science Research at Langley

    NASA Technical Reports Server (NTRS)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  4. Applied technology center business plan and market survey

    NASA Technical Reports Server (NTRS)

    Hodgin, Robert F.; Marchesini, Roberto

    1990-01-01

    Business plan and market survey for the Applied Technology Center (ATC), computer technology transfer and development non-profit corporation, is presented. The mission of the ATC is to stimulate innovation in state-of-the-art and leading edge computer based technology. The ATC encourages the practical utilization of late-breaking computer technologies by firms of all variety.

  5. Personal Computers.

    ERIC Educational Resources Information Center

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  6. Sort computation

    NASA Technical Reports Server (NTRS)

    Dorband, John E.

    1988-01-01

    Sorting has long been used to organize data in preparation for further computation, but sort computation allows some types of computation to be performed during the sort. Sort aggregation and sort distribution are the two basic forms of sort computation. Sort aggregation generates an accumulative or aggregate result for each group of records and places this result in one of the records. An aggregate operation can be any operation that is both associative and commutative, i.e., any operation whose result does not depend on the order of the operands or the order in which the operations are performed. Sort distribution copies the value from a field of a specific record in a group into that field in every record of that group.

  7. LHC Computing

    SciTech Connect

    Lincoln, Don

    2015-07-28

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  8. Advanced computing

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Advanced concepts in hardware, software and algorithms are being pursued for application in next generation space computers and for ground based analysis of space data. The research program focuses on massively parallel computation and neural networks, as well as optical processing and optical networking which are discussed under photonics. Also included are theoretical programs in neural and nonlinear science, and device development for magnetic and ferroelectric memories.

  9. A search for stratiform massive-sulfide exploration targets in Appalachian Devonian rocks; a case study using computer-assisted attribute-coincidence mapping

    USGS Publications Warehouse

    Wedow, Helmuth

    1983-01-01

    The empirical model for sediment-associated, stratiform, exhalative, massive-sulfide deposits presented by D. Large in 1979 and 1980 has been redesigned to permit its use in a computer-assisted search for exploration-target areas in Devonian rocks of the Appalachian region using attribute-coincidence mapping (ACM). Some 36 gridded-data maps and selected maps derived therefrom were developed to show the orthogonal patterns, using the 7-1/2 minute quadrangle as an information cell, of geologic data patterns relevant to the empirical model. From these map and data files, six attribute-coincidence maps were prepared to illustrate both variation in the application of ACM techniques and the extent of possible significant exploration-target areas. As a result of this preliminary work in ACM, four major (and some lesser) exploration-target areas needing further study and analysis have been defined as follows: 1) in western and central New York in the outcrop area of lowermost Upper Devonian rocks straddling the Clarendon-Linden fault; 2) in western Virginia and eastern West Virginia in an area largely coincident with the well-known 'Oriskany' Mn-Fe ores; 3) an area in West Virginia, Maryland, and Virginia along and nearby the trend of the Alabama-New York lineament of King and Zietz approximately between 38- and 40-degrees N. latitude; and 4) an area in northeastern Ohio overlying an area coincident with a significant thickness of Silurian salt and high modern seismic activity. Some lesser, smaller areas suggested by relatively high coincidence may also be worthy of further study.

  10. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  11. Chromatin Computation

    PubMed Central

    Bryant, Barbara

    2012-01-01

    In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this “chromatin computer” to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal – and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines. PMID:22567109

  12. The influence of computer literacy and computer anxiety on computer self-efficacy: the moderating effect of gender.

    PubMed

    Lee, Chun-Lin; Huang, Ming-Kuei

    2014-03-01

    Although researchers have published many studies on computer literacy and anxiety related to computer self-efficacy, there are two gaps in relevant literature. First, the effects of computer literacy and computer anxiety on computer self-efficacy are considered separately, yet their interaction effect is neglected. Second, the role of individual gender characteristics in the relationships between computer literacy and anxiety on computer self-efficacy is far from clear. To address these two concerns, this study empirically investigates the interaction effect between computer literacy and computer anxiety, and the moderating role of gender. This study tests hypotheses using survey data from people who have experience using computers in Taiwan, and uses hierarchical regression to analyze the models. Results indicate that computer literacy can help form positive computer self-efficacy more effectively for males than for females, and computer anxiety can lead to more negative computer self-efficacy for females than for males. A three-way interaction also exists among computer literacy, computer anxiety, and gender. The results, research contributions, and limitations are discussed, and implications for future studies are suggested.

  13. Computational structures for robotic computations

    NASA Technical Reports Server (NTRS)

    Lee, C. S. G.; Chang, P. R.

    1987-01-01

    The computational problem of inverse kinematics and inverse dynamics of robot manipulators by taking advantage of parallelism and pipelining architectures is discussed. For the computation of inverse kinematic position solution, a maximum pipelined CORDIC architecture has been designed based on a functional decomposition of the closed-form joint equations. For the inverse dynamics computation, an efficient p-fold parallel algorithm to overcome the recurrence problem of the Newton-Euler equations of motion to achieve the time lower bound of O(log sub 2 n) has also been developed.

  14. Computing Strategies in Small Universities and Colleges.

    ERIC Educational Resources Information Center

    Coughlin, Patrick J.

    A survey was conducted to identify the patterns of academic and administrative computer services in use--or planned for the near future--in small colleges and universities as they relate to such strategic policy areas as: (1) management/governance structure; (2) personnel-staff; (3) personnel-faculty; (4) academic computing; (5) library services;…

  15. Computer Viruses: An Assessment of Student Perceptions.

    ERIC Educational Resources Information Center

    Jones, Mary C.; Arnett, Kirk P.

    1992-01-01

    A majority of 213 college business students surveyed had knowledge of computer viruses; one-fourth had been exposed to them. Many believed that computer professionals are responsible for prevention and cure. Educators should make students aware of multiple sources of infection, the breadth and extent of possible damage, and viral detection and…

  16. Improving radiation survey data using CADD/CAE

    SciTech Connect

    Palau, G.L.; Tarpinian, J.E.

    1987-01-01

    A new application of computer-aided design and drafting (CADD) and computer-aided engineering (CAE) at the Three Mile Island Unit 2 (TMI-2) cleanup is improving the quality of radiation survey data taken in the plant. The use of CADD/CAE-generated survey maps has increased both the accuracy of survey data and the capability to perform analyses with these data. In addition, health physics technician man hours and radiation exposure can be reduced in situations where the CADD/CAE-generated drawings are used for survey mapping.

  17. Community Perception Survey, 2001.

    ERIC Educational Resources Information Center

    Rasmussen, Patricia; Silverman, Barbara

    This document is a report on the 2001 Community Perception Survey administered by Mt. San Antonio College (SAC) (California). The survey gathered public perception data of SAC services and programs. The survey was mailed to 773 service area community leaders; 160 (21%) responded. Survey results showed that: (1) 70% had knowledge of SAC programs…

  18. ACSI Survey 2014

    Atmospheric Science Data Center

    2014-08-26

    Upcoming EOSDIS Survey   Dear Colleagues,   In the next few days, you will ... on behalf of NASA. This message will ask you to complete a survey for users of NASA Earth science data and services, which includes the ... System (EOSDIS) science data centers evaluated by this survey. The purpose of this survey is to help NASA and the DAACs assess ...

  19. [DNA computing].

    PubMed

    Błasiak, Janusz; Krasiński, Tadeusz; Popławski, Tomasz; Sakowski, Sebastian

    2011-01-01

    Biocomputers can be an alternative for traditional "silicon-based" computers, which continuous development may be limited due to further miniaturization (imposed by the Heisenberg Uncertainty Principle) and increasing the amount of information between the central processing unit and the main memory (von Neuman bottleneck). The idea of DNA computing came true for the first time in 1994, when Adleman solved the Hamiltonian Path Problem using short DNA oligomers and DNA ligase. In the early 2000s a series of biocomputer models was presented with a seminal work of Shapiro and his colleguas who presented molecular 2 state finite automaton, in which the restriction enzyme, FokI, constituted hardware and short DNA oligomers were software as well as input/output signals. DNA molecules provided also energy for this machine. DNA computing can be exploited in many applications, from study on the gene expression pattern to diagnosis and therapy of cancer. The idea of DNA computing is still in progress in research both in vitro and in vivo and at least promising results of these research allow to have a hope for a breakthrough in the computer science. PMID:21735816

  20. Computational mechanics

    SciTech Connect

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  1. The AAS Workforce Survey

    NASA Astrophysics Data System (ADS)

    Postman, Marc; Norman, D. J.; Evans, N. R.; Ivie, R.

    2014-01-01

    The AAS Demographics Committee, on behalf of the AAS, was tasked with initiating a biennial survey to improve the Society's ability to serve its members and to inform the community about changes in the community's demographics. A survey, based in part on similar surveys for other scientific societies, was developed in the summer of 2012 and was publicly launched in January 2013. The survey randomly targeted 2500 astronomers who are members of the AAS. The survey was closed 4 months later (April 2013). The response rate was excellent - 63% (1583 people) completed the survey. I will summarize the results from this survey, highlighting key results and plans for their broad dissemination.

  2. Computational Psychiatry

    PubMed Central

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  3. Computational mechanics

    SciTech Connect

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  4. Computer viruses

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  5. LHC Computing

    ScienceCinema

    Lincoln, Don

    2016-07-12

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  6. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  7. Distributed GPU Computing in GIScience

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.

    2013-12-01

    Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE

  8. Computer Routing.

    ERIC Educational Resources Information Center

    Malone, Roger

    1991-01-01

    Computerized bus-routing systems plot the most efficient routes, cut the time it takes to draw routes, and generate reports quickly and accurately. However, school districts often underestimate the amount of work necessary to get information into the computer database. (MLF)

  9. Computer Corner.

    ERIC Educational Resources Information Center

    Mason, Margie

    1985-01-01

    This article: describes how to prevent pins on game paddles from breaking; suggests using needlepoint books for ideas to design computer graphics; lists a BASIC program to create a Christmas tree, with extension activities; suggests a LOGO Christmas activity; and describes a book on the development of microcomputers. (JN)

  10. Business Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    A brief definition of some fundamentals of microcomputers and of the ways they may be used in small businesses can help potential buyers make informed purchases. Hardware (the mechanical devices from which computers are made) described here are the video display, keyboard, central processing unit, "random access" and "read only" memories, cassette…

  11. Computational trigonometry

    SciTech Connect

    Gustafson, K.

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  12. Computational Physics.

    ERIC Educational Resources Information Center

    Borcherds, P. H.

    1986-01-01

    Describes an optional course in "computational physics" offered at the University of Birmingham. Includes an introduction to numerical methods and presents exercises involving fast-Fourier transforms, non-linear least-squares, Monte Carlo methods, and the three-body problem. Recommends adding laboratory work into the course in the future. (TW)

  13. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  14. Networking computers.

    PubMed

    McBride, D C

    1997-03-01

    This decade the role of the personal computer has shifted dramatically from a desktop device designed to increase individual productivity and efficiency to an instrument of communication linking people and machines in different places with one another. A computer in one city can communicate with another that may be thousands of miles away. Networking is how this is accomplished. Just like the voice network used by the telephone, computer networks transmit data and other information via modems over these same telephone lines. A network can be created over both short and long distances. Networks can be established within a hospital or medical building or over many hospitals or buildings covering many geographic areas. Those confined to one location are called LANs, local area networks. Those that link computers in one building to those at other locations are known as WANs, or wide area networks. The ultimate wide area network is the one we've all been hearing so much about these days--the Internet, and its World Wide Web. Setting up a network is a process that requires careful planning and commitment. To avoid potential pitfalls and to make certain the network you establish meets your needs today and several years down the road, several steps need to be followed. This article reviews the initial steps involved in getting ready to network.

  15. Computational Estimation

    ERIC Educational Resources Information Center

    Fung, Maria G.; Latulippe, Christine L.

    2010-01-01

    Elementary school teachers are responsible for constructing the foundation of number sense in youngsters, and so it is recommended that teacher-training programs include an emphasis on number sense to ensure the development of dynamic, productive computation and estimation skills in students. To better prepare preservice elementary school teachers…

  16. Computational Musicology.

    ERIC Educational Resources Information Center

    Bel, Bernard; Vecchione, Bernard

    1993-01-01

    Asserts that a revolution has been occurring in musicology since the 1970s. Contends that music has change from being only a source of emotion to appearing more open to science and techniques based on computer technology. Describes recent research and other writings about the topic and provides an extensive bibliography. (CFR)

  17. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  18. Corporate Recruiters Survey, 2011. Survey Report

    ERIC Educational Resources Information Center

    Edgington, Rachel

    2011-01-01

    In this report, the Graduate Management Admission Council[R] (GMAC[R]) presents the results from the 2011 Corporate Recruiters Survey. Conducted annually since 2001, this survey examines the job outlook for recent graduate business students as well as employer needs and expectations. The objectives of this study are to obtain a picture of the…

  19. 2012 Corporate Recruiters Survey. Survey Report

    ERIC Educational Resources Information Center

    Estrada, Rebecca

    2012-01-01

    This paper presents the results from the 2012 Corporate Recruiters Survey conducted by the Graduate Management Admission Council[R] (GMAC[R]). Conducted annually since 2001, this survey examines the job outlook for recent graduate business students as well as employer needs and expectations. The objectives of this study are to obtain a picture of…

  20. High resolution survey for topographic surveying

    NASA Astrophysics Data System (ADS)

    Luh, L. C.; Setan, H.; Majid, Z.; Chong, A. K.; Tan, Z.

    2014-02-01

    In this decade, terrestrial laser scanner (TLS) is getting popular in many fields such as reconstruction, monitoring, surveying, as-built of facilities, archaeology, and topographic surveying. This is due the high speed in data collection which is about 50,000 to 1,000,000 three-dimensional (3D) points per second at high accuracy. The main advantage of 3D representation for the data is that it is more approximate to the real world. Therefore, the aim of this paper is to show the use of High-Definition Surveying (HDS), also known as 3D laser scanning for topographic survey. This research investigates the effectiveness of using terrestrial laser scanning system for topographic survey by carrying out field test in Universiti Teknologi Malaysia (UTM), Skudai, Johor. The 3D laser scanner used in this study is a Leica ScanStation C10. Data acquisition was carried out by applying the traversing method. In this study, the result for the topographic survey is under 1st class survey. At the completion of this study, a standard of procedure was proposed for topographic data acquisition using laser scanning systems. This proposed procedure serves as a guideline for users who wish to utilize laser scanning system in topographic survey fully.

  1. Aerial radiation surveys

    SciTech Connect

    Jobst, J.

    1980-01-01

    A recent aerial radiation survey of the surroundings of the Vitro mill in Salt Lake City shows that uranium mill tailings have been removed to many locations outside their original boundary. To date, 52 remote sites have been discovered within a 100 square kilometer aerial survey perimeter surrounding the mill; 9 of these were discovered with the recent aerial survey map. Five additional sites, also discovered by aerial survey, contained uranium ore, milling equipment, or radioactive slag. Because of the success of this survey, plans are being made to extend the aerial survey program to other parts of the Salt Lake valley where diversions of Vitro tailings are also known to exist.

  2. Amorphous Computing

    NASA Astrophysics Data System (ADS)

    Sussman, Gerald

    2002-03-01

    Digital computers have always been constructed to behave as precise arrangements of reliable parts, and our techniques for organizing computations depend upon this precision and reliability. Two emerging technologies, however, are begnning to undercut these assumptions about constructing and programming computers. These technologies -- microfabrication and bioengineering -- will make it possible to assemble systems composed of myriad information- processing units at almost no cost, provided: 1) that not all the units need to work correctly; and 2) that there is no need to manufacture precise geometrical arrangements or interconnection patterns among them. Microelectronic mechanical components are becoming so inexpensive to manufacture that we can anticipate combining logic circuits, microsensors, actuators, and communications devices integrated on the same chip to produce particles that could be mixed with bulk materials, such as paints, gels, and concrete. Imagine coating bridges or buildings with smart paint that can sense and report on traffic and wind loads and monitor structural integrity of the bridge. A smart paint coating on a wall could sense vibrations, monitor the premises for intruders, or cancel noise. Even more striking, there has been such astounding progress in understanding the biochemical mechanisms in individual cells, that it appears we'll be able to harness these mechanisms to construct digital- logic circuits. Imagine a discipline of cellular engineering that could tailor-make biological cells that function as sensors and actuators, as programmable delivery vehicles for pharmaceuticals, as chemical factories for the assembly of nanoscale structures. Fabricating such systems seem to be within our reach, even if it is not yet within our grasp Fabrication, however, is only part of the story. We can envision producing vast quantities of individual computing elements, whether microfabricated particles, engineered cells, or macromolecular computing

  3. Bacteria as computers making computers

    PubMed Central

    Danchin, Antoine

    2009-01-01

    Various efforts to integrate biological knowledge into networks of interactions have produced a lively microbial systems biology. Putting molecular biology and computer sciences in perspective, we review another trend in systems biology, in which recursivity and information replace the usual concepts of differential equations, feedback and feedforward loops and the like. Noting that the processes of gene expression separate the genome from the cell machinery, we analyse the role of the separation between machine and program in computers. However, computers do not make computers. For cells to make cells requires a specific organization of the genetic program, which we investigate using available knowledge. Microbial genomes are organized into a paleome (the name emphasizes the role of the corresponding functions from the time of the origin of life), comprising a constructor and a replicator, and a cenome (emphasizing community-relevant genes), made up of genes that permit life in a particular context. The cell duplication process supposes rejuvenation of the machine and replication of the program. The paleome also possesses genes that enable information to accumulate in a ratchet-like process down the generations. The systems biology must include the dynamics of information creation in its future developments. PMID:19016882

  4. Advanced Algorithms and Statistics for MOS Surveys

    NASA Astrophysics Data System (ADS)

    Bolton, A. S.

    2016-10-01

    This paper presents an individual view on the current state of computational data processing and statistics for inference and discovery in multi-object spectroscopic surveys, supplemented by a historical perspective and a few present-day applications. It is more op-ed than review, and hopefully more readable as a result.

  5. A survey of big data research

    PubMed Central

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  6. Third Annual Survey of the Profession.

    ERIC Educational Resources Information Center

    Dugger, William E., Jr.; And Others

    1987-01-01

    Reports results of "School Shop's" annual survey of teachers of technology and vocational education. Questions centered on (1) information about respondents, (2) data on schools and programs, and (3) opinions about strengths, weaknesses, and problems. Results indicate that robotics and computers are among the fastest growing programs. (CH)

  7. RATIO COMPUTER

    DOEpatents

    Post, R.F.

    1958-11-11

    An electronic computer circuit is described for producing an output voltage proportional to the product or quotient of tbe voltages of a pair of input signals. ln essence, the disclosed invention provides a computer having two channels adapted to receive separate input signals and each having amplifiers with like fixed amplification factors and like negatlve feedback amplifiers. One of the channels receives a constant signal for comparison purposes, whereby a difference signal is produced to control the amplification factors of the variable feedback amplifiers. The output of the other channel is thereby proportional to the product or quotient of input signals depending upon the relation of input to fixed signals in the first mentioned channel.

  8. Computational Combustion

    SciTech Connect

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  9. Computer Game

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Using NASA studies of advanced lunar exploration and colonization, KDT Industries, Inc. and Wesson International have developed MOONBASE, a computer game. The player, or team commander, must build and operate a lunar base using NASA technology. He has 10 years to explore the surface, select a site and assemble structures brought from Earth into an efficient base. The game was introduced in 1991 by Texas Space Grant Consortium.

  10. Computer centers

    NASA Astrophysics Data System (ADS)

    The National Science Foundation has renewed grants to four of its five supercomputer centers. Average annual funding will rise from $10 million to $14 million so facilities can be upgraded and training and education expanded. As cooperative projects, the centers also receive money from states, universities, computer vendors and industry. The centers support research in fluid dynamics, atmospheric modeling, engineering geophysics and many other scientific disciplines.

  11. Singularity computations

    NASA Technical Reports Server (NTRS)

    Swedlow, J. L.

    1976-01-01

    An approach is described for singularity computations based on a numerical method for elastoplastic flow to delineate radial and angular distribution of field quantities and measure the intensity of the singularity. The method is applicable to problems in solid mechanics and lends itself to certain types of heat flow and fluid motion studies. Its use is not limited to linear, elastic, small strain, or two-dimensional situations.

  12. Computational Biology Support: RECOMB Conference Series (Conference Support)

    SciTech Connect

    Michael Waterman

    2006-06-15

    This funding was support for student and postdoctoral attendance at the Annual Recomb Conference from 2001 to 2005. The RECOMB Conference series was founded in 1997 to provide a scientific forum for theoretical advances in computational biology and their applications in molecular biology and medicine. The conference series aims at attracting research contributions in all areas of computational molecular biology. Typical, but not exclusive, the topics of interest are: Genomics, Molecular sequence analysis, Recognition of genes and regulatory elements, Molecular evolution, Protein structure, Structural genomics, Gene Expression, Gene Networks, Drug Design, Combinatorial libraries, Computational proteomics, and Structural and functional genomics. The origins of the conference came from the mathematical and computational side of the field, and there remains to be a certain focus on computational advances. However, the effective use of computational techniques to biological innovation is also an important aspect of the conference. The conference had a growing number of attendees, topping 300 in recent years and often exceeding 500. The conference program includes between 30 and 40 contributed papers, that are selected by a international program committee with around 30 experts during a rigorous review process rivaling the editorial procedure for top-rate scientific journals. In previous years papers selection has been made from up to 130--200 submissions from well over a dozen countries. 10-page extended abstracts of the contributed papers are collected in a volume published by ACM Press and Springer, and are available at the conference. Full versions of a selection of the papers are published annually in a special issue of the Journal of Computational Biology devoted to the RECOMB Conference. A further point in the program is a lively poster session. From 120-300 posters have been presented each year at RECOMB 2000. One of the highlights of each RECOMB conference is a

  13. Water Use: A Survey

    ERIC Educational Resources Information Center

    Fleming, Rose Glee; Warden, Jessie

    1976-01-01

    A survey of Florida State University students showed that their current laundry practices generate energy and water over-consumption. The survey also resulted in some concrete suggestions to the students that would improve their conservation practices. (Author/BP)

  14. National Health Care Survey

    Cancer.gov

    This survey encompasses a family of health care provider surveys, including information about the facilities that supply health care, the services rendered, and the characteristics of the patients served.

  15. Role of Computer Assisted Instruction (CAI) in an Introductory Computer Concepts Course.

    ERIC Educational Resources Information Center

    Skudrna, Vincent J.

    1997-01-01

    Discusses the role of computer assisted instruction (CAI) in undergraduate education via a survey of related literature and specific applications. Describes an undergraduate computer concepts course and includes appendices of instructions, flowcharts, programs, sample student work in accounting, COBOL instructional model, decision logic in a…

  16. Summary of Computer Usage and Inventory of Computer Utilization in Curriculum. FY 1987-88.

    ERIC Educational Resources Information Center

    Tennessee Univ., Chattanooga. Center of Excellence for Computer Applications.

    This report presents the results of a computer usage survey/inventory, the ninth in a series conducted at the University of Tennessee at Chattanooga to obtain information on the changing status of computer usage in the curricula. Data analyses are reported in 11 tables, which include comparisons between annual inventories and demonstrate growth…

  17. Perceived Social Supports, Computer Self-Efficacy, and Computer Use among High School Students

    ERIC Educational Resources Information Center

    Hsiao, Hsi-Chi; Tu, Ya-Ling; Chung, Hsin-Nan

    2012-01-01

    This study investigated the function of social supports and computer self-efficacy in predicting high school students' perceived effect of computer use. The study was survey method to collect data. The questionnaires were distributed to the high school students in Taiwan. 620 questionnaires were distributed and 525 questionnaires were gathered…

  18. The Effects of Applying Authentic Learning Strategies to Develop Computational Thinking Skills in Computer Literacy Students

    ERIC Educational Resources Information Center

    Mingo, Wendye Dianne

    2013-01-01

    This study attempts to determine if authentic learning strategies can be used to acquire knowledge of and increase motivation for computational thinking. Over 600 students enrolled in a computer literacy course participated in this study which involved completing a pretest, posttest and motivation survey. The students were divided into an…

  19. The First Step in Utilizing Computers in Education: Preparing Computer Literate Teachers.

    ERIC Educational Resources Information Center

    Wells, Malcolm; Bitter, Gary

    As a result of a survey concerning computer assisted instruction in Arizona schools, Arizona State University developed a program to assist districts in computer instructional program development. In the initial planning phase of the program, a list was drawn up of preparatory functions essential for districts making a transition to computer…

  20. Preparing ground states of quantum many-body systems on a quantum computer

    NASA Astrophysics Data System (ADS)

    Poulin, David

    2009-03-01

    The simulation of quantum many-body systems is a notoriously hard problem in condensed matter physics, but it could easily be handled by a quantum computer [4,1]. There is however one catch: while a quantum computer can naturally implement the dynamics of a quantum system --- i.e. solve Schr"odinger's equation --- there was until now no general method to initialize the computer in a low-energy state of the simulated system. We present a quantum algorithm [5] that can prepare the ground state and thermal states of a quantum many-body system in a time proportional to the square-root of its Hilbert space dimension. This is the same scaling as required by the best known algorithm to prepare the ground state of a classical many-body system on a quantum computer [3,2]. This provides strong evidence that for a quantum computer, preparing the ground state of a quantum system is in the worst case no more difficult than preparing the ground state of a classical system. 1 D. Aharonov and A. Ta-Shma, Adiabatic quantum state generation and statistical zero knowledge, Proc. 35th Annual ACM Symp. on Theo. Comp., (2003), p. 20. F. Barahona, On the computational complexity of ising spin glass models, J. Phys. A. Math. Gen., 15 (1982), p. 3241. C. H. Bennett, E. Bernstein, G. Brassard, and U. Vazirani, Strengths and weaknessess of quantum computing, SIAM J. Comput., 26 (1997), pp. 1510--1523, quant-ph/9701001. S. Lloyd, Universal quantum simulators, Science, 273 (1996), pp. 1073--1078. D. Poulin and P. Wocjan, Preparing ground states of quantum many-body systems on a quantum computer, 2008, arXiv:0809.2705.

  1. Autonomous mobile robot for radiologic surveys

    DOEpatents

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1994-06-28

    An apparatus is described for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm. 5 figures.

  2. Autonomous mobile robot for radiologic surveys

    DOEpatents

    Dudar, Aed M.; Wagner, David G.; Teese, Gregory D.

    1994-01-01

    An apparatus for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm.

  3. Preparation for Computer Usage in Social Work: Student Consumer Variables.

    ERIC Educational Resources Information Center

    Nurius, Paula S.; And Others

    1988-01-01

    A survey of students in a large master's program in social work investigated student training in and experience with computers and attitudes about computer applications for human service activities. The value of the findings in curriculum planning, practica development, computer resources management, and faculty and agency involvement are…

  4. Computer Repair: When the Bits Hit the Fan

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2004-01-01

    Many small business and home users rely on local computer repair shops. But who can be trusted? The nonprofit consumer organization Center for the Study of Services has just published the results of a survey it did of local stores in seven cities that do computer repair, along with providing tips about computer repair in general. This article…

  5. Social Media and Archives: A Survey of Archive Users

    ERIC Educational Resources Information Center

    Washburn, Bruce; Eckert, Ellen; Proffitt, Merrilee

    2013-01-01

    In April and May of 2012, the Online Computer Library Center (OCLC) Research conducted a survey of users of archives to learn more about their habits and preferences. In particular, they focused on the roles that social media, recommendations, reviews, and other forms of user-contributed annotation play in archival research. OCLC surveyed faculty,…

  6. Florida Employer Opinion Survey. Annual Report, June 1992.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee.

    Each year the Florida Education and Training Placement Information Program (FETPIP) conducts surveys to determine the opinions of employers about the preparation of graduates of vocational programs. The 1992 survey focused on eight specific occupational training areas (i.e., child care services, computer programming and analysis, dental assisting,…

  7. Effect of Mailing Address Style on Survey Response Rate.

    ERIC Educational Resources Information Center

    Cookingham, Frank G.

    This study determined the effect of using mailing labels prepared by a letter-quality computer printer on survey response rate. D. A. Dillman's personalization approach to conducting mail surveys suggests that envelopes with addresses typed directly on them may produce a higher response rate than envelopes with addresses typed on self-adhesive…

  8. Telephone Survey Designs.

    ERIC Educational Resources Information Center

    Casady, Robert J.

    The concepts, definitions, and notation that have evolved with the development of telephone survey design methodology are discussed and presented as a unified structure. This structure is then applied to some of the more well-known telephone survey designs and alternative designs are developed. The relative merits of the different survey designs…

  9. AECT Needs Survey, 2000.

    ERIC Educational Resources Information Center

    Frick, Theodore W.; Richter, Kurt; Kim, Minhee; Yang, Jessica Chao-I; Duvenci, Abdullah

    The purpose of this study was to determine the needs of AECT (Association for Educational Communications and Technology) members. A total of 590 individuals completed a Web-based 16-question survey after receiving an e-mail invitation from AECT. This survey was active between October 30 and November 10, 2000. The survey was categorized into three…

  10. Sensitive Questions in Surveys

    ERIC Educational Resources Information Center

    Tourangeau, Roger; Yan, Ting

    2007-01-01

    Psychologists have worried about the distortions introduced into standardized personality measures by social desirability bias. Survey researchers have had similar concerns about the accuracy of survey reports about such topics as illicit drug use, abortion, and sexual behavior. The article reviews the research done by survey methodologists on…

  11. New computer architectures

    SciTech Connect

    Tiberghien, J.

    1984-01-01

    This book presents papers on supercomputers. Topics considered include decentralized computer architecture, new programming languages, data flow computers, reduction computers, parallel prefix calculations, structural and behavioral descriptions of digital systems, instruction sets, software generation, personal computing, and computer architecture education.

  12. Computers: from ethos and ethics to mythos and religion. Notes on the new frontier between computers and philosophy

    SciTech Connect

    Mitcham, C.

    1986-01-01

    This essay surveys recent studies concerning the social, cultural, ethical and religious dimensions of computers. The argument is that computers have certain cultural influences which call for ethical analysis. Further suggestions are that American culture is itself reflected in new ways in the high-technology computer milieu, and that ethical issues entail religious ones which are being largely ignored. 28 references.

  13. CAMSS: A spectroscopic survey of meteoroid elemental abundances

    NASA Astrophysics Data System (ADS)

    Jenniskens, P.; Gural, P.; Berdeu, A.

    2014-07-01

    The main element abundances (Mg, Fe, Na, ...) of some Near Earth Objects can be measured by meteor spectroscopy. The Cameras for All-sky Meteor Surveillance (CAMS) Spectrograph project aims to scale up meteor spectroscopy in the same way as CAMS scaled up the measurement of precise meteoroid trajectories from multi-station video observations. Spectra are recorded with sixteen low-light video cameras, each equipped with a high 1379 lines/mm objective transmission grating. The cameras are operated in survey mode and have recorded spectra in the San Francisco Bay Area every clear night since March 12, 2013. An interactive software tool is being developed to calibrate the wavelength alignments projected on the focal plane and extract the meteor spectra. Because the meteoroid trajectory and pre-atmospheric orbit are also independently measured, the absolute abundances of elements in the meteoroid plasma can be calculated as a function of altitude, while the orbital information can tie the meteoroid back to its parent object. % 2007AdSpR..39..538A Berezhnoy, A. A., Borovička, J. 2012, ACM 2012, Abstract 6142 1993A&A...279..627B 1994A&AS..103...83B 2005Icar..174...15B 2011pimo.conf...28G Gural, P. S. 2012, M&PS, 47, 1405 1997ApJ...479..441J 2007AdSpR..39..491J 2011Icar..216...40J Gomez, N., Madiedo, J. M., & Trigo-Rodriguez, J. M. 2013, 44th LPSC, Abstract 1239 2007AdSpR..39..513K 2004AJ....128.2564M 2007AdSpR..39..583R 2007AdSpR..39..517T 2011A&A...526A.126W

  14. The current status of super computers

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1978-01-01

    In this paper, commercially available super computers are surveyed. Computer performance in general is limited by circuit speeds and physical size. Assuming the use of the fastest technology, super computers typically use parallelism in the form of either vector processing or array processing to obtain performance. The Burroughs Scientific Processor is an array computer with 16 separate processors, the Cray-1 and CDC STAR-100 are vector processors, the Goodyear Aerospace STARAN is an array processor with up to 8192 single bit processors, and the Systems Development Corporation PEPE is a collection of up to 288 separate processors.

  15. Quality indexing with computer-aided lexicography

    NASA Technical Reports Server (NTRS)

    Buchan, Ronald L.

    1992-01-01

    Indexing with computers is a far cry from indexing with the first indexing tool, the manual card sorter. With the aid of computer-aided lexicography, both indexing and indexing tools can provide standardization, consistency, and accuracy, resulting in greater quality control than ever before. A brief survey of computer activity in indexing is presented with detailed illustrations from NASA activity. Applications from techniques mentioned, such as Retrospective Indexing (RI), can be made to many indexing systems. In addition to improving the quality of indexing with computers, the improved efficiency with which certain tasks can be done is demonstrated.

  16. The Effect of Survey Mode on High School Risk Behavior Data: A Comparison between Web and Paper-Based Surveys

    ERIC Educational Resources Information Center

    Raghupathy, Shobana; Hahn-Smith, Stephen

    2013-01-01

    There has been increasing interest in using of web-based surveys--rather than paper based surveys--for collecting data on alcohol and other drug use in middle and high schools in the US. However, prior research has indicated that respondent confidentiality is an underlying concern with online data collection especially when computer-assisted…

  17. A Survey of Automated Activities in the Libraries of Mexico, Central America and South America; Volume 4, World Survey Series.

    ERIC Educational Resources Information Center

    Patrinostro, Frank S., Comp.; Sanders, Nancy P., Ed.

    The intent of this fourth volume of the "Survey of Automated Activities in the Libraries of the World" is to identify and describe computer-based library projects in the Latin American countries. Information was drawn from survey questionnaires sent to individual libraries. However, few of the South American libraries responded, and as a result…

  18. An Innovative, Effective and Cost Effective Survey Method Using a Survey-Check Response Format

    PubMed Central

    Feil, Edward G.; Severson, Herbert; Taylor, Ted; Boles, Shawn; Albert, David A.; Blair, Jason

    2007-01-01

    Maximizing the response rate to surveys involves thoughtful choices about survey design, sampling and collection methods. This paper describes an innovative survey method, to provide immediate reinforcement for responding and to minimize the response cost. This method involves using a questionnaire printed as checks on security (anti-fraud) paper with questions and responses separated using a perforated tear off section. Once a participant completes the survey, the response area is detached from the questions, thus protecting the confidentiality of the subject, and the check is returned via the banking system. This report describes the survey-check methodology, the survey flow process, and the results from four research studies which have used this method. These studies include (1) a technology accessibility survey of parents with children enrolled in a low-income preschool program; (2) a parent report of their child’s behavior used as screening criteria for inclusion in a computer-mediated parent education project; (3) a follow-up questionnaire as part of a longitudinal study of child behavior, covering home and classroom interventions, and service utilization, and; (4) a survey of dentists in support of efforts to recruit them to participate in a randomized control trial of tobacco cessation in dental offices. The results of using this method show great improvement in response rates over traditionally administered surveys for three of the four reported studies. Results are discussed in terms of future applications of this method, limitations, and potential cost savings. PMID:17180473

  19. Environmental Survey preliminary report

    SciTech Connect

    Not Available

    1988-04-01

    This report presents the preliminary findings from the first phase of the Environmental Survey of the United States Department of Energy (DOE) Sandia National Laboratories conducted August 17 through September 4, 1987. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with Sandia National Laboratories-Albuquerque (SNLA). The Survey covers all environmental media and all areas of environmental regulation. It is being performed in accordance with the DOE Environmental Survey Manual. This phase of the Survey involves the review of existing site environmental data, observations of the operations carried on at SNLA, and interviews with site personnel. 85 refs., 49 figs., 48 tabs.

  20. Sources of computer self-efficacy: The relationship to outcome expectations, computer anxiety, and intention to use computers

    NASA Astrophysics Data System (ADS)

    Antoine, Marilyn V.

    2011-12-01

    The purpose of this research was to extend earlier research on sources of selfefficacy (Lent, Lopez, & Biechke, 1991; Usher & Pajares, 2009) to the information technology domain. The principal investigator examined how Bandura's (1977) sources of self-efficacy information---mastery experience, vicarious experience, verbal persuasion, and physiological states---shape computer self-efficacy beliefs and influence the decision to use or not use computers. The study took place at a mid-sized Historically Black College or University in the South. A convenience sample of 105 undergraduates was drawn from students enrolled in multiple sections of two introductory computer courses. There were 67 females and 38 males. This research was a correlational study of the following variables: sources of computer self-efficacy, general computer self-efficacy, outcome expectations, computer anxiety, and intention to use computers. The principal investigator administered a survey questionnaire containing 52 Likert items to measure the major study variables. Additionally, the survey instrument collected demographic variables such as gender, age, race, intended major, classification, technology use, technology adoption category, and whether the student owns a computer. The results reveal the following: (1) Mastery experience and verbal persuasion had statistically significant relationships to general computer self-efficacy, while vicarious experience and physiological states had non-significant relationships. Mastery experience had the strongest correlation to general computer self-efficacy. (2) All of the sources of computer self-efficacy had statistically significant relationships to personal outcome expectations. Vicarious experience had the strongest correlation to personal outcome expectations. (3) All of the sources of self-efficacy had statistically significant relationships to performance outcome expectations. Vicarious experience had the strongest correlation to performance

  1. Wellbore inertial directional surveying system

    DOEpatents

    Andreas, Ronald D.; Heck, G. Michael; Kohler, Stewart M.; Watts, Alfred C.

    1991-01-01

    A wellbore inertial directional surveying system for providing a complete directional survey of an oil or gas well borehole to determine the displacement in all three directions of the borehole path relative to the well head at the surface. The information generated by the present invention is especially useful when numerous wells are drilled to different geographical targets from a single off-shore platform. Accurate knowledge of the path of the borehole allows proper well spacing and provides assurance that target formations are reached. The tool is lowered down into a borehole on the electrical cable. A computer positioned on the surface communicates with the tool via the cable. The tool contains a sensor block which is supported on a single gimbal, the rotation axis of which is aligned with the cylinder axis of the tool and, correspondingly, the borehole. The gyroscope measurement of the sensor block rotation is used in a null-seeking servo loop which essentially prevents rotation of the sensor block aboutthe gimbal axis. Angular rates of the sensor block about axes which are perpendicular to the gimbal axis are measured by gyroscopes in a manner similar to a strapped-down arrangement. Three accelerometers provide acceleration information as the tool is lowered within the borehole. The uphole computer derives position information based upon acceleration information and anular rate information. Kalman estimation techniques are used to compensate for system errors.

  2. Wellbore inertial directional surveying system

    DOEpatents

    Andreas, R.D.; Heck, G.M.; Kohler, S.M.; Watts, A.C.

    1982-09-08

    A wellbore inertial directional surveying system for providing a complete directional survey of an oil or gas well borehole to determine the displacement in all three directions of the borehole path relative to the well head at the surface. The information generated by the present invention is especially useful when numerous wells are drilled to different geographical targets from a single offshore platform. Accurate knowledge of the path of the borehole allows proper well spacing and provides assurance that target formations are reached. The tool is lowered down into a borehole on an electrical cable. A computer positioned on the surface communicates with the tool via the cable. The tool contains a sensor block which is supported on a single gimbal, the rotation axis of which is aligned with the cylinder axis of the tool and, correspondingly, the borehole. The gyroscope measurement of the sensor block rotation is used in a null-seeking servo loop which essentially prevents rotation of the sensor block about the gimbal axis. Angular rates of the sensor block about axes which are perpendicular to te gimbal axis are measured by gyroscopes in a manner similar to a strapped-down arrangement. Three accelerometers provide acceleration information as the tool is lowered within the borehole. The uphole computer derives position information based upon acceleration information and angular rate information. Kalman estimation techniques are used to compensate for system errors. 25 figures.

  3. Computational crystallization.

    PubMed

    Altan, Irem; Charbonneau, Patrick; Snell, Edward H

    2016-07-15

    Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed.

  4. Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.; Astsatryan, H. V.

    2015-07-01

    Present astronomical archives that contain billions of objects, both Galactic and extragalactic, and the vast amount of data on them allow new studies and discoveries. Astrophysical Virtual Observatories (VO) use available databases and current observing material as a collection of interoperating data archives and software tools to form a research environment in which complex research programs can be conducted. Most of the modern databases give at present VO access to the stored information, which makes possible also a fast analysis and managing of these data. Cross-correlations result in revealing new objects and new samples. Very often dozens of thousands of sources hide a few very interesting ones that are needed to be discovered by comparison of various physical characteristics. VO is a prototype of Grid technologies that allows distributed data computation, analysis and imaging. Particularly important are data reduction and analysis systems: spectral analysis, SED building and fitting, modelling, variability studies, cross correlations, etc. Computational astrophysics has become an indissoluble part of astronomy and most of modern research is being done by means of it.

  5. Developing the online survey.

    PubMed

    Gordon, Jeffry S; McNew, Ryan

    2008-12-01

    Institutions of higher education are now using Internet-based technology tools to conduct surveys for data collection. Research shows that the type and quality of responses one receives with online surveys are comparable with what one receives in paper-based surveys. Data collection can take place on Web-based surveys, e-mail-based surveys, and personal digital assistants/Smartphone devices. Web surveys can be subscription templates, software packages installed on one's own server, or created from scratch using Web programming development tools. All of these approaches have their advantages and disadvantages. The survey owner must make informed decisions as to the right technology to implement. The correct choice can save hours of work in sorting, organizing, and analyzing data.

  6. ESO imaging survey: infrared deep public survey

    NASA Astrophysics Data System (ADS)

    Olsen, L. F.; Miralles, J.-M.; da Costa, L.; Madejsky, R.; Jørgensen, H. E.; Mignano, A.; Arnouts, S.; Benoist, C.; Dietrich, J. P.; Slijkhuis, R.; Zaggia, S.

    2006-09-01

    This paper is part of the series presenting the final results obtained by the ESO Imaging Survey (EIS) project. It presents new J and Ks data obtained from observations conducted at the ESO 3.5 m New Technology Telescope (NTT) using the SOFI camera. These data were taken as part of the Deep Public Survey (DPS) carried out by the ESO Imaging Survey program, significantly extending the earlier optical/infrared EIS-DEEP survey presented in a previous paper of this series. The DPS-IR survey comprises two observing strategies: shallow Ks observations providing nearly full coverage of pointings with complementary multi-band (in general {UBVRI}) optical data obtained using ESO's wide-field imager (WFI) and deeper J and Ks observations of the central parts of these fields. Currently, the DPS-IR survey provides a coverage of roughly 2.1 square degrees ( 300 SOFI pointings) in Ks with 0.63 square degrees to fainter magnitudes and also covered in J, over three independent regions of the sky. The goal of the present paper is to briefly describe the observations, the data reduction procedures, and to present the final survey products which include fully calibrated pixel-maps and catalogs extracted from them. The astrometric solution with an estimated accuracy of ⪉0.15 arcsec is based on the USNO catalog and limited only by the accuracy of the reference catalog. The final stacked images presented here number 89 and 272, in J and K_s, respectively, the latter reflecting the larger surveyed area. The J and Ks images were taken with a median seeing of 0.77 arcsec and 0.8 arcsec. The images reach a median 5σ limiting magnitude of JAB˜23.06 as measured within an aperture of 2´´, while the corresponding limiting magnitude in KsAB is 21.41 and 22.16 mag for the shallow and deep strategies. Although some spatial variation due to varying observing conditions is observed, overall the observed limiting magnitudes are consistent with those originally proposed. The quality of the data

  7. Computational introspection

    SciTech Connect

    Batali, J.

    1983-02-01

    Introspection is the process of thinking about one's own thoughts and feelings. In this paper, the author discusses recent attempts to make computational systems that exhibit introspective behavior. Each presents a system capable of manipulating representations of its own program and current context. He argues that introspective ability is crucial for intelligent systems--without it an agent cannot represent certain problems that it must be able to solve. A theory of intelligent action would describe how and why certain actions intelligently achieve an agent's goals. The agent would both embody and represent this theory: it would be implemented as the program for the agent; and the importance of introspection suggests that the agent represent its theory of action to itself.

  8. Computer vision

    SciTech Connect

    Not Available

    1982-01-01

    This paper discusses material from areas such as artificial intelligence, psychology, computer graphics, and image processing. The intent is to assemble a selection of this material in a form that will serve both as a senior/graduate-level academic text and as a useful reference to those building vision systems. This book has a strong artificial intelligence flavour, emphasising the belief that both the intrinsic image information and the internal model of the world are important in successful vision systems. The book is organised into four parts, based on descriptions of objects at four different levels of abstraction. These are: generalised images-images and image-like entities; segmented images-images organised into subimages that are likely to correspond to interesting objects; geometric structures-quantitative models of image and world structures; relational structures-complex symbolic descriptions of image and world structures. The book contains author and subject indexes.

  9. The Development and Application of Expert Systems: A National Survey.

    ERIC Educational Resources Information Center

    Bossinger, June; Milheim, William D.

    1993-01-01

    Discussion of expert systems focuses on a national survey that gathered information concerning the attention and investment given to expert systems by managers and computer professionals in a variety of fields. Highlights include uses of expert systems, types of computers and software used, and expert systems shells and development costs. (18…

  10. DICOM image viewers: a survey

    NASA Astrophysics Data System (ADS)

    Horii, Steven C.

    2003-05-01

    Purpose: The purpose of this survey was to identify and characterize available programs for viewing DICOM images on personal computers. Methods: To determine the most commonly used software packages for viewing DICOM images and to seek out less well-known ones, recommendations from colleagues, Internet searches, and searches of databases were carried out. Available software was downloaded and run on the hardware recommended by the developer. Features, such as DICOM information object types, image processing capabilities included, and the ability to export images in other formats were tested and compared. Results: A number of "freeware" and "shareware" programs for viewing DICOM images are available. They range from comprehensive offerings that are very similar to commercial workstation software to simple-to-use programs to open DICOM image files and display the images on personal computers. Surprisingly, some of the more capable software is also "freeware". Breakthrough work: While no scientific breakthroughs resulted from this work, at least one of the DICOM image software packages was not well known among the author"s colleagues who were familiar with other systems, and this particular software was among the most flexible in terms of exporting images in other forms. Conclusion: DICOM viewing software is readily available at no, or low, cost. These programs vary in ease of use, features, and output capability. The results of this survey, while necessarily a "snapshot" in the fast-moving world of software development, should be useful to those who desire to open DICOM images on personal computers or export images from PACS for use in typical office or educational applications.

  11. Multilingual Information Discovery and AccesS (MIDAS): A Joint ACM DL'99/ ACM SIGIR'99 Workshop.

    ERIC Educational Resources Information Center

    Oard, Douglas; Peters, Carol; Ruiz, Miguel; Frederking, Robert; Klavans, Judith; Sheridan, Paraic

    1999-01-01

    Discusses a multidisciplinary workshop that addressed issues concerning internationally distributed information networks. Highlights include multilingual information access in media other than character-coded text; cross-language information retrieval and multilingual metadata; and evaluation of multilingual systems. (LRW)

  12. ARM Airborne Carbon Measurements (ARM-ACME) and ARM-ACME 2.5 Final Campaign Reports

    SciTech Connect

    Biraud, S. C.; Tom, M. S.; Sweeney, C.

    2016-01-01

    We report on a 5-year multi-institution and multi-agency airborne study of atmospheric composition and carbon cycling at the Atmospheric Radiation Measurement (ARM) Climate Research Facility’s Southern Great Plains (SGP) site, with scientific objectives that are central to the carbon-cycle and radiative-forcing goals of the U.S. Global Change Research Program and the North American Carbon Program (NACP). The goal of these measurements is to improve understanding of 1) the carbon exchange of the Atmospheric Radiation Measurement (ARM) SGP region; 2) how CO2 and associated water and energy fluxes influence radiative-forcing, convective processes, and CO2 concentrations over the ARM SGP region, and 3) how greenhouse gases are transported on continental scales.

  13. Computational mechanics needs study

    NASA Technical Reports Server (NTRS)

    Griffin, O. Hayden, Jr.

    1993-01-01

    In order to assess the needs in computational mechanics over the next decade, we formulated a questionnaire and contacted computational mechanics researchers and users in industry, government, and academia. As expected, we found a wide variety of computational mechanics usage and research. This report outlines the activity discussed with those contacts, as well as that in our own organizations. It should be noted that most of the contacts were made before the recent decline of the defense industry. Therefore, areas which are strongly defense-oriented may decrease in relative importance. In order to facilitate updating of this study, names of a few key researchers in each area are included as starting points for future literature surveys. These lists of names are not intended to represent those persons doing the best research in that area, nor are they intended to be comprehensive. They are, as previously stated, offered as starting points for future literature searches. Overall, there is currently a broad activity in computational mechanics in this country, with the breadth and depth increasing as more sophisticated software and faster computers become more available. The needs and desires of the workers in this field are as diverse as their background and organizational products. There seems to be some degree of software development in any organization (although the level of activity is highly variable from one organization to another) which has any research component in its mission. It seems, however, that there is considerable use of commercial software in almost all organizations. In most industrial research organizations, it appears that very little actual software development is contracted out, but that most is done in-house, using a mixture of funding sources. Government agencies vary widely in the ratio of in-house to out-house ratio. There is a considerable amount of experimental verification in most, but not all, organizations. Generally, the amount of

  14. Computer controlled techniques for high emission density mapping of thermionic cathodes

    NASA Astrophysics Data System (ADS)

    Gibson, J. W.; Thomas, R. E.

    1985-12-01

    Some of the techniques commonly used (e.g. SLEEP and thermionic emission microscope) for measuring emission or work function uniformity of thermionic cathode surfaces require the use of very low or near zero current densities, thus the cathode is characterized at current densities and temperatures much lower than that of a normally operating cathode. The system reported on here uses a high voltage pulse technique and is capable of measuring emission densities in the range 1 to 80 A/cm 2 at normal cathode operating temperatures. The cathode surface is scanned with an anode having a 0.025 mm aperture whose position is controlled by computer operated stepping motors. The current through the aperture to a collector electrode is measured using a sample-and-hold amplifier. Pulsing and sampling are computer synchronized with the scanning, and data for each pulse are accumulated and can be processed and displayed in several ways using the computer, including a detailed "three-dimensional" map of either the electron emission density or work function variations. The entire surface of the cathode or any portion of it can be mapped in steps as small as 0.001 mm (1μm), but typically steps of 5-100 μm were used. Measurements are presented illustrating the uniformity or nonuniformity of the electron emission densities and work functions for type-B and type-M cathodes.

  15. Computer simulation of a pilot in V/STOL aircraft control loops

    NASA Technical Reports Server (NTRS)

    Vogt, William G.; Mickle, Marlin H.; Zipf, Mark E.; Kucuk, Senol

    1989-01-01

    The objective was to develop a computerized adaptive pilot model for the computer model of the research aircraft, the Harrier II AV-8B V/STOL with special emphasis on propulsion control. In fact, two versions of the adaptive pilot are given. The first, simply called the Adaptive Control Model (ACM) of a pilot includes a parameter estimation algorithm for the parameters of the aircraft and an adaption scheme based on the root locus of the poles of the pilot controlled aircraft. The second, called the Optimal Control Model of the pilot (OCM), includes an adaption algorithm and an optimal control algorithm. These computer simulations were developed as a part of the ongoing research program in pilot model simulation supported by NASA Lewis from April 1, 1985 to August 30, 1986 under NASA Grant NAG 3-606 and from September 1, 1986 through November 30, 1988 under NASA Grant NAG 3-729. Once installed, these pilot models permitted the computer simulation of the pilot model to close all of the control loops normally closed by a pilot actually manipulating the control variables. The current version of this has permitted a baseline comparison of various qualitative and quantitative performance indices for propulsion control, the control loops and the work load on the pilot. Actual data for an aircraft flown by a human pilot furnished by NASA was compared to the outputs furnished by the computerized pilot and found to be favorable.

  16. Inertial navigation system for directional surveying

    SciTech Connect

    Kohler, S.M.

    1982-09-01

    A Wellbore Inertial Navigation System (WINS) was developed and tested. Developed for directional surveying of geothermal, oil, and gas wells, the system uses gyros and accelerometers to obtain survey errors of less than 10 ft (approx. 3 m) in a 10,000-ft (approx. 300-m) well. The tool, which communicates with a computer at the surface, is 4 in. (approx. 10 cm) in diameter and 20 ft (approx. 6.1 m) long. The concept and hardware is based on a system developed by Sandia for flight vehicles.

  17. Atmospheric prediction model survey

    NASA Technical Reports Server (NTRS)

    Wellck, R. E.

    1976-01-01

    As part of the SEASAT Satellite program of NASA, a survey of representative primitive equation atmospheric prediction models that exist in the world today was written for the Jet Propulsion Laboratory. Seventeen models developed by eleven different operational and research centers throughout the world are included in the survey. The surveys are tutorial in nature describing the features of the various models in a systematic manner.

  18. FFT-local gravimetric geoid computation

    NASA Technical Reports Server (NTRS)

    Nagy, Dezso; Fury, Rudolf J.

    1989-01-01

    Model computations show that changes of sampling interval introduce only 0.3 cm changes, whereas zero padding provides an improvement of more than 5 cm in the fast Fourier transformation (FFT) generated geoid. For the Global Positioning System (GPS) survey of Franklin County, Ohio, the parameters selected as a result of model computations, allow large reduction in local data requirements while still retaining the cm accuracy when tapering and padding is applied. The results are shown in tables.

  19. 5 CFR 532.241 - Analysis of usable wage survey data.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Analysis of usable wage survey data. 532... PREVAILING RATE SYSTEMS Prevailing Rate Determinations § 532.241 Analysis of usable wage survey data. (a)(1.... The weighted average rates shall be computed using the survey job data collected in accordance...

  20. Advanced Computing for Medicine.

    ERIC Educational Resources Information Center

    Rennels, Glenn D.; Shortliffe, Edward H.

    1987-01-01

    Discusses contributions that computers and computer networks are making to the field of medicine. Emphasizes the computer's speed in storing and retrieving data. Suggests that doctors may soon be able to use computers to advise on diagnosis and treatment. (TW)

  1. Ubiquitous Computing for Remote Cardiac Patient Monitoring: A Survey

    PubMed Central

    Kumar, Sunil; Kambhatla, Kashyap; Hu, Fei; Lifson, Mark; Xiao, Yang

    2008-01-01

    New wireless technologies, such as wireless LAN and sensor networks, for telecardiology purposes give new possibilities for monitoring vital parameters with wearable biomedical sensors, and give patients the freedom to be mobile and still be under continuous monitoring and thereby better quality of patient care. This paper will detail the architecture and quality-of-service (QoS) characteristics in integrated wireless telecardiology platforms. It will also discuss the current promising hardware/software platforms for wireless cardiac monitoring. The design methodology and challenges are provided for realistic implementation. PMID:18604301

  2. Application Trends Survey. 2014 Survey Report

    ERIC Educational Resources Information Center

    Worthington, Rebecca; Bruggeman, Paula

    2014-01-01

    Now in its 15th year, the Graduate Management Admission Council's assessment of application volume trends for graduate management programs offers timely insights into demographic shifts and other factors defining the candidate pools for the 2014 application cycle. Responses collected in the 2014 survey represent a record-breaking total of 748 MBA,…

  3. Corporate Recruiters Survey: 2014 Survey Report

    ERIC Educational Resources Information Center

    Estrada Worthington, Rebecca

    2014-01-01

    The 2014 Corporate Recruiters Survey Report examines the current hiring outlook for graduate business students and analyzes demand by industry and world region, salaries, job functions, and mobility in regional job placement. It also explores recruiter behavior, including recruitment practices and school and candidate selection criteria, and…

  4. Application Trends Survey, 2011. Survey Report

    ERIC Educational Resources Information Center

    Estrada, Rebecca

    2011-01-01

    The 2011 Application Trends Survey conducted by the Graduate Management Admission Council (GMAC) is the industry source for comprehensive statistics and timely and reliable insights into the demand for graduate management education around the world. A total of 649 programs from 331 business schools and faculties worldwide representing 45 countries…

  5. Alumni Perspectives Survey, 2011. Survey Report

    ERIC Educational Resources Information Center

    Sheikh, Sabeen

    2011-01-01

    Since the Graduate Management Admission Council[R] (GMAC[R]) first began conducting its Alumni Perspectives Surveys 11 years ago, several "truths" about graduate business school alumni have consistently stood the test of time: They are and remain eminently employable. They constantly rate the value of the degree highly. This year's results are…

  6. Alumni Perspectives Survey. 2014 Survey Report

    ERIC Educational Resources Information Center

    Schoenfeld, Gregg

    2014-01-01

    Alumni are a powerful force in building a business school's brand. They recommend programs to prospective students, they connect current students to job opportunities, and they contribute significantly to building a school's legacy. The findings in the 2014 Alumni Perspectives Survey Report provide a current snapshot of nearly 21,000 business…

  7. Computer security in DOE distributed computing systems

    SciTech Connect

    Hunteman, W.J.

    1990-01-01

    The modernization of DOE facilities amid limited funding is creating pressure on DOE facilities to find innovative approaches to their daily activities. Distributed computing systems are becoming cost-effective solutions to improved productivity. This paper defines and describes typical distributed computing systems in the DOE. The special computer security problems present in distributed computing systems are identified and compared with traditional computer systems. The existing DOE computer security policy supports only basic networks and traditional computer systems and does not address distributed computing systems. A review of the existing policy requirements is followed by an analysis of the policy as it applies to distributed computing systems. Suggested changes in the DOE computer security policy are identified and discussed. The long lead time in updating DOE policy will require guidelines for applying the existing policy to distributed systems. Some possible interim approaches are identified and discussed. 2 refs.

  8. Computers and occupational therapy.

    PubMed

    English, C B

    1975-01-01

    The benefits and applications of computer science for occupational therapy are explored and a basic, functional description of the computer and computer programming is presented. Potential problems and advantages of computer utilization are compared and examples of existing computer systems in health fields are cited. Methods for successfully introducing computers are discussed.

  9. The Case for Ceres: Report to the Planetary Science Decadal Survey Committee

    NASA Astrophysics Data System (ADS)

    Rivkin, Andrew S.; Castillo-Rogez, J. C.; Cohen, B. A.; Conrad, P. G.; Li, J.; Lim, L. F.; Lovell, A. J.; McCord, T. M.; McFadden, L. A.; McKinnon, W. B.; Milliken, R. E.; Russell, C. T.; Schmidt, B. E.; Sykes, M. V.; Thomas, P. C.

    2009-09-01

    Ceres is the largest object in the asteroid belt, accounting for one-third of the mass found between Mars and Jupiter. Since the last decadal survey was undertaken our knowledge of Ceres has blossomed, with observations, modeling, and theory converging on a paradigm of a severely aqueously altered body with an icy mantle covering a rocky core, transitional in nature between the rocky bodies of the inner solar system and the icy satellites found at the jovian planets [1-6]. However, this paradigm is still in its infancy and recent work has proposed alternatives including an undifferentiated object [7], and even an origin in the outer solar system beyond Neptune [8]. While Dawn will begin the spacecraft reconnaissance of Ceres and provide a wealth of data, geophysical, geochemical, and astrobiological considerations show Ceres to be uniquely compelling as a target for continued ground-based and space-based attention in the coming decade. We will summarize the current state of knowledge about Ceres, present the outstanding science questions presented by Ceres, and recommendations for priorities for the upcoming decade of Ceres research. References: [1] McCord and Sotin (2005) JGR, [2] Thomas et al. (2005) Nature, [3] Li et al. (2006) Icarus, [4] Rivkin et al. (2006) Icarus, [5] Milliken and Rivkin (2009) Nature Geosci., [6] Carry et al. (2008) Astron Astroph., [7] Castillo-Rogez and McCord (2009) Icarus, [7] Zolotov (2009) Icarus, [8] McKinnon (2008) ACM 2008.

  10. Infrastructure Survey 2009

    ERIC Educational Resources Information Center

    Group of Eight (NJ1), 2010

    2010-01-01

    In 2008 the Group of Eight (Go8) released a first report on the state of its buildings and infrastructure, based on a survey undertaken in 2007. A further survey was undertaken in 2009, updating some information about the assessed quality, value and condition of buildings and use of space. It also collated data related to aspects of the estate not…

  11. Seven Survey Sins

    ERIC Educational Resources Information Center

    Gehlbach, Hunter

    2015-01-01

    As pressure builds to assess students, teachers, and schools, educational practitioners and policy makers are increasingly looking toward student perception surveys as a promising means to collect high-quality, useful data. For instance, the widely cited Measures of Effective Teaching study lists student perception surveys as one of the three key…

  12. Basic Surveying Technology.

    ERIC Educational Resources Information Center

    Olson, David A.; Kellum, Mary, Ed.

    This document is intended to help teachers prepare students to perform the duties of any member of a surveying party, including those of party chief, in the field and in the office. It contains instructional units on introduction to surveying, safety, horizontal measurements, vertical measurements, angles and directions, angular measurements,…

  13. Leaver Survey Report, 1996.

    ERIC Educational Resources Information Center

    Cunningham, Stephen

    To determine factors influencing attrition and retention at Pennsylvania College of Technology, a survey was conducted of the 688 students who were enrolled in spring 1996 but neither graduated nor enrolled in fall 1996. Responses were received from 437 former students and were compared to findings from a similar survey of 482 leavers in 1994.…

  14. Submarine cable route survey

    SciTech Connect

    Herrouin, G.; Scuiller, T.

    1995-12-31

    The growth of telecommunication market is very significant. From the beginning of the nineties, more and more the use of optical fiber submarine cables is privileged to that of satellites. These submarine telecommunication highways require accurate surveys in order to select the optimum route and determine the cable characteristics. Advanced technology tools used for these surveys are presented along with their implementation.

  15. Freshman Survey Report, 1997.

    ERIC Educational Resources Information Center

    Cunningham, Steve; Hiris, Eric

    The Cooperative Institutional Research Program (CIRP) sponsors a national annual survey that gathers data on incoming freshman classes at two- and four-year institutions. The data allow the colleges to compare their students with previous classes and with the "average" American freshman. This report presents findings from the 1997 CIRP survey at…

  16. And the Survey Says...

    ERIC Educational Resources Information Center

    White, Susan C.

    2016-01-01

    Last month we highlighted our Quadrennial Survey of High School Physics Teachers. Using data from the survey, we have looked at the availability of high school physics. We report that about 95% of high school seniors attend a high school where physics is offered regularly--either every year or every other year. A U.S. Department of Education…

  17. Maryland Adolescent Survey, 1994.

    ERIC Educational Resources Information Center

    Maryland State Dept. of Education, Baltimore. Div. of Compensatory Education and Support Services.

    This report details the latest findings from the biennial Maryland Adolescent Survey of the extent and trends in alcohol, tobacco, and drug use among students. To permit comparisons with national findings and trends, the form and content of survey items parallel those of the annual national study "Monitoring the Future," conducted by the…

  18. University Community Survey.

    ERIC Educational Resources Information Center

    Francis, John Bruce; Lewis, Steven

    This report is of an omnibus survey of campus attitudes conducted by the Survey Research Center (SRC) of the State University of New York at Buffalo. Its primary purpose was to provide accurate information as a basis for effective decisions by institutional policy makers. A random sample of 326 students, 98 faculty, and 95 staff participated in…

  19. Attitude Surveys Document Sampler.

    ERIC Educational Resources Information Center

    Walker, Albert, Comp.

    This packet presents results of a series of attitude surveys representing a variety of purposes, methods and defined publics. They range from a simple questionnaire prepared and mailed to a small group of key individuals by a public relations staff to scientifically derived surveys purchased from Louis Harris and Associates and other research…

  20. Managing Online Survey Data

    ERIC Educational Resources Information Center

    Ritter, Lois A., Ed.; Sue, Valerie M., Ed.

    2007-01-01

    Managing data collected from online surveys may be a straightforward process involving no more than downloading a spreadsheet from a Web survey host and presenting descriptive statistics associated with each questionnaire item. On the other hand, if the evaluation objectives require more complex analysis and presentation of the data, it will be…

  1. Readership Surveys Build Confidence.

    ERIC Educational Resources Information Center

    Bohle, Bob

    1980-01-01

    Reports results of a survey of students' opinions of their school newspaper. Lists four changes that were based on the survey results: (1) added emphasis on meeting students' personal interest needs, (2) increase in short feature and humorous stories, (3) more persuasive editorial and opinion pieces, and (4) increase in advertising benefits for…

  2. Technology & Distance Learning Survey.

    ERIC Educational Resources Information Center

    Florida Human Resources Development, Inc., Gainesville.

    A survey was conducted to assess the current state of technology and distance learning awareness and usage in Florida's adult education and community-based programs. Data were gathered through a survey of 350 adult practitioners, literacy providers, community-based organizations and libraries throughout the state (125 responses [36 percent return…

  3. Surveying the Community.

    ERIC Educational Resources Information Center

    Brown, Marleen

    The booklet serves as a step-by-step guide to assist career education teachers and administrators in setting up a program of utilizing the resources in the community. It provides specific procedures, forms, and suggestions to help the school in surveying the community. Nine steps involved in surveying the community are discussed in detail: (1)…

  4. 2007 Maryland Adolescent Survey

    ERIC Educational Resources Information Center

    Maryland State Department of Education, 2008

    2008-01-01

    Periodically, Maryland's sixth, eighth, tenth, and twelfth graders are surveyed to determine the nature, extent, and trend of alcohol, tobacco, and other drug (ATOD) use among adolescents. The "2007 Maryland Adolescent Survey (MAS)" presents the latest findings regarding ATOD use by Maryland's adolescents and compares State and local findings with…

  5. Annual HR Salary Survey.

    ERIC Educational Resources Information Center

    Schaeffer, Patricia

    2000-01-01

    A trainers' salary survey collected data on 1,091 companies, 31,615 employees, and 97 human resource jobs. Results show pay for human resource professionals is continuing to rise. The survey contains information on base salaries, annual bonuses and incentives, and long-term eligibility incentives. (JOW)

  6. Investigation of the computer experiences and attitudes of pre-service mathematics teachers: new evidence from Turkey.

    PubMed

    Birgin, Osman; Catlioğlu, Hakan; Gürbüz, Ramazan; Aydin, Serhat

    2010-10-01

    This study aimed to investigate the experiences of pre-service mathematics (PSM) teachers with computers and their attitudes toward them. The Computer Attitude Scale, Computer Competency Survey, and Computer Use Information Form were administered to 180 Turkish PSM teachers. Results revealed that most PSM teachers used computers at home and at Internet cafes, and that their competency was generally intermediate and upper level. The study concludes that PSM teachers' attitudes about computers differ according to their years of study, computer ownership, level of computer competency, frequency of computer use, computer experience, and whether they had attended a computer-aided instruction course. However, computer attitudes were not affected by gender.

  7. Physics Survey Overview

    SciTech Connect

    2002-12-30

    An overview of a series of assignments of the branches of physics carried out by the Board on Physics and Astronomy of the National Research Council. It identifies further theories in physics and makes recommendations on preventive priorities. The Board on Physics and Astronomy (BPA) has conducted a new decadal survey of physics entitled ''Physics in a New Era''. The survey includes assessments of the main branches of physics as well as certain selected emerging areas. The various elements of the survey were prepared by separately-appointed National Research Council (NRC) committees. The BPA formed the Physics Survey Overview Committee (PSOVC) to complete the survey by preparing an overview of the field of physics to summarize and synthesize the results of the various assessments and to address cross-cutting issues that concern physics as a whole.

  8. Building Technologies Residential Survey

    SciTech Connect

    Secrest, Thomas J.

    2005-11-07

    Introduction A telephone survey of 1,025 residential occupants was administered in late October for the Building Technologies Program (BT) to gather information on residential occupant attitudes, behaviors, knowledge, and perceptions. The next section, Survey Results, provides an overview of the responses, with major implications and caveats. Additional information is provided in three appendices as follows: - Appendix A -- Summary Response: Provides summary tabular data for the 13 questions that, with subparts, comprise a total of 25 questions. - Appendix B -- Benchmark Data: Provides a benchmark by six categories to the 2001 Residential Energy Consumption Survey administered by EIA. These were ownership, heating fuel, geographic location, race, household size and income. - Appendix C -- Background on Survey Method: Provides the reader with an understanding of the survey process and interpretation of the results.

  9. The Methanol Multibeam Survey

    NASA Astrophysics Data System (ADS)

    Green, James A.; Cohen, R. J.; Caswell, J. L.; Fuller, G. A.; Brooks, K.; Burton, M. G.; Chrysostomou, A.; Diamond, P. J.; Ellingsen, S. P.; Gray, M. D.; Hoare, M. G.; Masheder, M. R. W.; McClure-Griffiths, N.; Pestalozzi, M.; Phillips, C.; Quinn, L.; Thompson, M. A.; Voronkov, M.; Walsh, A.; Ward-Thompson, D.; Wong-McSweeney, D.; Yates, J. A.; Cox, J.

    2007-03-01

    A new 7-beam methanol multibeam receiver is being used to survey the Galaxy for newly forming massive stars, that are pinpointed by strong methanol maser emission at 6.668 GHz. The receiver, jointly constructed by Jodrell Bank Observatory (JBO) and the Australia Telescope National Facility (ATNF), was successfully commissioned at Parkes in January 2006. The Parkes-Jodrell survey of the Milky Way for methanol masers is two orders of magnitude faster than previous systematic surveys using 30-m class dishes, and is the first systematic survey of the entire Galactic plane. The first 53 days of observations with the Parkes telescope have yielded 518 methanol sources, of which 218 are new discoveries. We present the survey methodology as well as preliminary results and analysis.

  10. Web-Based Surveys: Not Your Basic Survey Anymore

    ERIC Educational Resources Information Center

    Bertot, John Carlo

    2009-01-01

    Web-based surveys are not new to the library environment. Although such surveys began as extensions of print surveys, the Web-based environment offers a number of approaches to conducting a survey that the print environment cannot duplicate easily. Since 1994, the author and others have conducted national surveys of public library Internet…

  11. Mobile-Computing Trends: Lighter, Faster, Smarter

    ERIC Educational Resources Information Center

    Godwin-Jones, Robert

    2008-01-01

    The new era of mobile computing promises greater variety in applications, highly improved usability, and speedier networking. The 3G iPhone from Apple is the poster child for this trend, but there are plenty of other developments that point in this direction. Previous surveys, in LLT, and by researchers at the UK's Open University, have…

  12. Design and Implementation of Instructional Computer Systems.

    ERIC Educational Resources Information Center

    Graczyk, Sandra L.

    1989-01-01

    Presents an input-process-output (IPO) model that can facilitate the design and implementation of instructional micro and minicomputer systems in school districts. A national survey of school districts with outstanding computer systems is described, a systems approach to develop the model is explained, and evaluation of the system is discussed.…

  13. Can Computer Translation Replace Human Translation?

    ERIC Educational Resources Information Center

    Schairer, Karen

    1996-01-01

    Evaluates three commercial computer-based language translation programs' translation of a university social sciences telephone survey from English to Spanish. The three programs, "Spanish Scholar,""Spanish Assistant," and "Spanish Amigo," were rated as unacceptable in their quality of translations by native and near-native Spanish speakers. (nine…

  14. Computer Applications in Balancing Chemical Equations.

    ERIC Educational Resources Information Center

    Kumar, David D.

    2001-01-01

    Discusses computer-based approaches to balancing chemical equations. Surveys 13 methods, 6 based on matrix, 2 interactive programs, 1 stand-alone system, 1 developed in algorithm in Basic, 1 based on design engineering, 1 written in HyperCard, and 1 prepared for the World Wide Web. (Contains 17 references.) (Author/YDS)

  15. Computer-Assisted Study Skills Improvement Program.

    ERIC Educational Resources Information Center

    Brown, William F.; Forristall, Dorothy Z.

    The Computer-Assisted Study Skills Improvement Program (CASSIP) is designed to help students develop effective study skills and academic attitudes, thus increasing their potential for scholastic success. The program contains four integrated items: Study Skills Surveys; Study Skills Modules, Study Skills Notebook; and Study Skills Test. The surveys…

  16. Problems and Prospects in Foreign Language Computing.

    ERIC Educational Resources Information Center

    Pusack, James P.

    The problems and prospects of the field of foreign language computing are profiled through a survey of typical implementation, development, and research projects that language teachers may undertake. Basic concepts in instructional design, hardware, and software are first clarified. Implementation projects involving courseware evaluation, textbook…

  17. Rock-Strata Names Go on Computer

    ERIC Educational Resources Information Center

    Cohee, George V.

    1973-01-01

    Reports that the United States Geological Survey has recently prepared computer print-outs of the rock-stratigraphic names in good usage in published references in the United States, using the standard stratigraphic code of the American Association of Petroleum Geologists. A sample of the print-out is provided with explanatory notes. (JR)

  18. Survey of nuclear fuel-cycle codes

    SciTech Connect

    Thomas, C.R.; de Saussure, G.; Marable, J.H.

    1981-04-01

    A two-month survey of nuclear fuel-cycle models was undertaken. This report presents the information forthcoming from the survey. Of the nearly thirty codes reviewed in the survey, fifteen of these codes have been identified as potentially useful in fulfilling the tasks of the Nuclear Energy Analysis Division (NEAD) as defined in their FY 1981-1982 Program Plan. Six of the fifteen codes are given individual reviews. The individual reviews address such items as the funding agency, the author and organization, the date of completion of the code, adequacy of documentation, computer requirements, history of use, variables that are input and forecast, type of reactors considered, part of fuel cycle modeled and scope of the code (international or domestic, long-term or short-term, regional or national). The report recommends that the Model Evaluation Team perform an evaluation of the EUREKA uranium mining and milling code.

  19. Improving Transfer of Learning in a Computer Based Classroom.

    ERIC Educational Resources Information Center

    Davis, Jay Bee

    This report describes a program for improving the transfer of the learning of different techniques used in computer applications. The targeted population consisted of sophomores and juniors in a suburban high school in a middle class community. The problem was documented through teacher surveys, student surveys, anecdotal records and behavioral…

  20. Planning for Instructional Use of Radio and Computers by Satellite.

    ERIC Educational Resources Information Center

    Suppes, Patrick

    This paper surveys approaches that are deemed practical for instructional use of radios and computers by satellite transmission. For each of the two instructional technologies a brief history is provided, a survey of the evaluation studies of effectiveness is given, and a concluding section on planning for application is provided. Because the…