Science.gov

Sample records for acm computing surveys

  1. ACM TOMS replicated computational results initiative

    DOE PAGES

    Heroux, Michael Allen

    2015-06-03

    In this study, the scientific community relies on the peer review process for assuring the quality of published material, the goal of which is to build a body of work we can trust. Computational journals such as The ACM Transactions on Mathematical Software (TOMS) use this process for rigorously promoting the clarity and completeness of content, and citation of prior work. At the same time, it is unusual to independently confirm computational results.

  2. ACM TOMS replicated computational results initiative

    SciTech Connect

    Heroux, Michael Allen

    2015-06-03

    In this study, the scientific community relies on the peer review process for assuring the quality of published material, the goal of which is to build a body of work we can trust. Computational journals such as The ACM Transactions on Mathematical Software (TOMS) use this process for rigorously promoting the clarity and completeness of content, and citation of prior work. At the same time, it is unusual to independently confirm computational results.

  3. Categorization of Computing Education Resources into the ACM Computing Classification System

    SciTech Connect

    Chen, Yinlin; Bogen, Paul Logasa; Fox, Dr. Edward A.; Hsieh, Dr. Haowei; Cassel, Dr. Lillian N.

    2012-01-01

    The Ensemble Portal harvests resources from multiple heterogonous federated collections. Managing these dynamically increasing collections requires an automatic mechanism to categorize records in to corresponding topics. We propose an approach to use existing ACM DL metadata to build classifiers for harvested resources in the Ensemble project. We also present our experience on utilizing the Amazon Mechanical Turk platform to build ground truth training data sets from Ensemble collections.

  4. Preliminary survey report: control technology for the ceramic industry at Acme Brick Company, Malvern, Arkansas

    SciTech Connect

    Godbey, F.W.

    1983-06-01

    Health-hazard control methods, work processes, and existing control technologies used in the manufacture of brick were surveyed at Acme Brick Company, Malvern, Arkansas in June, 1983. The company employed about 32 workers to produce structural brick from alluvial clay, free clay, shale, and aggregate. A potential hazard existed from silica exposure since the clays contained about 20% quartz. Raw materials were transported in a cab-enclosed front-end loader to feeders that delivered the materials to a crusher. Blended coarsely crushed material was moved by conveyor to a hammer mill for fine crushing. Production-size product was transported by overhead conveyor to storage silos in the production building. The entire material particle-size reduction process was completely automated. The clay-preparation building and raw-material storage area were isolated from the production building, and only two workers performed the crushing and grinding operations. Material transfer points had removable covers, and a water-mist spray was used on one conveyor of each line. The operation was monitored from a totally enclosed air-conditioned control room. Head and eye protection were required. The author does not recommend an in-depth study of control technologies of the company.

  5. Proceedings of the ACM-SIGSAM 1989 international symposium on symbolic and algebraic computation

    SciTech Connect

    Not Available

    1989-01-01

    This book contain subjects under the following topics: Gencray: A portable code generator for Cray fortan; massively parallel symbolic computation; reduction of group constructions to point stabilizers; and constrained equational reasoning.

  6. Human factors in computing systems: focus on patient-centered health communication at the ACM SIGCHI conference.

    PubMed

    Wilcox, Lauren; Patel, Rupa; Chen, Yunan; Shachak, Aviv

    2013-12-01

    Health Information Technologies, such as electronic health records (EHR) and secure messaging, have already transformed interactions among patients and clinicians. In addition, technologies supporting asynchronous communication outside of clinical encounters, such as email, SMS, and patient portals, are being increasingly used for follow-up, education, and data reporting. Meanwhile, patients are increasingly adopting personal tools to track various aspects of health status and therapeutic progress, wishing to review these data with clinicians during consultations. These issues have drawn increasing interest from the human-computer interaction (HCI) community, with special focus on critical challenges in patient-centered interactions and design opportunities that can address these challenges. We saw this community presenting and interacting at the ACM SIGCHI 2013, Conference on Human Factors in Computing Systems, (also known as CHI), held April 27-May 2nd, 2013 at the Palais de Congrès de Paris in France. CHI 2013 featured many formal avenues to pursue patient-centered health communication: a well-attended workshop, tracks of original research, and a lively panel discussion. In this report, we highlight these events and the main themes we identified. We hope that it will help bring the health care communication and the HCI communities closer together.

  7. Cloud Computing Security Issue: Survey

    NASA Astrophysics Data System (ADS)

    Kamal, Shailza; Kaur, Rajpreet

    2011-12-01

    Cloud computing is the growing field in IT industry since 2007 proposed by IBM. Another company like Google, Amazon, and Microsoft provides further products to cloud computing. The cloud computing is the internet based computing that shared recourses, information on demand. It provides the services like SaaS, IaaS and PaaS. The services and recourses are shared by virtualization that run multiple operation applications on cloud computing. This discussion gives the survey on the challenges on security issues during cloud computing and describes some standards and protocols that presents how security can be managed.

  8. ACME-III and ACME-IV Final Campaign Reports

    SciTech Connect

    Biraud, S. C.

    2016-01-01

    The goals of the Atmospheric Radiation Measurement (ARM) Climate Research Facility’s third and fourth Airborne Carbon Measurements (ACME) field campaigns, ACME-III and ACME-IV, are: 1) to measure and model the exchange of CO2, water vapor, and other greenhouse gases by the natural, agricultural, and industrial ecosystems of the Southern Great Plains (SGP) region; 2) to develop quantitative approaches to relate these local fluxes to the concentration of greenhouse gases measured at the Central Facility tower and in the atmospheric column above the ARM SGP Central Facility, 3) to develop and test bottom-up measurement and modeling approaches to estimate regional scale carbon balances, and 4) to develop and test inverse modeling approaches to estimate regional scale carbon balance and anthropogenic sources over continental regions. Regular soundings of the atmosphere from near the surface into the mid-troposphere are essential for this research.

  9. Assessment of Competencies for Computer Information Systems Curricula.

    ERIC Educational Resources Information Center

    Womble, Myra N.

    1993-01-01

    In a survey of 80 managerial and 130 entry-level computer professionals, most entry workers believed they possessed competencies identified in Association for Computing Machinery (ACM) curricula; most managers did not agree. Most managers rated 28% of ACM competencies moderately to not important; 63% were so rated by entry workers. (SK)

  10. Survey of Fault Tolerant Computer Security and Computer Safety.

    DTIC Science & Technology

    introduction to the report as a whole. Contents: Fundamental Concepts of Fault-Tolerant Computing; Survey of Device and System Testing; Computer Security in Defense Systems; and State of the Art of Safety for Computer Controlled Systems .

  11. The Papers of the ACM SIGCSE-SIGCUE Technical Symposium, Computer Science and Education (Anaheim, California, February 12 and 13, 1976).

    ERIC Educational Resources Information Center

    Colman, Ron, Ed.; Lorton, Paul, Jr., Ed.

    1976-01-01

    Over 65 papers presented at a joint symposium sponsored by the Association for Computing Machinery's Special Interest Groups on Computer Uses in Education and on Computer Science Education are gathered here. The papers cover a wide range of topics, including structured programing, computer literacy, computer science education, computerized test…

  12. Experiments in Computing: A Survey

    PubMed Central

    Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general. PMID:24688404

  13. Experiments in computing: a survey.

    PubMed

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  14. Survey of Computer Usage in Louisiana Schools.

    ERIC Educational Resources Information Center

    Kirby, Peggy C.; And Others

    A survey of computer usage in 179 randomly selected public elementary and secondary schools in Louisiana was conducted in the spring of 1988. School principals responded to questions about school size, the socioeconomic status of the student population, the number of teachers certified in computer literacy and computer science, and the number of…

  15. Proceedings of the Annual ACM Symposium (11th) on Principles of Distributed Computing Held in Vancouver, British Columbia, Canada on 10-12 Aug 1992

    DTIC Science & Technology

    1992-08-10

    1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED i0 AUGUST 1992 FINAL 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS ...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION UNIVERSITY OF BRITISH COLUMBIA REPORT NUMBER DEPARTMENT OF COMPUTER SCIENCE 6356...MONITORING OFFICE OF THE CHIEF OF NAVAL RESEARCH AGENCY REPORT NUMBER CODE 1133 BALLSTON TOWER ONE 800 NORTH QUINCY STREET ARLINGTON, VA 22217-5660 11

  16. A Short Survey on Quantum Computers

    SciTech Connect

    Kanamori, Yoshito; Yoo, Seong-Moo; Pan, W. D.; Sheldon, Frederick T

    2006-01-01

    Quantum computing is an emerging technology. The clock frequency of current computer processor systems may reach about 40 GHz within the next 10 years. By then, one atom may represent one bit. Electrons under such conditions are no longer described by classical physics and a new model of the computer may be necessary by then. The quantum computer is one proposal that may have merit in dealing with the problems associated with the fact that certain important computationally intense problems present that current (classical) computers cannot solve because they require too much processing time. For example, Shor's algorithm performs factoring a large integer in polynomial time while classical factoring algorithms can do it in exponential time. In this paper we briefly survey the current status of quantum computers, quantum computer systems, and quantum simulators. Keywords Classical computers, quantum computers, quantum computer systems, quantum simulators, Shor's algorithm.

  17. Computer Graphics Evolution: A Survey.

    ERIC Educational Resources Information Center

    Gartel, Laurence M.

    1985-01-01

    The history of the field of computer graphics is discussed. In 1976 there were no institutions that offered any kind of study of computer graphics. Today electronic image-making is seen as a viable, legitimate art form, and courses are offered by many universities and colleges. (RM)

  18. The Survey; An Interdisciplinary Computer Application.

    ERIC Educational Resources Information Center

    Carolan, Kevin

    APL (A Programing Language), a computer language used thus far largely for mathematical and scientific applications, can be used to tabulate a survey. Since this computer application can be appreciated by social scientists as well as mathematicians, it serves as an invaluable pedagogical tool for presenting APL to nonscientific users. An…

  19. Survey: Computer Usage in Design Courses.

    ERIC Educational Resources Information Center

    Henley, Ernest J.

    1983-01-01

    Presents results of a survey of chemical engineering departments regarding computer usage in senior design courses. Results are categorized according to: computer usage (use of process simulators, student-written programs, faculty-written or "canned" programs; costs (hard and soft money); and available software. Programs offered are…

  20. Asbestos-Containing Materials (ACM) and Demolition

    EPA Pesticide Factsheets

    There are specific federal regulatory requirements that require the identification of asbestos-containing materials (ACM) in many of the residential buildings that are being demolished or renovated by a municipality.

  1. Quark ACM with topologically generated gluon mass

    NASA Astrophysics Data System (ADS)

    Choudhury, Ishita Dutta; Lahiri, Amitabha

    2016-03-01

    We investigate the effect of a small, gauge-invariant mass of the gluon on the anomalous chromomagnetic moment (ACM) of quarks by perturbative calculations at one-loop level. The mass of the gluon is taken to have been generated via a topological mass generation mechanism, in which the gluon acquires a mass through its interaction with an antisymmetric tensor field Bμν. For a small gluon mass ( < 10 MeV), we calculate the ACM at momentum transfer q2 = -M Z2. We compare those with the ACM calculated for the gluon mass arising from a Proca mass term. We find that the ACM of up, down, strange and charm quarks vary significantly with the gluon mass, while the ACM of top and bottom quarks show negligible gluon mass dependence. The mechanism of gluon mass generation is most important for the strange quarks ACM, but not so much for the other quarks. We also show the results at q2 = -m t2. We find that the dependence on gluon mass at q2 = -m t2 is much less than at q2 = -M Z2 for all quarks.

  2. Survey of Federal Computer Security Policies,

    DTIC Science & Technology

    1980-11-01

    have pro ,ulqated computer security policies; however, these varied in approach, scope amd apolicability. Survey results reflected the historical... LIFE CYCLE COVERAGE? (ADP SYSTEMS AND/OR DATA SYSTEM) * - SUBDISCIPLINES INCLUDED - PERSONNEL SECURITY - PHYSICAL SECURITY - COMPUNICATIONS SECURITY...included. Further, although inference may be made concerning overall relative quality of documents in terms of indicators specified, the subcommittee did

  3. A Survey of Techniques for Approximate Computing

    DOE PAGES

    Mittal, Sparsh

    2016-03-18

    Approximate computing trades off computation quality with the effort expended and as rising performance demands confront with plateauing resource budgets, approximate computing has become, not merely attractive, but even imperative. Here, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU and FPGA), processor components, memory technologies etc., and programming frameworks for AC. Moreover, we classify these techniques based on several key characteristics to emphasize their similarities and differences. Finally, the aim of this paper is tomore » provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.« less

  4. A Survey of Techniques for Approximate Computing

    SciTech Connect

    Mittal, Sparsh

    2016-03-18

    Approximate computing trades off computation quality with the effort expended and as rising performance demands confront with plateauing resource budgets, approximate computing has become, not merely attractive, but even imperative. Here, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU and FPGA), processor components, memory technologies etc., and programming frameworks for AC. Moreover, we classify these techniques based on several key characteristics to emphasize their similarities and differences. Finally, the aim of this paper is to provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.

  5. Additive Construction with Mobile Emplacement (ACME)

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The Additive Construction with Mobile Emplacement (ACME) project is developing technology to build structures on planetary surfaces using in-situ resources. The project focuses on the construction of both 2D (landing pads, roads, and structure foundations) and 3D (habitats, garages, radiation shelters, and other structures) infrastructure needs for planetary surface missions. The ACME project seeks to raise the Technology Readiness Level (TRL) of two components needed for planetary surface habitation and exploration: 3D additive construction (e.g., contour crafting), and excavation and handling technologies (to effectively and continuously produce in-situ feedstock). Additionally, the ACME project supports the research and development of new materials for planetary surface construction, with the goal of reducing the amount of material to be launched from Earth.

  6. Towards an Autonomic Cluster Management System (ACMS) with Reflex Autonomicity

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Hinchey, Mike; Sterritt, Roy

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of providing a fault-tolerant environment and achieving significant computational capabilities for high-performance computing applications. However, the task of manually managing and configuring a cluster quickly becomes daunting as the cluster grows in size. Autonomic computing, with its vision to provide self-management, can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Autonomic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management and its evolution to include reflex reactions via pulse monitoring.

  7. Multivariate Lipschitz optimization: Survey and computational comparison

    SciTech Connect

    Hansen, P.; Gourdin, E.; Jaumard, B.

    1994-12-31

    Many methods have been proposed to minimize a multivariate Lipschitz function on a box. They pertain the three approaches: (i) reduction to the univariate case by projection (Pijavskii) or by using a space-filling curve (Strongin); (ii) construction and refinement of a single upper bounding function (Pijavskii, Mladineo, Mayne and Polak, Jaumard Hermann and Ribault, Wood...); (iii) branch and bound with local upper bounding functions (Galperin, Pint{acute e}r, Meewella and Mayne, the present authors). A survey is made, stressing similarities of algorithms, expressed when possible within a unified framework. Moreover, an extensive computational comparison is reported on.

  8. Porcine bladder acellular matrix (ACM): protein expression, mechanical properties.

    PubMed

    Farhat, Walid A; Chen, Jun; Haig, Jennifer; Antoon, Roula; Litman, Jessica; Sherman, Christopher; Derwin, Kathleen; Yeger, Herman

    2008-06-01

    Experimentally, porcine bladder acellular matrix (ACM) that mimics extracellular matrix has excellent potential as a bladder substitute. Herein we investigated the spatial localization and expression of different key cellular and extracellular proteins in the ACM; furthermore, we evaluated the inherent mechanical properties of the resultant ACM prior to implantation. Using a proprietary decellularization method, the DNA contents in both ACM and normal bladder were measured; in addition we used immunohistochemistry and western blots to quantify and localize the different cellular and extracellular components, and finally the mechanical testing was performed using a uniaxial mechanical testing machine. The mean DNA content in the ACM was significantly lower in the ACM compared to the bladder. Furthermore, the immunohistochemical and western blot analyses showed that collagen I and IV were preserved in the ACM, but possibly denatured collagen III in the ACM. Furthermore, elastin, laminin and fibronectin were mildly reduced in the ACM. Although the ACM did not exhibit nucleated cells, residual cellular components (actin, myosin, vimentin and others) were still present. There was, on the other hand, no significant difference in the mean stiffness between the ACM and the bladder. Although our decellularization method is effective in removing nuclear material from the bladder while maintaining its inherent mechanical properties, further work is mandatory to determine whether these residual DNA and cellular remnants would lead to any immune reaction, or if the mechanical properties of the ACM are preserved upon implantation and cellularization.

  9. Porosity of porcine bladder acellular matrix: impact of ACM thickness.

    PubMed

    Farhat, Walid; Chen, Jun; Erdeljan, Petar; Shemtov, Oren; Courtman, David; Khoury, Antoine; Yeger, Herman

    2003-12-01

    The objectives of this study are to examine the porosity of bladder acellular matrix (ACM) using deionized (DI) water as the model fluid and dextran as the indicator macromolecule, and to correlate the porosity to the ACM thickness. Porcine urinary bladders from pigs weighing 20-50 kg were sequentially extracted in detergent containing solutions, and to modify the ACM thickness, stretched bladders were acellularized in the same manner. Luminal and abluminal ACM specimens were subjected to fixed static DI water pressure (10 cm); and water passing through the specimens was collected at specific time interval. While for the macromolecule porosity testing, the diffusion rate and direction of 10,000 MW fluoroescein-labeled dextrans across the ACM specimens mounted in Ussing's chambers were measured. Both experiments were repeated on the thin stretched ACM. In both ACM types, the fluid porosity in both directions did not decrease with increased test duration (3 h); in addition, the abluminal surface was more porous to fluid than the luminal surface. On the other hand, when comparing thin to thick ACM, the porosity in either direction was higher in the thick ACM. Macromolecule porosity, as measured by absorbance, was higher for the abluminal thick ACM than the luminal side, but this characteristic was reversed in the thin ACM. Comparing thin to thick ACM, the luminal side in the thin ACM was more porous to dextran than in the thick ACM, but this characteristic was reversed for the abluminal side. The porcine bladder ACM possesses directional porosity and acellularizing stretched urinary bladders may increase structural density and alter fluid and macromolecule porosity.

  10. How to recycle asbestos containing materials (ACM)

    SciTech Connect

    Jantzen, C.M.

    2000-04-11

    The current disposal of asbestos containing materials (ACM) in the private sector consists of sealing asbestos wetted with water in plastic for safe transportation and burial in regulated land fills. This disposal methodology requires large disposal volumes especially for asbestos covered pipe and asbestos/fiberglass adhering to metal framework, e.g. filters. This wrap and bury technology precludes recycle of the asbestos, the pipe and/or the metal frameworks. Safe disposal of ACM at U.S. Department of Energy (DOE) sites, likewise, requires large disposal volumes in landfills for non-radioactive ACM and large disposal volumes in radioactive burial grounds for radioactive and suspect contaminated ACM. The availability of regulated disposal sites is rapidly diminishing causing recycle to be a more attractive option. Asbestos adhering to metal (e.g., pipes) can be recycled by safely removing the asbestos from the metal in a patented hot caustic bath which prevents airborne contamination /inhalation of asbestos fibers. The dissolution residue (caustic and asbestos) can be wet slurry fed to a melter and vitrified into a glass or glass-ceramic. Palex glasses, which are commercially manufactured, are shown to be preferred over conventional borosilicate glasses. The Palex glasses are alkali magnesium silicate glasses derived by substituting MgO for B{sub 2}O{sub 3} in borosilicate type glasses. Palex glasses are very tolerant of the high MgO and high CaO content of the fillers used in forming asbestos coverings for pipes and found in boiler lashing, e.g., hydromagnesite (3MgCO{sub 3} Mg(OH){sub 2} 3H{sub 2}O) and plaster of paris, gypsum (CaSO{sub 4}). The high temperate of the vitrification process destroys the asbestos fibers and renders the asbestos non-hazardous, e.g., a glass or glass-ceramic. In this manner the glass or glass-ceramic produced can be recycled, e.g., glassphalt or glasscrete, as can the clean metal pipe or metal framework.

  11. Reliability and Generalizability of the Collis Attitudes toward Computers Survey.

    ERIC Educational Resources Information Center

    Temple, Linda; Lips, Hilary M.

    1989-01-01

    The Collis Attitudes toward Computers Survey, developed by B. Collis (1984) for a secondary school population, was tested with 305 college students attending the University of Winnipeg (Canada). Results support the reliability and generalizability of the survey and identified one dominant factor--personal interest and enjoyment of computers. (SLD)

  12. Equivalency of Paper versus Tablet Computer Survey Data

    ERIC Educational Resources Information Center

    Ravert, Russell D.; Gomez-Scott, Jessica; Donnellan, M. Brent

    2015-01-01

    Survey responses collected via paper surveys and computer tablets were compared to test for differences between those methods of obtaining self-report data. College students (N = 258) were recruited in public campus locations and invited to complete identical surveys on either paper or iPad tablet. Only minor homogeneity differences were found…

  13. On the characteristics-based ACM for incompressible flows

    NASA Astrophysics Data System (ADS)

    Su, Xiaohui; Zhao, Yong; Huang, Xiaoyang

    2007-11-01

    In this paper, the revised characteristics-based (CB) method for incompressible flows recently derived by Neofytou [P. Neofytou, Revision of the characteristic-based scheme for incompressible flows, J. Comput. Phys. 222 (2007) 475-484] has been further investigated. We have derived all the formulas for pressure and velocities from this revised CB method, which is based on the artificial compressibility method (ACM) [A.J. Chorin, A numerical solution for solving incompressible viscous flow problems, J. Comput. Phys. 2 (1967) 12]. Then we analyze the formulations of the original CB method [D. Drikakis, P.A. Govatsos, D.E. Papatonis, A characteristic based method for incompressible flows, Int. J. Numer. Meth. Fluids 19 (1994) 667-685; E. Shapiro, D. Drikakis, Non-conservative and conservative formulations of characteristics numerical reconstructions for incompressible flows, Int. J. Numer. Meth. Eng. 66 (2006) 1466-1482; D. Drikakis, P.K. Smolarkiewicz, On spurious vortical structures, J. Comput. Phys. 172 (2001) 309-325; F. Mallinger, D. Drikakis, Instability in three-dimensional, unsteady stenotic flows, Int. J. Heat Fluid Flow 23 (2002) 657-663; E. Shapiro, D. Drikakis, Artificial compressibility, characteristics-based schemes for variable density, incompressible, multi-species flows. Parts I. Derivation of different formulations and constant density limit, J. Comput. Phys. 210 (2005) 584-607; Y. Zhao, B. Zhang, A high-order characteristics upwind FV method for incompressible flow and heat transfer simulation on unstructured grids, Comput. Meth. Appl. Mech. Eng. 190 (5-7) (2000) 733-756] to investigate their consistency with the governing flow equations after convergence has been achieved. Furthermore we have implemented both formulations in an unstructured-grid finite volume solver [Y. Zhao, B. Zhang, A high-order characteristics upwind FV method for incompressible flow and heat transfer simulation on unstructured grids, Comput. Meth. Appl. Mech. Eng. 190 (5

  14. Computed survey spectra of 2-5 micron atmospheric absorption

    NASA Astrophysics Data System (ADS)

    Leslie, D. H.; Lebow, P. S.

    1983-08-01

    Computed high resolution survey spectra of atmospheric absorption coefficient vs wavenumber are presented covering the wavelength region 2-5 micrometers. The 1980 AFGL atmospheric absorption parameter compilation was employed with a mid-latitude, sea-level atmospheric model.

  15. Survey of Computer Facilities in Minnesota and North Dakota.

    ERIC Educational Resources Information Center

    MacGregor, Donald

    In order to attain a better understanding of the data processing manpower needs of business and industry, a survey instrument was designed and mailed to 570 known and possible computer installations in the Minnesota/North Dakota area. The survey was conducted during the spring of 1975, and concentrated on the kinds of equipment and computer…

  16. A Survey of Computer Science Capstone Course Literature

    ERIC Educational Resources Information Center

    Dugan, Robert F., Jr.

    2011-01-01

    In this article, we surveyed literature related to undergraduate computer science capstone courses. The survey was organized around course and project issues. Course issues included: course models, learning theories, course goals, course topics, student evaluation, and course evaluation. Project issues included: software process models, software…

  17. A Survey of Civilian Dental Computer Systems.

    DTIC Science & Technology

    1988-01-01

    r.arketplace, the orthodontic community continued to pioneer clinical automation through diagnosis, treat- (1) patient registration, identification...profession." New York State Dental Journal 34:76, 1968. 17. Ehrlich, A., The Role of Computers in Dental Practice Management. Champaign, IL: Colwell...Council on Dental military dental clinic. Medical Bulletin of the US Army Practice. Report: Dental Computer Vendors. 1984 Europe 39:14-16, 1982. 19

  18. Computer Applications in Archives: A Survey.

    ERIC Educational Resources Information Center

    Bartle, Rachel; Cook, Michael

    A survey was conducted by the Liverpool University Archives in 1982 to identify existing archives services outside the Public Record Office where automation has taken place or is about to take place, and to undertake a preliminary evaluation of the systems used or proposed. The objective of the study was to identify operational systems and those…

  19. Survey of Intelligent Computer-Aided Training

    NASA Technical Reports Server (NTRS)

    Loftin, R. B.; Savely, Robert T.

    1992-01-01

    Intelligent Computer-Aided Training (ICAT) systems integrate artificial intelligence and simulation technologies to deliver training for complex, procedural tasks in a distributed, workstation-based environment. Such systems embody both the knowledge of how to perform a task and how to train someone to perform that task. This paper briefly reviews the antecedents of ICAT systems and describes the approach to their creation developed at the NASA Lyndon B. Johnson Space Center. In addition to the general ICAT architecture, specific ICAT applications that have been or are currently under development are discussed. ICAT systems can offer effective solutions to a number of training problems of interest to the aerospace community.

  20. Acme Foundry, Inc. - Clean Water Act Public Notice

    EPA Pesticide Factsheets

    The EPA is providing notice of a proposed Administrative Penalty Assessment against Acme Foundry, Inc., for alleged violations at its facility located at 1502 South Spruce Street, Coffeyville, Kansas 67337

  1. Computers for real time flight simulation: A market survey

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.; Karplus, W. J.

    1977-01-01

    An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

  2. In-situ Data Analysis Framework for ACME Land Simulations

    NASA Astrophysics Data System (ADS)

    Wang, D.; Yao, C.; Jia, Y.; Steed, C.; Atchley, S.

    2015-12-01

    The realistic representation of key biogeophysical and biogeochemical functions is the fundamental of process-based ecosystem models. Investigating the behavior of those ecosystem functions within real-time model simulation can be a very challenging due to the complex of both model and software structure of an environmental model, such as the Accelerated Climate Model for Energy (ACME) Land Model (ALM). In this research, author will describe the urgent needs and challenges for in-situ data analysis for ALM simulations, and layouts our methods/strategies to meet these challenges. Specifically, an in-situ data analysis framework is designed to allow users interactively observe the biogeophyical and biogeochemical process during ALM simulation. There are two key components in this framework, automatically instrumented ecosystem simulation, in-situ data communication and large-scale data exploratory toolkit. This effort is developed by leveraging several active projects, including scientific unit testing platform, common communication interface and extreme-scale data exploratory toolkit. Authors believe that, based on advanced computing technologies, such as compiler-based software system analysis, automatic code instrumentation, and in-memory data transport, this software system provides not only much needed capability for real-time observation and in-situ data analytics for environmental model simulation, but also the potentials for in-situ model behavior adjustment via simulation steering.

  3. 2005 DOE Computer Graphics Forum Site Survey

    SciTech Connect

    Rebecca, S; Eric, B

    2005-04-15

    The Information Management and Graphics Group supports and develops tools that enhance our ability to access, display, and understand large, complex data sets. Activities include developing visualization software for terascale data exploration; running two video production labs; supporting graphics libraries and tools for end users; maintaining four PowerWalls and assorted other advanced displays; and providing integrated tools for searching, organizing, and browsing scientific data. The Data group supports Defense and Nuclear technologies (D&NT) Directorate. The group's visualization team has developed and maintains two visualization tools: MeshTV and VisIt. These are interactive graphical analysis tools for visualizing and analyzing data on two- and three-dimensional meshes. They also provide movie production support. Researchers in the Center for Applied Scientific Computing (CASC) work on various projects including the development of visualization and data mining techniques for terascale data exploration that are funded by ASC. The researchers also have LDRD projects and collaborations with other lab researchers, academia, and industry.

  4. Empirical Validation and Application of the Computing Attitudes Survey

    ERIC Educational Resources Information Center

    Dorn, Brian; Elliott Tew, Allison

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…

  5. The ACLS Survey of Scholars: Views on Publications, Computers, Libraries.

    ERIC Educational Resources Information Center

    Morton, Herbert C.; Price, Anne Jamieson

    1986-01-01

    Reviews results of a survey by the American Council of Learned Societies (ACLS) of 3,835 scholars in the humanities and social sciences who are working both in colleges and universities and outside the academic community. Areas highlighted include professional reading, authorship patterns, computer use, and library use. (LRW)

  6. Legal Protection of Computer Software: An Industrial Survey.

    ERIC Educational Resources Information Center

    Miller, Richard I.

    A survey was commissioned to establish a baseline on the present modes of legal protection employed by the computer software industry. Questionnaires were distributed to 308 member companies of the Association of Data Processing Service Organizations (ADAPSO), and 116 responses were returned. Results indicated that (1) business executives…

  7. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  8. A survey of GPU-based medical image computing techniques.

    PubMed

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming; Wang, Defeng

    2012-09-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine.

  9. Model Diagnostics for the Department of Energy's Accelerated Climate Modeling for Energy (ACME) Project

    NASA Astrophysics Data System (ADS)

    Smith, B.

    2015-12-01

    In 2014, eight Department of Energy (DOE) national laboratories, four academic institutions, one company, and the National Centre for Atmospheric Research combined forces in a project called Accelerated Climate Modeling for Energy (ACME) with the goal to speed Earth system model development for climate and energy. Over the planned 10-year span, the project will conduct simulations and modeling on DOE's most powerful high-performance computing systems at Oak Ridge, Argonne, and Lawrence Berkeley Leadership Compute Facilities. A key component of the ACME project is the development of an interactive test bed for the advanced Earth system model. Its execution infrastructure will accelerate model development and testing cycles. The ACME Workflow Group is leading the efforts to automate labor-intensive tasks, provide intelligent support for complex tasks and reduce duplication of effort through collaboration support. As part of this new workflow environment, we have created a diagnostic, metric, and intercomparison Python framework, called UVCMetrics, to aid in the testing-to-production execution of the ACME model. The framework exploits similarities among different diagnostics to compactly support diagnosis of new models. It presently focuses on atmosphere and land but is designed to support ocean and sea ice model components as well. This framework is built on top of the existing open-source software framework known as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT). Because of its flexible framework design, scientists and modelers now can generate thousands of possible diagnostic outputs. These diagnostics can compare model runs, compare model vs. observation, or simply verify a model is physically realistic. Additional diagnostics are easily integrated into the framework, and our users have already added several. Diagnostics can be generated, viewed, and manipulated from the UV-CDAT graphical user interface, Python command line scripts and programs

  10. Icing simulation: A survey of computer models and experimental facilities

    NASA Technical Reports Server (NTRS)

    Potapczuk, M. G.; Reinmann, J. J.

    1991-01-01

    A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focussed on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for increased understanding of the physical processes governing ice accretion, ice shedding, and iced airfoil aerodynamics is examined.

  11. Survey of computer programs for heat transfer analysis

    NASA Astrophysics Data System (ADS)

    Noor, A. K.

    An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.

  12. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1982-01-01

    An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.

  13. Autonomic Cluster Management System (ACMS): A Demonstration of Autonomic Principles at Work

    NASA Technical Reports Server (NTRS)

    Baldassari, James D.; Kopec, Christopher L.; Leshay, Eric S.; Truszkowski, Walt; Finkel, David

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of achieving significant computational capabilities for high-performance computing applications, while simultaneously affording the ability to. increase that capability simply by adding more (inexpensive) processors. However, the task of manually managing and con.guring a cluster quickly becomes impossible as the cluster grows in size. Autonomic computing is a relatively new approach to managing complex systems that can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Automatic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management.

  14. Surveying co-located space geodesy techniques for ITRF computation

    NASA Astrophysics Data System (ADS)

    Sarti, P.; Sillard, P.; Vittuari, L.

    2003-04-01

    We present a comprehensive operational methodology, based on classical geodesy triangulation and trilateration, that allows the determination of reference points of the five space geodesy techniques used in ITRF computation (i.e.: DORIS, GPS, LLR, SLR, VLBI). Most of the times, for a single technique, the reference point is not accessible and measurable directly. Likewise, no mechanically determined ex-center with respect to an external and measurable point is usually given. In these cases, it is not possible to directly measure the sought reference points and it is even less straightforward to obtain the statistical information relating these points for different techniques. We outline the most general practical surveying methodology that permits to recover the reference points of the different techniques regardless of their physical materialization. We also give a detailed analytical approach for less straightforward cases (e.g.: non geodetic VLBI antennae and SLR/LLR systems). We stress the importance of surveying instrumentation and procedure in achieving the best possible results and outline the impact of the information retrieved with our method in ITRF computation. In particular, we will give numerical examples of computation of the reference point of VLBI antennae (Ny Aalesund and Medicina) and the ex-centre vector computation linking co-located VLBI and GPS techniques in Medicina (Italy). A special attention was paid to the rigorous derivation of statistical elements. They will be presented in an other presentation.

  15. A Survey of Architectural Techniques for Near-Threshold Computing

    DOE PAGES

    Mittal, Sparsh

    2015-12-28

    Energy efficiency has now become the primary obstacle in scaling the performance of all classes of computing systems. In low-voltage computing and specifically, near-threshold voltage computing (NTC), which involves operating the transistor very close to and yet above its threshold voltage, holds the promise of providing many-fold improvement in energy efficiency. However, use of NTC also presents several challenges such as increased parametric variation, failure rate and performance loss etc. Our paper surveys several re- cent techniques which aim to offset these challenges for fully leveraging the potential of NTC. By classifying these techniques along several dimensions, we also highlightmore » their similarities and differences. Ultimately, we hope that this paper will provide insights into state-of-art NTC techniques to researchers and system-designers and inspire further research in this field.« less

  16. A Survey of Architectural Techniques for Near-Threshold Computing

    SciTech Connect

    Mittal, Sparsh

    2015-12-28

    Energy efficiency has now become the primary obstacle in scaling the performance of all classes of computing systems. In low-voltage computing and specifically, near-threshold voltage computing (NTC), which involves operating the transistor very close to and yet above its threshold voltage, holds the promise of providing many-fold improvement in energy efficiency. However, use of NTC also presents several challenges such as increased parametric variation, failure rate and performance loss etc. Our paper surveys several re- cent techniques which aim to offset these challenges for fully leveraging the potential of NTC. By classifying these techniques along several dimensions, we also highlight their similarities and differences. Ultimately, we hope that this paper will provide insights into state-of-art NTC techniques to researchers and system-designers and inspire further research in this field.

  17. A survey of CPU-GPU heterogeneous computing techniques

    SciTech Connect

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-07-04

    As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and application level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.

  18. A survey of CPU-GPU heterogeneous computing techniques

    DOE PAGES

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-07-04

    As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and applicationmore » level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.« less

  19. Improving Throughput of the ACME Climate Model by Parallel Splitting Atmospheric Physics and Dynamics

    NASA Astrophysics Data System (ADS)

    Caldwell, P.; Taylor, M.

    2015-12-01

    If fluid dynamics and atmospheric physics parameterizations were computed in parallel, they could be calculated simultaneously on separate cores of a supercomputer. This would greatly increase model throughput for high-resolution simulations. Additionally, because atmospheric physics is embarrassingly parallel, more sophisticated physics parameterizations could be used without slowing simulations down by simply increasing the number of cores used. The downside to this approach is that it increases time-truncation error. In this presentation, we demonstrate that parallel splitting the ACME model and using a smaller timestep for physics results in faster, more accurate solutions.

  20. Campus Computing 1993. The USC National Survey of Desktop Computing in Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.; Eastman, Skip

    A national survey of desktop computing in higher education was conducted in spring and summer 1993 at over 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges and community colleges. Respondents (N=1011) were individuals specifically responsible for the operation and future…

  1. Prevalence and genetic diversity of arginine catabolic mobile element (ACME) in clinical isolates of coagulase-negative staphylococci: identification of ACME type I variants in Staphylococcus epidermidis.

    PubMed

    Onishi, Mayumi; Urushibara, Noriko; Kawaguchiya, Mitsuyo; Ghosh, Souvik; Shinagawa, Masaaki; Watanabe, Naoki; Kobayashi, Nobumichi

    2013-12-01

    Arginine catabolic mobile element (ACME), a genomic island consisting of the arc and/or opp3 gene clusters found in staphylococcal species, is related to increased bacterial adaptability to hosts. Staphylococcus epidermidis is considered a major ACME reservoir; however, prevalence and genetic diversity of ACME in coagulase-negative staphylococci (CNS) have not yet been well characterized for clinical isolates in Japan. A total of 271 clinical isolates of CNS in a Japanese hospital were investigated for the presence and genotype of ACME and SCCmec. The prevalence of ACME-arcA was significantly higher (p<0.001) in S. epidermidis (45.8%) than in other CNS species (3.7%). ACME in S. epidermidis isolates (n=87) were differentiated into type I (n=33), variant forms of type I (ΔI, n=26) newly identified in this study, type II (n=6), and type ΔII (n=19). ACME-type ΔI, which were further classified into three subtypes, lacked some genetic components between the arc and opp3 clusters in archetypal type I, whereas the arc and opp3 clusters were intact. The arc cluster exhibited high sequence identity (95.8-100%) to that of type I ACME; in contrast, the opp3 cluster was highly diverse, and showed relatively lower identities (94.8-98.7%) to the identical regions in type I ACME. Twenty-one isolates of ΔI ACME-carrying S. epidermidis possessed SCCmec IVa and belonged to ST5 (clonal complex 2). Phylogenetic analysis revealed that isolates harboring ACME ΔI in this study clustered with previously reported S. epidermidis strains with other lineges, suggesting that S. epidermidis originally had some genetic variations in the opp3 cluster. In summary, ACME type ΔI, a truncated variant of ACME-I, was first identified in S. epidermidis, and revealed to be prevalent in ST5 MRSE clinical isolates with SCCmec IVa.

  2. Survey of patient dose in computed tomography in Syria 2009.

    PubMed

    Kharita, M H; Khazzam, S

    2010-09-01

    The radiation doses to patient in computed tomography (CT) in Syria have been investigated and compared with similar studies in different countries. This work surveyed 30 CT scanners from six different manufacturers distributed all over Syria. Some of the results in this paper were part of a project launched by the International Atomic Energy Agency in different regions of the world covering Asia, Africa and Eastern Europe. The dose quantities covered are CT dose index (CTDI(w)), dose-length product (DLP), effective dose (E) and collective dose. It was found that most CTDI(w) and DLP values were similar to the European reference levels and in line with the results of similar surveys in the world. The results were in good agreement with the UNSCEAR Report 2007. This study concluded a recommendation for national diagnostic reference level for the most common CT protocols in Syria. The results can be used as a base for future optimisation studies in the country.

  3. A survey of computational aerodynamics in the United States

    NASA Technical Reports Server (NTRS)

    Gessow, A.; Morris, D. J.

    1977-01-01

    Programs in theoretical and computational aerodynamics in the United States are described. Those aspects of programs that relate to aeronautics are detailed. The role of analysis at various levels of sophistication is discussed as well as the inverse solution techniques that are of primary importance in design methodology. The research is divided into the broad categories of application for boundary layer flow, Navier-Stokes turbulence modeling, internal flows, two-dimensional configurations, subsonic and supersonic aircraft, transonic aircraft, and the space shuttle. A survey of representative work in each area is presented.

  4. Computer vision in roadway transportation systems: a survey

    NASA Astrophysics Data System (ADS)

    Loce, Robert P.; Bernal, Edgar A.; Wu, Wencheng; Bala, Raja

    2013-10-01

    There is a worldwide effort to apply 21st century intelligence to evolving our transportation networks. The goals of smart transportation networks are quite noble and manifold, including safety, efficiency, law enforcement, energy conservation, and emission reduction. Computer vision is playing a key role in this transportation evolution. Video imaging scientists are providing intelligent sensing and processing technologies for a wide variety of applications and services. There are many interesting technical challenges including imaging under a variety of environmental and illumination conditions, data overload, recognition and tracking of objects at high speed, distributed network sensing and processing, energy sources, as well as legal concerns. This paper presents a survey of computer vision techniques related to three key problems in the transportation domain: safety, efficiency, and security and law enforcement. A broad review of the literature is complemented by detailed treatment of a few selected algorithms and systems that the authors believe represent the state-of-the-art.

  5. Importance of Computer Competencies for Entering JCCC Students: A Survey of Faculty and Staff.

    ERIC Educational Resources Information Center

    Weglarz, Shirley

    Johnson County Community College (JCCC) conducted a survey in response to faculty comments regarding entering students' lack of rudimentary computer skills. Faculty were spending time in non-computer related classes teaching students basic computer skills. The aim of the survey was to determine what the basic computer competencies for entering…

  6. Sealing Force Increasing of ACM Gasket through Electron Beam Radiation

    NASA Astrophysics Data System (ADS)

    dos Santos, D. J.; Batalha, G. F.

    2011-01-01

    Rubber is an engineering material largely used as sealing parts, in form of O-rings, solid gaskets and liquid gaskets, materials applied in liquid state with posterior vulcanization and sealing. Stress relaxation is a rubber characteristic which impacts negatively in such industrial applications (rings and solid gaskets). This work has the purpose to investigate the use of electron beam radiation (EB) as a technology able to decrease the stress relaxation in acrylic rubber (ACM), consequently increasing the sealing capability of this material. ACM samples were irradiated with dose of 100 kGy and 250 kGy, its behavior was comparatively investigated using, dynamic mechanical analysis (DMA) and compression stress relaxation (CSR) experiments. The results obtained by DMA shown an increase of Tg and changes in dynamic mechanical behavior.

  7. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1986-01-01

    An overview is given of the current capabilities of thirty-three computer programs that are used to solve heat transfer problems. The programs considered range from large general-purpose codes with broad spectrum of capabilities, large user community, and comprehensive user support (e.g., ABAQUS, ANSYS, EAL, MARC, MITAS II, MSC/NASTRAN, and SAMCEF) to the small, special-purpose codes with limited user community such as ANDES, NTEMP, TAC2D, TAC3D, TEPSA and TRUMP. The majority of the programs use either finite elements or finite differences for the spatial discretization. The capabilities of the programs are listed in tabular form followed by a summary of the major features of each program. The information presented herein is based on a questionnaire sent to the developers of each program. This information is preceded by a brief background material needed for effective evaluation and use of computer programs for heat transfer analysis. The present survey is useful in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program.

  8. ACME, a GIS tool for Automated Cirque Metric Extraction

    NASA Astrophysics Data System (ADS)

    Spagnolo, Matteo; Pellitero, Ramon; Barr, Iestyn D.; Ely, Jeremy C.; Pellicer, Xavier M.; Rea, Brice R.

    2017-02-01

    Regional scale studies of glacial cirque metrics provide key insights on the (palaeo) environment related to the formation of these erosional landforms. The growing availability of high resolution terrain models means that more glacial cirques can be identified and mapped in the future. However, the extraction of their metrics still largely relies on time consuming manual techniques or the combination of, more or less obsolete, GIS tools. In this paper, a newly coded toolbox is provided for the automated, and comparatively quick, extraction of 16 key glacial cirque metrics; including length, width, circularity, planar and 3D area, elevation, slope, aspect, plan closure and hypsometry. The set of tools, named ACME (Automated Cirque Metric Extraction), is coded in Python, runs in one of the most commonly used GIS packages (ArcGIS) and has a user friendly interface. A polygon layer of mapped cirques is required for all metrics, while a Digital Terrain Model and a point layer of cirque threshold midpoints are needed to run some of the tools. Results from ACME are comparable to those from other techniques and can be obtained rapidly, allowing large cirque datasets to be analysed and potentially important regional trends highlighted.

  9. Survey of Computational Algorithms for MicroRNA Target Prediction

    PubMed Central

    Yue, Dong; Liu, Hui; Huang, Yufei

    2009-01-01

    MicroRNAs (miRNAs) are 19 to 25 nucleotides non-coding RNAs known to possess important post-transcriptional regulatory functions. Identifying targeting genes that miRNAs regulate are important for understanding their specific biological functions. Usually, miRNAs down-regulate target genes through binding to the complementary sites in the 3' untranslated region (UTR) of the targets. In part, due to the large number of miRNAs and potential targets, an experimental based prediction design would be extremely laborious and economically unfavorable. However, since the bindings of the animal miRNAs are not a perfect one-to-one match with the complementary sites of their targets, it is difficult to predict targets of animal miRNAs by accessing their alignment to the 3' UTRs of potential targets. Consequently, sophisticated computational approaches for miRNA target prediction are being considered as essential methods in miRNA research. We surveyed most of the current computational miRNA target prediction algorithms in this paper. Particularly, we provided a mathematical definition and formulated the problem of target prediction under the framework of statistical classification. Moreover, we summarized the features of miRNA-target pairs in target prediction approaches and discussed these approaches according to two categories, which are the rule-based and the data-driven approaches. The rule-based approach derives the classifier mainly on biological prior knowledge and important observations from biological experiments, whereas the data driven approach builds statistic models using the training data and makes predictions based on the models. Finally, we tested a few different algorithms on a set of experimentally validated true miRNA-target pairs [1] and a set of false miRNA-target pairs, derived from miRNA overexpression experiment [2]. Receiver Operating Characteristic (ROC) curves were drawn to show the performances of these algorithms. PMID:20436875

  10. Micro-Stirling Active Cooling Module (MS/ACM) for DoD Electronics Systems

    DTIC Science & Technology

    2012-03-01

    Micro- Stirling Active Cooling Module (MS/ACM) for DoD Electronics Systems Douglas S. Beck Beck Engineering , Inc. 1490 Lumsden Road, Port Orchard...refrigerator. We are developing for DARPA a cm-scale Micro- Stirling Active Cooling Module (MS/ACM) micro- refrigerator to benefit the DoD systems. Under...a DARPA contract, we are designing, building, and demonstrating a breadboard MS/ACM. Keywords: Stirling ; cooler; active cooling module; micro

  11. Sparse Polynomial Chaos Surrogate for ACME Land Model via Iterative Bayesian Compressive Sensing

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Ricciuto, D. M.; Safta, C.; Debusschere, B.; Najm, H. N.; Thornton, P. E.

    2015-12-01

    For computationally expensive climate models, Monte-Carlo approaches of exploring the input parameter space are often prohibitive due to slow convergence with respect to ensemble size. To alleviate this, we build inexpensive surrogates using uncertainty quantification (UQ) methods employing Polynomial Chaos (PC) expansions that approximate the input-output relationships using as few model evaluations as possible. However, when many uncertain input parameters are present, such UQ studies suffer from the curse of dimensionality. In particular, for 50-100 input parameters non-adaptive PC representations have infeasible numbers of basis terms. To this end, we develop and employ Weighted Iterative Bayesian Compressive Sensing to learn the most important input parameter relationships for efficient, sparse PC surrogate construction with posterior uncertainty quantified due to insufficient data. Besides drastic dimensionality reduction, the uncertain surrogate can efficiently replace the model in computationally intensive studies such as forward uncertainty propagation and variance-based sensitivity analysis, as well as design optimization and parameter estimation using observational data. We applied the surrogate construction and variance-based uncertainty decomposition to Accelerated Climate Model for Energy (ACME) Land Model for several output QoIs at nearly 100 FLUXNET sites covering multiple plant functional types and climates, varying 65 input parameters over broad ranges of possible values. This work is supported by the U.S. Department of Energy, Office of Science, Biological and Environmental Research, Accelerated Climate Modeling for Energy (ACME) project. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  12. Air Traffic Complexity Measurement Environment (ACME): Software User's Guide

    NASA Technical Reports Server (NTRS)

    1996-01-01

    A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.

  13. Teacher and Teacher-Directed Student Use of Computers and Software. Teaching, Learning, and Computing: 1998 National Survey. Report #3.

    ERIC Educational Resources Information Center

    Becker, Henry J.; Ravitz, Jason L.; Wong, YanTien

    This report, the third in a series from the spring 1998 national survey, "Teaching, Learning, and Computing," focuses on how teachers have incorporated computers into their instructional practices. The study is comprised of completed questionnaire responses from teachers, principals, and school technology coordinators from 1,616 schools. Part 1…

  14. Contribution of the collagen adhesin Acm to pathogenesis of Enterococcus faecium in experimental endocarditis.

    PubMed

    Nallapareddy, Sreedhar R; Singh, Kavindra V; Murray, Barbara E

    2008-09-01

    Enterococcus faecium is a multidrug-resistant opportunist causing difficult-to-treat nosocomial infections, including endocarditis, but there are no reports experimentally demonstrating E. faecium virulence determinants. Our previous studies showed that some clinical E. faecium isolates produce a cell wall-anchored collagen adhesin, Acm, and that an isogenic acm deletion mutant of the endocarditis-derived strain TX0082 lost collagen adherence. In this study, we show with a rat endocarditis model that TX0082 Deltaacm::cat is highly attenuated versus wild-type TX0082, both in established (72 h) vegetations (P < 0.0001) and for valve colonization 1 and 3 hours after infection (P Acm the first factor shown to be important for E. faecium pathogenesis. In contrast, no mortality differences were observed in a mouse peritonitis model. While 5 of 17 endocarditis isolates were Acm nonproducers and failed to adhere to collagen in vitro, all had an intact, highly conserved acm locus. Highly reduced acm mRNA levels (>or=50-fold reduction relative to an Acm producer) were found in three of these five nonadherent isolates, including the sequenced strain TX0016, by quantitative reverse transcription-PCR, indicating that acm transcription is downregulated in vitro in these isolates. However, examination of TX0016 cells obtained directly from infected rat vegetations by flow cytometry showed that Acm was present on 40% of cells grown during infection. Finally, we demonstrated a significant reduction in E. faecium collagen adherence by affinity-purified anti-Acm antibodies from E. faecium endocarditis patient sera, suggesting that Acm may be a potential immunotarget for strategies to control this emerging pathogen.

  15. Survey of Commercially Available Computer-Readable Bibliographic Data Bases.

    ERIC Educational Resources Information Center

    Schneider, John H., Ed.; And Others

    This document contains the results of a survey of 94 U. S. organizations, and 36 organizations in other countries that were thought to prepare machine-readable data bases. Of those surveyed, 55 organizations (40 in U. S., 15 in other countries) provided completed camera-ready forms describing 81 commercially available, machine-readable data bases…

  16. An Overview of Two Recent Surveys of Administrative Computer Operations in Higher Education.

    ERIC Educational Resources Information Center

    Mann, Richard L.; And Others

    This document summarizes the results of two surveys about the current administrative uses of computers in higher education. Included in the document is: (1) a brief history of the development of computer operational and management information systems in higher education; (2) information on how computers are currently being used to support…

  17. A Survey of Computer Use in Two-Year College Reading Programs.

    ERIC Educational Resources Information Center

    Swartz, Donna

    1985-01-01

    A mail survey of two-year colleges was conducted to identify (1) two-year colleges using computer technology to teach reading, (2) the types of hardware and software used, (3) the courses in which computer technology is used, and (4) the ways in which computer technology is used in two-year college reading programs. Responses from 181 two-year…

  18. Validation of the Adolescent Concerns Measure (ACM): evidence from exploratory and confirmatory factor analysis.

    PubMed

    Ang, Rebecca P; Chong, Wan Har; Huan, Vivien S; Yeo, Lay See

    2007-01-01

    This article reports the development and initial validation of scores obtained from the Adolescent Concerns Measure (ACM), a scale which assesses concerns of Asian adolescent students. In Study 1, findings from exploratory factor analysis using 619 adolescents suggested a 24-item scale with four correlated factors--Family Concerns (9 items), Peer Concerns (5 items), Personal Concerns (6 items), and School Concerns (4 items). Initial estimates of convergent validity for ACM scores were also reported. The four-factor structure of ACM scores derived from Study 1 was confirmed via confirmatory factor analysis in Study 2 using a two-fold cross-validation procedure with a separate sample of 811 adolescents. Support was found for both the multidimensional and hierarchical models of adolescent concerns using the ACM. Internal consistency and test-retest reliability estimates were adequate for research purposes. ACM scores show promise as a reliable and potentially valid measure of Asian adolescents' concerns.

  19. ARM Airborne Carbon Measurements VI (ACME VI) Science Plan

    SciTech Connect

    Biraud, S

    2015-12-01

    From October 1 through September 30, 2016, the Atmospheric Radiation Measurement (ARM) Aerial Facility will deploy the Cessna 206 aircraft over the Southern Great Plains (SGP) site, collecting observations of trace-gas mixing ratios over the ARM’s SGP facility. The aircraft payload includes two Atmospheric Observing Systems, Inc., analyzers for continuous measurements of CO2 and a 12-flask sampler for analysis of carbon cycle gases (CO2, CO, CH4, N2O, 13CO2, 14CO2, carbonyl sulfide, and trace hydrocarbon species, including ethane). The aircraft payload also includes instrumentation for solar/infrared radiation measurements. This research is supported by the U.S. Department of Energy’s ARM Climate Research Facility and Terrestrial Ecosystem Science Program and builds upon previous ARM Airborne Carbon Measurements (ARM-ACME) missions. The goal of these measurements is to improve understanding of 1) the carbon exchange at the SGP site, 2) how CO2 and associated water and energy fluxes influence radiative forcing, convective processes and CO2 concentrations over the SGP site, and 3) how greenhouse gases are transported on continental scales.

  20. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software

    SciTech Connect

    Fenske, K.R.

    1991-11-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization Plans for Word Processors, Personal Computers, Workstations, and Associated Software to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference that documents the plans of each organization for office automation, identifies appropriate planners and other contact people in those organizations, and encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan.

  1. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software. Revision 3

    SciTech Connect

    Fenske, K.R.

    1991-11-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization Plans for Word Processors, Personal Computers, Workstations, and Associated Software to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference that documents the plans of each organization for office automation, identifies appropriate planners and other contact people in those organizations, and encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan.

  2. A survey of current trends in computational drug repositioning

    PubMed Central

    Zheng, Si; Chen, Bin; Butte, Atul J.; Swamidass, S. Joshua; Lu, Zhiyong

    2016-01-01

    Computational drug repositioning or repurposing is a promising and efficient tool for discovering new uses from existing drugs and holds the great potential for precision medicine in the age of big data. The explosive growth of large-scale genomic and phenotypic data, as well as data of small molecular compounds with granted regulatory approval, is enabling new developments for computational repositioning. To achieve the shortest path toward new drug indications, advanced data processing and analysis strategies are critical for making sense of these heterogeneous molecular measurements. In this review, we show recent advancements in the critical areas of computational drug repositioning from multiple aspects. First, we summarize available data sources and the corresponding computational repositioning strategies. Second, we characterize the commonly used computational techniques. Third, we discuss validation strategies for repositioning studies, including both computational and experimental methods. Finally, we highlight potential opportunities and use-cases, including a few target areas such as cancers. We conclude with a brief discussion of the remaining challenges in computational drug repositioning. PMID:25832646

  3. The Successive Contributions of Computers to Education: A Survey.

    ERIC Educational Resources Information Center

    Lelouche, Ruddy

    1998-01-01

    Shows how education has successively benefited from traditional information processing through programmed instruction and computer-assisted instruction (CAI), artificial intelligence, intelligent CAI, intelligent tutoring systems, and hypermedia techniques. Contains 29 references. (DDR)

  4. Infrared Testing of the Wide-field Infrared Survey Telescope Grism Using Computer Generated Holograms

    NASA Technical Reports Server (NTRS)

    Dominguez, Margaret Z.; Content, David A.; Gong, Qian; Griesmann, Ulf; Hagopian, John G.; Marx, Catherine T; Whipple, Arthur L.

    2017-01-01

    Infrared Computer Generated Holograms (CGHs) were designed, manufactured and used to measure the performance of the grism (grating prism) prototype which includes testing Diffractive Optical Elements (DOE). The grism in the Wide Field Infrared Survey Telescope (WFIRST) will allow the surveying of a large section of the sky to find bright galaxies.

  5. A Synthesis and Survey of Critical Success Factors for Computer Technology Projects

    ERIC Educational Resources Information Center

    Baker, Ross A.

    2012-01-01

    The author investigated the existence of critical success factors for computer technology projects. Current research literature and a survey of experienced project managers indicate that there are 23 critical success factors (CSFs) that correlate with project success. The survey gathered an assessment of project success and the degree to which…

  6. A comparison of computational methods and algorithms for the complex gamma function

    NASA Technical Reports Server (NTRS)

    Ng, E. W.

    1974-01-01

    A survey and comparison of some computational methods and algorithms for gamma and log-gamma functions of complex arguments are presented. Methods and algorithms reported include Chebyshev approximations, Pade expansion and Stirling's asymptotic series. The comparison leads to the conclusion that Algorithm 421 published in the Communications of ACM by H. Kuki is the best program either for individual application or for the inclusion in subroutine libraries.

  7. A Guide to Computer Literature; An Introductory Survey of the Sources of Information.

    ERIC Educational Resources Information Center

    Pritchard, Alan

    Sources of information in the computer field (especially in the United Kingdom and the United States) and instruction in the bibliographical control of the literature are provided in this survey which was designed to be of particular use to companies operating or concerned with computer systems. Sources are divided into three categories: primary…

  8. Multigrid Methods on Parallel Computers: A Survey on Recent Developments

    DTIC Science & Technology

    1990-12-01

    multi- color (red-black, four color etc.) order- ing of the grid points. Clearly, computation of defects, interpolation and restriction can be also...73716 72555 .984 85750 82919 95800 85206 .889 113086 97406 16406 16383 .999 22042 21845 23024 21853 .949 31668 29143 Table 6: Evaluated time

  9. National Survey of Computer Aided Manufacturing in Industrial Technology Programs.

    ERIC Educational Resources Information Center

    Heidari, Farzin

    The current status of computer-aided manufacturing in the 4-year industrial technology programs in the United States was studied. All industrial technology department chairs were mailed a questionnaire divided into program information, equipment information, and general comments sections. The questionnaire was designed to determine the subjects…

  10. Inhibition of Enterococcus faecium adherence to collagen by antibodies against high-affinity binding subdomains of Acm.

    PubMed

    Nallapareddy, Sreedhar R; Sillanpää, Jouko; Ganesh, Vannakambadi K; Höök, Magnus; Murray, Barbara E

    2007-06-01

    Strains of Enterococcus faecium express a cell wall-anchored protein, Acm, which mediates adherence to collagen. Here, we (i) identify the minimal and high-affinity binding subsegments of Acm and (ii) show that anti-Acm immunoglobulin Gs (IgGs) purified against these subsegments reduced E. faecium TX2535 strain collagen adherence up to 73 and 50%, respectively, significantly more than the total IgGs against the full-length Acm A domain (28%) (P < 0.0001). Blocking Acm adherence with functional subsegment-specific antibodies raises the possibility of their use as therapeutic or prophylactic agents.

  11. Comparative study of numerical schemes of TVD3, UNO3-ACM and optimized compact scheme

    NASA Technical Reports Server (NTRS)

    Lee, Duck-Joo; Hwang, Chang-Jeon; Ko, Duck-Kon; Kim, Jae-Wook

    1995-01-01

    Three different schemes are employed to solve the benchmark problem. The first one is a conventional TVD-MUSCL (Monotone Upwind Schemes for Conservation Laws) scheme. The second scheme is a UNO3-ACM (Uniformly Non-Oscillatory Artificial Compression Method) scheme. The third scheme is an optimized compact finite difference scheme modified by us: the 4th order Runge Kutta time stepping, the 4th order pentadiagonal compact spatial discretization with the maximum resolution characteristics. The problems of category 1 are solved by using the second (UNO3-ACM) and third (Optimized Compact) schemes. The problems of category 2 are solved by using the first (TVD3) and second (UNO3-ACM) schemes. The problem of category 5 is solved by using the first (TVD3) scheme. It can be concluded from the present calculations that the Optimized Compact scheme and the UN03-ACM show good resolutions for category 1 and category 2 respectively.

  12. Software survey: VOSviewer, a computer program for bibliometric mapping.

    PubMed

    van Eck, Nees Jan; Waltman, Ludo

    2010-08-01

    We present VOSviewer, a freely available computer program that we have developed for constructing and viewing bibliometric maps. Unlike most computer programs that are used for bibliometric mapping, VOSviewer pays special attention to the graphical representation of bibliometric maps. The functionality of VOSviewer is especially useful for displaying large bibliometric maps in an easy-to-interpret way. The paper consists of three parts. In the first part, an overview of VOSviewer's functionality for displaying bibliometric maps is provided. In the second part, the technical implementation of specific parts of the program is discussed. Finally, in the third part, VOSviewer's ability to handle large maps is demonstrated by using the program to construct and display a co-citation map of 5,000 major scientific journals.

  13. Software survey: VOSviewer, a computer program for bibliometric mapping

    PubMed Central

    Waltman, Ludo

    2009-01-01

    We present VOSviewer, a freely available computer program that we have developed for constructing and viewing bibliometric maps. Unlike most computer programs that are used for bibliometric mapping, VOSviewer pays special attention to the graphical representation of bibliometric maps. The functionality of VOSviewer is especially useful for displaying large bibliometric maps in an easy-to-interpret way. The paper consists of three parts. In the first part, an overview of VOSviewer’s functionality for displaying bibliometric maps is provided. In the second part, the technical implementation of specific parts of the program is discussed. Finally, in the third part, VOSviewer’s ability to handle large maps is demonstrated by using the program to construct and display a co-citation map of 5,000 major scientific journals. PMID:20585380

  14. A survey of computational intelligence techniques in protein function prediction.

    PubMed

    Tiwari, Arvind Kumar; Srivastava, Rajeev

    2014-01-01

    During the past, there was a massive growth of knowledge of unknown proteins with the advancement of high throughput microarray technologies. Protein function prediction is the most challenging problem in bioinformatics. In the past, the homology based approaches were used to predict the protein function, but they failed when a new protein was different from the previous one. Therefore, to alleviate the problems associated with homology based traditional approaches, numerous computational intelligence techniques have been proposed in the recent past. This paper presents a state-of-the-art comprehensive review of various computational intelligence techniques for protein function predictions using sequence, structure, protein-protein interaction network, and gene expression data used in wide areas of applications such as prediction of DNA and RNA binding sites, subcellular localization, enzyme functions, signal peptides, catalytic residues, nuclear/G-protein coupled receptors, membrane proteins, and pathway analysis from gene expression datasets. This paper also summarizes the result obtained by many researchers to solve these problems by using computational intelligence techniques with appropriate datasets to improve the prediction performance. The summary shows that ensemble classifiers and integration of multiple heterogeneous data are useful for protein function prediction.

  15. The acellular matrix (ACM) for bladder tissue engineering: A quantitative magnetic resonance imaging study.

    PubMed

    Cheng, Hai-Ling Margaret; Loai, Yasir; Beaumont, Marine; Farhat, Walid A

    2010-08-01

    Bladder acellular matrices (ACMs) derived from natural tissue are gaining increasing attention for their role in tissue engineering and regeneration. Unlike conventional scaffolds based on biodegradable polymers or gels, ACMs possess native biomechanical and many acquired biologic properties. Efforts to optimize ACM-based scaffolds are ongoing and would be greatly assisted by a noninvasive means to characterize scaffold properties and monitor interaction with cells. MRI is well suited to this role, but research with MRI for scaffold characterization has been limited. This study presents initial results from quantitative MRI measurements for bladder ACM characterization and investigates the effects of incorporating hyaluronic acid, a natural biomaterial useful in tissue-engineering and regeneration. Measured MR relaxation times (T(1), T(2)) and diffusion coefficient were consistent with increased water uptake and glycosaminoglycan content observed on biochemistry in hyaluronic acid ACMs. Multicomponent MRI provided greater specificity, with diffusion data showing an acellular environment and T(2) components distinguishing the separate effects of increased glycosaminoglycans and hydration. These results suggest that quantitative MRI may provide useful information on matrix composition and structure, which is valuable in guiding further development using bladder ACMs for organ regeneration and in strategies involving the use of hyaluronic acid.

  16. Children's Experiences of Completing a Computer-Based Violence Survey: Finnish Child Victim Survey Revisited.

    PubMed

    Fagerlund, Monica; Ellonen, Noora

    2016-07-01

    The involvement of children as research subjects requires special considerations with regard to research practices and ethics. This is especially true concerning sensitive research topics such as sexual victimization. Prior research suggests that reflecting these experiences in a survey can cause negative feelings in child participants, although posing only a minimal to moderate risk. Analyzing only predefined, often negative feelings related to answering a sexual victimization survey has dominated the existing literature. In this article children's free-text comments about answering a victimization survey and experiences of sexual victimization are analyzed together to evaluate the effects of research participation in relation to this sensitive issue. Altogether 11,364 children, aged 11-12 and 15-16, participated in the Finnish Child Victim Survey in 2013. Of these, 69% (7,852) reflected on their feelings about answering the survey. Results indicate that both clearly negative and positive feelings are more prevalent among victimized children compared to their nonvictimized peers. Characteristics unique to sexual victimization as well as differences related to gender and age are also discussed. The study contributes to the important yet contradictory field of studying the effects of research participation on children.

  17. Survey of computer vision in roadway transportation systems

    NASA Astrophysics Data System (ADS)

    Manikoth, Natesh; Loce, Robert; Bernal, Edgar; Wu, Wencheng

    2012-01-01

    There is a world-wide effort to apply 21st century intelligence to evolving our transportation networks. The goals of smart transportation networks are quite noble and manifold, including safety, efficiency, law enforcement, energy conservation, and emission reduction. Computer vision is playing a key role in this transportation evolution. Video imaging scientists are providing intelligent sensing and processing technologies for a wide variety of applications and services. There are many interesting technical challenges including imaging under a variety of environmental and illumination conditions, data overload, recognition and tracking of objects at high speed, distributed network sensing and processing, energy sources, as well as legal concerns. This conference presentation and publication is brief introduction to the field, and will be followed by an in-depth journal paper that provides more details on the imaging systems and algorithms.

  18. Gender stereotypes, aggression, and computer games: an online survey of women.

    PubMed

    Norris, Kamala O

    2004-12-01

    Computer games were conceptualized as a potential mode of entry into computer-related employment for women. Computer games contain increasing levels of realism and violence, as well as biased gender portrayals. It has been suggested that aggressive personality characteristics attract people to aggressive video games, and that more women do not play computer games because they are socialized to be non-aggressive. To explore gender identity and aggressive personality in the context of computers, an online survey was conducted on women who played computer games and women who used the computer but did not play computer games. Women who played computer games perceived their online environments as less friendly but experienced less sexual harassment online, were more aggressive themselves, and did not differ in gender identity, degree of sex role stereotyping, or acceptance of sexual violence when compared to women who used the computer but did not play video games. Finally, computer gaming was associated with decreased participation in computer-related employment; however, women with high masculine gender identities were more likely to use computers at work.

  19. Cloud computing for energy management in smart grid - an application survey

    NASA Astrophysics Data System (ADS)

    Naveen, P.; Kiing Ing, Wong; Kobina Danquah, Michael; Sidhu, Amandeep S.; Abu-Siada, Ahmed

    2016-03-01

    The smart grid is the emerging energy system wherein the application of information technology, tools and techniques that make the grid run more efficiently. It possesses demand response capacity to help balance electrical consumption with supply. The challenges and opportunities of emerging and future smart grids can be addressed by cloud computing. To focus on these requirements, we provide an in-depth survey on different cloud computing applications for energy management in the smart grid architecture. In this survey, we present an outline of the current state of research on smart grid development. We also propose a model of cloud based economic power dispatch for smart grid.

  20. A survey on computer aided diagnosis for ocular diseases

    PubMed Central

    2014-01-01

    Background Computer Aided Diagnosis (CAD), which can automate the detection process for ocular diseases, has attracted extensive attention from clinicians and researchers alike. It not only alleviates the burden on the clinicians by providing objective opinion with valuable insights, but also offers early detection and easy access for patients. Method We review ocular CAD methodologies for various data types. For each data type, we investigate the databases and the algorithms to detect different ocular diseases. Their advantages and shortcomings are analyzed and discussed. Result We have studied three types of data (i.e., clinical, genetic and imaging) that have been commonly used in existing methods for CAD. The recent developments in methods used in CAD of ocular diseases (such as Diabetic Retinopathy, Glaucoma, Age-related Macular Degeneration and Pathological Myopia) are investigated and summarized comprehensively. Conclusion While CAD for ocular diseases has shown considerable progress over the past years, the clinical importance of fully automatic CAD systems which are able to embed clinical knowledge and integrate heterogeneous data sources still show great potential for future breakthrough. PMID:25175552

  1. Creation and Use of a Survey Instrument for Comparing Mobile Computing Devices

    PubMed Central

    Macri, Jennifer M.; Lee, Paul P.; Silvey, Garry M.; Lobach, David F.

    2005-01-01

    Both personal digital assistants (PDAs) and tablet computers have emerged to facilitate data collection at the point of care. However, little research has been reported comparing these mobile computing devices in specific care settings. In this study we present an approach for comparing functionally identical applications on a Palm operating system-based PDA and a Windows-based tablet computer for point-of-care documentation of clinical observations by eye care professionals when caring for patients with diabetes. Eye-care professionals compared the devices through focus group sessions and through validated usability surveys. This poster describes the development and use of the survey instrument used for comparing mobile computing devices. PMID:16779327

  2. Computers in Education: A Survey of Computer Technology in the Westchester/Putnam Schools.

    ERIC Educational Resources Information Center

    Flank, Sandra G.; Livesey, Lynne

    The Westchester Education Coalition, Inc., a coalition of business, education, and the community, surveyed the state of education in the schools of Westchester and Putnam counties (New York) to establish baseline data in the region and to suggest some future directions. In January 1992, a questionnaire was sent to all of the school districts of…

  3. National Survey of Internet Usage: Teachers, Computer Coordinators, and School Librarians, Grades 3-12.

    ERIC Educational Resources Information Center

    Market Data Retrieval, Inc., Shelton, CT.

    A study was conducted to assess the number and type of schools and educators who use the Internet/World Wide Web. The national survey was conducted in November and December of 1996, and included 6,000 teachers, computer coordinators, and school librarians currently working in grades 3-5, 6-8, and 9-12. At the elementary level, classroom teachers…

  4. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software. Revision 4

    SciTech Connect

    Fenske, K.R.; Rockwell, V.S.

    1992-08-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization plans for Word Processors, Personal Computers, Workstations, and Associated Software (ANL/TM, Revision 4) to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference document that (1) documents the plans of each organization for office automation, (2) identifies appropriate planners and other contact people in those organizations and (3) encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations (ANL/TM 458) and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan (ANL/TM 466).

  5. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software

    SciTech Connect

    Fenske, K.R.; Rockwell, V.S.

    1992-08-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization plans for Word Processors, Personal Computers, Workstations, and Associated Software (ANL/TM, Revision 4) to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference document that (1) documents the plans of each organization for office automation, (2) identifies appropriate planners and other contact people in those organizations and (3) encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations (ANL/TM 458) and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan (ANL/TM 466).

  6. Computer-Assisted Instruction: A Survey of the Literature, Third Edition.

    ERIC Educational Resources Information Center

    Hickey, Albert E., Ed.

    References to literature published before July 1968 on computer-assisted instruction (CAI) are presented in this survey. Nine subject area chapters, providing the framework for the references, deal with general statements on CAI (including benefits, state of the art, problems, roles in society, financial support, and trends); applications of CAI…

  7. Survey of New Jersey Public School Districts Using Computers and Data Entry Equipment.

    ERIC Educational Resources Information Center

    Gaydos, Irvin A.

    The twelve tables in this study represent the results of a Fall 1975 survey of the 589 operating school districts in the State of New Jersey to determine the status of computer and data entry equipment utilization. Results show that the number of users of this equipment increased noticeably, primarily in administrative processing areas such as…

  8. Technology survey of computer software as applicable to the MIUS project

    NASA Technical Reports Server (NTRS)

    Fulbright, B. E.

    1975-01-01

    Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.

  9. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    DOE PAGES

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-04-24

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less

  10. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    SciTech Connect

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-04-24

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based on their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.

  11. Issues in nursing: strategies for an Internet-based, computer-assisted telephone survey.

    PubMed

    Piamjariyakul, Ubolrat; Bott, Marjorie J; Taunton, Roma Lee

    2006-08-01

    The study describes the design and implementation of an Internet-based, computed-assisted telephone survey about the care-planning process in 107 long-term care facilities in the Midwest. Two structured telephone surveys were developed to interview the care planning coordinators and their team members. Questionmark Perception Software Version 3 was used to develop the surveys in a wide range of formats. The responses were drawn into a database that was exported to a spreadsheet format and converted to a statistical format by the Information Technology team. Security of the database was protected. Training sessions were provided to project staff. The interviews were tape-recorded for the quality checks. The inter-rater reliabilities were above 95% to 100% agreement. Investigators should consider using Internet-based survey tools, especially for multisite studies that allow access to larger samples at less cost. Exploring multiple software systems for the best fit to the study requirements is essential.

  12. Objective measures, sensors and computational techniques for stress recognition and classification: a survey.

    PubMed

    Sharma, Nandita; Gedeon, Tom

    2012-12-01

    Stress is a major growing concern in our day and age adversely impacting both individuals and society. Stress research has a wide range of benefits from improving personal operations, learning, and increasing work productivity to benefiting society - making it an interesting and socially beneficial area of research. This survey reviews sensors that have been used to measure stress and investigates techniques for modelling stress. It discusses non-invasive and unobtrusive sensors for measuring computed stress, a term we coin in the paper. Sensors that do not impede everyday activities that could be used by those who would like to monitor stress levels on a regular basis (e.g. vehicle drivers, patients with illnesses linked to stress) is the focus of the discussion. Computational techniques have the capacity to determine optimal sensor fusion and automate data analysis for stress recognition and classification. Several computational techniques have been developed to model stress based on techniques such as Bayesian networks, artificial neural networks, and support vector machines, which this survey investigates. The survey concludes with a summary and provides possible directions for further computational stress research.

  13. CNTF-ACM promotes mitochondrial respiration and oxidative stress in cortical neurons through upregulating L-type calcium channel activity.

    PubMed

    Sun, Meiqun; Liu, Hongli; Xu, Huanbai; Wang, Hongtao; Wang, Xiaojing

    2016-09-01

    A specialized culture medium termed ciliary neurotrophic factor-treated astrocyte-conditioned medium (CNTF-ACM) allows investigators to assess the peripheral effects of CNTF-induced activated astrocytes upon cultured neurons. CNTF-ACM has been shown to upregulate neuronal L-type calcium channel current activity, which has been previously linked to changes in mitochondrial respiration and oxidative stress. Therefore, the aim of this study was to evaluate CNTF-ACM's effects upon mitochondrial respiration and oxidative stress in rat cortical neurons. Cortical neurons, CNTF-ACM, and untreated control astrocyte-conditioned medium (UC-ACM) were prepared from neonatal Sprague-Dawley rat cortical tissue. Neurons were cultured in either CNTF-ACM or UC-ACM for a 48-h period. Changes in the following parameters before and after treatment with the L-type calcium channel blocker isradipine were assessed: (i) intracellular calcium levels, (ii) mitochondrial membrane potential (ΔΨm), (iii) oxygen consumption rate (OCR) and adenosine triphosphate (ATP) formation, (iv) intracellular nitric oxide (NO) levels, (v) mitochondrial reactive oxygen species (ROS) production, and (vi) susceptibility to the mitochondrial complex I toxin rotenone. CNTF-ACM neurons displayed the following significant changes relative to UC-ACM neurons: (i) increased intracellular calcium levels (p < 0.05), (ii) elevation in ΔΨm (p < 0.05), (iii) increased OCR and ATP formation (p < 0.05), (iv) increased intracellular NO levels (p < 0.05), (v) increased mitochondrial ROS production (p < 0.05), and (vi) increased susceptibility to rotenone (p < 0.05). Treatment with isradipine was able to partially rescue these negative effects of CNTF-ACM (p < 0.05). CNTF-ACM promotes mitochondrial respiration and oxidative stress in cortical neurons through elevating L-type calcium channel activity.

  14. 76 FR 64943 - Proposed Cercla Administrative Cost Recovery Settlement; ACM Smelter and Refinery Site, Located...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-19

    ... AGENCY Proposed Cercla Administrative Cost Recovery Settlement; ACM Smelter and Refinery Site, Located in..., Compensation, and Liability Act, as amended (CERCLA), 42 U.S.C. 9622(i), notice is hereby given of a proposed... portions of Operable Unit 1 of the Site, and to pay $1,050,000.00 to the Hazardous Substance Superfund...

  15. SUPERFUND TREATABILITY CLEARINGHOUSE: FINAL REPORT, PHASE I - IMMEDIATE ASSESSMENT, ACME SOLVENTS SITE

    EPA Science Inventory

    This is a site assessment and feasibility study of incineration alternatives at the ACME Solvents Site at Rockford, Illinois. The document contains laboratory results that are reported to simulate incineration conditions but no details on test methods were provided. The d...

  16. Validation of the Adolescent Concerns Measure (ACM): Evidence from Exploratory and Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Ang, Rebecca P.; Chong, Wan Har; Huan, Vivien S.; Yeo, Lay See

    2007-01-01

    This article reports the development and initial validation of scores obtained from the Adolescent Concerns Measure (ACM), a scale which assesses concerns of Asian adolescent students. In Study 1, findings from exploratory factor analysis using 619 adolescents suggested a 24-item scale with four correlated factors--Family Concerns (9 items), Peer…

  17. An audience-channel-message-evaluation (ACME) framework for health communication campaigns.

    PubMed

    Noar, Seth M

    2012-07-01

    Recent reviews of the literature have indicated that a number of health communication campaigns continue to fail to adhere to principles of effective campaign design. The lack of an integrated, organizing framework for the design, implementation, and evaluation of health communication campaigns may contribute to this state of affairs. The current article introduces an audience-channel-message-evaluation (ACME) framework that organizes the major principles of health campaign design, implementation, and evaluation. ACME also explicates the relationships and linkages between the varying principles. Insights from ACME include the following: The choice of audience segment(s) to focus on in a campaign affects all other campaign design choices, including message strategy and channel/component options. Although channel selection influences options for message design, choice of message design also influences channel options. Evaluation should not be thought of as a separate activity, but rather should be infused and integrated throughout the campaign design and implementation process, including formative, process, and outcome evaluation activities. Overall, health communication campaigns that adhere to this integrated set of principles of effective campaign design will have a greater chance of success than those using principles idiosyncratically. These design, implementation, and evaluation principles are embodied in the ACME framework.

  18. Clinical Computer Systems Survey (CLICS): learning about health information technology (HIT) in its context of use.

    PubMed

    Lichtner, Valentina; Cornford, Tony; Klecun, Ela

    2013-01-01

    Successful health information technology (HIT) implementations need to be informed on the context of use and on users' attitudes. To this end, we developed the CLinical Computer Systems Survey (CLICS) instrument. CLICS reflects a socio-technical view of HIT adoption, and is designed to encompass all members of the clinical team. We used the survey in a large English hospital as part of its internal evaluation of the implementation of an electronic patient record system (EPR). The survey revealed extent and type of use of the EPR; how it related to and integrated with other existing systems; and people's views on its use, usability and emergent safety issues. Significantly, participants really appreciated 'being asked'. They also reminded us of the wider range of administrative roles engaged with EPR. This observation reveals pertinent questions as to our understanding of the boundaries between administrative tasks and clinical medicine - what we propose as the field of 'administrative medicine'.

  19. Survey and future directions of fault-tolerant distributed computing on board spacecraft

    NASA Astrophysics Data System (ADS)

    Fayyaz, Muhammad; Vladimirova, Tanya

    2016-12-01

    Current and future space missions demand highly reliable on-board computing systems, which are capable of carrying out high-performance data processing. At present, no single computing scheme satisfies both, the highly reliable operation requirement and the high-performance computing requirement. The aim of this paper is to review existing systems and offer a new approach to addressing the problem. In the first part of the paper, a detailed survey of fault-tolerant distributed computing systems for space applications is presented. Fault types and assessment criteria for fault-tolerant systems are introduced. Redundancy schemes for distributed systems are analyzed. A review of the state-of-the-art on fault-tolerant distributed systems is presented and limitations of current approaches are discussed. In the second part of the paper, a new fault-tolerant distributed computing platform with wireless links among the computing nodes is proposed. Novel algorithms, enabling important aspects of the architecture, such as time slot priority adaptive fault-tolerant channel access and fault-tolerant distributed computing using task migration are introduced.

  20. On-resin conversion of Cys(Acm)-containing peptides to their corresponding Cys(Scm) congeners.

    PubMed

    Mullen, Daniel G; Weigel, Benjamin; Barany, George; Distefano, Mark D

    2010-05-01

    The Acm protecting group for the thiol functionality of cysteine is removed under conditions (Hg(2+)) that are orthogonal to the acidic milieu used for global deprotection in Fmoc-based solid-phase peptide synthesis. This use of a toxic heavy metal for deprotection has limited the usefulness of Acm in peptide synthesis. The Acm group may be converted to the Scm derivative that can then be used as a reactive intermediate for unsymmetrical disulfide formation. It may also be removed by mild reductive conditions to generate unprotected cysteine. Conversion of Cys(Acm)-containing peptides to their corresponding Cys(Scm) derivatives in solution is often problematic because the sulfenyl chloride reagent used for this conversion may react with the sensitive amino acids tyrosine and tryptophan. In this protocol, we report a method for on-resin Acm to Scm conversion that allows the preparation of Cys(Scm)-containing peptides under conditions that do not modify other amino acids.

  1. Opportunities and Needs for Mobile-Computing Technology to Support U.S. Geological Survey Fieldwork

    USGS Publications Warehouse

    Wood, Nathan J.; Halsing, David L.

    2006-01-01

    To assess the opportunities and needs for mobile-computing technology at the U.S. Geological Survey (USGS), we conducted an internal, Internet-based survey of bureau scientists whose research includes fieldwork. In summer 2005, 144 survey participants answered 65 questions about fieldwork activities and conditions, technology to support field research, and postfieldwork data processing and analysis. Results suggest that some types of mobile-computing technology are already commonplace, such as digital cameras and Global Positioning System (GPS) receivers, whereas others are not, such as personal digital assistants (PDAs) and tablet-based personal computers (tablet PCs). The potential for PDA use in the USGS is high: 97 percent of respondents record field observations (primarily environmental conditions and water-quality data), and 87 percent take field samples (primarily water-quality data, water samples, and sediment/soil samples). The potential for tablet PC use in the USGS is also high: 59 percent of respondents map environmental features in the field, primarily by sketching in field notebooks, on aerial photographs, or on topographic-map sheets. Results also suggest that efficient mobile-computing-technology solutions could benefit many USGS scientists because most respondents spend at least 1 week per year in the field, conduct field sessions that are least 1 week in duration, have field crews of one to three people, and typically travel on foot about 1 mi from their field vehicles. By allowing researchers to enter data directly into digital databases while in the field, mobile-computing technology could also minimize postfieldwork data processing: 93 percent of respondents enter collected field data into their office computers, and more than 50 percent spend at least 1 week per year on postfieldwork data processing. Reducing postfieldwork data processing could free up additional time for researchers and result in cost savings for the bureau. Generally

  2. Continuing educational needs in computers and informatics. McGill survey of family physicians.

    PubMed Central

    McClaran, J.; Snell, L.; Duarte-Franco, E.

    2000-01-01

    OBJECTIVE: To describe family physicians' perceived educational needs in computers and informatics. DESIGN: Mailed survey. SETTING: General or family practices in Canada. PARTICIPANTS: Physicians (489 responded to a mailing sent to 2,500 physicians) who might attend sessions at the McGill Centre for CME. Two duplicate questionnaires were excluded from the analysis. METHOD: Four domains were addressed: practice profile, clinical CME needs, professional CME needs, and preferred learning formats. Data were entered on dBASE IV; analyses were performed on SPSS. MAIN FINDINGS: In the 487 questionnaires retained for analysis, "informatics and computers" was mentioned more than any other clinical diagnostic area, any other professional area, and all but three patient groups and service areas as a topic where improvement in knowledge and skills was needed in the coming year. Most physicians had no access to computer support for practice (62.6%); physicians caring for neonates, toddlers, or hospital inpatients were more likely to report some type of computer support. CONCLUSIONS: Family physicians selected knowledge and skills for computers and informatics as an area for improvement in the coming year more frequently than they selected most traditional clinical CME topics. This educational need is particularly great in small towns and in settings where some computerized hospital data are already available. PMID:10790816

  3. Survey of Need for Computer Operators, Dietetic Technicians, Cable TV Technicians and for a Crime Laboratory for Northwest Police Academy.

    ERIC Educational Resources Information Center

    Lucas, John A.; And Others

    Surveys were conducted of community needs for three types of technicians--computer operators, diet technicians, cable TV technicians--and for a regional crime lab to serve the Chicago northwestern suburban police. The purpose of the surveys was to determine whether community needs would justify inclusion of training programs for such technicians…

  4. A Comparison of Paper vs. Computer-Assisted Self Interview for School, Alcohol, Tobacco, and Other Drug Surveys.

    ERIC Educational Resources Information Center

    Hallfors, Denise; Khatapoush, Shereen; Kadushin, Charles; Watson, Kim; Saxe, Leonard

    2000-01-01

    Examined whether computer assisted self-interview (CASI) alcohol, tobacco, and drug use surveys are feasible with 2,296 7th, 9th, and 11th graders in 2 communities. CASI surveys did not increase reported rates of substance abuse, but did improve the speed of data processing and decrease missing data. (SLD)

  5. Effects of activated ACM on expression of signal transducers in cerebral cortical neurons of rats.

    PubMed

    Wang, Xiaojing; Li, Zhengli; Zhu, Changgeng; Li, Zhongyu

    2007-06-01

    To explore the roles of astrocytes in the epileptogenesis, astrocytes and neurons were isolated, purified and cultured in vitro from cerebral cortex of rats. The astrocytes were activated by ciliary neurotrophic factor (CNTF) and astrocytic conditioned medium (ACM) was collected to treat neurons for 4, 8 and 12 h. By using Western blot, the expression of calmodulin dependent protein kinase II (CaMK II), inducible nitric oxide synthase (iNOS) and adenylate cyclase (AC) was detected in neurons. The results showed that the expression of CaMK II, iNOS and AC was increased significantly in the neurons treated with ACM from 4 h to 12 h (P<0.05), and that of iNOS and AC peaked at 8 h and 12 h respectively. It was suggested that there might be some epileptogenic factors in the ACM and such signal pathways as NOS-NO-cGMP, Ca2+/CaM-CaMK II and AC-cAMP-PKA might take part in the signal transduction of epileptogenesis.

  6. Clouds and Precipitation Simulated by the US DOE Accelerated Climate Modeling for Energy (ACME)

    NASA Astrophysics Data System (ADS)

    Xie, S.; Lin, W.; Yoon, J. H.; Ma, P. L.; Rasch, P. J.; Ghan, S.; Zhang, K.; Zhang, Y.; Zhang, C.; Bogenschutz, P.; Gettelman, A.; Larson, V. E.; Neale, R. B.; Park, S.; Zhang, G. J.

    2015-12-01

    A new US Department of Energy (DOE) climate modeling effort is to develop an Accelerated Climate Model for Energy (ACME) to accelerate the development and application of fully coupled, state-of-the-art Earth system models for scientific and energy application. ACME is a high-resolution climate model with a 0.25 degree in horizontal and more than 60 levels in the vertical. It starts from the Community Earth System Model (CESM) with notable changes to its physical parameterizations and other components. This presentation provides an overview on the ACME model's capability in simulating clouds and precipitation and its sensitivity to convection schemes. Results with using several state-of-the-art cumulus convection schemes, including those unified parameterizations that are being developed in the climate community, will be presented. These convection schemes are evaluated in a multi-scale framework including both short-range hindcasts and free-running climate simulations with both satellite data and ground-based measurements. Running climate model in short-range hindcasts has been proven to be an efficient way to understand model deficiencies. The analysis is focused on those systematic errors in clouds and precipitation simulations that are shared in many climate models. The goal is to understand what model deficiencies might be primarily responsible for these systematic errors.

  7. Survey on Security Issues in File Management in Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Gupta, Udit

    2015-06-01

    Cloud computing has pervaded through every aspect of Information technology in past decade. It has become easier to process plethora of data, generated by various devices in real time, with the advent of cloud networks. The privacy of users data is maintained by data centers around the world and hence it has become feasible to operate on that data from lightweight portable devices. But with ease of processing comes the security aspect of the data. One such security aspect is secure file transfer either internally within cloud or externally from one cloud network to another. File management is central to cloud computing and it is paramount to address the security concerns which arise out of it. This survey paper aims to elucidate the various protocols which can be used for secure file transfer and analyze the ramifications of using each protocol.

  8. A survey of parametrized variational principles and applications to computational mechanics

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1993-01-01

    This survey paper describes recent developments in the area of parametrized variational principles (PVP's) and selected applications to finite-element computational mechanics. A PVP is a variational principle containing free parameters that have no effect on the Euler-Lagrange equations. The theory of single-field PVP's based on gauge functions (also known as null Lagrangians) is a subset of the inverse problem of variational calculus that has limited value. On the other hand, multifield PVP's are more interesting from theoretical and practical standpoints. Following a tutorial introduction, the paper describes the recent construction of multifield PVP's in several areas of elasticity and electromagnetics. It then discusses three applications to finite-element computational mechanics: the derivation of high-performance finite elements, the development of element-level error indicators, and the constructions of finite element templates. The paper concludes with an overview of open research areas.

  9. USL NASA/RECON project presentations at the 1985 ACM Computer Science Conference: Abstracts and visuals

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Chum, Frank Y.; Gallagher, Suzy; Granier, Martin; Hall, Philip P.; Moreau, Dennis R.; Triantafyllopoulos, Spiros

    1985-01-01

    This Working Paper Series entry represents the abstracts and visuals associated with presentations delivered by six USL NASA/RECON research team members at the above named conference. The presentations highlight various aspects of NASA contract activities pursued by the participants as they relate to individual research projects. The titles of the six presentations are as follows: (1) The Specification and Design of a Distributed Workstation; (2) An Innovative, Multidisciplinary Educational Program in Interactive Information Storage and Retrieval; (3) Critical Comparative Analysis of the Major Commercial IS and R Systems; (4) Design Criteria for a PC-Based Common User Interface to Remote Information Systems; (5) The Design of an Object-Oriented Graphics Interface; and (6) Knowledge-Based Information Retrieval: Techniques and Applications.

  10. An ergonomic questionnaire survey on the use of computers in schools.

    PubMed

    Sotoyama, Midori; Bergqvist, Ulf; Jonai, Hiroshi; Saito, Susumu

    2002-04-01

    A questionnaire was sent out to elementary, junior high and high schools in Yokohama and Kawasaki Cities from January to March 1998 regarding the use of personal computers by pupils and students. The survey included the questions that asked how often and in what environment computers are used, whether any instructions are given as to their use, children's working posture, and the effect on health. The results show that most schools are slow to develop instructive programs from the environmental or ergonomic point of view. So far there are not many children who complain of any serious symptoms such as pain in the neck, head or shoulders, but a future increase in the number of classes which involve computing, as well as the widespread popularity of home computers, will surely arouse a legitimate concern about the health of pupils and students, since they will spend more and more time operating the devices. An effective way to anticipate the problem is to provide young students with adequate knowledge of easy-on-body usage and environmental design, and now there is an urgent need for specific guidelines to protect them.

  11. Survey results of Internet and computer usage in veterans with epilepsy.

    PubMed

    Pramuka, Michael; Hendrickson, Rick; Van Cott, Anne C

    2010-03-01

    After our study of a self-management intervention for epilepsy, we gathered data on Internet use and computer availability to assess the feasibility of computer-based interventions in a veteran population. Veterans were asked to complete an anonymous questionnaire that gathered information regarding seizures/epilepsy in addition to demographic data, Internet use, computer availability, and interest in distance education regarding epilepsy. Three hundred twenty-four VA neurology clinic patients completed the survey. One hundred twenty-six self-reported a medical diagnosis of epilepsy and constituted the epilepsy/seizure group. For this group of veterans, the need for remote/distance-based interventions was validated given the majority of veterans traveled long distances (>2 hours). Only 51% of the epilepsy/seizure group had access to the Internet, and less than half (42%) expressed an interest in getting information on epilepsy self-management on their computer, suggesting that Web-based interventions may not be an optimal method for a self-management intervention in this population.

  12. U.S. Geological Survey national computer technology meeting; program and abstracts, New Orleans, Louisiana, April 10-15, 1994

    USGS Publications Warehouse

    Balthrop, B. H.; Baker, E.G.

    1994-01-01

    This report contains some of the abstracts of papers that were presented at the National Computer Technology Meeting that was held in April 1994. This meeting was sponsored by the Water Resources Division of the U.S. Geological Survey, and was attended by more than 200 technical and managerial personnel representing all the Divisions of the U.S. Geological Survey. Computer-related information from all Divisions of the U.S. Geological Survey are discussed in this compilation of abstracts. Some of the topics addressed are data transfer, data-base management, hydrologic applications, national water information systems, and geographic information systems applications and techniques.

  13. A State-Wide Survey of South Australian Secondary Schools to Determine the Current Emphasis on Ergonomics and Computer Use

    ERIC Educational Resources Information Center

    Sawyer, Janet; Penman, Joy

    2012-01-01

    This study investigated the pattern of teaching of healthy computing skills to high school students in South Australia. A survey approach was used to collect data, specifically to determine the emphasis placed by schools on ergonomics that relate to computer use. Participating schools were recruited through the Department for Education and Child…

  14. Survey of computed tomography scanners in Taiwan: Dose descriptors, dose guidance levels, and effective doses

    SciTech Connect

    Tsai, H. Y.; Tung, C. J.; Yu, C. C.; Tyan, Y. S.

    2007-04-15

    The IAEA and the ICRP recommended dose guidance levels for the most frequent computed tomography (CT) examinations to promote strategies for the optimization of radiation dose to CT patients. A national survey, including on-site measurements and questionnaires, was conducted in Taiwan in order to establish dose guidance levels and evaluate effective doses for CT. The beam quality and output and the phantom doses were measured for nine representative CT scanners. Questionnaire forms were completed by respondents from facilities of 146 CT scanners out of 285 total scanners. Information on patient, procedure, scanner, and technique for the head and body examinations was provided. The weighted computed tomography dose index (CTDI{sub w}), the dose length product (DLP), organ doses and effective dose were calculated using measured data, questionnaire information and Monte Carlo simulation results. A cost-effective analysis was applied to derive the dose guidance levels on CTDI{sub w} and DLP for several CT examinations. The mean effective dose{+-}standard deviation distributes from 1.6{+-}0.9 mSv for the routine head examination to 13{+-}11 mSv for the examination of liver, spleen, and pancreas. The surveyed results and the dose guidance levels were provided to the national authorities to develop quality control standards and protocols for CT examinations.

  15. A survey on resource allocation in high performance distributed computing systems

    SciTech Connect

    Hussain, Hameed; Malik, Saif Ur Rehman; Hameed, Abdul; Khan, Samee Ullah; Bickler, Gage; Min-Allah, Nasro; Qureshi, Muhammad Bilal; Zhang, Limin; Yongji, Wang; Ghani, Nasir; Kolodziej, Joanna; Zomaya, Albert Y.; Xu, Cheng-Zhong; Balaji, Pavan; Vishnu, Abhinav; Pinel, Fredric; Pecero, Johnatan E.; Kliazovich, Dzmitry; Bouvry, Pascal; Li, Hongxiang; Wang, Lizhe; Chen, Dan; Rayes, Ammar

    2013-11-01

    An efficient resource allocation is a fundamental requirement in high performance computing (HPC) systems. Many projects are dedicated to large-scale distributed computing systems that have designed and developed resource allocation mechanisms with a variety of architectures and services. In our study, through analysis, a comprehensive survey for describing resource allocation in various HPCs is reported. The aim of the work is to aggregate under a joint framework, the existing solutions for HPC to provide a thorough analysis and characteristics of the resource management and allocation strategies. Resource allocation mechanisms and strategies play a vital role towards the performance improvement of all the HPCs classifications. Therefore, a comprehensive discussion of widely used resource allocation strategies deployed in HPC environment is required, which is one of the motivations of this survey. Moreover, we have classified the HPC systems into three broad categories, namely: (a) cluster, (b) grid, and (c) cloud systems and define the characteristics of each class by extracting sets of common attributes. All of the aforementioned systems are cataloged into pure software and hybrid/hardware solutions. The system classification is used to identify approaches followed by the implementation of existing resource allocation strategies that are widely presented in the literature.

  16. The VIMOS VLT deep survey. Computing the two point correlation statistics and associated uncertainties

    NASA Astrophysics Data System (ADS)

    Pollo, A.; Meneux, B.; Guzzo, L.; Le Fèvre, O.; Blaizot, J.; Cappi, A.; Iovino, A.; Marinoni, C.; McCracken, H. J.; Bottini, D.; Garilli, B.; Le Brun, V.; Maccagni, D.; Picat, J. P.; Scaramella, R.; Scodeggio, M.; Tresse, L.; Vettolani, G.; Zanichelli, A.; Adami, C.; Arnaboldi, M.; Arnouts, S.; Bardelli, S.; Bolzonella, M.; Charlot, S.; Ciliegi, P.; Contini, T.; Foucaud, S.; Franzetti, P.; Gavignaud, I.; Ilbert, O.; Marano, B.; Mathez, G.; Mazure, A.; Merighi, R.; Paltani, S.; Pellò, R.; Pozzetti, L.; Radovich, M.; Zamorani, G.; Zucca, E.; Bondi, M.; Bongiorno, A.; Busarello, G.; Gregorini, L.; Lamareille, F.; Mellier, Y.; Merluzzi, P.; Ripepi, V.; Rizzo, D.

    2005-09-01

    We present a detailed description of the methods used to compute the three-dimensional two-point galaxy correlation function in the VIMOS-VLT deep survey (VVDS). We investigate how instrumental selection effects and observational biases affect the measurements and identify the methods to correct for them. We quantify the accuracy of our corrections using an ensemble of 50 mock galaxy surveys generated with the GalICS semi-analytic model of galaxy formation which incorporate the selection biases and tiling strategy of the real data. We demonstrate that we are able to recover the real-space two-point correlation function ξ(s) and the projected correlation function w_p(r_p) to an accuracy better than 10% on scales larger than 1 h-1 Mpc with the sampling strategy used for the first epoch VVDS data. The large number of simulated surveys allows us to provide a reliable estimate of the cosmic variance on the measurements of the correlation length r0 at z ˜ 1, of about 15-20% for the first epoch VVDS observation while any residual systematic effect in the measurements of r0 is always below 5%. The error estimation and measurement techniques outlined in this paper are being used in several parallel studies which investigate in detail the clustering properties of galaxies in the VVDS.

  17. Survey on computer aided decision support for diagnosis of celiac disease

    PubMed Central

    Hegenbart, Sebastian; Uhl, Andreas; Vécsei, Andreas

    2015-01-01

    Celiac disease (CD) is a complex autoimmune disorder in genetically predisposed individuals of all age groups triggered by the ingestion of food containing gluten. A reliable diagnosis is of high interest in view of embarking on a strict gluten-free diet, which is the CD treatment modality of first choice. The gold standard for diagnosis of CD is currently based on a histological confirmation of serology, using biopsies performed during upper endoscopy. Computer aided decision support is an emerging option in medicine and endoscopy in particular. Such systems could potentially save costs and manpower while simultaneously increasing the safety of the procedure. Research focused on computer-assisted systems in the context of automated diagnosis of CD has started in 2008. Since then, over 40 publications on the topic have appeared. In this context, data from classical flexible endoscopy as well as wireless capsule endoscopy (WCE) and confocal laser endomicrosopy (CLE) has been used. In this survey paper, we try to give a comprehensive overview of the research focused on computer-assisted diagnosis of CD. PMID:25770906

  18. Mechanisms of stochastic focusing and defocusing in biological reaction networks: insight from accurate chemical master equation (ACME) solutions.

    PubMed

    Gursoy, Gamze; Terebus, Anna; Youfang Cao; Jie Liang

    2016-08-01

    Stochasticity plays important roles in regulation of biochemical reaction networks when the copy numbers of molecular species are small. Studies based on Stochastic Simulation Algorithm (SSA) has shown that a basic reaction system can display stochastic focusing (SF) by increasing the sensitivity of the network as a result of the signal noise. Although SSA has been widely used to study stochastic networks, it is ineffective in examining rare events and this becomes a significant issue when the tails of probability distributions are relevant as is the case of SF. Here we use the ACME method to solve the exact solution of the discrete Chemical Master Equations and to study a network where SF was reported. We showed that the level of SF depends on the degree of the fluctuations of signal molecule. We discovered that signaling noise under certain conditions in the same reaction network can lead to a decrease in the system sensitivities, thus the network can experience stochastic defocusing. These results highlight the fundamental role of stochasticity in biological reaction networks and the need for exact computation of probability landscape of the molecules in the system.

  19. Measuring Computer Usage by Air Force Contracting Personnel as it Relates to Computer Training

    DTIC Science & Technology

    1996-09-01

    Computers in Human Behavior 12.1...34 Computers in Human Behavior 3 (1987): 49-59. Howard, G. S. and R. D. Smith. "Computer Anxiety in Management: Myth or Reality." Communications of the ACM 29...34 Computers in Human Behavior 9 (1993): 27-50. 34 Szajna, Bernadette. "An Investigation of the Predictive Validity of Computer Anxiety and

  20. The Asilomar Survey: Stakeholders' Opinions on Ethical Issues Related to Brain-Computer Interfacing.

    PubMed

    Nijboer, Femke; Clausen, Jens; Allison, Brendan Z; Haselager, Pim

    2013-01-01

    Brain-Computer Interface (BCI) research and (future) applications raise important ethical issues that need to be addressed to promote societal acceptance and adequate policies. Here we report on a survey we conducted among 145 BCI researchers at the 4(th) International BCI conference, which took place in May-June 2010 in Asilomar, California. We assessed respondents' opinions about a number of topics. First, we investigated preferences for terminology and definitions relating to BCIs. Second, we assessed respondents' expectations on the marketability of different BCI applications (BCIs for healthy people, BCIs for assistive technology, BCIs-controlled neuroprostheses and BCIs as therapy tools). Third, we investigated opinions about ethical issues related to BCI research for the development of assistive technology: informed consent process with locked-in patients, risk-benefit analyses, team responsibility, consequences of BCI on patients' and families' lives, liability and personal identity and interaction with the media. Finally, we asked respondents which issues are urgent in BCI research.

  1. Sci—Thur PM: Imaging — 06: Canada's National Computed Tomography (CT) Survey

    SciTech Connect

    Wardlaw, GM; Martel, N; Blackler, W; Asselin, J-F

    2014-08-15

    The value of computed tomography (CT) in medical imaging is reflected in its' increased use and availability since the early 1990's; however, given CT's relatively larger exposures (vs. planar x-ray) greater care must be taken to ensure that CT procedures are optimised in terms of providing the smallest dose possible while maintaining sufficient diagnostic image quality. The development of CT Diagnostic Reference Levels (DRLs) supports this process. DRLs have been suggested/supported by international/national bodies since the early 1990's and widely adopted elsewhere, but not on a national basis in Canada. Essentially, CT DRLs provide guidance on what is considered good practice for common CT exams, but require a representative sample of CT examination data to make any recommendations. Canada's National CT Survey project, in collaboration with provincial/territorial authorities, has collected a large national sample of CT practice data for 7 common examinations (with associated clinical indications) of both adult and pediatric patients. Following completion of data entry into a common database, a survey summary report and recommendations will be made on CT DRLs from this data. It is hoped that these can then be used by local regions to promote CT practice optimisation and support any dose reduction initiatives.

  2. Computer analysis of digital sky surveys using citizen science and manual classification

    NASA Astrophysics Data System (ADS)

    Kuminski, Evan; Shamir, Lior

    2015-01-01

    As current and future digital sky surveys such as SDSS, LSST, DES, Pan-STARRS and Gaia create increasingly massive databases containing millions of galaxies, there is a growing need to be able to efficiently analyze these data. An effective way to do this is through manual analysis, however, this may be insufficient considering the extremely vast pipelines of astronomical images generated by the present and future surveys. Some efforts have been made to use citizen science to classify galaxies by their morphology on a larger scale than individual or small groups of scientists can. While these citizen science efforts such as Zooniverse have helped obtain reasonably accurate morphological information about large numbers of galaxies, they cannot scale to provide complete analysis of billions of galaxy images that will be collected by future ventures such as LSST. Since current forms of manual classification cannot scale to the masses of data collected by digital sky surveys, it is clear that in order to keep up with the growing databases some form of automation of the data analysis will be required, and will work either independently or in combination with human analysis such as citizen science. Here we describe a computer vision method that can automatically analyze galaxy images and deduce galaxy morphology. Experiments using Galaxy Zoo 2 data show that the performance of the method increases as the degree of agreement between the citizen scientists gets higher, providing a cleaner dataset. For several morphological features, such as the spirality of the galaxy, the algorithm agreed with the citizen scientists on around 95% of the samples. However, the method failed to analyze some of the morphological features such as the number of spiral arms, and provided accuracy of just ~36%.

  3. A historical survey of algorithms and hardware architectures for neural-inspired and neuromorphic computing applications

    DOE PAGES

    James, Conrad D.; Aimone, James B.; Miner, Nadine E.; ...

    2017-01-04

    In this study, biological neural networks continue to inspire new developments in algorithms and microelectronic hardware to solve challenging data processing and classification problems. Here in this research, we survey the history of neural-inspired and neuromorphic computing in order to examine the complex and intertwined trajectories of the mathematical theory and hardware developed in this field. Early research focused on adapting existing hardware to emulate the pattern recognition capabilities of living organisms. Contributions from psychologists, mathematicians, engineers, neuroscientists, and other professions were crucial to maturing the field from narrowly-tailored demonstrations to more generalizable systems capable of addressing difficult problem classesmore » such as object detection and speech recognition. Algorithms that leverage fundamental principles found in neuroscience such as hierarchical structure, temporal integration, and robustness to error have been developed, and some of these approaches are achieving world-leading performance on particular data classification tasks. Additionally, novel microelectronic hardware is being developed to perform logic and to serve as memory in neuromorphic computing systems with optimized system integration and improved energy efficiency. Key to such advancements was the incorporation of new discoveries in neuroscience research, the transition away from strict structural replication and towards the functional replication of neural systems, and the use of mathematical theory frameworks to guide algorithm and hardware developments.« less

  4. A historical survey of algorithms and hardware architectures for neural-inspired and neuromorphic computing applications

    SciTech Connect

    James, Conrad D.; Aimone, James B.; Miner, Nadine E.; Vineyard, Craig M.; Rothganger, Fredrick H.; Carlson, Kristofor D.; Mulder, Samuel A.; Draelos, Timothy J.; Faust, Aleksandra; Marinella, Matthew J.; Naegle, John H.; Plimpton, Steven J.

    2017-01-01

    Biological neural networks continue to inspire new developments in algorithms and microelectronic hardware to solve challenging data processing and classification problems. Here in this research, we survey the history of neural-inspired and neuromorphic computing in order to examine the complex and intertwined trajectories of the mathematical theory and hardware developed in this field. Early research focused on adapting existing hardware to emulate the pattern recognition capabilities of living organisms. Contributions from psychologists, mathematicians, engineers, neuroscientists, and other professions were crucial to maturing the field from narrowly-tailored demonstrations to more generalizable systems capable of addressing difficult problem classes such as object detection and speech recognition. Algorithms that leverage fundamental principles found in neuroscience such as hierarchical structure, temporal integration, and robustness to error have been developed, and some of these approaches are achieving world-leading performance on particular data classification tasks. Additionally, novel microelectronic hardware is being developed to perform logic and to serve as memory in neuromorphic computing systems with optimized system integration and improved energy efficiency. Key to such advancements was the incorporation of new discoveries in neuroscience research, the transition away from strict structural replication and towards the functional replication of neural systems, and the use of mathematical theory frameworks to guide algorithm and hardware developments.

  5. U.S. Geological Survey National Computer Technology Meeting; Program and abstracts, May 7-11, 1990

    USGS Publications Warehouse

    Balthrop, B. H.; Baker, E.G.

    1990-01-01

    Computer-related information from all Divisions of the U.S. Geological Survey are discussed in this compilation of abstracts. Some of the topics addressed are system administration; distributed information systems and data bases, both current (1990) and proposed; hydrologic applications; national water information systems; geographic information systems applications and techniques. The report contains some of the abstracts that were presented at the National Computer Technology Meeting that was held in May 1990. The meeting was sponsored by the Water Resources Division and was attended by more than 200 technical and managerial personnel representing all the Divisions of the U.S. Geological Survey. (USGS)

  6. Superfund Record of Decision (EPA Region 5): Acme Solvents, Morristown, Illinois, September 1985. Final report

    SciTech Connect

    Not Available

    1985-09-27

    The Acme Solvents Reclaiming, Inc. facility is located approximately five miles south of Rockford, Illinois. From 1960 until 1973, the facility served as a disposal site for paints, oils and still bottoms from the solvent reclamation plant located in Rockford. In addition, empty drums were stored onsite. Wastes were dumped into depressions created from either previous quarrying activities or by scraping overburden from the near surface bedrock to form berms. In September 1972, the Illinois Pollution Control Board (IPCB) ordered Acme to remove all drums and wastes from the facility and to backfill the lagoons. Follow-up inspections revealed that wastes and crushed drums were being left onsite and merely covered with soil. Sampling of the site revealed high concentrations of chlorinated organics in the drinking water. The major source of hazardous substances at the facility are the waste disposal mounds. These mounds contain volatile and semi-volatile organic compounds and concentrations of PCBs up to several hundred mg/kg. The selected remedial action is included.

  7. Interaction and Critical Inquiry in Asynchronous Computer-Mediated Conferencing: A Research Agenda

    ERIC Educational Resources Information Center

    Hopkins, Joseph; Gibson, Will; Ros i. Sole, Cristina; Savvides, Nicola; Starkey, Hugh

    2008-01-01

    This paper reviews research on learner and tutor interaction in asynchronous computer-mediated (ACM) conferences used in distance learning. The authors note claims made for the potential of ACM conferences to promote higher-order critical inquiry and the social construction of knowledge, and argue that there is a general lack of evidence regarding…

  8. Clinical isolates of Enterococcus faecium exhibit strain-specific collagen binding mediated by Acm, a new member of the MSCRAMM family.

    PubMed

    Nallapareddy, Sreedhar R; Weinstock, George M; Murray, Barbara E

    2003-03-01

    A collagen-binding adhesin of Enterococcus faecium, Acm, was identified. Acm shows 62% similarity to the Staphylococcus aureus collagen adhesin Cna over the entire protein and is more similar to Cna (60% and 75% similarity with Cna A and B domains respectively) than to the Enterococcus faecalis collagen-binding adhesin, Ace, which shares homology with Acm only in the A domain. Despite the detection of acm in 32 out of 32 E. faecium isolates, only 11 of these (all clinical isolates, including four vancomycin-resistant endocarditis isolates and seven other isolates) exhibited binding to collagen type I (CI). Although acm from three CI-binding vancomycin-resistant E. faecium clinical isolates showed 100% identity, analysis of acm genes and their promoter regions from six non-CI-binding strains identified deletions or mutations that introduced stop codons and/or IS elements within the gene or the promoter region in five out of six strains, suggesting that the presence of an intact functional acm gene is necessary for binding of E. faecium strains to CI. Recombinant Acm A domain showed specific and concentration-dependent binding to collagen, and this protein competed with E. faecium binding to immobilized CI. Consistent with the adherence phenotype and sequence data, probing with Acm-specific IgGs purified from anti-recombinant Acm A polyclonal rabbit serum confirmed the surface expression of Acm in three out of three collagen-binding clinical isolates of E. faecium tested, but in none of the strains with a non-functional pseudo acm gene. Introduction of a functional acm gene into two non-CI-binding natural acm mutant strains conferred a CI-binding phenotype, further confirming that native Acm is sufficient for the binding of E. faecium to CI. These results demonstrate that acm, which encodes a potential virulence factor, is functional only in certain infection-derived clinical isolates of E. faecium, and suggest that Acm is the primary adhesin responsible for the

  9. Do Mathematicians Integrate Computer Algebra Systems in University Teaching? Comparing a Literature Review to an International Survey Study

    ERIC Educational Resources Information Center

    Marshall, Neil; Buteau, Chantal; Jarvis, Daniel H.; Lavicza, Zsolt

    2012-01-01

    We present a comparative study of a literature review of 326 selected contributions (Buteau, Marshall, Jarvis & Lavicza, 2010) to an international (US, UK, Hungary) survey of mathematicians (Lavicza, 2008) regarding the use of Computer Algebra Systems (CAS) in post-secondary mathematics education. The comparison results are organized with respect…

  10. Technology Support: Its Depth, Breadth and Impact in America's Schools. Teaching, Learning, and Computing: 1998 National Survey Report #5.

    ERIC Educational Resources Information Center

    Ronnkvist, Amy M.; Dexter, Sara L.; Anderson, Ronald E.

    This report, the fifth in a series from the spring 1998 national survey, "Teaching, Learning, and Computing," provides a framework for defining the various dimensions of technology support. Research has shown that teachers lack adequate support for the use of information and communication technologies (ICT). In this report, the term…

  11. A Survey of Exemplar Teachers' Perceptions, Use, and Access of Computer-Based Games and Technology for Classroom Instruction

    ERIC Educational Resources Information Center

    Proctor, Michael D.; Marks, Yaela

    2013-01-01

    This research reports and analyzes for archival purposes surveyed perceptions, use, and access by 259 United States based exemplar Primary and Secondary educators of computer-based games and technology for classroom instruction. Participating respondents were considered exemplary as they each won the Milken Educator Award during the 1996-2009…

  12. 1986 Newspaper Help Wanted Ad Survey. Number of Ads Increases 11.5%. Computer Use Up; Shorthand Down.

    ERIC Educational Resources Information Center

    Fusselman, Kay

    1986-01-01

    Key findings of the 1986 Newspaper Help Wanted Advertisements Survey concerning secretarial positions are reported. It was found that ads indicating that word processors or computers would be used on the job increased to 38.9 percent; ads requiring shorthand or fast notetaking were down for the second year; and averages of salaries offered were…

  13. Promoting CLT within a Computer Assisted Learning Environment: A Survey of the Communicative English Course of FLTC

    ERIC Educational Resources Information Center

    Haider, Md. Zulfeqar; Chowdhury, Takad Ahmed

    2012-01-01

    This study is based on a survey of the Communicative English Language Certificate (CELC) course run by the Foreign Language Training Center (FLTC), a Project under the Ministry of Education, Bangladesh. FLTC is working to promote the teaching and learning of English through its eleven computer-based and state of the art language laboratories. As…

  14. AcmB Is an S-Layer-Associated β-N-Acetylglucosaminidase and Functional Autolysin in Lactobacillus acidophilus NCFM

    PubMed Central

    Johnson, Brant R.

    2016-01-01

    ABSTRACT Autolysins, also known as peptidoglycan hydrolases, are enzymes that hydrolyze specific bonds within bacterial cell wall peptidoglycan during cell division and daughter cell separation. Within the genome of Lactobacillus acidophilus NCFM, there are 11 genes encoding proteins with peptidoglycan hydrolase catalytic domains, 9 of which are predicted to be functional. Notably, 5 of the 9 putative autolysins in L. acidophilus NCFM are S-layer-associated proteins (SLAPs) noncovalently colocalized along with the surface (S)-layer at the cell surface. One of these SLAPs, AcmB, a β-N-acetylglucosaminidase encoded by the gene lba0176 (acmB), was selected for functional analysis. In silico analysis revealed that acmB orthologs are found exclusively in S-layer- forming species of Lactobacillus. Chromosomal deletion of acmB resulted in aberrant cell division, autolysis, and autoaggregation. Complementation of acmB in the ΔacmB mutant restored the wild-type phenotype, confirming the role of this SLAP in cell division. The absence of AcmB within the exoproteome had a pleiotropic effect on the extracellular proteins covalently and noncovalently bound to the peptidoglycan, which likely led to the observed decrease in the binding capacity of the ΔacmB strain for mucin and extracellular matrices fibronectin, laminin, and collagen in vitro. These data suggest a functional association between the S-layer and the multiple autolysins noncovalently colocalized at the cell surface of L. acidophilus NCFM and other S-layer-producing Lactobacillus species. IMPORTANCE Lactobacillus acidophilus is one of the most widely used probiotic microbes incorporated in many dairy foods and dietary supplements. This organism produces a surface (S)-layer, which is a self-assembling crystalline array found as the outermost layer of the cell wall. The S-layer, along with colocalized associated proteins, is an important mediator of probiotic activity through intestinal adhesion and modulation of

  15. Machine Learning in Computer-aided Diagnosis of the Thorax and Colon in CT: A Survey

    PubMed Central

    SUZUKI, Kenji

    2013-01-01

    SUMMARY Computer-aided detection (CADe) and diagnosis (CAD) has been a rapidly growing, active area of research in medical imaging. Machine leaning (ML) plays an essential role in CAD, because objects such as lesions and organs may not be represented accurately by a simple equation; thus, medical pattern recognition essentially require “learning from examples.” One of the most popular uses of ML is the classification of objects such as lesion candidates into certain classes (e.g., abnormal or normal, and lesions or non-lesions) based on input features (e.g., contrast and area) obtained from segmented lesion candidates. The task of ML is to determine “optimal” boundaries for separating classes in the multidimensional feature space which is formed by the input features. ML algorithms for classification include linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), multilayer perceptrons, and support vector machines (SVM). Recently, pixel/voxel-based ML (PML) emerged in medical image processing/analysis, which uses pixel/voxel values in images directly, instead of features calculated from segmented lesions, as input information; thus, feature calculation or segmentation is not required. In this paper, ML techniques used in CAD schemes for detection and diagnosis of lung nodules in thoracic CT and for detection of polyps in CT colonography (CTC) are surveyed and reviewed. PMID:24174708

  16. ARECIBO PALFA SURVEY AND EINSTEIN-HOME: BINARY PULSAR DISCOVERY BY VOLUNTEER COMPUTING

    SciTech Connect

    Knispel, B.; Allen, B.; Aulbert, C.; Bock, O.; Fehrmann, H.; Lazarus, P.; Bogdanov, S.; Anderson, D.; Bhat, N. D. R.; Brazier, A.; Chatterjee, S.; Cordes, J. M.; Camilo, F.; Crawford, F.; Deneva, J. S.; Desvignes, G.; Freire, P. C. C.; Hammer, D.; Hessels, J. W. T.; Jenet, F. A.

    2011-05-01

    We report the discovery of the 20.7 ms binary pulsar J1952+2630, made using the distributed computing project Einstein-Home in Pulsar ALFA survey observations with the Arecibo telescope. Follow-up observations with the Arecibo telescope confirm the binary nature of the system. We obtain a circular orbital solution with an orbital period of 9.4 hr, a projected orbital radius of 2.8 lt-s, and a mass function of f = 0.15 M{sub sun} by analysis of spin period measurements. No evidence of orbital eccentricity is apparent; we set a 2{sigma} upper limit e {approx}< 1.7 x 10{sup -3}. The orbital parameters suggest a massive white dwarf companion with a minimum mass of 0.95 M{sub sun}, assuming a pulsar mass of 1.4 M{sub sun}. Most likely, this pulsar belongs to the rare class of intermediate-mass binary pulsars. Future timing observations will aim to determine the parameters of this system further, measure relativistic effects, and elucidate the nature of the companion star.

  17. Machine Learning in Computer-aided Diagnosis of the Thorax and Colon in CT: A Survey.

    PubMed

    Suzuki, Kenji

    2013-04-01

    Computer-aided detection (CADe) and diagnosis (CAD) has been a rapidly growing, active area of research in medical imaging. Machine leaning (ML) plays an essential role in CAD, because objects such as lesions and organs may not be represented accurately by a simple equation; thus, medical pattern recognition essentially require "learning from examples." One of the most popular uses of ML is the classification of objects such as lesion candidates into certain classes (e.g., abnormal or normal, and lesions or non-lesions) based on input features (e.g., contrast and area) obtained from segmented lesion candidates. The task of ML is to determine "optimal" boundaries for separating classes in the multidimensional feature space which is formed by the input features. ML algorithms for classification include linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), multilayer perceptrons, and support vector machines (SVM). Recently, pixel/voxel-based ML (PML) emerged in medical image processing/analysis, which uses pixel/voxel values in images directly, instead of features calculated from segmented lesions, as input information; thus, feature calculation or segmentation is not required. In this paper, ML techniques used in CAD schemes for detection and diagnosis of lung nodules in thoracic CT and for detection of polyps in CT colonography (CTC) are surveyed and reviewed.

  18. ACME: automated cell morphology extractor for comprehensive reconstruction of cell membranes.

    PubMed

    Mosaliganti, Kishore R; Noche, Ramil R; Xiong, Fengzhu; Swinburne, Ian A; Megason, Sean G

    2012-01-01

    The quantification of cell shape, cell migration, and cell rearrangements is important for addressing classical questions in developmental biology such as patterning and tissue morphogenesis. Time-lapse microscopic imaging of transgenic embryos expressing fluorescent reporters is the method of choice for tracking morphogenetic changes and establishing cell lineages and fate maps in vivo. However, the manual steps involved in curating thousands of putative cell segmentations have been a major bottleneck in the application of these technologies especially for cell membranes. Segmentation of cell membranes while more difficult than nuclear segmentation is necessary for quantifying the relations between changes in cell morphology and morphogenesis. We present a novel and fully automated method to first reconstruct membrane signals and then segment out cells from 3D membrane images even in dense tissues. The approach has three stages: 1) detection of local membrane planes, 2) voting to fill structural gaps, and 3) region segmentation. We demonstrate the superior performance of the algorithms quantitatively on time-lapse confocal and two-photon images of zebrafish neuroectoderm and paraxial mesoderm by comparing its results with those derived from human inspection. We also compared with synthetic microscopic images generated by simulating the process of imaging with fluorescent reporters under varying conditions of noise. Both the over-segmentation and under-segmentation percentages of our method are around 5%. The volume overlap of individual cells, compared to expert manual segmentation, is consistently over 84%. By using our software (ACME) to study somite formation, we were able to segment touching cells with high accuracy and reliably quantify changes in morphogenetic parameters such as cell shape and size, and the arrangement of epithelial and mesenchymal cells. Our software has been developed and tested on Windows, Mac, and Linux platforms and is available

  19. A five-layer users' need hierarchy of computer input device selection: a contextual observation survey of computer users with cervical spinal injuries (CSI).

    PubMed

    Tsai, Tsai-Hsuan; Nash, Robert J; Tseng, Kevin C

    2009-05-01

    This article presents how the researcher goes about answering the research question, 'how assistive technology impacts computer use among individuals with cervical spinal cord injury?' through an in-depth investigation into the real-life situations among computer operators with cervical spinal cord injuries (CSI). An in-depth survey was carried out to provide an insight into the function abilities and limitation, habitual practice and preference, choices and utilisation of input devices, personal and/or technical assistance, environmental set-up and arrangements and special requirements among 20 experienced computer users with cervical spinal cord injuries. Following the survey findings, a five-layer CSI users' needs hierarchy of input device selection and use was proposed. These needs were ranked in order: beginning with the most basic criterion at the bottom of the pyramid; lower-level criteria must be met before one moves onto the higher level. The users' needs hierarchy for CSI computer users, which had not been applied by previous research work and which has established a rationale for the development of alternative input devices. If an input device achieves the criteria set up in the needs hierarchy, then a good match of person and technology will be achieved.

  20. Teacher Professional Engagement and Constructivist-Compatible Computer Use. Teaching, Learning, and Computing: 1998 National Survey. Report #7.

    ERIC Educational Resources Information Center

    Becker, Henry Jay; Riel, Margaret M.

    This report describes aspects of the professional engagement of American teachers and examines relationships between professional engagement and teaching practice, including instruction involving computer use. Professional engagement is measured by: the frequency that teachers had informal substantive communications with other teachers at their…

  1. Successful use of tablet personal computers and wireless technologies for the 2011 Nepal Demographic and Health Survey.

    PubMed

    Paudel, Deepak; Ahmed, Marie; Pradhan, Anjushree; Lal Dangol, Rajendra

    2013-08-01

    Computer-Assisted Personal Interviewing (CAPI), coupled with the use of mobile and wireless technology, is growing as a data collection methodology. Nepal, a geographically diverse and resource-scarce country, implemented the 2011 Nepal Demographic and Health Survey, a nationwide survey of major health indicators, using tablet personal computers (tablet PCs) and wireless technology for the first time in the country. This paper synthesizes responses on the benefits and challenges of using new technology in such a challenging environment from the 89 interviewers who administered the survey. Overall, feedback from the interviewers indicate that the use of tablet PCs and wireless technology to administer the survey demonstrated potential to improve data quality and reduce data collection time-benefits that outweigh manageable challenges, such as storage and transport of the tablet PCs during fieldwork, limited options for confidential interview space due to screen readability issues under direct sunlight, and inconsistent electricity supply at times. The introduction of this technology holds great promise for improving data availability and quality, even in a context with limited infrastructure and extremely difficult terrain.

  2. Successful use of tablet personal computers and wireless technologies for the 2011 Nepal Demographic and Health Survey

    PubMed Central

    Paudel, Deepak; Ahmed, Marie; Pradhan, Anjushree; Lal Dangol, Rajendra

    2013-01-01

    ABSTRACT Computer-Assisted Personal Interviewing (CAPI), coupled with the use of mobile and wireless technology, is growing as a data collection methodology. Nepal, a geographically diverse and resource-scarce country, implemented the 2011 Nepal Demographic and Health Survey, a nationwide survey of major health indicators, using tablet personal computers (tablet PCs) and wireless technology for the first time in the country. This paper synthesizes responses on the benefits and challenges of using new technology in such a challenging environment from the 89 interviewers who administered the survey. Overall, feedback from the interviewers indicate that the use of tablet PCs and wireless technology to administer the survey demonstrated potential to improve data quality and reduce data collection time—benefits that outweigh manageable challenges, such as storage and transport of the tablet PCs during fieldwork, limited options for confidential interview space due to screen readability issues under direct sunlight, and inconsistent electricity supply at times. The introduction of this technology holds great promise for improving data availability and quality, even in a context with limited infrastructure and extremely difficult terrain. PMID:25276539

  3. Computer Programs for Library Operations; Results of a Survey Conducted Between Fall 1971 and Spring 1972.

    ERIC Educational Resources Information Center

    Liberman, Eva; And Others

    Many library operations involving large data banks lend themselves readily to computer operation. In setting up library computer programs, in changing or expanding programs, cost in programming and time delays could be substantially reduced if the programmers had access to library computer programs being used by other libraries, providing similar…

  4. Pre-Service ELT Teachers' Attitudes towards Computer Use: A Turkish Survey

    ERIC Educational Resources Information Center

    Sariçoban, Arif

    2013-01-01

    Problem Statement: Computer technology plays a crucial role in foreign/second language (L2) instruction, and as such, L2 teachers display different attitudes towards the use of computers in their teaching activities. It is important to know what attitudes these teachers hold towards the use of computers and whether they have these varying…

  5. Computer Software Used in U.S. Army Anthropometric Survey 1987-1988

    DTIC Science & Technology

    1988-06-30

    Length 20 Wrist-Wall Length, Extended 20 112 APPENDIX D. The Biographical Data Questionnaire 113 APPENDIX D. US ARMY ANTHROPOMETRIC SURVEY ( ANSUR ...114 US ARMY ANTHROPOMETRIC SURVEY ( ANSUR ) BIOGRAPHICAL DATA: PERSONAL HISTORY 1. Your Birthdate: ......... / ......... (Month) (Day) (Year) 2. Age

  6. Perceived problems with computer gaming and internet use among adolescents: measurement tool for non-clinical survey studies

    PubMed Central

    2014-01-01

    Background Existing instruments for measuring problematic computer and console gaming and internet use are often lengthy and often based on a pathological perspective. The objective was to develop and present a new and short non-clinical measurement tool for perceived problems related to computer use and gaming among adolescents and to study the association between screen time and perceived problems. Methods Cross-sectional school-survey of 11-, 13-, and 15-year old students in thirteen schools in the City of Aarhus, Denmark, participation rate 89%, n = 2100. The main exposure was time spend on weekdays on computer- and console-gaming and internet use for communication and surfing. The outcome measures were three indexes on perceived problems related to computer and console gaming and internet use. Results The three new indexes showed high face validity and acceptable internal consistency. Most schoolchildren with high screen time did not experience problems related to computer use. Still, there was a strong and graded association between time use and perceived problems related to computer gaming, console gaming (only boys) and internet use, odds ratios ranging from 6.90 to 10.23. Conclusion The three new measures of perceived problems related to computer and console gaming and internet use among adolescents are appropriate, reliable and valid for use in non-clinical surveys about young people’s everyday life and behaviour. These new measures do not assess Internet Gaming Disorder as it is listed in the DSM and therefore has no parity with DSM criteria. We found an increasing risk of perceived problems with increasing time spent with gaming and internet use. Nevertheless, most schoolchildren who spent much time with gaming and internet use did not experience problems. PMID:24731270

  7. Computing the Deflection of the Vertical for Improving Aerial Surveys: A Comparison between EGM2008 and ITALGEO05 Estimates

    PubMed Central

    Barzaghi, Riccardo; Carrion, Daniela; Pepe, Massimiliano; Prezioso, Giuseppina

    2016-01-01

    Recent studies on the influence of the anomalous gravity field in GNSS/INS applications have shown that neglecting the impact of the deflection of vertical in aerial surveys induces horizontal and vertical errors in the measurement of an object that is part of the observed scene; these errors can vary from a few tens of centimetres to over one meter. The works reported in the literature refer to vertical deflection values based on global geopotential model estimates. In this paper we compared this approach with the one based on local gravity data and collocation methods. In particular, denoted by ξ and η, the two mutually-perpendicular components of the deflection of the vertical vector (in the north and east directions, respectively), their values were computed by collocation in the framework of the Remove-Compute-Restore technique, applied to the gravity database used for estimating the ITALGEO05 geoid. Following this approach, these values have been computed at different altitudes that are relevant in aerial surveys. The (ξ, η) values were then also estimated using the high degree EGM2008 global geopotential model and compared with those obtained in the previous computation. The analysis of the differences between the two estimates has shown that the (ξ, η) global geopotential model estimate can be reliably used in aerial navigation applications that require the use of sensors connected to a GNSS/INS system only above a given height (e.g., 3000 m in this paper) that must be defined by simulations. PMID:27472333

  8. Asteroids, comets, Kuiper Belt objects, meteors: the ACM (AKM) 2002 perspective

    NASA Astrophysics Data System (ADS)

    Binzel, Richard P.

    2002-11-01

    An impressionistic overview is given on the state of science of Asteroider, Kometer, Meteorer (AKM), where the Swedish spelling is adopted in recognition of the origin of the ACM conference series. Asteroider (asteroids) are a field that has come of age to the point of asking sophisticated geological and geophysical questions based on dedicated spacecraft missions. Kometer (comets) are coming of age with a strong international emphasis toward new comet missions and ever increasing sophistication of Earth-based observations. K also stands for Kuiper belt objects, a field that wasn't invented when our conference series began but whose inclusion we embrace and recognize as being integral to our science. (Hence the recommendation that we adopt the moniker AKM to proclaim fully our inclusivity) KBOs are an emerging field, perhaps analogous to a fast growing child. The presently known number of KBOs is comparable to the number of known main-belt asteroids in 1900, suggesting that we are just beginning to learn about this region. Meteorer (meteors) is a rejuvenated field that has enjoyed spectacular recent successes in detailed predictions of the Leonid shower and in just recently recording the fall and recovering the Neuschwanstein meteorite. The future outlook portends the greatest advancement in K (kometer and KBOs) with broad interdisciplinary implications.

  9. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    USGS Publications Warehouse

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  10. Computer ethics: A capstone course

    SciTech Connect

    Fisher, T.G.; Abunawass, A.M.

    1994-12-31

    This paper presents a capstone course on computer ethics required for all computer science majors in our program. The course was designed to encourage students to evaluate their own personal value systems in terms of the established values in computer science as represented by the ACM Code of Ethics. The structure, activities, and topics of the course as well as assessment of the students are presented. Observations on various course components and student evaluations of the course are also presented.

  11. Tradeoffs Between Synchronization, Communication, and Work in Parallel Linear Algebra Computations

    DTIC Science & Technology

    2014-01-25

    2001. [24] A. Tiskin. Communication-efficient parallel generic pairwise elimination. Future Generation Computer Systems , 23(2):179 – 188, 2007. [25] S. Warshall. A theorem on boolean matrices. J. ACM, 9:11–12, January 1962. 17

  12. A survey of students` ethical attitudes using computer-related scenarios

    SciTech Connect

    Hanchey, C.M.; Kingsbury, J.

    1994-12-31

    Many studies exist that examine ethical beliefs and attitudes of university students ascending medium or large institutions. There are also many studies which examine ethical attitudes and beliefs of computer science and computer information systems majors. None, however, examines ethical attitudes of university students (regardless of undergraduate major) at a small, Christian, liberal arts institution regarding computer-related situations. This paper will present data accumulated by an on-going study in which students are presented seven scenarios--all of which involve some aspect of computing technology. These students were randomly selected from a small, Christian, liberal-arts university.

  13. A Survey of High-Quality Computational Libraries and their Impactin Science and Engineering Applications

    SciTech Connect

    Drummond, L.A.; Hernandez, V.; Marques, O.; Roman, J.E.; Vidal, V.

    2004-09-20

    Recently, a number of important scientific and engineering problems have been successfully studied and solved by means of computational modeling and simulation. Many of these computational models and simulations benefited from the use of available software tools and libraries to achieve high performance and portability. In this article, we present a reference matrix of the performance of robust, reliable and widely used tools mapped to scientific and engineering applications that use them. We aim at regularly maintaining and disseminating this matrix to the computational science community. This matrix will contain information on state-of-the-art computational tools, their applications and their use.

  14. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    ERIC Educational Resources Information Center

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  15. A Survey and Evaluation of Simulators Suitable for Teaching Courses in Computer Architecture and Organization

    ERIC Educational Resources Information Center

    Nikolic, B.; Radivojevic, Z.; Djordjevic, J.; Milutinovic, V.

    2009-01-01

    Courses in Computer Architecture and Organization are regularly included in Computer Engineering curricula. These courses are usually organized in such a way that students obtain not only a purely theoretical experience, but also a practical understanding of the topics lectured. This practical work is usually done in a laboratory using simulators…

  16. Survey of Turbulence Models for the Computation of Turbulent Jet Flow and Noise

    NASA Technical Reports Server (NTRS)

    Nallasamy, N.

    1999-01-01

    The report presents an overview of jet noise computation utilizing the computational fluid dynamic solution of the turbulent jet flow field. The jet flow solution obtained with an appropriate turbulence model provides the turbulence characteristics needed for the computation of jet mixing noise. A brief account of turbulence models that are relevant for the jet noise computation is presented. The jet flow solutions that have been directly used to calculate jet noise are first reviewed. Then, the turbulent jet flow studies that compute the turbulence characteristics that may be used for noise calculations are summarized. In particular, flow solutions obtained with the k-e model, algebraic Reynolds stress model, and Reynolds stress transport equation model are reviewed. Since, the small scale jet mixing noise predictions can be improved by utilizing anisotropic turbulence characteristics, turbulence models that can provide the Reynolds stress components must now be considered for jet flow computations. In this regard, algebraic stress models and Reynolds stress transport models are good candidates. Reynolds stress transport models involve more modeling and computational effort and time compared to algebraic stress models. Hence, it is recommended that an algebraic Reynolds stress model (ASM) be implemented in flow solvers to compute the Reynolds stress components.

  17. Does Computer Survey Technology Improve Reports on Alcohol and Illicit Drug Use in the General Population? A Comparison Between Two Surveys with Different Data Collection Modes In France

    PubMed Central

    Beck, François; Guignard, Romain; Legleye, Stéphane

    2014-01-01

    Background Previous studies have shown that survey methodology can greatly influence prevalence estimates for alcohol and illicit drug use. The aim of this article is to assess the effect of data collection modes on alcohol misuse and drug use reports by comparing national estimates from computer-assisted telephone interviews (CATI) and audio-computer-assisted self interviews (A-CASI). Methods Design: Two national representative surveys conducted in 2005 in France by CATI (n = 24,674) and A-CASI (n = 8,111). Participants: French-speaking individuals aged [18]–[64] years old. Measurements: Alcohol misuse according to the CAGE test, cannabis use (lifetime, last year, 10+ in last month) and experimentation with cocaine, LSD, heroin, amphetamines, ecstasy, were measured with the same questions and wordings in the two surveys. Multivariate logistic regressions controlling for sociodemographic characteristics (age, educational level, marital status and professional status) were performed. Analyses were conducted on the whole sample and stratified by age (18–29 and 30–44 years old) and gender. 45–64 years old data were not analysed due to limited numbers. Results Overall national estimates were similar for 9 out of the 10 examined measures. However, after adjustment, A-CASI provided higher use for most types of illicit drugs among the youngest men (adjusted odds ratio, or OR, of 1.64 [1.08–2.49] for cocaine, 1.62 [1.10–2.38] for ecstasy, 1.99 [1.17–3.37] for LSD, 2.17 [1.07–4.43] for heroin, and 2.48 [1.41–4.35] for amphetamines), whereas use amongst women was similar in CATI and A-CASI, except for LSD in the 30–44 age group (OR = 3.60 [1.64–7.89]). Reported alcohol misuse was higher with A-CASI, for all ages and genders. Conclusions Although differences in the results over the whole population were relatively small between the surveys, the effect of data collection mode seemed to vary according to age and gender. PMID:24465720

  18. Selective desulfurization of cysteine in the presence of Cys(Acm) in polypeptides obtained by native chemical ligation.

    PubMed

    Pentelute, Brad L; Kent, Stephen B H

    2007-02-15

    Increased versatility for the synthesis of proteins and peptides by native chemical ligation requires the ability to ligate at positions other than Cys. Here, we report that Raney nickel can be used under standard conditions for the selective desulfurization of Cys in the presence of Cys(Acm). This simple and practical tactic enables the more common Xaa-Ala junctions to be used as ligation sites for the chemical synthesis of Cys-containing peptides and proteins. [reaction: see text].

  19. Geologic, geotechnical, and geophysical properties of core from the Acme Fire-Pit-1 drill hole, Sheridan County, Wyoming

    USGS Publications Warehouse

    Collins, Donley S.

    1983-01-01

    A preliminary core study from the Acme Fire-Pit-1 drill hole, Sheridan County, Wyoming, revealed that the upper portion of the core had been baked by a fire confined to the underlying Monarch coal bed. The baked (clinkered) sediment above the Monarch coal bed was determined to have higher point-load strength values (greater than 2 MPa) than the sediment under the burned coal

  20. Segmentation of solid subregion of high grade gliomas in MRI images based on active contour model (ACM)

    NASA Astrophysics Data System (ADS)

    Seow, P.; Win, M. T.; Wong, J. H. D.; Abdullah, N. A.; Ramli, N.

    2016-03-01

    Gliomas are tumours arising from the interstitial tissue of the brain which are heterogeneous, infiltrative and possess ill-defined borders. Tumour subregions (e.g. solid enhancing part, edema and necrosis) are often used for tumour characterisation. Tumour demarcation into substructures facilitates glioma staging and provides essential information. Manual segmentation had several drawbacks that include laborious, time consuming, subjected to intra and inter-rater variability and hindered by diversity in the appearance of tumour tissues. In this work, active contour model (ACM) was used to segment the solid enhancing subregion of the tumour. 2D brain image acquisition data using 3T MRI fast spoiled gradient echo sequence in post gadolinium of four histologically proven high-grade glioma patients were obtained. Preprocessing of the images which includes subtraction and skull stripping were performed and then followed by ACM segmentation. The results of the automatic segmentation method were compared against the manual delineation of the tumour by a trainee radiologist. Both results were further validated by an experienced neuroradiologist and a brief quantitative evaluations (pixel area and difference ratio) were performed. Preliminary results of the clinical data showed the potential of ACM model in the application of fast and large scale tumour segmentation in medical imaging.

  1. TOPICAL REVIEW: A survey of signal processing algorithms in brain computer interfaces based on electrical brain signals

    NASA Astrophysics Data System (ADS)

    Bashashati, Ali; Fatourechi, Mehrdad; Ward, Rabab K.; Birch, Gary E.

    2007-06-01

    Brain computer interfaces (BCIs) aim at providing a non-muscular channel for sending commands to the external world using the electroencephalographic activity or other electrophysiological measures of the brain function. An essential factor in the successful operation of BCI systems is the methods used to process the brain signals. In the BCI literature, however, there is no comprehensive review of the signal processing techniques used. This work presents the first such comprehensive survey of all BCI designs using electrical signal recordings published prior to January 2006. Detailed results from this survey are presented and discussed. The following key research questions are addressed: (1) what are the key signal processing components of a BCI, (2) what signal processing algorithms have been used in BCIs and (3) which signal processing techniques have received more attention?

  2. Wireless Computing Architecture

    DTIC Science & Technology

    2009-07-01

    mechanisms are relevant to a broad spectrum of applications , but are particularly important to data broadcast in wireless distributed computing...significantly improve applications where reliable data broadcast is required. For example, unmanned aerial vehicles (UAVs) may use Rainbow to distribute ...68-74. 8. Dean, J., Ghemawat, S., “ MapReduce : simplified data processing on large clusters ”, Communications of the ACM, 51, 1, 2008, pp. 107-113

  3. How We Surveyed Doctors to Learn What They Want from Computers and Technology

    ERIC Educational Resources Information Center

    Bardyn, Tania; Young, Caroline; Lombardi, Lin C.

    2008-01-01

    Librarians at New York City's Bellevue Hospital Center needed to write a 3-year strategic plan that included technology data. In this article, they describe how they surveyed doctors and residents about their technology and internet use to determine what the Bellevue Medical Library needed to do in order to support those who deliver medical care.…

  4. Computer Based Instruction in Saudi Education: A Survey of Commercially Produced Software.

    ERIC Educational Resources Information Center

    Al-Saleh, Bader A.; Al-Debassi, Saleh M.

    This study addressed the status quo of instructional software produced by national Saudi Arabian software companies as well as the utilization of commercially produced software at selected 1-12 private schools in Riyadh, Saudi Arabia. Descriptive data from a survey of general managers of four major software producers are reported, as well as from…

  5. Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes

    PubMed Central

    2016-01-01

    Background The National Health Service (NHS) 70-item inpatient questionnaire surveys inpatients on their perceptions of their hospitalization experience. However, it imposes more burden on the patient than other similar surveys. The literature shows that computerized adaptive testing (CAT) based on item response theory can help shorten the item length of a questionnaire without compromising its precision. Objective Our aim was to investigate whether CAT can be (1) efficient with item reduction and (2) used with quick response (QR) codes scanned by mobile phones. Methods After downloading the 2008 inpatient survey data from the Picker Institute Europe website and analyzing the difficulties of this 70-item questionnaire, we used an author-made Excel program using the Rasch partial credit model to simulate 1000 patients’ true scores followed by a standard normal distribution. The CAT was compared to two other scenarios of answering all items (AAI) and the randomized selection method (RSM), as we investigated item length (efficiency) and measurement accuracy. The author-made Web-based CAT program for gathering patient feedback was effectively accessed from mobile phones by scanning the QR code. Results We found that the CAT can be more efficient for patients answering questions (ie, fewer items to respond to) than either AAI or RSM without compromising its measurement accuracy. A Web-based CAT inpatient survey accessed by scanning a QR code on a mobile phone was viable for gathering inpatient satisfaction responses. Conclusions With advances in technology, patients can now be offered alternatives for providing feedback about hospitalization satisfaction. This Web-based CAT is a possible option in health care settings for reducing the number of survey items, as well as offering an innovative QR code access. PMID:26935793

  6. Dynamic MRI-based computer aided diagnostic systems for early detection of kidney transplant rejection: A survey

    NASA Astrophysics Data System (ADS)

    Mostapha, Mahmoud; Khalifa, Fahmi; Alansary, Amir; Soliman, Ahmed; Gimel'farb, Georgy; El-Baz, Ayman

    2013-10-01

    Early detection of renal transplant rejection is important to implement appropriate medical and immune therapy in patients with transplanted kidneys. In literature, a large number of computer-aided diagnostic (CAD) systems using different image modalities, such as ultrasound (US), magnetic resonance imaging (MRI), computed tomography (CT), and radionuclide imaging, have been proposed for early detection of kidney diseases. A typical CAD system for kidney diagnosis consists of a set of processing steps including: motion correction, segmentation of the kidney and/or its internal structures (e.g., cortex, medulla), construction of agent kinetic curves, functional parameter estimation, diagnosis, and assessment of the kidney status. In this paper, we survey the current state-of-the-art CAD systems that have been developed for kidney disease diagnosis using dynamic MRI. In addition, the paper addresses several challenges that researchers face in developing efficient, fast and reliable CAD systems for the early detection of kidney diseases.

  7. A Prediction of the Damping Properties of Hindered Phenol AO-60/polyacrylate Rubber (AO-60/ACM) Composites through Molecular Dynamics Simulation

    NASA Astrophysics Data System (ADS)

    Yang, Da-Wei; Zhao, Xiu-Ying; Zhang, Geng; Li, Qiang-Guo; Wu, Si-Zhu

    2016-05-01

    Molecule dynamics (MD) simulation, a molecular-level method, was applied to predict the damping properties of AO-60/polyacrylate rubber (AO-60/ACM) composites before experimental measures were performed. MD simulation results revealed that two types of hydrogen bond, namely, type A (AO-60) -OH•••O=C- (ACM), type B (AO-60) - OH•••O=C- (AO-60) were formed. Then, the AO-60/ACM composites were fabricated and tested to verify the accuracy of the MD simulation through dynamic mechanical thermal analysis (DMTA). DMTA results showed that the introduction of AO-60 could remarkably improve the damping properties of the composites, including the increase of glass transition temperature (Tg) alongside with the loss factor (tan δ), also indicating the AO-60/ACM(98/100) had the best damping performance amongst the composites which verified by the experimental.

  8. Drug Metabolism in Preclinical Drug Development: A Survey of the Discovery Process, Toxicology, and Computational Tools.

    PubMed

    Issa, Naiem T; Wathieu, Henri; Ojo, Abiola; Byers, Stephen W; Dakshanamurthy, Sivanesan

    2017-03-15

    Increased R & D spending and high failure rates exist in drug development, due in part to inadequate prediction of drug metabolism and its consequences in the human body. Hence, there is a need for computational methods to supplement and complement current biological assessment strategies. In this review, we provide an overview of drug metabolism in pharmacology, and discuss the current in vitro and in vivo strategies for assessing drug metabolism in preclinical drug development. We highlight computational tools available to the scientific community for the in silico prediction of drug metabolism, and examine how these tools have been implemented to produce drug-target signatures relevant to metabolic routes. Computational workflows that assess drug metabolism and its toxicological and pharmacokinetic effects, such as by applying the adverse outcome pathway framework for risk assessment, may improve the efficiency and speed of preclinical drug development.

  9. A functional collagen adhesin gene, acm, in clinical isolates of Enterococcus faecium correlates with the recent success of this emerging nosocomial pathogen.

    PubMed

    Nallapareddy, Sreedhar R; Singh, Kavindra V; Okhuysen, Pablo C; Murray, Barbara E

    2008-09-01

    Enterococcus faecium recently evolved from a generally avirulent commensal into a multidrug-resistant health care-associated pathogen causing difficult-to-treat infections, but little is known about the factors responsible for this change. We previously showed that some E. faecium strains express a cell wall-anchored collagen adhesin, Acm. Here we analyzed 90 E. faecium isolates (99% acm(+)) and found that the Acm protein was detected predominantly in clinically derived isolates, while the acm gene was present as a transposon-interrupted pseudogene in 12 of 47 isolates of nonclinical origin. A highly significant association between clinical (versus fecal or food) origin and collagen adherence (P Acm detected by whole-cell enzyme-linked immunosorbent assay and flow cytometry. Thirty-seven of 41 sera from patients with E. faecium infections showed reactivity with recombinant Acm, while only 4 of 30 community and hospitalized patient control group sera reacted (P Acm were present in all 14 E. faecium endocarditis patient sera. Although pulsed-field gel electrophoresis indicated that multiple strains expressed collagen adherence, multilocus sequence typing demonstrated that the majority of collagen-adhering isolates, as well as 16 of 17 endocarditis isolates, are part of the hospital-associated E. faecium genogroup referred to as clonal complex 17 (CC17), which has emerged globally. Taken together, our findings support the hypothesis that Acm has contributed to the emergence of E. faecium and CC17 in nosocomial infections.

  10. Photo-Modeling and Cloud Computing. Applications in the Survey of Late Gothic Architectural Elements

    NASA Astrophysics Data System (ADS)

    Casu, P.; Pisu, C.

    2013-02-01

    This work proposes the application of the latest methods of photo-modeling to the study of Gothic architecture in Sardinia. The aim is to consider the versatility and ease of use of such documentation tools in order to study architecture and its ornamental details. The paper illustrates a procedure of integrated survey and restitution, with the purpose to obtain an accurate 3D model of some gothic portals. We combined the contact survey and the photographic survey oriented to the photo-modelling. The software used is 123D Catch by Autodesk an Image Based Modelling (IBM) system available free. It is a web-based application that requires a few simple steps to produce a mesh from a set of not oriented photos. We tested the application on four portals, working at different scale of detail: at first the whole portal and then the different architectural elements that composed it. We were able to model all the elements and to quickly extrapolate simple sections, in order to make a comparison between the moldings, highlighting similarities and differences. Working in different sites at different scale of detail, have allowed us to test the procedure under different conditions of exposure, sunshine, accessibility, degradation of surface, type of material, and with different equipment and operators, showing if the final result could be affected by these factors. We tested a procedure, articulated in a few repeatable steps, that can be applied, with the right corrections and adaptations, to similar cases and/or larger or smaller elements.

  11. A Survey of Computer Use in Associate Degree Programs in Engineering Technology.

    ERIC Educational Resources Information Center

    Cunningham, Pearley

    As part of its annual program review process, the Department of Engineering Technology at the Community College of Allegheny County, in Pennsylvania, conducted a study of computer usage in community college engineering technology programs across the nation. Specifically, the study sought to determine the types of software, Internet access, average…

  12. Using Computers in Distance Study: Results of a Survey amongst Disabled Distance Students.

    ERIC Educational Resources Information Center

    Ommerborn, Rainer; Schuemer, Rudolf

    2002-01-01

    In the euphoria about new technologies in distance education there exists the danger of not sufficiently considering how ever increasing "virtualization" may exclude some student groups. An explorative study was conducted that asked disabled students about their experiences with using computers and the Internet. Overall, those questioned…

  13. Computational needs survey of NASA automation and robotics missions. Volume 2: Appendixes

    NASA Technical Reports Server (NTRS)

    Davis, Gloria J.

    1991-01-01

    NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is the fact that mission computing requirements are frequency unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. Here, NASA, industry and academic communities are provided with a preliminary set of advanced mission computational processing requirements of automation and robotics (A and R) systems. The results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implemented capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Here, appendixes are provided.

  14. A Survey of Students Participating in a Computer-Assisted Education Programme

    ERIC Educational Resources Information Center

    Yel, Elif Binboga; Korhan, Orhan

    2015-01-01

    This paper mainly examines anthropometric data, data regarding the habits, experiences, and attitudes of the students about their tablet/laptop/desktop computer use, in addition to self-reported musculoskeletal discomfort levels and frequencies of students participating in a tablet-assisted interactive education programme. A two-part questionnaire…

  15. A Survey of Knowledge Management Skills Acquisition in an Online Team-Based Distributed Computing Course

    ERIC Educational Resources Information Center

    Thomas, Jennifer D. E.

    2007-01-01

    This paper investigates students' perceptions of their acquisition of knowledge management skills, namely thinking and team-building skills, resulting from the integration of various resources and technologies into an entirely team-based, online upper level distributed computing (DC) information systems (IS) course. Results seem to indicate that…

  16. COMPUTER-ASSISTED INSTRUCTION, A SURVEY OF THE LITERATURE. SECOND EDITION

    DTIC Science & Technology

    A selective review of 242 documents related to computer-assisted instruction (CAI). Principal headings: CAI Reviews and Bibliographies, Applications...of CAI, Major CAI Centers, CAI Systems Studies, CAI Languages, Instructional Theory, and Program Preparation and Evaluation. An appendix lists 140 CAI programs. The review will be updated semiannually.

  17. Effects of Gender on Computer-Mediated Communication: A Survey of University Faculty

    ERIC Educational Resources Information Center

    Valenziano, Laura

    2007-01-01

    The influence of gender on computer-mediated communication is a research area with tremendous growth. This study sought to determine what gender effects exist in email communication between professors and students. The study also explored the amount of lying and misinterpretation that occurs through online communication. The study results indicate…

  18. Using Computers in Distance Study: Results of a Survey amongst Disabled Distance Students.

    ERIC Educational Resources Information Center

    Ommerborn, Rainer; Schuemer, Rudolf

    A study at Germany's FernUniversitat sent a questionnaire to 300 enrolled distance education students (mostly adult, mostly part-time) who labeled themselves as severely disabled or chronically ill (about 2 percent of students), asking them about the types of their disabilities and their attitudes toward computer-assisted learning and online…

  19. A Survey of Computational Tools to Analyze and Interpret Whole Exome Sequencing Data

    PubMed Central

    Robinson, William A.

    2016-01-01

    Whole Exome Sequencing (WES) is the application of the next-generation technology to determine the variations in the exome and is becoming a standard approach in studying genetic variants in diseases. Understanding the exomes of individuals at single base resolution allows the identification of actionable mutations for disease treatment and management. WES technologies have shifted the bottleneck in experimental data production to computationally intensive informatics-based data analysis. Novel computational tools and methods have been developed to analyze and interpret WES data. Here, we review some of the current tools that are being used to analyze WES data. These tools range from the alignment of raw sequencing reads all the way to linking variants to actionable therapeutics. Strengths and weaknesses of each tool are discussed for the purpose of helping researchers make more informative decisions on selecting the best tools to analyze their WES data. PMID:28070503

  20. A Survey of Computational Tools to Analyze and Interpret Whole Exome Sequencing Data.

    PubMed

    Hintzsche, Jennifer D; Robinson, William A; Tan, Aik Choon

    2016-01-01

    Whole Exome Sequencing (WES) is the application of the next-generation technology to determine the variations in the exome and is becoming a standard approach in studying genetic variants in diseases. Understanding the exomes of individuals at single base resolution allows the identification of actionable mutations for disease treatment and management. WES technologies have shifted the bottleneck in experimental data production to computationally intensive informatics-based data analysis. Novel computational tools and methods have been developed to analyze and interpret WES data. Here, we review some of the current tools that are being used to analyze WES data. These tools range from the alignment of raw sequencing reads all the way to linking variants to actionable therapeutics. Strengths and weaknesses of each tool are discussed for the purpose of helping researchers make more informative decisions on selecting the best tools to analyze their WES data.

  1. Computing the Effects of Strain on Electronic States: A Survey of Methods and Issues

    DTIC Science & Technology

    2012-12-01

    Naval Research Laboratory (NRL) and the NEMO5 code (from Drs. Gerhard Klimeck and James Fonseca, Purdue University) to help in our understanding of...optoelectronics (7, 8). However, even where such effects are absent or negligible, strain still has important effects on the electronic properties ...the community of researchers with interest in accurately computing the electronic structure properties and, in particular, the excited state

  2. Computational approaches for detecting protein complexes from protein interaction networks: a survey

    PubMed Central

    2010-01-01

    Background Most proteins form macromolecular complexes to perform their biological functions. However, experimentally determined protein complex data, especially of those involving more than two protein partners, are relatively limited in the current state-of-the-art high-throughput experimental techniques. Nevertheless, many techniques (such as yeast-two-hybrid) have enabled systematic screening of pairwise protein-protein interactions en masse. Thus computational approaches for detecting protein complexes from protein interaction data are useful complements to the limited experimental methods. They can be used together with the experimental methods for mapping the interactions of proteins to understand how different proteins are organized into higher-level substructures to perform various cellular functions. Results Given the abundance of pairwise protein interaction data from high-throughput genome-wide experimental screenings, a protein interaction network can be constructed from protein interaction data by considering individual proteins as the nodes, and the existence of a physical interaction between a pair of proteins as a link. This binary protein interaction graph can then be used for detecting protein complexes using graph clustering techniques. In this paper, we review and evaluate the state-of-the-art techniques for computational detection of protein complexes, and discuss some promising research directions in this field. Conclusions Experimental results with yeast protein interaction data show that the interaction subgraphs discovered by various computational methods matched well with actual protein complexes. In addition, the computational approaches have also improved in performance over the years. Further improvements could be achieved if the quality of the underlying protein interaction data can be considered adequately to minimize the undesirable effects from the irrelevant and noisy sources, and the various biological evidences can be better

  3. A survey of computational methods and error rate estimation procedures for peptide and protein identification in shotgun proteomics

    PubMed Central

    Nesvizhskii, Alexey I.

    2010-01-01

    This manuscript provides a comprehensive review of the peptide and protein identification process using tandem mass spectrometry (MS/MS) data generated in shotgun proteomic experiments. The commonly used methods for assigning peptide sequences to MS/MS spectra are critically discussed and compared, from basic strategies to advanced multi-stage approaches. A particular attention is paid to the problem of false-positive identifications. Existing statistical approaches for assessing the significance of peptide to spectrum matches are surveyed, ranging from single-spectrum approaches such as expectation values to global error rate estimation procedures such as false discovery rates and posterior probabilities. The importance of using auxiliary discriminant information (mass accuracy, peptide separation coordinates, digestion properties, and etc.) is discussed, and advanced computational approaches for joint modeling of multiple sources of information are presented. This review also includes a detailed analysis of the issues affecting the interpretation of data at the protein level, including the amplification of error rates when going from peptide to protein level, and the ambiguities in inferring the identifies of sample proteins in the presence of shared peptides. Commonly used methods for computing protein-level confidence scores are discussed in detail. The review concludes with a discussion of several outstanding computational issues. PMID:20816881

  4. GTE: a new FFT based software to compute terrain correction on airborne gravity surveys in spherical approximation.

    NASA Astrophysics Data System (ADS)

    Capponi, Martina; Sampietro, Daniele; Sansò, Fernando

    2016-04-01

    The computation of the vertical attraction due to the topographic masses (Terrain Correction) is still a matter of study both in geodetic as well as in geophysical applications. In fact it is required in high precision geoid estimation by the remove-restore technique and it is used to isolate the gravitational effect of anomalous masses in geophysical exploration. This topographical effect can be evaluated from the knowledge of a Digital Terrain Model in different ways: e.g. by means of numerical integration, by prisms, tesseroids, polyedra or Fast Fourier Transform (FFT) techniques. The increasing resolution of recently developed digital terrain models, the increasing number of observation points due to extensive use of airborne gravimetry and the increasing accuracy of gravity data represents nowadays major issues for the terrain correction computation. Classical methods such as prism or point masses approximations are indeed too slow while Fourier based techniques are usually too approximate for the required accuracy. In this work a new software, called Gravity Terrain Effects (GTE), developed in order to guarantee high accuracy and fast computation of terrain corrections is presented. GTE has been thought expressly for geophysical applications allowing the computation not only of the effect of topographic and bathymetric masses but also those due to sedimentary layers or to the Earth crust-mantle discontinuity (the so called Moho). In the present contribution we summarize the basic theory of the software and its practical implementation. Basically the GTE software is based on a new algorithm which, by exploiting the properties of the Fast Fourier Transform, allows to quickly compute the terrain correction, in spherical approximation, at ground or airborne level. Some tests to prove its performances are also described showing GTE capability to compute high accurate terrain corrections in a very short time. Results obtained for a real airborne survey with GTE

  5. Computational analysis in epilepsy neuroimaging: A survey of features and methods

    PubMed Central

    Kini, Lohith G.; Gee, James C.; Litt, Brian

    2016-01-01

    Epilepsy affects 65 million people worldwide, a third of whom have seizures that are resistant to anti-epileptic medications. Some of these patients may be amenable to surgical therapy or treatment with implantable devices, but this usually requires delineation of discrete structural or functional lesion(s), which is challenging in a large percentage of these patients. Advances in neuroimaging and machine learning allow semi-automated detection of malformations of cortical development (MCDs), a common cause of drug resistant epilepsy. A frequently asked question in the field is what techniques currently exist to assist radiologists in identifying these lesions, especially subtle forms of MCDs such as focal cortical dysplasia (FCD) Type I and low grade glial tumors. Below we introduce some of the common lesions encountered in patients with epilepsy and the common imaging findings that radiologists look for in these patients. We then review and discuss the computational techniques introduced over the past 10 years for quantifying and automatically detecting these imaging findings. Due to large variations in the accuracy and implementation of these studies, specific techniques are traditionally used at individual centers, often guided by local expertise, as well as selection bias introduced by the varying prevalence of specific patient populations in different epilepsy centers. We discuss the need for a multi-institutional study that combines features from different imaging modalities as well as computational techniques to definitively assess the utility of specific automated approaches to epilepsy imaging. We conclude that sharing and comparing these different computational techniques through a common data platform provides an opportunity to rigorously test and compare the accuracy of these tools across different patient populations and geographical locations. We propose that these kinds of tools, quantitative imaging analysis methods and open data platforms for

  6. Computational analysis in epilepsy neuroimaging: A survey of features and methods.

    PubMed

    Kini, Lohith G; Gee, James C; Litt, Brian

    2016-01-01

    Epilepsy affects 65 million people worldwide, a third of whom have seizures that are resistant to anti-epileptic medications. Some of these patients may be amenable to surgical therapy or treatment with implantable devices, but this usually requires delineation of discrete structural or functional lesion(s), which is challenging in a large percentage of these patients. Advances in neuroimaging and machine learning allow semi-automated detection of malformations of cortical development (MCDs), a common cause of drug resistant epilepsy. A frequently asked question in the field is what techniques currently exist to assist radiologists in identifying these lesions, especially subtle forms of MCDs such as focal cortical dysplasia (FCD) Type I and low grade glial tumors. Below we introduce some of the common lesions encountered in patients with epilepsy and the common imaging findings that radiologists look for in these patients. We then review and discuss the computational techniques introduced over the past 10 years for quantifying and automatically detecting these imaging findings. Due to large variations in the accuracy and implementation of these studies, specific techniques are traditionally used at individual centers, often guided by local expertise, as well as selection bias introduced by the varying prevalence of specific patient populations in different epilepsy centers. We discuss the need for a multi-institutional study that combines features from different imaging modalities as well as computational techniques to definitively assess the utility of specific automated approaches to epilepsy imaging. We conclude that sharing and comparing these different computational techniques through a common data platform provides an opportunity to rigorously test and compare the accuracy of these tools across different patient populations and geographical locations. We propose that these kinds of tools, quantitative imaging analysis methods and open data platforms for

  7. SAM 2.1—A computer program for plotting and formatting surveying data for estimating peak discharges by the slope-area method

    USGS Publications Warehouse

    Hortness, J.E.

    2004-01-01

    The U.S. Geological Survey (USGS) measures discharge in streams using several methods. However, measurement of peak discharges is often impossible or impractical due to difficult access, inherent danger of making measurements during flood events, and timing often associated with flood events. Thus, many peak discharge values often are calculated after the fact by use of indirect methods. The most common indirect method for estimating peak dis- charges in streams is the slope-area method. This, like other indirect methods, requires measuring the flood profile through detailed surveys. Processing the survey data for efficient entry into computer streamflow models can be time demanding; SAM 2.1 is a program designed to expedite that process. The SAM 2.1 computer program is designed to be run in the field on a portable computer. The program processes digital surveying data obtained from an electronic surveying instrument during slope- area measurements. After all measurements have been completed, the program generates files to be input into the SAC (Slope-Area Computation program; Fulford, 1994) or HEC-RAS (Hydrologic Engineering Center-River Analysis System; Brunner, 2001) computer streamflow models so that an estimate of the peak discharge can be calculated.

  8. Survey of computed tomography technique and radiation dose in Sudanese hospitals.

    PubMed

    Suliman, I I; Abdalla, S E; Ahmed, Nada A; Galal, M A; Salih, Isam

    2011-12-01

    The purpose of this study was to survey technique and radiation absorbed dose in CT examinations of adult in Sudan and to compare the results with the reference dose levels. Questionnaire forms were completed in nine hospitals and a sample of 445 CT examinations in patients. Information on patient, procedure, scanner, and technique for common CT examinations were collected. For each facility, the radiation absorbed dose was measured on CT dose phantom measuring 16 cm (head) and 32 cm (body) in diameter and was used to calculate the normalized CT air kerma index. Volume CT air kerma index (CVOL), CT air kerma-length product (PKL,CT) values were calculated using the measured normalized CT air kerma index and questionnaire information. The effective dose, E estimates was determined by using PKL,CT measurements and appropriate normalized coefficients. Assuming the sample to offer a fair representative picture of CT practice patterns in Sudan, the mean CVOL and PKL,CT values were comparable or below the reference doses: 65 mGy and 758 mGy cm, respectively at head CT; 11.5 mGy and 327 mGy cm, respectively at chest CT; 11.6 mGy and 437 mGy cm, respectively at abdominal CT; and 11.0 mGy and 264 mGy cm, respectively at pelvis CT. Estimated effective doses were 1.6, 4.6, 6.6 and 4.0 mSv, respectively. The study offered a first national dose survey and provided a mean for quality control and optimization of CT practice within the country.

  9. Biomedical informatics for computer-aided decision support systems: a survey.

    PubMed

    Belle, Ashwin; Kon, Mark A; Najarian, Kayvan

    2013-01-01

    The volumes of current patient data as well as their complexity make clinical decision making more challenging than ever for physicians and other care givers. This situation calls for the use of biomedical informatics methods to process data and form recommendations and/or predictions to assist such decision makers. The design, implementation, and use of biomedical informatics systems in the form of computer-aided decision support have become essential and widely used over the last two decades. This paper provides a brief review of such systems, their application protocols and methodologies, and the future challenges and directions they suggest.

  10. Biomedical Informatics for Computer-Aided Decision Support Systems: A Survey

    PubMed Central

    Belle, Ashwin; Kon, Mark A.; Najarian, Kayvan

    2013-01-01

    The volumes of current patient data as well as their complexity make clinical decision making more challenging than ever for physicians and other care givers. This situation calls for the use of biomedical informatics methods to process data and form recommendations and/or predictions to assist such decision makers. The design, implementation, and use of biomedical informatics systems in the form of computer-aided decision support have become essential and widely used over the last two decades. This paper provides a brief review of such systems, their application protocols and methodologies, and the future challenges and directions they suggest. PMID:23431259

  11. Computer simulations and literature survey of continuously variable transmissions for use in buses. Final report, June-December 1981

    SciTech Connect

    Barrows, T.

    1981-12-01

    Numerous studies have been conducted on the concept of flywheel energy storage for buses. Flywheel systems require a continuously variable transmission (CVT) of some type to transmit power between the flywheel and the drive wheels. However, a CVT can provide some fuel economy benefit with or without an energy-storing flywheel, which is the focus of this report. This computer study and literature review is intended to provide insight into the potential applicability of CVTs to buses. It has been suggested that such transmissions may be of interest for two reasons: (1) simple substitution of a CVT in the place of a conventional transmission may offer fuel savings by allowing the engine to operate at its most efficient speed and (2) the combination of a CVT and a flywheel allows regenerative braking, in which the vehicle kinetic energy during deceleration is captured for later re-use. This computer study and literature survey considers several examples of CVTs in buses, both with and without flywheel energy storage, and finds a predicted energy savings of 10 to 32 percent. The analysis focuses on the use of a CVT alone, without regenerative braking. Computer simulations are made to compute the fuel use of a bus with two different CVTs - one with a ratio range of 6, and the other of an infinite ratio range. For the former, assuming an efficiency of 85 percent, a fuel savings of 12 to 22 percent is predicted, depending upon the driving cycle. It is shown that a substantial part of this saving arises from the simple fact that the accessories operate at a lower speed. For this reason, a separate study of accessory speed control has been conducted, yielding a predicted fuel saving of as high as 17 percent. The report concludes that such accessory speed control may represent an attractive way to reduce fuel consumption. In comparison with the CVT, the concept is more compatible with present bus technology.

  12. Incorporating Colour Information for Computer-Aided Diagnosis of Melanoma from Dermoscopy Images: A Retrospective Survey and Critical Analysis

    PubMed Central

    Drew, Mark S.

    2016-01-01

    Cutaneous melanoma is the most life-threatening form of skin cancer. Although advanced melanoma is often considered as incurable, if detected and excised early, the prognosis is promising. Today, clinicians use computer vision in an increasing number of applications to aid early detection of melanoma through dermatological image analysis (dermoscopy images, in particular). Colour assessment is essential for the clinical diagnosis of skin cancers. Due to this diagnostic importance, many studies have either focused on or employed colour features as a constituent part of their skin lesion analysis systems. These studies range from using low-level colour features, such as simple statistical measures of colours occurring in the lesion, to availing themselves of high-level semantic features such as the presence of blue-white veil, globules, or colour variegation in the lesion. This paper provides a retrospective survey and critical analysis of contributions in this research direction. PMID:28096807

  13. Self-report computer-based survey of technology use by people with intellectual and developmental disabilities.

    PubMed

    Tanis, Emily Shea; Palmer, Susan; Wehmeyer, Michael; Davies, Daniel K; Stock, Steven E; Lobb, Kathy; Bishop, Barbara

    2012-02-01

    Advancements of technologies in the areas of mobility, hearing and vision, communication, and daily living for people with intellectual and developmental disabilities has the potential to greatly enhance independence and self-determination. Previous research, however, suggests that there is a technological divide with regard to the use of such technologies by people with intellectual and developmental disabilities when compared with the use reported by the general public. To provide current information with regard to technology use by people with intellectual and developmental disabilities by examining the technology needs, use, and barriers to such use experienced by 180 adults with intellectual and developmental disabilities, we used QuestNet, a self-directed computer survey program. Results suggest that although there has been progress in technology acquisition and use by people with intellectual and developmental disabilities, an underutilization of technologies across the population remains.

  14. A Self-Report Computer-Based Survey of Technology Use by People with Intellectual and Developmental Disabilities

    PubMed Central

    Tanis, Emily Shea; Palmer, Susan B.; Wehmeyer, Michael L.; Davies, Danial; Stock, Steven; Lobb, Kathy; Bishop, Barbara

    2014-01-01

    Advancements of technologies in the areas of mobiliy, hearing and vision, communication, and daily living for people with intellectual and developmental disabilities (IDD) has the potential to greatly enhance indepencence and self-determination. Previous research, however, suggests that there is a “technological divide” with regard to the use of such technologies by people with IDD when compared with the general public. The present study sought to provide current information with regard to technology use by people with IDD by examining the technology needs, use, and barriers to such use experienced by 180 adults with IDD through QuestNet, a self-directed computer survey program. The study findings suggest that although there has been progress in technology acquisition and use by people IDD, yet there remains an underutilization of technologies across the population. PMID:22316226

  15. Creating a New Model Curriculum: A Rationale for "Computing Curricula 1990".

    ERIC Educational Resources Information Center

    Bruce, Kim B.

    1991-01-01

    Describes a model for the design of undergraduate curricula in the discipline of computing that was developed by the ACM/IEEE (Association for Computing Machinery/Institute of Electrical and Electronics Engineers) Computer Society Joint Curriculum Task Force. Institutional settings and structures in which computing degrees are awarded are…

  16. A review of brain-computer interface games and an opinion survey from researchers, developers and users.

    PubMed

    Ahn, Minkyu; Lee, Mijin; Choi, Jinyoung; Jun, Sung Chan

    2014-08-11

    In recent years, research on Brain-Computer Interface (BCI) technology for healthy users has attracted considerable interest, and BCI games are especially popular. This study reviews the current status of, and describes future directions, in the field of BCI games. To this end, we conducted a literature search and found that BCI control paradigms using electroencephalographic signals (motor imagery, P300, steady state visual evoked potential and passive approach reading mental state) have been the primary focus of research. We also conducted a survey of nearly three hundred participants that included researchers, game developers and users around the world. From this survey, we found that all three groups (researchers, developers and users) agreed on the significant influence and applicability of BCI and BCI games, and they all selected prostheses, rehabilitation and games as the most promising BCI applications. User and developer groups tended to give low priority to passive BCI and the whole head sensor array. Developers gave higher priorities to "the easiness of playing" and the "development platform" as important elements for BCI games and the market. Based on our assessment, we discuss the critical point at which BCI games will be able to progress from their current stage to widespread marketing to consumers. In conclusion, we propose three critical elements important for expansion of the BCI game market: standards, gameplay and appropriate integration.

  17. A Review of Brain-Computer Interface Games and an Opinion Survey from Researchers, Developers and Users

    PubMed Central

    Ahn, Minkyu; Lee, Mijin; Choi, Jinyoung; Jun, Sung Chan

    2014-01-01

    In recent years, research on Brain-Computer Interface (BCI) technology for healthy users has attracted considerable interest, and BCI games are especially popular. This study reviews the current status of, and describes future directions, in the field of BCI games. To this end, we conducted a literature search and found that BCI control paradigms using electroencephalographic signals (motor imagery, P300, steady state visual evoked potential and passive approach reading mental state) have been the primary focus of research. We also conducted a survey of nearly three hundred participants that included researchers, game developers and users around the world. From this survey, we found that all three groups (researchers, developers and users) agreed on the significant influence and applicability of BCI and BCI games, and they all selected prostheses, rehabilitation and games as the most promising BCI applications. User and developer groups tended to give low priority to passive BCI and the whole head sensor array. Developers gave higher priorities to “the easiness of playing” and the “development platform” as important elements for BCI games and the market. Based on our assessment, we discuss the critical point at which BCI games will be able to progress from their current stage to widespread marketing to consumers. In conclusion, we propose three critical elements important for expansion of the BCI game market: standards, gameplay and appropriate integration. PMID:25116904

  18. TOPICAL REVIEW: The nucleation mechanism of protein folding: a survey of computer simulation studies

    NASA Astrophysics Data System (ADS)

    Faísca, Patrícia F. N.

    2009-09-01

    The nucleation mechanism of protein folding, originally proposed by Baldwin in the early 1970s, was firstly observed by Shakhnovich and co-workers two decades later in the context of Monte Carlo simulations of a simple lattice model. At about the same time the extensive use of phi-value analysis provided the first experimental evidence that the folding of Chymotrypsin-inhibitor 2, a small single-domain protein, which folds with two-state kinetics, is also driven by a nucleation mechanism. Since then, the nucleation mechanism is generally considered the most common form of folding mechanism amongst two-state proteins. However, recent experimental data has put forward the idea that this may not necessarily be so, since the accuracy of the experimentally determined phi values, which are used to identify the critical (i.e. nucleating) residues, is typically poor. Here, we provide a survey of in silico results on the nucleation mechanism, ranging from simple lattice Monte Carlo to more sophisticated off-lattice molecular dynamics simulations, and discuss them in light of experimental data.

  19. Superfund Record of Decision (EPA Region 5): Acme Solvent Reclaiming, Winnebago County, IL. (Second remedial action), December 1990

    SciTech Connect

    Not Available

    1990-12-31

    The 20-acre Acme Solvent Reclaiming site is a former industrial disposal site in Winnebago County, Illinois. Land use in the area is mixed agricultural and residential. From 1960 to 1973, Acme Solvent Reclaiming disposed of paints, oils, and still bottoms onsite from its solvent reclamation plant. Wastes were dumped into depressions created from previous quarrying and landscaping operations, and empty drums also were stored onsite. State investigations in 1981 identified elevated levels of chlorinated organic compounds in ground water. A 1985 Record of Decision (ROD) provided for excavation and onsite incineration of 26,000 cubic yards of contaminated soil and sludge, supplying home carbon treatment units to affected residences, and further study of ground water and bedrock. During illegal removal actions taken by PRPs in 1986, 40,000 tons of soil and sludge were removed from the site. The selected remedial action for the site includes excavating and treating 6,000 tons of soil and sludge from two waste areas, using low-temperature thermal stripping; treating residuals using solidification, if necessary, followed by onsite or offsite disposal; treating the remaining contaminated soil and possibly bedrock using soil/bedrock vapor extraction; consolidating the remaining contaminated soil onsite with any treatment residuals, followed by capping; incinerating offsite 8,000 gallons of liquids and sludge from two remaining tanks, and disposing of the tanks offsite; providing an alternate water supply to residents with contaminated wells; pumping and onsite treatment of VOC-contaminated ground water.

  20. Insights into Degron Recognition by APC/C Coactivators from the Structure of an Acm1-Cdh1 Complex

    PubMed Central

    He, Jun; Chao, William C.H.; Zhang, Ziguo; Yang, Jing; Cronin, Nora; Barford, David

    2013-01-01

    Summary The anaphase-promoting complex/cyclosome (APC/C) regulates sister chromatid segregation and the exit from mitosis. Selection of most APC/C substrates is controlled by coactivator subunits (either Cdc20 or Cdh1) that interact with substrate destruction motifs—predominantly the destruction (D) box and KEN box degrons. How coactivators recognize D box degrons and how this is inhibited by APC/C regulatory proteins is not defined at the atomic level. Here, from the crystal structure of S. cerevisiae Cdh1 in complex with its specific inhibitor Acm1, which incorporates D and KEN box pseudosubstrate motifs, we describe the molecular basis for D box recognition. Additional interactions between Acm1 and Cdh1 identify a third protein-binding site on Cdh1 that is likely to confer coactivator-specific protein functions including substrate association. We provide a structural rationalization for D box and KEN box recognition by coactivators and demonstrate that many noncanonical APC/C degrons bind APC/C coactivators at the D box coreceptor. PMID:23707760

  1. Accurate treatments of electrostatics for computer simulations of biological systems: A brief survey of developments and existing problems

    NASA Astrophysics Data System (ADS)

    Yi, Sha-Sha; Pan, Cong; Hu, Zhong-Han

    2015-12-01

    Modern computer simulations of biological systems often involve an explicit treatment of the complex interactions among a large number of molecules. While it is straightforward to compute the short-ranged Van der Waals interaction in classical molecular dynamics simulations, it has been a long-lasting issue to develop accurate methods for the longranged Coulomb interaction. In this short review, we discuss three types of methodologies for the accurate treatment of electrostatics in simulations of explicit molecules: truncation-type methods, Ewald-type methods, and mean-field-type methods. Throughout the discussion, we brief the formulations and developments of these methods, emphasize the intrinsic connections among the three types of methods, and focus on the existing problems which are often associated with the boundary conditions of electrostatics. This brief survey is summarized with a short perspective on future trends along the method developments and applications in the field of biological simulations. Project supported by the National Natural Science Foundation of China (Grant Nos. 91127015 and 21522304) and the Open Project from the State Key Laboratory of Theoretical Physics, and the Innovation Project from the State Key Laboratory of Supramolecular Structure and Materials.

  2. Parallel and Distributed Computing Combinatorial Algorithms

    DTIC Science & Technology

    1993-10-01

    FUPNDKC %2,•, PARALLEL AND DISTRIBUTED COMPUTING COMBINATORIAL ALGORITHMS 6. AUTHOR(S) 2304/DS F49620-92-J-0125 DR. LEIGHTON 7 PERFORMING ORGANIZATION NAME...on several problems involving parallel and distributed computing and combinatorial optimization. This research is reported in the numerous papers that...network decom- position. In Proceedings of the Eleventh Annual ACM Symposium on Principles of Distributed Computing , August 1992. [15] B. Awerbuch, B

  3. ACM-based automatic liver segmentation from 3-D CT images by combining multiple atlases and improved mean-shift techniques.

    PubMed

    Ji, Hongwei; He, Jiangping; Yang, Xin; Deklerck, Rudi; Cornelis, Jan

    2013-05-01

    In this paper, we present an autocontext model(ACM)-based automatic liver segmentation algorithm, which combines ACM, multiatlases, and mean-shift techniques to segment liver from 3-D CT images. Our algorithm is a learning-based method and can be divided into two stages. At the first stage, i.e., the training stage, ACM is performed to learn a sequence of classifiers in each atlas space (based on each atlas and other aligned atlases). With the use of multiple atlases, multiple sequences of ACM-based classifiers are obtained. At the second stage, i.e., the segmentation stage, the test image will be segmented in each atlas space by applying each sequence of ACM-based classifiers. The final segmentation result will be obtained by fusing segmentation results from all atlas spaces via a multiclassifier fusion technique. Specially, in order to speed up segmentation, given a test image, we first use an improved mean-shift algorithm to perform over-segmentation and then implement the region-based image labeling instead of the original inefficient pixel-based image labeling. The proposed method is evaluated on the datasets of MICCAI 2007 liver segmentation challenge. The experimental results show that the average volume overlap error and the average surface distance achieved by our method are 8.3% and 1.5 m, respectively, which are comparable to the results reported in the existing state-of-the-art work on liver segmentation.

  4. A survey and taxonomy on energy efficient resource allocation techniques for cloud computing systems

    SciTech Connect

    Hameed, Abdul; Khoshkbarforoushha, Alireza; Ranjan, Rajiv; Jayaraman, Prem Prakash; Kolodziej, Joanna; Balaji, Pavan; Zeadally, Sherali; Malluhi, Qutaibah Marwan; Tziritas, Nikos; Vishnu, Abhinav; Khan, Samee U.; Zomaya, Albert

    2014-06-06

    In a cloud computing paradigm, energy efficient allocation of different virtualized ICT resources (servers, storage disks, and networks, and the like) is a complex problem due to the presence of heterogeneous application (e.g., content delivery networks, MapReduce, web applications, and the like) workloads having contentious allocation requirements in terms of ICT resource capacities (e.g., network bandwidth, processing speed, response time, etc.). Several recent papers have tried to address the issue of improving energy efficiency in allocating cloud resources to applications with varying degree of success. However, to the best of our knowledge there is no published literature on this subject that clearly articulates the research problem and provides research taxonomy for succinct classification of existing techniques. Hence, the main aim of this paper is to identify open challenges associated with energy efficient resource allocation. In this regard, the study, first, outlines the problem and existing hardware and software-based techniques available for this purpose. Furthermore, available techniques already presented in the literature are summarized based on the energy-efficient research dimension taxonomy. The advantages and disadvantages of the existing techniques are comprehensively analyzed against the proposed research dimension taxonomy namely: resource adaption policy, objective function, allocation method, allocation operation, and interoperability.

  5. A survey of radiation dose to patients and operators during radiofrequency ablation using computed tomography.

    PubMed

    Saidatul, A; Azlan, Ca; Megat Amin, Msa; Abdullah, Bjj; Ng, Kh

    2010-01-01

    Computed tomography (CT) fluoroscopy is able to give real time images to a physician undertaking minimally invasive procedures such as biopsies, percutaneous drainage, and radio frequency ablation (RFA). Both operators executing the procedure and patients too, are thus at risk of radiation exposure during a CT fluoroscopy.This study focuses on the radiation exposure present during a series of radio frequency ablation (RFA) procedures, and used Gafchromic film (Type XR-QA; International Specialty Products, USA) and thermoluminescent dosimeters (TLD-100H; Bicron, USA) to measure the radiation received by patients undergoing treatment, and also operators subject to scatter radiation.The voltage was held constant at 120 kVp and the current 70mA, with 5mm thickness. The duration of irradiation was between 150-638 seconds.Ultimately, from a sample of 30 liver that have undergone RFA, the study revealed that the operator received the highest dose at the hands, which was followed by the eyes and thyroid, while secondary staff dosage was moderately uniform across all parts of the body that were measured.

  6. Recent Evolution of the Introductory Curriculum in Computing.

    ERIC Educational Resources Information Center

    Tucker, Allen B.; Garnick, David K.

    1991-01-01

    Traces the evolution of introductory computing courses for undergraduates based on the Association for Computing Machinery (ACM) guidelines published in "Curriculum 78." Changes in the curricula are described, including the role of discrete mathematics and theory; and the need for a broader model for designing introductory courses is…

  7. An Undergraduate Computer Science Curriculum for the Hearing Impaired.

    ERIC Educational Resources Information Center

    Perkins, A. Louise

    1995-01-01

    Presents an example section from a computer-science-integrated curriculum that was originally based on the Association of Computing Machinery (ACM) 1978 curriculum. The curriculum was designed to allow both instructors and students to move away from teaching and learning facts. (DDR)

  8. ReRep: Computational detection of repetitive sequences in genome survey sequences (GSS)

    PubMed Central

    Otto, Thomas D; Gomes, Leonardo HF; Alves-Ferreira, Marcelo; de Miranda, Antonio B; Degrave, Wim M

    2008-01-01

    Background Genome survey sequences (GSS) offer a preliminary global view of a genome since, unlike ESTs, they cover coding as well as non-coding DNA and include repetitive regions of the genome. A more precise estimation of the nature, quantity and variability of repetitive sequences very early in a genome sequencing project is of considerable importance, as such data strongly influence the estimation of genome coverage, library quality and progress in scaffold construction. Also, the elimination of repetitive sequences from the initial assembly process is important to avoid errors and unnecessary complexity. Repetitive sequences are also of interest in a variety of other studies, for instance as molecular markers. Results We designed and implemented a straightforward pipeline called ReRep, which combines bioinformatics tools for identifying repetitive structures in a GSS dataset. In a case study, we first applied the pipeline to a set of 970 GSSs, sequenced in our laboratory from the human pathogen Leishmania braziliensis, the causative agent of leishmaniosis, an important public health problem in Brazil. We also verified the applicability of ReRep to new sequencing technologies using a set of 454-reads of an Escheria coli. The behaviour of several parameters in the algorithm is evaluated and suggestions are made for tuning of the analysis. Conclusion The ReRep approach for identification of repetitive elements in GSS datasets proved to be straightforward and efficient. Several potential repetitive sequences were found in a L. braziliensis GSS dataset generated in our laboratory, and further validated by the analysis of a more complete genomic dataset from the EMBL and Sanger Centre databases. ReRep also identified most of the E. coli K12 repeats prior to assembly in an example dataset obtained by automated sequencing using 454 technology. The parameters controlling the algorithm behaved consistently and may be tuned to the properties of the dataset, in particular

  9. Role of computed tomography before lumbar puncture: a survey of clinical practice

    PubMed Central

    Greig, P R; Goroszeniuk, D

    2006-01-01

    Introduction It is becoming increasingly common to request computed tomography (CT) to rule out space occupying lesions before lumbar puncture (LP), even in patients with no clinical signs. Imaging trends within a busy district general hospital in Oxfordshire, UK were analysed with results used to clarify when imaging should be considered mandatory. Method A retrospective six month sample was obtained comprising all adults considered for LP. Observed frequencies of abnormal examination findings compared with abnormal investigations were used to determine sensitivity, specificity, positive predictive, and negative predictive values to assess the validity of using a normal clinical examination as a basis for excluding CT. Results 64 patients were considered for LP. In total, 58 patients underwent LP, with a single patient receiving two. After an abnormal CT scan, six patients did not undergo a planned LP. In all six of these cases subarachnoid haemorrhage was detected, and in all cases this was considered a probable diagnosis. In no case was an LP precluded by an unsuspected space occupying lesion. Neurological examination showed a sensitivity of 0.72 (0.52 to 0.93), specificity 0.78 (0.64 to 0.91), positive predictive value 0.61 (0.41 to 0.83), and negative predictive value 0.85 (0.73 to 0.97). Discussion The high sensitivity and negative predictive values support normal neurological examination as an effective predictor of normal CT scan. This permits the recommendation in cases where subarachnoid haemorrhage is not suspected, a CT scan can be avoided provided there are no abnormal findings on physical or fundoscopic examination. PMID:16517796

  10. State-of-the-art and dissemination of computational tools for drug-design purposes: a survey among Italian academics and industrial institutions.

    PubMed

    Artese, Anna; Alcaro, Stefano; Moraca, Federica; Reina, Rocco; Ventura, Marzia; Costantino, Gabriele; Beccari, Andrea R; Ortuso, Francesco

    2013-05-01

    During the first edition of the Computationally Driven Drug Discovery meeting, held in November 2011 at Dompé Pharma (L'Aquila, Italy), a questionnaire regarding the diffusion and the use of computational tools for drug-design purposes in both academia and industry was distributed among all participants. This is a follow-up of a previously reported investigation carried out among a few companies in 2007. The new questionnaire implemented five sections dedicated to: research group identification and classification; 18 different computational techniques; software information; hardware data; and economical business considerations. In this article, together with a detailed history of the different computational methods, a statistical analysis of the survey results that enabled the identification of the prevalent computational techniques adopted in drug-design projects is reported and a profile of the computational medicinal chemist currently working in academia and pharmaceutical companies in Italy is highlighted.

  11. Assessment of Universal Healthcare Coverage in a District of North India: A Rapid Cross-Sectional Survey Using Tablet Computers

    PubMed Central

    Singh, Tarundeep; Roy, Pritam; Jamir, Limalemla; Gupta, Saurav; Kaur, Navpreet; Jain, D. K.; Kumar, Rajesh

    2016-01-01

    Objective A rapid survey was carried out in Shaheed Bhagat Singh Nagar District of Punjab state in India to ascertain health seeking behavior and out-of-pocket health expenditures. Methods Using multistage cluster sampling design, 1,008 households (28 clusters x 36 households in each cluster) were selected proportionately from urban and rural areas. Households were selected through a house-to-house survey during April and May 2014 whose members had (a) experienced illness in the past 30 days, (b) had illness lasting longer than 30 days, (c) were hospitalized in the past 365 days, or (d) had women who were currently pregnant or experienced childbirth in the past two years. In these selected households, trained investigators, using a tablet computer-based structured questionnaire, enquired about the socio-demographics, nature of illness, source of healthcare, and healthcare and household expenditure. The data was transmitted daily to a central server using wireless communication network. Mean healthcare expenditures were computed for various health conditions. Catastrophic healthcare expenditure was defined as more than 10% of the total annual household expenditure on healthcare. Chi square test for trend was used to compare catastrophic expenditures on hospitalization between households classified into expenditure quartiles. Results The mean monthly household expenditure was 15,029 Indian Rupees (USD 188.2). Nearly 14.2% of the household expenditure was on healthcare. Fever, respiratory tract diseases, gastrointestinal diseases were the common acute illnesses, while heart disease, diabetes mellitus, and respiratory diseases were the more common chronic diseases. Hospitalizations were mainly due to cardiovascular diseases, gastrointestinal problems, and accidents. Only 17%, 18%, 20% and 31% of the healthcare for acute illnesses, chronic illnesses, hospitalizations and childbirth was sought in the government health facilities. Average expenditure in government health

  12. CLIC-ACM: generic modular rad-hard data acquisition system based on CERN GBT versatile link

    NASA Astrophysics Data System (ADS)

    Bielawski, B.; Locci, F.; Magnoni, S.

    2015-01-01

    CLIC is a world-wide collaboration to study the next ``terascale'' lepton collider, relying upon a very innovative concept of two-beam-acceleration. This accelerator, currently under study, will be composed of the subsequence of 21000 two-beam-modules. Each module requires more than 300 analogue and digital signals which need to be acquired and controlled in a synchronous way. CLIC-ACM (Acquisition and Control Module) is the 'generic' control and acquisition module developed to accommodate the controls of all these signals for various sub-systems and related specification in term of data bandwidth, triggering and timing synchronization. This paper describes the system architecture with respect to its radiation-tolerance, power consumption and scalability.

  13. Annual evaporite deposition at the acme of the Messinian salinity crisis: evidence for solar-lunar climate forcing

    NASA Astrophysics Data System (ADS)

    Manzi, Vinicio; Gennari, Rocco; Lugli, Stefano; Roveri, Marco; Scafetta, Nicola; Schreiber, B. Charlotte

    2013-04-01

    We studied two evaporite successions (one halite and the other gypsum) consisting of annual varves in order to reconstruct the paleoclimatic and paleoenvironmental conditions existing during the acme of the Messinian salinity crisis (MSC; ≈5.5 Ma), when huge volumes of evaporites accumulated on the floor of the Mediterranean basin. The spectral analyses of these varved evaporitic successions reveal significant peaks in periodicity at around 3-5, 9, 11-13, 20-27 and 50-100 yr. The deposition of varved sedimentary deposits is usually controlled by climate conditions. A comparison with modern precipitation data in the western Mediterranean shows that during the acme of the MSC the climate was not in a permanent evaporitic stage, but in a dynamic state where evaporite deposition was controlled by quasi-periodic climate oscillations similar to modern analogs including Quasi-Biennial Oscillation, El Niño Southern Oscillation, and decadal to secular lunar- and solar-induced cycles. Particularly, we found a significant quasi-decadal oscillation with a prominent 9-year peak that is also common in modern temperature records and is present in both the contemporary Atlantic Multidecadal Oscillation (AMO) index and Pacific Decadal Oscillation (PDO) index. These cyclical patterns are common to both ancient and modern climate records because they can be associated with solar and solar-lunar tidal cycles. During the Messinian, the Mediterranean basin as well as the global ocean, were characterized by somewhat different continent distribution, ocean size, geography, hydrological connections, and ice-sheet volume with respect to the modern configuration. The recognition of modern-style climate oscillations during the Messinian, however, suggests that, although local geographic factors acted as pre-conditioning factors turning the Mediterranean Sea into a giant brine pool, external climate forcing, regulated by solar-lunar cycles and largely independent of those local geographic

  14. A Review of Models for Teacher Preparation Programs for Precollege Computer Science Education.

    ERIC Educational Resources Information Center

    Deek, Fadi P.; Kimmel, Howard

    2002-01-01

    Discusses the need for adequate precollege computer science education and focuses on the issues of teacher preparation programs and requirements needed to teach high school computer science. Presents models of teacher preparation programs and compares state requirements with Association for Computing Machinery (ACM) recommendations. (Author/LRW)

  15. Characterization of PVL/ACME-positive methicillin-resistant Staphylococcus aureus (genotypes ST8-MRSA-IV and ST5-MRSA-II) isolated from a university hospital in Japan.

    PubMed

    Kawaguchiya, Mitsuyo; Urushibara, Noriko; Yamamoto, Dai; Yamashita, Toshiharu; Shinagawa, Masaaki; Watanabe, Naoki; Kobayashi, Nobumichi

    2013-02-01

    The ST8 methicillin-resistant Staphylococcus aureus (MRSA) with Staphylococcal cassette chromosome mec (SCCmec) type IVa, known as USA300, is a prevalent community-acquired MRSA (CA-MRSA) clone in the United States and has been spreading worldwide. The USA300 characteristically harbors Panton-Valentine Leukocidin (PVL) genes and the arginine catabolic mobile element (ACME, type I). Prevalence and molecular characteristics of PVL(+) and/or ACME(+) S. aureus were investigated in a university hospital located in northern Japan, for 1,366 S. aureus isolates, including 601 MRSA strains derived from clinical specimens collected from 2008 to 2010. The PVL gene was identified in three MRSA strains with SCCmec IV, which belonged to ST8, spa type t008, coagulase type III, and agr type I. Two PVL-positive MRSA strains had also type I ACME, and were isolated from skin abscess of outpatients who have not travelled abroad recently. One of these PVL(+)/ACME(+) strains carried tet(K), msrA, and aph(3')-IIIa, showing resistance to kanamycin, tetracycline, erythromycin, and ciprofloxacin, suggesting acquisition of more resistance than ST8 CA-MRSA reported in Japan previously. In contrast, another PVL(+)/ACME(+) strain and a PVL(+)/ACME(-) strain were susceptible to more antimicrobials and had less virulence factors than PVL(-)/ACME(+) MRSA strains. Besides the two PVL(+) MRSA strains, ACME (type-ΔII) was identified into seven MRSA strains with SCCmec II belonging to ST5, one of the three spa types (t002, t067, and t071), coagulase type II, and agr type II. These PVL(-)/ACME(+) MRSA strains showed multiple drug resistance and harbored various toxin genes as observed for ST5 PVL(-)/ACME(-) MRSA-II. The present study suggested the spread of ST8-MRSA-IV in northern Japan, and a potential significance of ACME-positive ST5-MRSA-II as an emerging MRSA clone in a hospital.

  16. Proceeding of the ACM/IEEE-CS Joint Conference on Digital Libraries (1st, Roanoke, Virginia, June 24-28, 2001).

    ERIC Educational Resources Information Center

    Association for Computing Machinery, New York, NY.

    Papers in this Proceedings of the ACM/IEEE-CS Joint Conference on Digital Libraries (Roanoke, Virginia, June 24-28, 2001) discuss: automatic genre analysis; text categorization; automated name authority control; automatic event generation; linked active content; designing e-books for legal research; metadata harvesting; mapping the…

  17. Macro- and microstructural diversity of sea urchin teeth revealed by large-scale mircro-computed tomography survey

    NASA Astrophysics Data System (ADS)

    Ziegler, Alexander; Stock, Stuart R.; Menze, Björn H.; Smith, Andrew B.

    2012-10-01

    Sea urchins (Echinodermata: Echinoidea) generally possess an intricate jaw apparatus that incorporates five teeth. Although echinoid teeth consist of calcite, their complex internal design results in biomechanical properties far superior to those of inorganic forms of the constituent material. While the individual elements (or microstructure) of echinoid teeth provide general insight into processes of biomineralization, the cross-sectional shape (or macrostructure) of echinoid teeth is useful for phylogenetic and biomechanical inferences. However, studies of sea urchin tooth macro- and microstructure have traditionally been limited to a few readily available species, effectively disregarding a potentially high degree of structural diversity that could be informative in a number of ways. Having scanned numerous sea urchin species using micro-computed tomography µCT) and synchrotron µCT, we report a large variation in macro- and microstructure of sea urchin teeth. In addition, we describe aberrant tooth shapes and apply 3D visualization protocols that permit accelerated visual access to the complex microstructure of sea urchin teeth. Our broad survey identifies key taxa for further in-depth study and integrates previously assembled data on fossil species into a more comprehensive systematic analysis of sea urchin teeth. In order to circumvent the imprecise, word-based description of tooth shape, we introduce shape analysis algorithms that will permit the numerical and therefore more objective description of tooth macrostructure. Finally, we discuss how synchrotron µCT datasets permit virtual models of tooth microstructure to be generated as well as the simulation of tooth mechanics based on finite element modeling.

  18. Radiation Dose from Whole-Body F-18 Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography: Nationwide Survey in Korea

    PubMed Central

    2016-01-01

    The purpose of this study was to estimate average radiation exposure from 18F-fluorodeoxyglucose (FDG) positron emission tomography/computed tomography (PET/CT) examinations and to analyze possible factors affecting the radiation dose. A nation-wide questionnaire survey was conducted involving all institutions that operate PET/CT scanners in Korea. From the response, radiation doses from injected FDG and CT examination were calculated. A total of 105 PET/CT scanners in 73 institutions were included in the analysis (response rate of 62.4%). The average FDG injected activity was 310 ± 77 MBq and 5.11 ± 1.19 MBq/kg. The average effective dose from FDG was estimated to be 5.89 ± 1.46 mSv. The average CT dose index and dose-length product were 4.60 ± 2.47 mGy and 429.2 ± 227.6 mGy∙cm, which corresponded to 6.26 ± 3.06 mSv. The radiation doses from FDG and CT were significantly lower in case of newer scanners than older ones (P < 0.001). Advanced PET technologies such as time-of-flight acquisition and point-spread function recovery were also related to low radiation dose (P < 0.001). In conclusion, the average radiation dose from FDG PET/CT is estimated to be 12.2 mSv. The radiation dose from FDG PET/CT is reduced with more recent scanners equipped with image-enhancing algorithms. PMID:26908992

  19. Assumptions, Trust, and Names in Computer Security Protocols

    DTIC Science & Technology

    2011-06-01

    some of these issues in Section 2.3.) Quine expands his approach in [28]. In [29], Kaplan compares the approaches of Frege and Quine and then attempts...according to Kaplan), the second occurrence of ‘Ortcutt’ does not mean anything, at least as far as logic is concerned. But for Frege , the second occurrence...6th ACM Conference on Computer and Communications Security (CCS ’99). ACM, 1999, pp. 52–62. [25] G. Frege , “Über sinn und bedeutung,” Zeitschrift für

  20. Summary of selected computer programs produced by the U.S. Geological Survey for simulation of ground-water flow and quality, 1994

    USGS Publications Warehouse

    Appel, Charles A.; Reilly, Thomas E.

    1994-01-01

    A summary list of reports that document numerical methods that simulate ground-water flow and quality is presented. The list documents the reference by giving a description of each model program, its numerical features, an expression of the number of past applications and where to obtain a copy. All reports included in the list have been published or developed by the U.S. Geological Survey and most contain listings of the computer programs.

  1. Novel Techniques for Secure Use of Public Cloud Computing Resources

    DTIC Science & Technology

    2015-09-17

    SIGCOMM Computer Communication Review, volume 43, 513–514. ACM, 2013. [61] Jeong, Ik Rae and Jeong Ok Kwon. “Analysis of some keyword search schemes in...Government’s Information Infrastruc- ture”, 1993. URL http://govinfo.library.unt.edu/npr/library/reports/it09.html. [92] Rhee, Hyun Sook, Ik Rae Jeong

  2. A Placement Test for Computer Science: Design, Implementation, and Analysis

    ERIC Educational Resources Information Center

    Nugent, Gwen; Soh, Leen-Kiat; Samal, Ashok; Lang, Jeff

    2006-01-01

    An introductory CS1 course presents problems for educators and students due to students' diverse background in programming knowledge and exposure. Students who enroll in CS1 also have different expectations and motivations. Prompted by the curricular guidelines for undergraduate programmes in computer science released in 2001 by the ACM/IEEE, and…

  3. Teaching Perspectives among Introductory Computer Programming Faculty in Higher Education

    ERIC Educational Resources Information Center

    Mainier, Michael J.

    2011-01-01

    This study identified the teaching beliefs, intentions, and actions of 80 introductory computer programming (CS1) faculty members from institutions of higher education in the United States using the Teacher Perspectives Inventory. Instruction method used inside the classroom, categorized by ACM CS1 curriculum guidelines, was also captured along…

  4. Detection of structural and numerical chomosomal abnormalities by ACM-FISH analysis in sperm of oligozoospermic infertility patients

    SciTech Connect

    Schmid, T E; Brinkworth, M H; Hill, F; Sloter, E; Kamischke, A; Marchetti, F; Nieschlag, E; Wyrobek, A J

    2003-11-10

    Modern reproductive technologies are enabling the treatment of infertile men with severe disturbances of spermatogenesis. The possibility of elevated frequencies of genetically and chromosomally defective sperm has become an issue of concern with the increased usage of intracytoplasmic sperm injection (ICSI), which can enable men with severely impaired sperm production to father children. Several papers have been published about aneuploidy in oligozoospermic patients, but relatively little is known about chromosome structural aberrations in the sperm of these patients. We examined sperm from infertile, oligozoospermic individuals for structural and numerical chromosomal abnormalities using a multicolor ACM FISH assay that utilizes DNA probes specific for three regions of chromosome 1 to detect human sperm that carry numerical chromosomal abnormalities plus two categories of structural aberrations: duplications and deletions of 1pter and 1cen, and chromosomal breaks within the 1cen-1q12 region. There was a significant increase in the average frequencies of sperm with duplications and deletions in the infertility patients compared with the healthy concurrent controls. There was also a significantly elevated level of breaks within the 1cen-1q12 region. There was no evidence for an increase in chromosome-1 disomy, or in diploidy. Our data reveal that oligozoospermia is associated with chromosomal structural abnormalities suggesting that, oligozoospermic men carry a higher burden of transmissible, chromosome damage. The findings raise the possibility of elevated levels of transmissible chromosomal defects following ICSI treatment.

  5. Hydrologic effects of phreatophyte control, Acme-Artesia reach of the Pecos River, New Mexico, 1967-82

    USGS Publications Warehouse

    Welder, G.E.

    1988-01-01

    The U.S. Bureau of Reclamation began a phreatophyte clearing and control program in the bottom land of the Acme-Artesia reach of the Pecos River in March 1967. The initial cutting of 19,000 acres of saltcedar trees, the dominant phreatophyte in the area, was completed in May 1969. Saltcedar regrowth continued each year until July 1975, when root plowing eradicated most of the regrowth. The major objective of the clearing and control program was to salvage water that could be put to beneficial use. Measurements of changes in the water table in the bottom land and changes in the base flow of the Pecos River were made in order to determine the hydrologic effects of the program. Some salvage of water was indicated, but it is not readily recognized as an increase in base flow. The quantity of salvage probably is less than the average annual base-flow gain of 19 ,110 acre-ft in the reach during 1967-82. (Author 's abstract)

  6. A multicopper oxidase is essential for manganese oxidation and laccase-like activity in Pedomicrobium sp. ACM 3067.

    PubMed

    Ridge, Justin P; Lin, Marianne; Larsen, Eloise I; Fegan, Mark; McEwan, Alastair G; Sly, Lindsay I

    2007-04-01

    Pedomicrobium sp. ACM 3067 is a budding-hyphal bacterium belonging to the alpha-Proteobacteria which is able to oxidize soluble Mn2+ to insoluble manganese oxide. A cosmid, from a whole-genome library, containing the putative genes responsible for manganese oxidation was identified and a primer-walking approach yielded 4350 bp of novel sequence. Analysis of this sequence showed the presence of a predicted three-gene operon, moxCBA. The moxA gene product showed homology to multicopper oxidases (MCOs) and contained the characteristic four copper-binding motifs (A, B, C and D) common to MCOs. An insertion mutation of moxA showed that this gene was essential for both manganese oxidation and laccase-like activity. The moxB gene product showed homology to a family of outer membrane proteins which are essential for Type I secretion in Gram-negative bacteria. moxBA has not been observed in other manganese-oxidizing bacteria but homologues were identified in the genomes of several bacteria including Sinorhizobium meliloti 1021 and Agrobacterium tumefaciens C58. These results suggest that moxBA and its homologues constitute a family of genes encoding an MCO and a predicted component of the Type I secretion system.

  7. Phase-Field Modeling and Computation of Crack Propagation and Fracture

    DTIC Science & Technology

    2014-04-07

    Computational Mechanics was renamed the Thomas J.R. Hughes Medal. “Isogeometric Analysis,” Charlemagne Distinguished Lecture, Aachen Institute for...Advanced Study in Computational Engineering Science, (AICES), RWTH – Aachen University, Aachen , Germany, October 24, 2012. ACM 2013, Advances in

  8. Using Relational Schemata in a Computer Immune System to Detect Multiple-Packet Network Intrusions

    DTIC Science & Technology

    2002-03-01

    used to perform portscans on between one and ten computers using SYN scanning, FIN scanning, and UDP scanning of victim machines. Both sequential and...for Computing Machinery, 1997. [Stan00] Staniford, Stuart, et al. “Practical Automated Detection of Stealthy Portscans .” Proceedings of the 7th ACM

  9. Peak data for U.S. Geological Survey gaging stations, Texas network and computer program to estimate peak-streamflow frequency

    USGS Publications Warehouse

    Slade, R.M.; Asquith, W.H.

    1996-01-01

    About 23,000 annual peak streamflows and about 400 historical peak streamflows exist for about 950 stations in the surface-water data-collection network of Texas. These data are presented on a computer diskette along with the corresponding dates, gage heights, and information concerning the basin, and nature or cause for the flood. Also on the computer diskette is a U.S. Geological Survey computer program that estimates peak-streamflow frequency based on annual and historical peak streamflow. The program estimates peak streamflow for 2-, 5-, 10-, 25-, 50-, and 100-year recurrence intervals and is based on guidelines established by the Interagency Advisory Committee on Water Data. Explanations are presented for installing the program, and an example is presented with discussion of its options.

  10. WTP Calculation Sheet: Determining the LAW Glass Former Constituents and Amounts for G2 and Acm Models. 24590-LAW-M4C-LFP-00002, Rev. B

    SciTech Connect

    Gimpel, Rodney F.; Kruger, Albert A.

    2013-12-16

    The purpose of this calculation is to determine the LAW glass former recipe and additives with their respective amounts. The methodology and equations contained herein are to be used in the G2 and ACM models until better information is supplied by R&T efforts. This revision includes calculations that determines the mass and volume of the bulk chemicals/minerals needed per batch. Plus, it contains calculations (for the G2 model) to help prevent overflow in LAW Feed Preparation Vessel.

  11. A system of computer programs (WAT{_}MOVE) for transferring data among data bases in the US Geological Survey National Water Information System

    SciTech Connect

    Rogers, G.D.; Kerans, B.K.

    1991-11-01

    This report describes WAT{_}MOVE, a system of computer programs that was developed for moving National Water Information System data between US Geological Survey distributed computer databases. WAT{_}MOVE has three major sub-systems: one for retrieval, one for loading, and one for purging. The retrieval sub-system creates transaction files of retrieved data for transfer and invokes a file transfer to send the transaction files to the receiving site. The loading sub-system reads the control and transaction files retrieved from the source database and loads the data in the appropriate files. The purging sub-system deletes data from a database. Although WAT{_}MOVE was developed for use by the Geological Survey`s Hydrologic Investigations Program of the Yucca Mountain Project Branch, the software can be beneficial to any office maintaining data in the Site File, ADAPS (Automated Data Processing System), GWSI (Ground-Water Site Inventory), and QW (Quality of Water) sub-systems of the National Water Information System. The software also can be used to move data between databases on a single network node or to modify data within a database.

  12. A 90-day subchronic feeding study of genetically modified maize expressing Cry1Ac-M protein in Sprague-Dawley rats.

    PubMed

    Liu, Pengfei; He, Xiaoyun; Chen, Delong; Luo, Yunbo; Cao, Sishuo; Song, Huan; Liu, Ting; Huang, Kunlun; Xu, Wentao

    2012-09-01

    The cry1Ac-M gene, coding one of Bacillus thuringiensis (Bt) crystal proteins, was introduced into maize H99 × Hi IIB genome to produce insect-resistant GM maize BT-38. The food safety assessment of the BT-38 maize was conducted in Sprague-Dawley rats by a 90-days feeding study. We incorporated maize grains from BT-38 and H99 × Hi IIB into rodent diets at three concentrations (12.5%, 25%, 50%) and administered to Sprague-Dawley rats (n=10/sex/group) for 90 days. A commercialized rodent diet was fed to an additional group as control group. Body weight, feed consumption and toxicological response variables were measured, and gross as well as microscopic pathology were examined. Moreover, detection of residual Cry1Ac-M protein in the serum of rats fed with GM maize was conducted. No death or adverse effects were observed in the current feeding study. No adverse differences in the values of the response variables were observed between rats that consumed diets containing GM maize BT-38 and non-GM maize H99 × Hi IIB. No detectable Cry1Ac-M protein was found in the serum of rats after feeding diets containing GM maize for 3 months. The results demonstrated that BT-38 maize is as safe as conventional non-GM maize.

  13. Acme Landfill Expansion. Appendices.

    DTIC Science & Technology

    1982-01-01

    existing operation area; plus use of a portion of the 178-acre southern parcel, sloping away from the existing hills. (Areas A and B) 3. Use of existing...acre parcel, less 40 acres for dredge spoil, and plus use of a portion of the 178-acre southern parcel, sloping away from the existing hills. (Areas...acre southern parcel filling against the existing hills. (Areas A, B, C, and D) 6. Use of existing operation area; plus use of the full 200-acre

  14. Use of handheld computers with global positioning systems for probability sampling and data entry in household surveys.

    PubMed

    Vanden Eng, Jodi L; Wolkon, Adam; Frolov, Anatoly S; Terlouw, Dianne J; Eliades, M James; Morgah, Kodjo; Takpa, Vincent; Dare, Aboudou; Sodahlon, Yao K; Doumanou, Yao; Hawley, William A; Hightower, Allen W

    2007-08-01

    We introduce an innovative method that uses personal digital assistants (PDAs) equipped with global positioning system (GPS) units in household surveys to select a probability-based sample and perform PDA-based interviews. Our approach uses PDAs with GPS to rapidly map all households in selected areas, choose a random sample, and navigate back to the sampled households to conduct an interview. We present recent field experience in two large-scale nationally representative household surveys to assess insecticide-treated bed net coverage as part of malaria control efforts in Africa. The successful application of this method resulted in statistically valid samples; quality-controlled data entry; and rapid aggregation, analyses, and availability of preliminary results within days of completing the field work. We propose this method as an alternative to the Expanded Program on Immunization cluster sample method when a fast, statistically valid survey is required in an environment with little census information at the enumeration area level.

  15. Social presence reinforcement and computer-mediated communication: the effect of the solicitor's photography on compliance to a survey request made by e-mail.

    PubMed

    Guéguen, Nicolas; Jacob, Céline

    2002-04-01

    Personal information is scarce in computer-mediated communication. So when information about the sender is attached with an e-mail, this could induce a positive feeling toward the sender. An experiment was carried out where a male and a female student-solicitor, by way of an e-mail, requested a student-subject to participate in a survey. In half of the cases, a digital photograph of the solicitor appeared at the end of the e-mail. Results show that subjects agreed more readily to the request in the experimental condition than in the control condition where no digital photograph was sent with the e-mail. The importance of social information on computer-mediated communication is used to explain such results.

  16. Survey of new vector computers: The CRAY 1S from CRAY research; the CYBER 205 from CDC and the parallel computer from ICL - architecture and programming

    NASA Technical Reports Server (NTRS)

    Gentzsch, W.

    1982-01-01

    Problems which can arise with vector and parallel computers are discussed in a user oriented context. Emphasis is placed on the algorithms used and the programming techniques adopted. Three recently developed supercomputers are examined and typical application examples are given in CRAY FORTRAN, CYBER 205 FORTRAN and DAP (distributed array processor) FORTRAN. The systems performance is compared. The addition of parts of two N x N arrays is considered. The influence of the architecture on the algorithms and programming language is demonstrated. Numerical analysis of magnetohydrodynamic differential equations by an explicit difference method is illustrated, showing very good results for all three systems. The prognosis for supercomputer development is assessed.

  17. Computer-science guest-lecture series at Langston University sponsored by the U.S. Geological Survey; abstracts, 1992-93

    USGS Publications Warehouse

    Steele, K. S.

    1994-01-01

    Langston University, a Historically Black University located at Langston, Oklahoma, has a computing and information science program within the Langston University Division of Business. Since 1984, Langston University has participated in the Historically Black College and University program of the U.S. Department of Interior, which provided education, training, and funding through a combined earth-science and computer-technology cooperative program with the U.S. Geological Survey (USGS). USGS personnel have presented guest lectures at Langston University since 1984. Students have been enthusiastic about the lectures, and as a result of this program, 13 Langston University students have been hired by the USGS on a part-time basis while they continued their education at the University. The USGS expanded the offering of guest lectures in 1992 by increasing the number of visits to Langston University, and by inviting participation of speakers from throughout the country. The objectives of the guest-lecture series are to assist Langston University in offering state-of-the-art education in the computer sciences, to provide students with an opportunity to learn from and interact with skilled computer-science professionals, and to develop a pool of potential future employees for part-time and full-time employment. This report includes abstracts for guest-lecture presentations during 1992-93 school year.

  18. Documentation of computer programs to compute and display pathlines using results from the U.S. Geological Survey modular three-dimensional finite-difference ground-water flow model

    USGS Publications Warehouse

    Pollock, David W.

    1989-01-01

    A particle tracking post-processing package was developed to compute three-dimensional path lines based on output from steady-state simulations obtained with the U.S. Geological Survey modular 3-dimensional finite difference groundwater flow model. The package consists of two FORTRAN 77 computer programs: (1) MODPATH, which calculates pathlines, and (2) MODPATH-PLOT, which presents results graphically. MODPATH uses a semi-analytical particle tracking scheme. The method is based on the assumption that each directional velocity component varies linearly within a grid cell in its own coordinate direction. This assumption allows an analytical expression to be obtained describing the flow path within a grid cell. Given the initial position of a particle anywhere in a cell, the coordinates of any other point along its path line within the cell, and the time of travel between them, can be computed directly. Data is input to MODPATH and MODPATH-PLOT through a combination of files and interactive dialogue. Examples of how to use MODPATH and MODPATH-PLOT are provided for a sample problem. Listings of the computer codes and detailed descriptions of input data format and program options are also presented. (Author 's abstract)

  19. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    PubMed

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  20. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature. [Once-through Cycle and Plutonium Recycle

    SciTech Connect

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included.

  1. Ethics in the computer age. Conference proceedings

    SciTech Connect

    Kizza, J.M.

    1994-12-31

    These proceedings contain the papers presented at the Ethics in the Computer Age conference held in Gatlinburg, Tennessee, November 11-13, 1994. The conference was sponsored by ACM SIGCAS (Computers and Society) to which I am very grateful. The Ethics in the Computer Age conference sequence started in 1991 with the first conference at the campus of the University of Tennessee at Chattanooga. The second was help at the same location a year later. These two conferences were limited to only invited speakers, but their success was overwhelming. This is the third in the sequence and the first truly international one. Plans are already under way for the fourth in 1996.

  2. Inter-method reliability of paper surveys and computer assisted telephone interviews in a randomized controlled trial of yoga for low back pain

    PubMed Central

    2014-01-01

    Background Little is known about the reliability of different methods of survey administration in low back pain trials. This analysis was designed to determine the reliability of responses to self-administered paper surveys compared to computer assisted telephone interviews (CATI) for the primary outcomes of pain intensity and back-related function, and secondary outcomes of patient satisfaction, SF-36, and global improvement among participants enrolled in a study of yoga for chronic low back pain. Results Pain intensity, back-related function, and both physical and mental health components of the SF-36 showed excellent reliability at all three time points; ICC scores ranged from 0.82 to 0.98. Pain medication use showed good reliability; kappa statistics ranged from 0.68 to 0.78. Patient satisfaction had moderate to excellent reliability; ICC scores ranged from 0.40 to 0.86. Global improvement showed poor reliability at 6 weeks (ICC = 0.24) and 12 weeks (ICC = 0.10). Conclusion CATI shows excellent reliability for primary outcomes and at least some secondary outcomes when compared to self-administered paper surveys in a low back pain yoga trial. Having two reliable options for data collection may be helpful to increase response rates for core outcomes in back pain trials. Trial registration ClinicalTrials.gov: NCT01761617. Date of trial registration: December 4, 2012. PMID:24716775

  3. Survey of Current Practice in Computer and Information Technology in the Youth Training Scheme. Publication No. 2.

    ERIC Educational Resources Information Center

    Brown, Alan; Mills, Julian

    A study examined the computer and information technology (CIT) training provided in 61 training schemes in 10 regions throughout the United Kingdom under the auspices of the Youth Training Scheme. Of the 52 programs for which data on the time spent on CIT were available, 12 offered 5 days or less of off-the-job training with little other…

  4. A Survey of the Applicants for the National Science Foundation Summer Institute on Computer Science at Shippensburg State College.

    ERIC Educational Resources Information Center

    Kelley, Thomas J., Jr.

    The 1969 National Science Foundation Summer Institute on Computer Science at Shippensburg State College attracted 765 applicants for the 30 available positions. The average applicant was 37 years of age, lived 878 miles from Shippensburg, and taught in a public high school with an enrollment of 1193 students. He had taught for 11 years, 8 1/2 in…

  5. Coming into Focus: The Treatment of African-Americans in Post Civil War United States History Survey Texts.

    ERIC Educational Resources Information Center

    Cha-Jua, Sundiata Keita; Weems, Robert E., Jr.

    1994-01-01

    Asserts that during the past two decades, the coverage of African Americans in college-level U.S. history textbooks has improved in both quantity and quality. Reports that, despite these advances, a survey of 14 textbooks revealed a significant portion of African American history remains beyond the boundaries of these texts. (ACM)

  6. Experimental determination of the partitioning coefficient and volatility of important BVOC oxidation products using the Aerosol Collection Module (ACM) coupled to a PTR-ToF-MS

    NASA Astrophysics Data System (ADS)

    Gkatzelis, G.; Hohaus, T.; Tillmann, R.; Schmitt, S. H.; Yu, Z.; Schlag, P.; Wegener, R.; Kaminski, M.; Kiendler-Scharr, A.

    2015-12-01

    Atmospheric aerosol can alter the Earth's radiative budget and global climate but can also affect human health. A dominant contributor to the submicrometer particulate matter (PM) is organic aerosol (OA). OA can be either directly emitted through e.g. combustion processes (primary OA) or formed through the oxidation of organic gases (secondary organic aerosol, SOA). A detailed understanding of SOA formation is of importance as it constitutes a major contribution to the total OA. The partitioning between the gas and particle phase as well as the volatility of individual components of SOA is yet poorly understood adding uncertainties and thus complicating climate modelling. In this work, a new experimental methodology was used for compound-specific analysis of organic aerosol. The Aerosol Collection Module (ACM) is a newly developed instrument that deploys an aerodynamic lens to separate the gas and particle phase of an aerosol. The particle phase is directed to a cooled sampling surface. After collection particles are thermally desorbed and transferred to a detector for further analysis. In the present work, the ACM was coupled to a Proton Transfer Reaction-Time of Flight-Mass Spectrometer (PTR-ToF-MS) to detect and quantify organic compounds partitioning between the gas and particle phase. This experimental approach was used in a set of experiments at the atmosphere simulation chamber SAPHIR to investigate SOA formation. Ozone oxidation with subsequent photochemical aging of β-pinene, limonene and real plant emissions from Pinus sylvestris (Scots pine) were studied. Simultaneous measurement of the gas and particle phase using the ACM-PTR-ToF-MS allows to report partitioning coefficients of important BVOC oxidation products. Additionally, volatility trends and changes of the SOA with photochemical aging are investigated and compared for all systems studied.

  7. A Survey of Current Temperature Dependent Elastic-Plastic-Creep Constitutive Laws for Applicability to Finite Element Computer Codes.

    DTIC Science & Technology

    1980-05-31

    data that several phenomena which should be modelled by the constitutive theory are: (1) the Bauschinger effect for reverse loading, (2) the nonunique ...evidence of its importance. Although significant work has been done to obtain working constitutive models, in many cases the theory has not been cast...nonlinear visco- elasticity theory for applicability to the conservation of momentum. Based on physical accuracy as well as computational efficiency the

  8. How to Implement Rigorous Computer Science Education in K-12 Schools? Some Answers and Many Questions

    ERIC Educational Resources Information Center

    Hubwieser, Peter; Armoni, Michal; Giannakos, Michail N.

    2015-01-01

    Aiming to collect various concepts, approaches, and strategies for improving computer science education in K-12 schools, we edited this second special issue of the "ACM TOCE" journal. Our intention was to collect a set of case studies from different countries that would describe all relevant aspects of specific implementations of…

  9. Discovering MicroRNA-Regulatory Modules in Multi-Dimensional Cancer Genomic Data: A Survey of Computational Methods

    PubMed Central

    Walsh, Christopher J.; Hu, Pingzhao; Batt, Jane; dos Santos, Claudia C.

    2016-01-01

    MicroRNAs (miRs) are small single-stranded noncoding RNA that function in RNA silencing and post-transcriptional regulation of gene expression. An increasing number of studies have shown that miRs play an important role in tumorigenesis, and understanding the regulatory mechanism of miRs in this gene regulatory network will help elucidate the complex biological processes at play during malignancy. Despite advances, determination of miR–target interactions (MTIs) and identification of functional modules composed of miRs and their specific targets remain a challenge. A large amount of data generated by high-throughput methods from various sources are available to investigate MTIs. The development of data-driven tools to harness these multi-dimensional data has resulted in significant progress over the past decade. In parallel, large-scale cancer genomic projects are allowing new insights into the commonalities and disparities of miR–target regulation across cancers. In the first half of this review, we explore methods for identification of pairwise MTIs, and in the second half, we explore computational tools for discovery of miR-regulatory modules in a cancer-specific and pan-cancer context. We highlight strengths and limitations of each of these tools as a practical guide for the computational biologists. PMID:27721651

  10. Recommendations for monitoring and evaluation of in-patient Computer-based Provider Order Entry systems: results of a Delphi survey.

    PubMed

    Sittig, Dean F; Campbell, Emily; Guappone, Ken; Dykstra, Richard; Ash, Joan S

    2007-10-11

    A survey of 20 clinical informaticists with experience in implementing Computer-based Provider Order Entry (CPOE) systems revealed the lack of easily accessible measurements of success. Using a Delphi approach, the authors, together with a group of CPOE experts, selected eight key CPOE-related measures to assess system availability, use, benefits, and e-Iatrogenesis. We suggest collecting these measures on a widespread/national basis would be wise stewardship and result in tighter feedback about both clinician workflow and patient safety. Establishing reliable benchmarks against which new implementations and existing systems can be compared will enhance organizations' ability to effectively manage and hence to realize the full benefits of their CPOE implementations.

  11. Design of ET(B) receptor agonists: NMR spectroscopic and conformational studies of ET7-21[Leu7, Aib11, Cys(Acm)15].

    PubMed

    Hewage, Chandralal M; Jiang, Lu; Parkinson, John A; Ramage, Robert; Sadler, Ian H

    2002-03-01

    In a previous report we have shown that the endothelin-B receptor-selective linear endothelin peptide, ET-1[Cys (Acm)1,15, Ala3, Leu7, Aib11], folds into an alpha-helical conformation in a methanol-d3/water co-solvent [Hewage et al. (1998) FEBS Lett., 425, 234-238]. To study the requirements for the structure-activity relationships, truncated analogues of this peptide were subjected to further studies. Here we report the solution conformation of ET7-21[Leu7, Aib11, Cys(Acm)15], in a methanol-d3/water co-solvent at pH 3.6, by NMR spectroscopic and molecular modelling studies. Further truncation of this short peptide results in it displaying poor agonist activity. The modelled structure shows that the peptide folds into an alpha-helical conformation between residues Lys9-His16, whereas the C-terminus prefers no fixed conformation. This truncated linear endothelin analogue is pivotal for designing endothelin-B receptor agonists.

  12. Evidence for heterogeneity of astrocyte de-differentiation in vitro: astrocytes transform into intermediate precursor cells following induction of ACM from scratch-insulted astrocytes.

    PubMed

    Yang, Hao; Qian, Xin-Hong; Cong, Rui; Li, Jing-wen; Yao, Qin; Jiao, Xi-Ying; Ju, Gong; You, Si-Wei

    2010-04-01

    Our previous study definitely demonstrated that the mature astrocytes could undergo a de-differentiation process and further transform into pluripotential neural stem cells (NSCs), which might well arise from the effect of diffusible factors released from scratch-insulted astrocytes. However, these neurospheres passaged from one neurosphere-derived from de-differentiated astrocytes possessed a completely distinct characteristic in the differentiation behavior, namely heterogeneity of differentiation. The heterogeneity in cell differentiation has become a crucial but elusive issue. In this study, we show that purified astrocytes could de-differentiate into intermediate precursor cells (IPCs) with addition of scratch-insulted astrocyte-conditioned medium (ACM) to the culture, which can express NG2 and A2B5, the IPCs markers. Apart from the number of NG2(+) and A2B5(+) cells, the percentage of proliferative cells as labeled with BrdU progressively increased with prolonged culture period ranging from 1 to 10 days. Meanwhile, the protein level of A2B5 in cells also increased significantly. These results revealed that not all astrocytes could de-differentiate fully into NSCs directly when induced by ACM, rather they generated intermediate or more restricted precursor cells that might undergo progressive de-differentiation to generate NSCs.

  13. A survey of advancements in nucleic acid-based logic gates and computing for applications in biotechnology and biomedicine.

    PubMed

    Wu, Cuichen; Wan, Shuo; Hou, Weijia; Zhang, Liqin; Xu, Jiehua; Cui, Cheng; Wang, Yanyue; Hu, Jun; Tan, Weihong

    2015-03-04

    Nucleic acid-based logic devices were first introduced in 1994. Since then, science has seen the emergence of new logic systems for mimicking mathematical functions, diagnosing disease and even imitating biological systems. The unique features of nucleic acids, such as facile and high-throughput synthesis, Watson-Crick complementary base pairing, and predictable structures, together with the aid of programming design, have led to the widespread applications of nucleic acids (NA) for logic gate and computing in biotechnology and biomedicine. In this feature article, the development of in vitro NA logic systems will be discussed, as well as the expansion of such systems using various input molecules for potential cellular, or even in vivo, applications.

  14. A Survey of Advancements in Nucleic Acid-based Logic Gates and Computing for Applications in Biotechnology and biomedicine

    PubMed Central

    Wu, Cuichen; Wan, Shuo; Hou, Weijia; Zhang, Liqin; Xu, Jiehua; Cui, Cheng; Wang, Yanyue; Hu, Jun

    2015-01-01

    Nucleic acid-based logic devices were first introduced in 1994. Since then, science has seen the emergence of new logic systems for mimicking mathematical functions, diagnosing disease and even imitating biological systems. The unique features of nucleic acids, such as facile and high-throughput synthesis, Watson-Crick complementary base pairing, and predictable structures, together with the aid of programming design, have led to the widespread applications of nucleic acids (NA) for logic gating and computing in biotechnology and biomedicine. In this feature article, the development of in vitro NA logic systems will be discussed, as well as the expansion of such systems using various input molecules for potential cellular, or even in vivo, applications. PMID:25597946

  15. HIV-related risk behaviors among the general population: a survey using Audio Computer-Assisted Self-Interview in 3 cities in Vietnam.

    PubMed

    Vu, Lan T H; Nadol, Patrick; Le, Linh Cu

    2015-03-01

    This study used a confidential survey method-namely, Audio Computer-Assisted Self-Interview (ACASI)-to gather data about HIV-related risk knowledge/behaviors among the general population in Vietnam. The study sample included 1371 people aged 15 to 49 years in 3 cities-Hanoi, Da nang, and Can Tho. Results indicated that 7% of participants had ever had nonconsensual sex, and 3.6% of them had ever had a one-night stand. The percentage of male participants reported to ever have sex with sex workers was 9.6% and to ever inject drugs was 4.3%. The proportion of respondents who had ever tested for HIV was 17.6%. The risk factors and attitudes reported in the survey indicate the importance of analyzing risk behaviors related to HIV infection among the general population. Young people, especially men in more urbanized settings, are engaging in risky behaviors and may act as a "bridge" for the transmission of HIV from high-risk groups to the general population in Vietnam.

  16. A survey of U.S.A. acute care hospitals' computer-based provider order entry system infusion levels.

    PubMed

    Sittig, Dean F; Guappone, Ken; Campbell, Emily M; Dykstra, Richard H; Ash, Joan S

    2007-01-01

    We developed and fielded a survey to help clinical information system designers, developers, and implementers better understand the infusion level, or the extent and sophistication of CPOE feature availability and use by clinicians within acute care hospitals across the United States of America. In the 176 responding hospitals, we found that CPOE had been in place a median of 5 years and that the median percentage of orders entered electronically was 90.5%. Greater than 96% of the sites used CPOE to enter pharmacy, laboratory and imaging orders; 82% were able to access all aspects of the clinical information system with a single sign-on; 86% of the respondents had order sets, drug-drug interaction warnings, and pop-up alerts even though nearly all hospitals were community hospitals with commercial systems; and 90% had a CPOE committee with a clinician representative in place. While CPOE has not been widely adopted after over 30 years of experimentation, there is still much that can be learned from this relatively small number of highly infused (with CPOE and clinical decision support) organizations.

  17. Surveying System

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Sunrise Geodetic Surveys are setting up their equipment for a town survey. Their equipment differs from conventional surveying systems that employ transit rod and chain to measure angles and distances. They are using ISTAC Inc.'s Model 2002 positioning system, which offers fast accurate surveying with exceptional signals from orbiting satellites. The special utility of the ISTAC Model 2002 is that it can provide positioning of the highest accuracy from Navstar PPS signals because it requires no knowledge of secret codes. It operates by comparing the frequency and time phase of a Navstar signal arriving at one ISTAC receiver with the reception of the same set of signals by another receiver. Data is computer processed and translated into three dimensional position data - latitude, longitude and elevation.

  18. Computer Programs for Obtaining and Analyzing Daily Mean Steamflow Data from the U.S. Geological Survey National Water Information System Web Site

    USGS Publications Warehouse

    Granato, Gregory E.

    2009-01-01

    Research Council, 2004). The USGS maintains the National Water Information System (NWIS), a distributed network of computers and file servers used to store and retrieve hydrologic data (Mathey, 1998; U.S. Geological Survey, 2008). NWISWeb is an online version of this database that includes water data from more than 24,000 streamflow-gaging stations throughout the United States (U.S. Geological Survey, 2002, 2008). Information from NWISWeb is commonly used to characterize streamflows at gaged sites and to help predict streamflows at ungaged sites. Five computer programs were developed for obtaining and analyzing streamflow from the National Water Information System (NWISWeb). The programs were developed as part of a study by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, to develop a stochastic empirical loading and dilution model. The programs were developed because reliable, efficient, and repeatable methods are needed to access and process streamflow information and data. The first program is designed to facilitate the downloading and reformatting of NWISWeb streamflow data. The second program is designed to facilitate graphical analysis of streamflow data. The third program is designed to facilitate streamflow-record extension and augmentation to help develop long-term statistical estimates for sites with limited data. The fourth program is designed to facilitate statistical analysis of streamflow data. The fifth program is a preprocessor to create batch input files for the U.S. Environmental Protection Agency DFLOW3 program for calculating low-flow statistics. These computer programs were developed to facilitate the analysis of daily mean streamflow data for planning-level water-quality analyses but also are useful for many other applications pertaining to streamflow data and statistics. These programs and the associated documentation are included on the CD-ROM accompanying this report. This report and the appendixes on the

  19. Survey of computed tomography doses and establishment of national diagnostic reference levels in the Republic of Belarus.

    PubMed

    Kharuzhyk, S A; Matskevich, S A; Filjustin, A E; Bogushevich, E V; Ugolkova, S A

    2010-01-01

    Computed tomography dose index (CTDI) was measured on eight CT scanners at seven public hospitals in the Republic of Belarus. The effective dose was calculated using normalised values of effective dose per dose-length product (DLP) over various body regions. Considerable variations of the dose values were observed. Mean effective doses amounted to 1.4 +/- 0.4 mSv for brain, 2.6 +/- 1.0 mSv for neck, 6.9 +/- 2.2 mSv for thorax, 7.0 +/- 2.3 mSv for abdomen and 8.8 +/- 3.2 mSv for pelvis. Diagnostic reference levels (DRLs) were proposed by calculating the third quartiles of dose value distributions (body region/volume CTDI, mGy/DLP, mGy cm): brain/60/730, neck/55/640, thorax/20/500, abdomen/25/600 and pelvis/25/490. It is evident that the protocols need to be optimised on some of the CT scanners, in view of the fact that these are the first formulated DRLs for the Republic of Belarus.

  20. Preconsult interactive computer-assisted client assessment survey for common mental disorders in a community health centre: a randomized controlled trial

    PubMed Central

    Ahmad, Farah; Lou, Wendy; Shakya, Yogendra; Ginsburg, Liane; Ng, Peggy T.; Rashid, Meb; Dinca-Panaitescu, Serban; Ledwos, Cliff; McKenzie, Kwame

    2017-01-01

    Background: Access disparities for mental health care exist for vulnerable ethnocultural and immigrant groups. Community health centres that serve these groups could be supported further by interactive, computer-based, self-assessments. Methods: An interactive computer-assisted client assessment survey (iCCAS) tool was developed for preconsult assessment of common mental disorders (using the Patient Health Questionnaire [PHQ-9], Generalized Anxiety Disorder 7-item [GAD-7] scale, Primary Care Post-traumatic Stress Disorder [PTSD-PC] screen and CAGE [concern/cut-down, anger, guilt and eye-opener] questionnaire), with point-of-care reports. The pilot randomized controlled trial recruited adult patients, fluent in English or Spanish, who were seeing a physician or nurse practitioner at the partnering community health centre in Toronto. Randomization into iCCAS or usual care was computer generated, and allocation was concealed in sequentially numbered, opaque envelopes that were opened after consent. The objectives were to examine the interventions' efficacy in improving mental health discussion (primary) and symptom detection (secondary). Data were collected by exit survey and chart review. Results: Of the 1248 patients assessed, 190 were eligible for participation. Of these, 148 were randomly assigned (response rate 78%). The iCCAS (n = 75) and usual care (n = 72) groups were similar in sociodemographics; 98% were immigrants, and 68% were women. Mental health discussion occurred for 58.7% of patients in the iCCAS group and 40.3% in the usual care group (p ≤ 0.05). The effect remained significant while controlling for potential covariates (language, sex, education, employment) in generalized linear mixed model (GLMM; adjusted odds ratio [OR] 2.2; 95% confidence interval [CI] 1.1-4.5). Mental health symptom detection occurred for 38.7% of patients in the iCCAS group and 27.8% in the usual care group (p > 0.05). The effect was not significant beyond potential

  1. Potential and limitations of X-Ray micro-computed tomography in arthropod neuroanatomy: A methodological and comparative survey

    PubMed Central

    Sombke, Andy; Lipke, Elisabeth; Michalik, Peter; Uhl, Gabriele; Harzsch, Steffen

    2015-01-01

    Classical histology or immunohistochemistry combined with fluorescence or confocal laser scanning microscopy are common techniques in arthropod neuroanatomy, and these methods often require time-consuming and difficult dissections and sample preparations. Moreover, these methods are prone to artifacts due to compression and distortion of tissues, which often result in information loss and especially affect the spatial relationships of the examined parts of the nervous system in their natural anatomical context. Noninvasive approaches such as X-ray micro-computed tomography (micro-CT) can overcome such limitations and have been shown to be a valuable tool for understanding and visualizing internal anatomy and structural complexity. Nevertheless, knowledge about the potential of this method for analyzing the anatomy and organization of nervous systems, especially of taxa with smaller body size (e.g., many arthropods), is limited. This study set out to analyze the brains of selected arthropods with micro-CT, and to compare these results with available histological and immunohistochemical data. Specifically, we explored the influence of different sample preparation procedures. Our study shows that micro-CT is highly suitable for analyzing arthropod neuroarchitecture in situ and allows specific neuropils to be distinguished within the brain to extract quantitative data such as neuropil volumes. Moreover, data acquisition is considerably faster compared with many classical histological techniques. Thus, we conclude that micro-CT is highly suitable for targeting neuroanatomy, as it reduces the risk of artifacts and is faster than classical techniques. J. Comp. Neurol. 523:1281–1295, 2015. © 2015 Wiley Periodicals, Inc. PMID:25728683

  2. Enterococcus faecium biofilm formation: identification of major autolysin AtlAEfm, associated Acm surface localization, and AtlAEfm-independent extracellular DNA Release.

    PubMed

    Paganelli, Fernanda L; Willems, Rob J L; Jansen, Pamela; Hendrickx, Antoni; Zhang, Xinglin; Bonten, Marc J M; Leavis, Helen L

    2013-04-16

    Enterococcus faecium is an important multidrug-resistant nosocomial pathogen causing biofilm-mediated infections in patients with medical devices. Insight into E. faecium biofilm pathogenesis is pivotal for the development of new strategies to prevent and treat these infections. In several bacteria, a major autolysin is essential for extracellular DNA (eDNA) release in the biofilm matrix, contributing to biofilm attachment and stability. In this study, we identified and functionally characterized the major autolysin of E. faecium E1162 by a bioinformatic genome screen followed by insertional gene disruption of six putative autolysin genes. Insertional inactivation of locus tag EfmE1162_2692 resulted in resistance to lysis, reduced eDNA release, deficient cell attachment, decreased biofilm, decreased cell wall hydrolysis, and significant chaining compared to that of the wild type. Therefore, locus tag EfmE1162_2692 was considered the major autolysin in E. faecium and renamed atlAEfm. In addition, AtlAEfm was implicated in cell surface exposure of Acm, a virulence factor in E. faecium, and thereby facilitates binding to collagen types I and IV. This is a novel feature of enterococcal autolysins not described previously. Furthermore, we identified (and localized) autolysin-independent DNA release in E. faecium that contributes to cell-cell interactions in the atlAEfm mutant and is important for cell separation. In conclusion, AtlAEfm is the major autolysin in E. faecium and contributes to biofilm stability and Acm localization, making AtlAEfm a promising target for treatment of E. faecium biofilm-mediated infections. IMPORTANCE Nosocomial infections caused by Enterococcus faecium have rapidly increased, and treatment options have become more limited. This is due not only to increasing resistance to antibiotics but also to biofilm-associated infections. DNA is released in biofilm matrix via cell lysis, caused by autolysin, and acts as a matrix stabilizer. In this study

  3. Enhanced delegated computing using coherence

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie; Dunjko, Vedran; Schlederer, Florian; Moore, Merritt; Kashefi, Elham; Walmsley, Ian A.

    2016-03-01

    A longstanding question is whether it is possible to delegate computational tasks securely—such that neither the computation nor the data is revealed to the server. Recently, both a classical and a quantum solution to this problem were found [C. Gentry, in Proceedings of the 41st Annual ACM Symposium on the Theory of Computing (Association for Computing Machinery, New York, 2009), pp. 167-178; A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science (IEEE Computer Society, Los Alamitos, CA, 2009), pp. 517-526]. Here, we study the first step towards the interplay between classical and quantum approaches and show how coherence can be used as a tool for secure delegated classical computation. We show that a client with limited computational capacity—restricted to an XOR gate—can perform universal classical computation by manipulating information carriers that may occupy superpositions of two states. Using single photonic qubits or coherent light, we experimentally implement secure delegated classical computations between an independent client and a server, which are installed in two different laboratories and separated by 50 m . The server has access to the light sources and measurement devices, whereas the client may use only a restricted set of passive optical devices to manipulate the information-carrying light beams. Thus, our work highlights how minimal quantum and classical resources can be combined and exploited for classical computing.

  4. Papers Presented at the ACM SIGCSE Technical Symposium on Academic Education in Computer Science [held in Houston, Texas, November 16, 1970].

    ERIC Educational Resources Information Center

    Aiken, Robert M., Ed.

    1970-01-01

    The papers given at this symposium were selected for their description of how specific problems were tackled, and with what success, as opposed to proposals unsupported by experience. The goal was to permit the audience to profit from the trials (and errors) of others. The eighteen papers presented are: "Business and the University Computer…

  5. A Survey of Parallel Computing

    DTIC Science & Technology

    1988-07-01

    CENTERS 153 Newnann Cente’r (JVNC) near Princeton, New Jersey. Each center is equipped with state-of-the- art supercomputing equipment and a staff to...offers state-of-the- art , networked workstations for interactive work on the Cray X-MP and for other research purposes such as analyzing results...corporations. Designated employees from participating corporations receive training tailored to their needs, access to state-of-the- art workstations and

  6. Chemical Laser Computer Code Survey,

    DTIC Science & Technology

    1980-12-01

    DOCUMENTATION: Resonator Geometry Synthesis Code Requi rement NV. L. Gamiz); Incorporate General Resonator into Ray Trace Code (W. H. Southwell... Synthesis Code Development (L. R. Stidhm) CATEGRY ATIUEOPTICS KINETICS GASOYNAM41CS None * None *iNone J.LEVEL Simrple Fabry Perot Simple SaturatedGt... Synthesis Co2de Require- ment (V L. ami l ncor~orate General Resonatorn into Ray Trace Code (W. H. Southwel) Srace Optimization Algorithms and Equations (W

  7. Prior to the Oral Therapy, What Do We Know About HCV-4 in Egypt: A Randomized Survey of Prevalence and Risks Using Data Mining Computed Analysis

    PubMed Central

    Abd Elrazek, Abd Elrazek; Bilasy, Shymaa E.; Elbanna, Abduh E. M.; Elsherif, Abd Elhalim A.

    2014-01-01

    Abstract Hepatitis C virus (HCV) affects over 180 million people worldwide and it's the leading cause of chronic liver diseases and hepatocellular carcinoma. HCV is classified into seven major genotypes and a series of subtypes. In general, HCV genotype 4 (HCV-4) is common in the Middle East and Africa, where it is responsible for more than 80% of HCV infections. Although HCV-4 is the cause of approximately 20% of the 180 million cases of chronic hepatitis C worldwide, it has not been a major subject of research yet. The aim of the current study is to survey the morbidities and disease complications among Egyptian population infected with HCV-4 using data mining advanced computing methods mainly and other complementary statistical analysis. Six thousand six hundred sixty subjects, aged between 17 and 58 years old, from different Egyptian Governorates were screened for HCV infection by ELISA and qualitative PCR. HCV-positive patients were further investigated for the incidence of liver cirrhosis and esophageal varices. Obtained data were analyzed by data mining approach. Among 6660 subjects enrolled in this survey, 1018 patients (15.28%) were HCV-positive. Proportion of infected-males was significantly higher than females; 61.6% versus 38.4% (P = 0.0052). Around two-third of infected-patients (635/1018; 62.4%) were presented with liver cirrhosis. Additionally, approximately half of the cirrhotic patients (301/635; 47.4%) showed degrees of large esophageal varices (LEVs), with higher variceal grade observed in males. Age for esophageal variceal development was 47 ± 1. Data mining analysis yielded esophageal wall thickness (>6.5 mm), determined by conventional U/S, as the only independent predictor for esophageal varices. This study emphasizes the high prevalence of HCV infection among Egyptian population, in particular among males. Egyptians with HCV-4 infection are at a higher risk to develop cirrhotic liver and esophageal varices. Data mining, a new

  8. A survey of surveys

    SciTech Connect

    Kent, S.M.

    1994-11-01

    A new era for the field of Galactic structure is about to be opened with the advent of wide-area digital sky surveys. In this article, the author reviews the status and prospects for research for 3 new ground-based surveys: the Sloan Digital Sky Survey (SDSS), the Deep Near-Infrared Survey of the Southern Sky (DENIS) and the Two Micron AU Sky Survey (2MASS). These surveys will permit detailed studies of Galactic structure and stellar populations in the Galaxy with unprecedented detail. Extracting the information, however, will be challenging.

  9. Do Home Computers Improve Educational Outcomes? Evidence from Matched Current Population Surveys and the National Longitudinal Survey of Youth 1997. National Poverty Center Working Paper Series #06-01

    ERIC Educational Resources Information Center

    Beltran, Daniel O.; Das, Kuntal K.; Fairlie, Robert W.

    2006-01-01

    Nearly twenty million children in the United States do not have computers in their homes. The role of "home" computers in the educational process, however, has drawn very little attention in the previous literature. We use panel data from the two main U.S. datasets that include recent information on computer ownership among children--the…

  10. Surveying Future Surveys

    NASA Astrophysics Data System (ADS)

    Carlstrom, John E.

    2016-06-01

    The now standard model of cosmology has been tested and refined by the analysis of increasingly sensitive, large astronomical surveys, especially with statistically significant millimeter-wave surveys of the cosmic microwave background and optical surveys of the distribution of galaxies. This talk will offer a glimpse of the future, which promises an acceleration of this trend with cosmological information coming from new surveys across the electromagnetic spectrum as well as particles and even gravitational waves.

  11. Construction of improved temperature-sensitive and mobilizable vectors and their use for constructing mutations in the adhesin-encoding acm gene of poorly transformable clinical Enterococcus faecium strains.

    PubMed

    Nallapareddy, Sreedhar R; Singh, Kavindra V; Murray, Barbara E

    2006-01-01

    Inactivation by allelic exchange in clinical isolates of the emerging nosocomial pathogen Enterococcus faecium has been hindered by lack of efficient tools, and, in this study, transformation of clinical isolates was found to be particularly problematic. For this reason, a vector for allelic replacement (pTEX5500ts) was constructed that includes (i) the pWV01-based gram-positive repAts replication region, which is known to confer a high degree of temperature intolerance, (ii) Escherichia coli oriR from pUC18, (iii) two extended multiple-cloning sites located upstream and downstream of one of the marker genes for efficient cloning of flanking regions for double-crossover mutagenesis, (iv) transcriptional terminator sites to terminate undesired readthrough, and (v) a synthetic extended promoter region containing the cat gene for allelic exchange and a high-level gentamicin resistance gene, aph(2'')-Id, to distinguish double-crossover recombination, both of which are functional in gram-positive and gram-negative backgrounds. To demonstrate the functionality of this vector, the vector was used to construct an acm (encoding an adhesin to collagen from E. faecium) deletion mutant of a poorly transformable multidrug-resistant E. faecium endocarditis isolate, TX0082. The acm-deleted strain, TX6051 (TX0082Deltaacm), was shown to lack Acm on its surface, which resulted in the abolishment of the collagen adherence phenotype observed in TX0082. A mobilizable derivative (pTEX5501ts) that contains oriT of Tn916 to facilitate conjugative transfer from the transformable E. faecalis strain JH2Sm::Tn916 to E. faecium was also constructed. Using this vector, the acm gene of a nonelectroporable E. faecium wound isolate was successfully interrupted. Thus, pTEX5500ts and its mobilizable derivative demonstrated their roles as important tools by helping to create the first reported allelic replacement in E. faecium; the constructed this acm deletion mutant will be useful for assessing the

  12. Autonomic Computing

    DTIC Science & Technology

    2006-04-01

    L. L.; Goldszmidt, G. S .; Harper, R. E.; Krishnakumar, S . M.; Pruett, G.; &Yassur, B.A. “Management of Application Complexes in Multitier...03] Agarwal, M.; Bhat, V.; Liu, H.; Matossian, V.; Putty, V.; Schmidt, C.; Zhang, G.; Zhen, L.; Parashar, M.; Rutgers, Khargharia, B.; & Hariri, S ...New York, NY: ACM Press, 2005. [Bantz 03] Bantz, D. F.; Bisdikian, C.; Challener, D.; Karidis, J. P.; Mastrianni, S .; Mohindra, A.; Shea, D. G

  13. Digital video delivery for a digital library in computer science

    NASA Astrophysics Data System (ADS)

    Fox, Edward A.; Abdulla, Ghaleb

    1994-04-01

    With support from four NSF awards we aim to develop a prototype digital library in computer science and apply it to improve undergraduate educations. First, Project Envision, `A User- Centered Database from the Computer Science Literature,' 1991-94, deals with translation, coding standards including SGML, retrieval/previewing/presentation/browsing/linking, human-computer interaction, and construction of a partial archive using text and multimedia materials provided by ACM. Second, `Interactive Learning with a Digital Library in Computer Science,' 1993-96, supported by NSF and ACM with additional assistance from other publishers, focuses on improving learning through delivery of materials from the archive. Third, `Networked Multimedia File System with HyTime,' funded by NSF through the SUCCEED coalition, considers networking support for distributed multimedia applications and the use of HyTime for description of such applications. Fourth, equipment support comes from the Information Access Laboratory allotment of the `Interactive Accessibility: Breaking Barriers to the Power of Computing' grant funded by NSF for 1993-98. In this paper we report on plans and work with digital video relating to these projects. In particular we focus on our analysis of the requirements for a multimedia digital library in computer science and our experience with MPEG as it applies to that library.

  14. Testing the abundant center model using range-wide demographic surveys of two coastal dune plants.

    PubMed

    Samis, Karen E; Eckert, Christopher G

    2007-07-01

    It is widely accepted that species are most abundant at the center of their geographic ranges and become progressively rarer toward range limits. Although the abundant center model (ACM) has rarely been tested with range-wide surveys, it influences much thinking about the ecology and evolution of species' distributions. We tested ACM predictions using two unrelated but ecologically similar plants, Camissonia cheiranthifolia and Abronia umbellata. We intensively sampled both throughout their one-dimensional distributions within the Pacific coastal dunes of North America, from northern Baja California, Mexico, to southern Oregon, USA. Data from > 1100 herbarium specimens indicated that these limits have been stable for at least the last 100 years. Range-wide field surveys detected C. cheiranthifolia at 87% of 124 sites and A. umbellata at 54% of 113 sites, but site occupancy did not decline significantly toward range limits for either species. Permutation analysis did not detect a significant fit of geographical variation in local density to the ACM. Mean density did not correlate negatively with mean individual performance (plant size or number of seeds/plant), probably because both species occur at low densities. Although size and seeds per plant varied widely, central populations tended to have the highest values for size only. For C. cheiranthifolia, we observed asymmetry in the pattern of variation between the northern and southern halves of the range consistent with the long-standing prediction that range limits are imposed by different ecological factors in different parts of the geographical distribution. However, these asymmetries were difficult to interpret and likely reflect evolutionary differentiation as well as plastic responses to ecological variation. Both density and seeds per plant contributed to variation in seed production per unit area. In C. cheiranthifolia only, sites with highest seed production tended to occur at the range center, as

  15. Computer technology forecasting at the National Laboratories

    SciTech Connect

    Peskin, A M

    1980-01-01

    The DOE Office of ADP Management organized a group of scientists and computer professionals, mostly from their own national laboratories, to prepare an annually updated technology forecast to accompany the Department's five-year ADP Plan. The activities of the task force were originally reported in an informal presentation made at the ACM Conference in 1978. This presentation represents an update of that report. It also deals with the process of applying the results obtained at a particular computing center, Brookhaven National Laboratory. Computer technology forecasting is a difficult and hazardous endeavor, but it can reap considerable advantage. The forecast performed on an industry-wide basis can be applied to the particular needs of a given installation, and thus give installation managers considerable guidance in planning. A beneficial side effect of this process is that it forces installation managers, who might otherwise tend to preoccupy themselves with immediate problems, to focus on longer term goals and means to their ends. (RWR)

  16. A regional land use survey based on remote sensing and other data: A report on a LANDSAT and computer mapping project, volume 2

    NASA Technical Reports Server (NTRS)

    Nez, G. (Principal Investigator); Mutter, D.

    1977-01-01

    The author has identified the following significant results. The project mapped land use/cover classifications from LANDSAT computer compatible tape data and combined those results with other multisource data via computer mapping/compositing techniques to analyze various land use planning/natural resource management problems. Data were analyzed on 1:24,000 scale maps at 1.1 acre resolution. LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.

  17. A Computational Wireless Network Backplane: Performance in a Distributed Speaker Identification Application Postprint

    DTIC Science & Technology

    2008-12-01

    traffic patterns are intense but constrained to a local area. Examples include peer-to-peer applications or sensor data processing in the region. In such...vol. 30, no. 4, pp. 68–74, 1997. [7] J. Dean and S. Ghemawat, “ Mapreduce : simplified data processing on large clusters ,” Commun. ACM, vol. 51, no. 1...DWARF, a general distributed application execution framework for wireless ad-hoc networks which dynamically allocates computation resources and manages

  18. The Challenge of Computers.

    ERIC Educational Resources Information Center

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  19. Quantum computing with trapped ions

    SciTech Connect

    Hughes, R.J.

    1998-01-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  20. QADATA user's manual; an interactive computer program for the retrieval and analysis of the results from the external blind sample quality- assurance project of the U.S. Geological Survey

    USGS Publications Warehouse

    Lucey, K.J.

    1990-01-01

    The U.S. Geological Survey conducts an external blind sample quality assurance project for its National Water Quality Laboratory in Denver, Colorado, based on the analysis of reference water samples. Reference samples containing selected inorganic and nutrient constituents are disguised as environmental samples at the Survey 's office in Ocala, Florida, and are sent periodically through other Survey offices to the laboratory. The results of this blind sample project indicate the quality of analytical data produced by the laboratory. This report provides instructions on the use of QADATA, an interactive, menu-driven program that allows users to retrieve the results of the blind sample quality- assurance project. The QADATA program, which is available on the U.S. Geological Survey 's national computer network, accesses a blind sample data base that contains more than 50,000 determinations from the last five water years for approximately 40 constituents at various concentrations. The data can be retrieved from the database for any user- defined time period and for any or all available constituents. After the user defines the retrieval, the program prepares statistical tables, control charts, and precision plots and generates a report which can be transferred to the user 's office through the computer network. A discussion of the interpretation of the program output is also included. This quality assurance information will permit users to document the quality of the analytical results received from the laboratory. The blind sample data is entered into the database within weeks after being produced by the laboratory and can be retrieved to meet the needs of specific projects or programs. (USGS)

  1. 'Towers in the Tempest' Computer Animation Submission

    NASA Technical Reports Server (NTRS)

    Shirah, Greg

    2008-01-01

    The following describes a computer animation that has been submitted to the ACM/SIGGRAPH 2008 computer graphics conference: 'Towers in the Tempest' clearly communicates recent scientific research into how hurricanes intensify. This intensification can be caused by a phenomenon called a 'hot tower.' For the first time, research meteorologists have run complex atmospheric simulations at a very fine temporal resolution of 3 minutes. Combining this simulation data with satellite observations enables detailed study of 'hot towers.' The science of 'hot towers' is described using: satellite observation data, conceptual illustrations, and a volumetric atmospheric simulation data. The movie starts by showing a 'hot tower' observed by NASA's Tropical Rainfall Measuring Mission (TRMM) spacecraft's three dimensional precipitation radar data of Hurricane Bonnie. Next, the dynamics of a hurricane and the formation of 'hot towers' are briefly explained using conceptual illustrations. Finally, volumetric cloud, wind, and vorticity data from a supercomputer simulation of Hurricane Bonnie are shown using volume techniques such as ray marching.

  2. Reliability automation tool (RAT) for fault tolerance computation

    NASA Astrophysics Data System (ADS)

    Singh, N. S. S.; Hamid, N. H.; Asirvadam, V. S.

    2012-09-01

    As CMOS transistors reduced in size, the circuit built using these nano-scale transistors naturally becomes less reliable. The reliability reduction, which is the measure of circuit performance, has brought up so many challenges in designing modern logic integrated circuit. Therefore, reliability modeling is increasingly important subject to be considered in designing modern logic integrated circuit. This drives a need to compute reliability measures for nano-scale circuits. This paper looks into the development of reliability automation tool (RAT) for circuit's reliability computation. The tool is developed using Matlab programming language based on the reliability evaluation model called Probabilistic Transfer Matrix (PTM). RAT allows users to significantly speed-up the reliability assessments of nano-scale circuits. Users have to provide circuit's netlist as the input to RAT for its reliability computation. The netlist signifies the circuit's description in terms of Gate Profile Matrix (GPM), Adjacency Computation Matrix (ACM) and Grid Layout Matrix (GLM). GPM, ACM and GLM indicate the types of logic gates, the interconnection between these logic gates and the layout matrix of these logic gates respectively in a given circuit design. Here, the reliability assessment by RAT is carried out on Full Adder circuit as the benchmark test circuit.

  3. Freedom from the Tyranny of the Campus Main-Frame: Handling the Statistical Analysis of a 10-year Survey Research Study with a Personal Computer.

    ERIC Educational Resources Information Center

    Hickman, Linda J.

    Technological advances in microcomputer hardware and software, including size of memory and increasingly more sophisticated statistical application packages, create a new era in educational research. The alternative to costly main-frame computer data processing and statistical analysis is explored in this paper. In the first section, typical…

  4. A Survey to Determine the Knowledge and Skills Needed by Clerical Workers in First-Level Entry Occupations in Digital Computer Installations.

    ERIC Educational Resources Information Center

    Jones, Adaline Dorothy Seitz

    The purposes of this study were to determine the occupational opportunities for which high school graduates can qualify in the field of digital computer installations, the knowledges and skills needed for employment, the training needed, the pattern of advancement, the effect of automatic coding, and significant recent developments. Sixty-nine…

  5. PEP surveying procedures and equipment

    SciTech Connect

    Linker, F.

    1982-06-01

    The PEP Survey and Alignment System, which employs both laser-based and optical survey methods, is described. The laser is operated in conjunction with the Tektronix 4051 computer and surveying instruments such as ARM and SAM, system which is designed to automate data input, reduction, and production of alignment instructions. The laser system is used when surveying ring quadrupoles, main bend magnets, sextupoles, and is optional when surveying RF cavities and insertion quadrupoles. Optical methods usually require that data be manually entered into the computer for alignment, but in some cases, an element can be aligned using nominal values of fiducial locations without use of the computer. Optical surveying is used in the alignment of NIT and SIT, low field bend magnets, wigglers, RF cavities, and insertion quadrupoles.

  6. The stabilization effect of dielectric constant and acidic amino acids on arginine-arginine (Arg-Arg) pairings: database survey and computational studies.

    PubMed

    Zhang, Zhengyan; Xu, Zhijian; Yang, Zhuo; Liu, Yingtao; Wang, Jin'an; Shao, Qiang; Li, Shujin; Lu, Yunxiang; Zhu, Weiliang

    2013-05-02

    Database survey in this study revealed that about one-third of the protein structures deposited in the Protein Data Bank (PDB) contain arginine-arginine (Arg-Arg) pairing with a carbon···carbon (CZ···CZ) interaction distance less than 5 Å. All the Arg-Arg pairings were found to bury in a polar environment composed of acidic residues, water molecules, and strong polarizable or negatively charged moieties from binding site or bound ligand. Most of the Arg-Arg pairings are solvent exposed and 68.3% Arg-Arg pairings are stabilized by acidic residues, forming Arg-Arg-Asp/Glu clusters. Density functional theory (DFT) was then employed to study the effect of environment on the pairing structures. It was revealed that Arg-Arg pairings become thermodynamically stable (about -1 kcal/mol) as the dielectric constant increases to 46.8 (DMSO), in good agreement with the results of the PDB survey. DFT calculations also demonstrated that perpendicular Arg-Arg pairing structures are favorable in low dielectric constant environment, while in high dielectric constant environment parallel structures are favorable. Additionally, the acidic residues can stabilize the Arg-Arg pairing structures to a large degree. Energy decomposition analysis of Arg-Arg pairings and Arg-Arg-Asp/Glu clusters showed that both solvation and electrostatic energies contribute significantly to their stability. The results reported herein should be very helpful for understanding Arg-Arg pairing and its application in drug design.

  7. Can Compute, Won't Compute: Women's Participation in the Culture of Computing.

    ERIC Educational Resources Information Center

    Wilson, Fiona

    2003-01-01

    Surveys of 130 psychology students and 52 computer science students (20 of the latter were interviewed) indicated that more males read computer magazines and were confident in computer use. Many did not perceive an equity problem. Men seemed to feel the equity situation is improving. Some felt that women do not enjoy computing as much as men and…

  8. Computer vision

    NASA Technical Reports Server (NTRS)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  9. Survey of digital filtering

    NASA Technical Reports Server (NTRS)

    Nagle, H. T., Jr.

    1972-01-01

    A three part survey is made of the state-of-the-art in digital filtering. Part one presents background material including sampled data transformations and the discrete Fourier transform. Part two, digital filter theory, gives an in-depth coverage of filter categories, transfer function synthesis, quantization and other nonlinear errors, filter structures and computer aided design. Part three presents hardware mechanization techniques. Implementations by general purpose, mini-, and special-purpose computers are presented.

  10. Results of the deepest all-sky survey for continuous gravitational waves on LIGO S6 data running on the Einstein@Home volunteer distributed computing project

    NASA Astrophysics Data System (ADS)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Bejger, M.; Bell, A. S.; Berger, B. K.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Boer, M.; Bogaert, G.; Bogan, C.; Bohe, A.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Broida, J. E.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brown, N. M.; Brunett, S.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cabero, M.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderón Bustillo, J.; Callister, T.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Casanueva Diaz, J.; Casentini, C.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerboni Baiardi, L.; Cerretani, G.; Cesarini, E.; Chamberlin, S. J.; Chan, M.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Cheeseboro, B. D.; Chen, H. Y.; Chen, Y.; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Creighton, T. D.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dasgupta, A.; Da Silva Costa, C. F.; Dattilo, V.; Dave, I.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; De, S.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dergachev, V.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Devine, R. C.; Dhurandhar, S.; Díaz, M. C.; Di Fiore, L.; Di Giovanni, M.; Di Girolamo, T.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Fenyvesi, E.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.; Fournier, J.-D.; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gaur, G.; Gehrels, N.; Gemme, G.; Geng, P.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Abhirup; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gonzalez Castro, J. M.; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Gosselin, M.; Gouaty, R.; Grado, A.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Hacker, J. J.; Hall, B. R.; Hall, E. D.; Hammond, G.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hanson, J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C.-J.; Haughian, K.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Henry, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hoak, D.; Hofman, D.; Holt, K.; Holz, D. E.; Hopkins, P.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J.-M.; Isi, M.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jian, L.; Jiménez-Forteza, F.; Johnson, W. W.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; K, Haris; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Kapadia, S. J.; Karki, S.; Karvinen, K. S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kéfélian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.; Key, J. S.; Khalili, F. Y.; Khan, I.; Khan, S.; Khan, Z.; Khazanov, E. A.; Kijbunchoo, N.; Kim, Chi-Woong; Kim, Chunglee; Kim, J.; Kim, K.; Kim, N.; Kim, W.; Kim, Y.-M.; Kimbrell, S. J.; King, E. J.; King, P. J.; Kissel, J. S.; Klein, B.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Krishnan, B.; Królak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kumar, R.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Laxen, M.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lebigot, E. O.; Lee, C. H.; Lee, H. K.; Lee, H. M.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Lewis, J. B.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Lombardi, A. L.; London, L. T.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lück, H.; Lundgren, A. P.; Lynch, R.; Ma, Y.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magaña-Sandoval, F.; Magaña Zertuche, L.; Magee, R. M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Mastrogiovanni, S.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McRae, T.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mendell, G.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Metzdorff, R.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, A. L.; Miller, A.; Miller, B. B.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B. C.; Moore, C. J.; Moraru, D.; Moreno, G.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P. G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Nedkova, K.; Nelemans, G.; Nelson, T. J. N.; Neri, M.; Neunzert, A.; Newton, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Oh, S. H.; Ohme, F.; Oliver, M.; Oppermann, P.; Oram, Richard J.; O'Reilly, B.; O'Shaughnessy, R.; Ottaway, D. J.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Paris, H. R.; Parker, W.; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Perri, L. M.; Phelps, M.; Piccinni, O. J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poe, M.; Poggiani, R.; Popolizio, P.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prix, R.; Prodi, G. A.; Prokhorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Pürrer, M.; Qi, H.; Qin, J.; Qiu, S.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rajan, C.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Rizzo, M.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rosińska, D.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Ryan, K.; Sachdev, S.; Sadecki, T.; Sadeghian, L.; Sakellariadou, M.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Saulson, P. R.; Sauter, O. E. S.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J.; Schmidt, P.; Schnabel, R.; Schofield, R. M. S.; Schönbeck, A.; Schreiber, E.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, S. M.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Setyawati, Y.; Shaddock, D. A.; Shaffer, T.; Shahriar, M. S.; Shaltev, M.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sieniawska, M.; Sigg, D.; Silva, A. D.; Singer, A.; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, J. R.; Smith, N. D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stone, R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sunil, S.; Sutton, P. J.; Swinkels, B. L.; Szczepańczyk, M. J.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tápai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Toland, K.; Tomlinson, C.; Tonelli, M.; Tornasi, Z.; Torres, C. V.; Torrie, C. I.; Töyrä, D.; Travasso, F.; Traylor, G.; Trifirò, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlbruch, H.; Vajente, G.; Valdes, G.; van Bakel, N.; van Beuzekom, M.; van den Brand, J. F. J.; Van Den Broeck, C.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasúth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Viceré, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J.-Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D. V.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, M.; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L.-W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Wen, L.; Weßels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; Whiting, B. F.; Williams, R. D.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Woehler, J.; Worden, J.; Wright, J. L.; Wu, D. S.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yu, H.; Yvert, M.; ZadroŻny, A.; Zangrando, L.; Zanolin, M.; Zendri, J.-P.; Zevin, M.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, S. J.; Zhu, X.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.; LIGO Scientific Collaboration; Virgo Collaboration

    2016-11-01

    We report results of a deep all-sky search for periodic gravitational waves from isolated neutron stars in data from the S6 LIGO science run. The search was possible thanks to the computing power provided by the volunteers of the Einstein@Home distributed computing project. We find no significant signal candidate and set the most stringent upper limits to date on the amplitude of gravitational wave signals from the target population. At the frequency of best strain sensitivity, between 170.5 and 171 Hz we set a 90% confidence upper limit of 5.5 ×10-25 , while at the high end of our frequency range, around 505 Hz, we achieve upper limits ≃10-24 . At 230 Hz we can exclude sources with ellipticities greater than 10-6 within 100 pc of Earth with fiducial value of the principal moment of inertia of 1038 kg m2 . If we assume a higher (lower) gravitational wave spin-down we constrain farther (closer) objects to higher (lower) ellipticities.

  11. Willingness of Patients with Breast Cancer in the Adjuvant and Metastatic Setting to Use Electronic Surveys (ePRO) Depends on Sociodemographic Factors, Health-related Quality of Life, Disease Status and Computer Skills

    PubMed Central

    Graf, J.; Simoes, E.; Wißlicen, K.; Rava, L.; Walter, C. B.; Hartkopf, A.; Keilmann, L.; Taran, A.; Wallwiener, S.; Fasching, P.; Brucker, S. Y.; Wallwiener, M.

    2016-01-01

    Introduction: Because of the often unfavorable prognosis, particularly for patients with metastases, health-related quality of life is extremely important for breast cancer patients. In recent years, data on patient-relevant endpoints is being increasingly collected electronically; however, knowledge on the acceptance and practicability of, and barriers to, this form of data collection remains limited. Material and Methods: A questionnaire was completed by 96 patients to determine to what extent existing computer skills, disease status, health-related quality of life and sociodemographic factors affect patientsʼ potential willingness to use electronics methods of data collection (ePRO). Results: 52 of 96 (55 %) patients reported a priori that they could envisage using ePRO. Patients who a priori preferred a paper-based survey (pPRO) tended to be older (ePRO 53 years vs. pPRO 62 years; p = 0.0014) and typically had lower levels of education (p = 0.0002), were in poorer health (p = 0.0327) and had fewer computer skills (p = 0.0003). Conclusion: Barriers to the prospective use of ePRO were identified in older patients and patients with a lower quality of life. Given the appropriate conditions with regard to age, education and current health status, opportunities to participate should be provided to encourage patientsʼ willingness to take part and ensure the validity of survey results. Focusing on ease of use of ePRO applications and making applications more patient-oriented and straightforward appears to be the way forward. PMID:27239062

  12. Computed Tomography Imaging Spectrometer (CTIS) with 2D Reflective Grating for Ultraviolet to Long-Wave Infrared Detection Especially Useful for Surveying Transient Events

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Maker, Paul D. (Inventor); Muller, Richard E. (Inventor); Mouroulis, Pantazis Z. (Inventor)

    2003-01-01

    The optical system of this invention is an unique type of imaging spectrometer, i.e. an instrument that can determine the spectra of all points in a two-dimensional scene. The general type of imaging spectrometer under which this invention falls has been termed a computed-tomography imaging spectrometer (CTIS). CTIS's have the ability to perform spectral imaging of scenes containing rapidly moving objects or evolving features, hereafter referred to as transient scenes. This invention, a reflective CTIS with an unique two-dimensional reflective grating, can operate in any wavelength band from the ultraviolet through long-wave infrared. Although this spectrometer is especially useful for events it is also for investigation of some slow moving phenomena as in the life sciences.

  13. Computed tomography imaging spectrometer (CTIS) with 2D reflective grating for ultraviolet to long-wave infrared detection especially useful for surveying transient events

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Maker, Paul D. (Inventor); Muller, Richard E. (Inventor); Mouroulis, Pantazis Z. (Inventor)

    2003-01-01

    The optical system of this invention is an unique type of imaging spectrometer, i.e. an instrument that can determine the spectra of all points in a two-dimensional scene. The general type of imaging spectrometer under which this invention falls has been termed a computed-tomography imaging spectrometer (CTIS). CTIS's have the ability to perform spectral imaging of scenes containing rapidly moving objects or evolving features, hereafter referred to as transient scenes. This invention, a reflective CTIS with an unique two-dimensional reflective grating, can operate in any wavelength band from the ultraviolet through long-wave infrared. Although this spectrometer is especially useful for rapidly occurring events it is also useful for investigation of some slow moving phenomena as in the life sciences.

  14. Audio computer-assisted survey instrument versus face-to-face interviews: optimal method for detecting high-risk behaviour in pregnant women and their sexual partners in the south of Brazil.

    PubMed

    Yeganeh, N; Dillavou, C; Simon, M; Gorbach, P; Santos, B; Fonseca, R; Saraiva, J; Melo, M; Nielsen-Saines, K

    2013-04-01

    Audio computer-assisted survey instrument (ACASI) has been shown to decrease under-reporting of socially undesirable behaviours, but has not been evaluated in pregnant women at risk of HIV acquisition in Brazil. We assigned HIV-negative pregnant women receiving routine antenatal care at in Porto Alegre, Brazil and their partners to receive a survey regarding high-risk sexual behaviours and drug use via ACASI (n = 372) or face-to-face (FTF) (n = 283) interviews. Logistic regression showed that compared with FTF, pregnant women interviewed via ACASI were significantly more likely to self-report themselves as single (14% versus 6%), having >5 sexual partners (35% versus 29%), having oral sex (42% versus 35%), using intravenous drugs (5% versus 0), smoking cigarettes (23% versus 16%), drinking alcohol (13% versus 8%) and using condoms during pregnancy (32% versus 17%). Therefore, ACASI may be a useful method in assessing risk behaviours in pregnant women, especially in relation to drug and alcohol use.

  15. Efficient Computational Research Protocol to Survey Free Energy Surface for Solution Chemical Reaction in the QM/MM Framework: The FEG-ER Methodology and Its Application to Isomerization Reaction of Glycine in Aqueous Solution.

    PubMed

    Takenaka, Norio; Kitamura, Yukichi; Nagaoka, Masataka

    2016-03-03

    In solution chemical reaction, we often need to consider a multidimensional free energy (FE) surface (FES) which is analogous to a Born-Oppenheimer potential energy surface. To survey the FES, an efficient computational research protocol is proposed within the QM/MM framework; (i) we first obtain some stable states (or transition states) involved by optimizing their structures on the FES, in a stepwise fashion, finally using the free energy gradient (FEG) method, and then (ii) we directly obtain the FE differences among any arbitrary states on the FES, efficiently by employing the QM/MM method with energy representation (ER), i.e., the QM/MM-ER method. To validate the calculation accuracy and efficiency, we applied the above FEG-ER methodology to a typical isomerization reaction of glycine in aqueous solution, and reproduced quite satisfactorily the experimental value of the reaction FE. Further, it was found that the structural relaxation of the solute in the QM/MM force field is not negligible to estimate correctly the FES. We believe that the present research protocol should become prevailing as one computational strategy and will play promising and important roles in solution chemistry toward solution reaction ergodography.

  16. Seismic, side-scan survey, diving, and coring data analyzed by a Macintosh II sup TM computer and inexpensive software provide answers to a possible offshore extension of landslides at Palos Verdes Peninsula, California

    SciTech Connect

    Dill, R.F. ); Slosson, J.E. ); McEachen, D.B. )

    1990-05-01

    A Macintosh II{sup TM} computer and commercially available software were used to analyze and depict the topography, construct an isopach sediment thickness map, plot core positions, and locate the geology of an offshore area facing an active landslide on the southern side of Palos Verdes Peninsula California. Profile data from side scan sonar, 3.5 kHz, and Boomer subbottom, high-resolution seismic, diving, echo sounder traverses, and cores - all controlled with a mini Ranger II navigation system - were placed in MacGridzo{sup TM} and WingZ{sup TM} software programs. The computer-plotted data from seven sources were used to construct maps with overlays for evaluating the possibility of a shoreside landslide extending offshore. The poster session describes the offshore survey system and demonstrates the development of the computer data base, its placement into the MacGridzo{sup TM} gridding program, and transfer of gridded navigational locations to the WingZ{sup TM} data base and graphics program. Data will be manipulated to show how sea-floor features are enhanced and how isopach data were used to interpret the possibility of landslide displacement and Holocene sea level rise. The software permits rapid assessment of data using computerized overlays and a simple, inexpensive means of constructing and evaluating information in map form and the preparation of final written reports. This system could be useful in many other areas where seismic profiles, precision navigational locations, soundings, diver observations, and core provide a great volume of information that must be compared on regional plots to develop of field maps for geological evaluation and reports.

  17. Sanitary Surveys

    EPA Pesticide Factsheets

    Sanitary survey is on-site review of a public water system’s water source, facilities, equipment, operation, and maintenance. Surveys point out sanitary deficiencies and assess a system’s capability to supply safe drinking water.

  18. "Suntelligence" Survey

    MedlinePlus

    ... to the American Academy of Dermatology's "Suntelligence" sun-smart survey. Please answer the following questions to measure ... be able to view a ranking of major cities suntelligence based on residents' responses to this survey. ...

  19. Computer program for simulation of variable recharge with the U. S. Geological Survey modular finite-difference ground-water flow model (MODFLOW)

    USGS Publications Warehouse

    Kontis, A.L.

    2001-01-01

    The Variable-Recharge Package is a computerized method designed for use with the U.S. Geological Survey three-dimensional finitedifference ground-water flow model (MODFLOW-88) to simulate areal recharge to an aquifer. It is suitable for simulations of aquifers in which the relation between ground-water levels and land surface can affect the amount and distribution of recharge. The method is based on the premise that recharge to an aquifer cannot occur where the water level is at or above land surface. Consequently, recharge will vary spatially in simulations in which the Variable- Recharge Package is applied, if the water levels are sufficiently high. The input data required by the program for each model cell that can potentially receive recharge includes the average land-surface elevation and a quantity termed ?water available for recharge,? which is equal to precipitation minus evapotranspiration. The Variable-Recharge Package also can be used to simulate recharge to a valley-fill aquifer in which the valley fill and the adjoining uplands are explicitly simulated. Valley-fill aquifers, which are the most common type of aquifer in the glaciated northeastern United States, receive much of their recharge from upland sources as channeled and(or) unchanneled surface runoff and as lateral ground-water flow. Surface runoff in the uplands is generated in the model when the applied water available for recharge is rejected because simulated water levels are at or above land surface. The surface runoff can be distributed to other parts of the model by (1) applying the amount of the surface runoff that flows to upland streams (channeled runoff) to explicitly simulated streams that flow onto the valley floor, and(or) (2) applying the amount that flows downslope toward the valley- fill aquifer (unchanneled runoff) to specified model cells, typically those near the valley wall. An example model of an idealized valley- fill aquifer is presented to demonstrate application of the

  20. Computer Series, 3: Computer Graphics for Chemical Education.

    ERIC Educational Resources Information Center

    Soltzberg, Leonard J.

    1979-01-01

    Surveys the current scene in computer graphics from the point of view of a chemistry educator. Discusses the scope of current applications of computer graphics in chemical education, and provides information about hardware and software systems to promote communication with vendors of computer graphics equipment. (HM)

  1. Survey of Anatomy and Root Canal Morphology of Maxillary First Molars Regarding Age and Gender in an Iranian Population Using Cone-Beam Computed Tomography

    PubMed Central

    Naseri, Mandana; Safi, Yaser; Akbarzadeh Baghban, Alireza; Khayat, Akbar; Eftekhar, Leila

    2016-01-01

    Introduction: The purpose of this study was to investigate the root and canal morphology of maxillary first molars with regards to patients’ age and gender with cone-beam computed tomography (CBCT). Methods and Materials: A total of 149 CBCT scans from 92 (67.1%) female and 57 (31.3%) male patients with mean age of 40.5 years were evaluated. Tooth length, presence of root fusion, number of the roots and canals, canal types based on Vertucci’s classification, deviation of root and apical foramen in coronal and sagittal planes and the correlation of all items with gender and age were recorded. The Mann Whitney U, Kruskal Wallis and Fisher’s exact tests were used to analyze these items. Results: The rate of root fusion was 1.3%. Multiple canals were present in the following frequencies: four canals 78.5%, five canals 11.4% and three canals 10.1%. Additional canal was detected in 86.6% of mesiobuccal roots in which Vertucci’s type VI configuration was the most prevalent followed by type II and I. Type I was the most common one in distobuccal and palatal roots. There was no statistically significant difference in the canal configurations in relation to gender and age as well as the incidence root or canal numbers (P>0.05). The mean tooth length was 19.3 and 20.3 mm in female and male patients, respectively which was statistically significant (P<0.05). Evaluation of root deviation showed that most commonly, a general pattern of straight-distal in the mesiobuccal and straight-straight for distobuccal and palatal roots occurred. In mesiobuccal roots, straight and distal deviations were more dominant in male and female, respectively (P<0.05). The prevalence of apical foramen deviation in mesiobuccal and palatal roots statistically differed with gender. Conclusion: The root and canal configuration of Iranian population showed different features from those of other populations. PMID:27790259

  2. Computer Education for Engineers, Part III.

    ERIC Educational Resources Information Center

    McCullough, Earl S.; Lofy, Frank J.

    1989-01-01

    Reports the results of the third survey of computer use in engineering education conducted in the fall of 1987 in comparing with 1981 and 1984 results. Summarizes survey data on computer course credits, languages, equipment use, CAD/CAM instruction, faculty access, and computer graphics. (YP)

  3. Computers in the World of College English.

    ERIC Educational Resources Information Center

    Tannheimer, Charlotte

    This sabbatical report surveys some computer software presently being developed, already in use, and/or available, and describes computer use in several Massachusetts colleges. A general introduction to computers, word processors, artificial intelligence, and computer assisted instruction is provided, as well as a discussion of what computers can…

  4. Cryptography, quantum computation and trapped ions

    SciTech Connect

    Hughes, Richard J.

    1998-03-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  5. Immature osteoblastic MG63 cells possess two calcitonin gene-related peptide receptor subtypes that respond differently to [Cys(Acm)(2,7)] calcitonin gene-related peptide and CGRP(8-37).

    PubMed

    Kawase, Tomoyuki; Okuda, Kazuhiro; Burns, Douglas M

    2005-10-01

    Calcitonin gene-related peptide (CGRP) is clearly an anabolic factor in skeletal tissue, but the distribution of CGRP receptor (CGRPR) subtypes in osteoblastic cells is poorly understood. We previously demonstrated that the CGRPR expressed in osteoblastic MG63 cells does not match exactly the known characteristics of the classic subtype 1 receptor (CGRPR1). The aim of the present study was to further characterize the MG63 CGRPR using a selective agonist of the putative CGRPR2, [Cys(Acm)(2,7)]CGRP, and a relatively specific antagonist of CGRPR1, CGRP(8-37). [Cys(Acm)(2,7)]CGRP acted as a significant agonist only upon ERK dephosphorylation, whereas this analog effectively antagonized CGRP-induced cAMP production and phosphorylation of cAMP response element-binding protein (CREB) and p38 MAPK. Although it had no agonistic action when used alone, CGRP(8-37) potently blocked CGRP actions on cAMP, CREB, and p38 MAPK but had less of an effect on ERK. Schild plot analysis of the latter data revealed that the apparent pA2 value for ERK is clearly distinguishable from those of the other three plots as judged using the 95% confidence intervals. Additional assays using 3-isobutyl-1-methylxanthine or the PKA inhibitor N-(2-[p-bromocinnamylamino]ethyl)-5-isoquinolinesulfonamide hydrochloride (H-89) indicated that the cAMP-dependent pathway was predominantly responsible for CREB phosphorylation, partially involved in ERK dephosphorylation, and not involved in p38 MAPK phosphorylation. Considering previous data from Scatchard analysis of [125I]CGRP binding in connection with these results, these findings suggest that MG63 cells possess two functionally distinct CGRPR subtypes that show almost identical affinity for CGRP but different sensitivity to CGRP analogs: one is best characterized as a variation of CGRPR1, and the second may be a novel variant of CGRPR2.

  6. Theory Survey or Survey Theory?

    ERIC Educational Resources Information Center

    Dean, Jodi

    2010-01-01

    Matthew Moore's survey of political theorists in U.S. American colleges and universities is an impressive contribution to political science (Moore 2010). It is the first such survey of political theory as a subfield, the response rate is very high, and the answers to the survey questions provide new information about how political theorists look…

  7. Robotic Surveying

    SciTech Connect

    Suzy Cantor-McKinney; Michael Kruzic

    2007-03-01

    -actuated functions to be controlled by an onboard computer. The computer-controlled Speedrower was developed at Carnegie Mellon University to automate agricultural harvesting. Harvesting tasks require the vehicle to cover a field using minimally overlapping rows at slow speeds in a similar manner to geophysical data acquisition. The Speedrower had demonstrated its ability to perform as it had already logged hundreds of acres of autonomous harvesting. This project is the first use of autonomous robotic technology on a large-scale for geophysical surveying.

  8. Heterotic computing: exploiting hybrid computational devices.

    PubMed

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications.

  9. Computers and Computer Resources.

    ERIC Educational Resources Information Center

    Bitter, Gary

    1980-01-01

    This resource directory provides brief evaluative descriptions of six popular home computers and lists selected sources of educational software, computer books, and magazines. For a related article on microcomputers in the schools, see p53-58 of this journal issue. (SJL)

  10. Columbia Gorge Community College Business Survey.

    ERIC Educational Resources Information Center

    McKee, Jonathon V.

    This is a report on a business survey conducted by Columbia Gorge Community College (CGCC) (Oregon) to review the success and quality of the college's degree and certificate programs in business administration, computer application systems, and computer information systems. The community college surveyed 104 local businesses to verify the…

  11. ARM User Survey Report

    SciTech Connect

    Roeder, LR

    2010-06-22

    The objective of this survey was to obtain user feedback to, among other things, determine how to organize the exponentially growing data within the Atmospheric Radiation Measurement (ARM) Climate Research Facility, and identify users’ preferred data analysis system. The survey findings appear to have met this objective, having received approximately 300 responses that give insight into the type of work users perform, usage of the data, percentage of data analysis users might perform on an ARM-hosted computing resource, downloading volume level where users begin having reservations, opinion about usage if given more powerful computing resources (including ability to manipulate data), types of tools that would be most beneficial to them, preferred programming language and data analysis system, level of importance for certain types of capabilities, and finally, level of interest in participating in a code-sharing community.

  12. SURVEY INSTRUMENT

    DOEpatents

    Borkowski, C J

    1954-01-19

    This pulse-type survey instrument is suitable for readily detecting {alpha} particles in the presence of high {beta} and {gamma} backgrounds. The instruments may also be used to survey for neutrons, {beta} particles and {gamma} rays by employing suitably designed interchangeable probes and selecting an operating potential to correspond to the particular probe.

  13. Computer representation of molecular surfaces

    SciTech Connect

    Max, N.L.

    1981-07-06

    This review article surveys recent work on computer representation of molecular surfaces. Several different algorithms are discussed for producing vector or raster drawings of space-filling models formed as the union of spheres. Other smoother surfaces are also considered.

  14. Quantum chromodynamics with advanced computing

    SciTech Connect

    Kronfeld, Andreas S.; /Fermilab

    2008-07-01

    We survey results in lattice quantum chromodynamics from groups in the USQCD Collaboration. The main focus is on physics, but many aspects of the discussion are aimed at an audience of computational physicists.

  15. Proceedings of the ACM (Association for Computing Machinery, Inc.) SIGSOFT/SIGPLAN, Software Engineering Symposium on Practical Software Development Environments (2nd) Held in Palo Alto, California on 9-11 December 1986.

    DTIC Science & Technology

    new implementation schemes for interactive program design, construction, debugging and testing. Topics of interest included: monolingual and... multilingual environments, production quality environments, database support for environments, knowledge based environments, workstation based environments

  16. Computers and Young Children

    ERIC Educational Resources Information Center

    Lacina, Jan

    2007-01-01

    Technology is a way of life for most Americans. A recent study published by the National Writing Project (2007) found that Americans believe that computers have a positive effect on writing skills. The importance of learning to use technology ranked just below learning to read and write, and 74 percent of the survey respondents noted that children…

  17. Computers and Computer Cultures.

    ERIC Educational Resources Information Center

    Papert, Seymour

    1981-01-01

    Instruction using computers is viewed as different from most other approaches to education, by allowing more than right or wrong answers, by providing models for systematic procedures, by shifting the boundary between formal and concrete processes, and by influencing the development of thinking in many new ways. (MP)

  18. A Tale of Two Surveys

    NASA Astrophysics Data System (ADS)

    Holberg, J. B.; Oswalt, T. D.; Sion, E.

    2017-03-01

    We compare two white dwarf survey populations. A recent all-sky, distance-limited population of nearby white dwarfs extending to 25 pc that contains 232 members, and a large magnitude-limited spectroscopic population, the SDSS DR7 survey, which contains over 20 000 DA stars. We derive distances and interstellar reddening estimates for the DR7 DA stars and compute luminosities and ages. Various aspects of the two samples are compared, including mass distributions, luminosity distributions, and cooling age distributions.

  19. Detecting and Jamming Dynamic Communication Networks in Anti-Access Environments

    DTIC Science & Technology

    2011-03-01

    2006. ACM. [30] Chen Wang and Li Xiao. Sensor localization in concave environments. ACM Trans. Sen. Netw., 4(1): 1-31. 2008. [31] Kui Wu, Yong Gao...Chong Liu, Jianping Pan, and Dandan Huang . Robust range-free localization in wireless sensor net- works. Mob. Netw. Appl., 12(5):392-405. 2007. [33...sensor network survey. Comput. Netw., 52(12):2292-2330,2008. [37] Yong Yuan, Zongkai Yang, Min Chen, and Jianhua He. A survey on information

  20. Computers in Public Education Study.

    ERIC Educational Resources Information Center

    HBJ Enterprises, Highland Park, NJ.

    This survey conducted for the National Institute of Education reports the use of computers in U.S. public schools in the areas of instructional computing, student accounting, management of educational resources, research, guidance, testing, and library applications. From a stratified random sample of 1800 schools in varying geographic areas and…

  1. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  2. Computer animation challenges for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Vines, Mauricio; Lee, Won-Sook; Mavriplis, Catherine

    2012-07-01

    Computer animation requirements differ from those of traditional computational fluid dynamics (CFD) investigations in that visual plausibility and rapid frame update rates trump physical accuracy. We present an overview of the main techniques for fluid simulation in computer animation, starting with Eulerian grid approaches, the Lattice Boltzmann method, Fourier transform techniques and Lagrangian particle introduction. Adaptive grid methods, precomputation of results for model reduction, parallelisation and computation on graphical processing units (GPUs) are reviewed in the context of accelerating simulation computations for animation. A survey of current specific approaches for the application of these techniques to the simulation of smoke, fire, water, bubbles, mixing, phase change and solid-fluid coupling is also included. Adding plausibility to results through particle introduction, turbulence detail and concentration on regions of interest by level set techniques has elevated the degree of accuracy and realism of recent animations. Basic approaches are described here. Techniques to control the simulation to produce a desired visual effect are also discussed. Finally, some references to rendering techniques and haptic applications are mentioned to provide the reader with a complete picture of the challenges of simulating fluids in computer animation.

  3. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  4. What Are Probability Surveys?

    EPA Pesticide Factsheets

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  5. 75 FR 52507 - Proposed Information Collection; Comment Request; Annual Capital Expenditures Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-26

    ... Census Bureau Proposed Information Collection; Comment Request; Annual Capital Expenditures Survey AGENCY... 2012 Annual Capital Expenditures Survey (ACES). The annual survey collects data on fixed assets and depreciation, sales and receipts, capitalized computer software, and capital expenditures for new and...

  6. Survey Sense.

    ERIC Educational Resources Information Center

    Pollick, Anne M.

    1995-01-01

    This article provides advice on how to plan and conduct an alumni census through the mail, drawing on the experiences of Stonehill College in North Easton, Massachusetts, which undertook such a survey in 1992. It focuses on costs, information needs, questionnaire design, mailing considerations, reporting the results, and expected response rates.…

  7. Complexity Survey.

    ERIC Educational Resources Information Center

    Gordon, Sandra L.; Anderson, Beth C.

    To determine whether consensus existed among teachers about the complexity of common classroom materials, a survey was administered to 66 pre-service and in-service kindergarten and prekindergarten teachers. Participants were asked to rate 14 common classroom materials as simple, complex, or super-complex. Simple materials have one obvious part,…

  8. Cooling Computers.

    ERIC Educational Resources Information Center

    Birken, Marvin N.

    1967-01-01

    Numerous decisions must be made in the design of computer air conditioning, each determined by a combination of economics, physical, and esthetic characteristics, and computer requirements. Several computer air conditioning systems are analyzed--(1) underfloor supply and overhead return, (2) underfloor plenum and overhead supply with computer unit…

  9. Pygmalion's Computer.

    ERIC Educational Resources Information Center

    Peelle, Howard A.

    Computers have undoubtedly entered the educational arena, mainly in the areas of computer-assisted instruction (CAI) and artificial intelligence, but whether educators should embrace computers and exactly how they should use them are matters of great debate. The use of computers in support of educational administration is widely accepted.…

  10. Survey of Mobile Robots.

    DTIC Science & Technology

    1985-12-01

    STANDARDS-1963-A IMY Techical Report ND. 450-90G-1R-ODWa p. 00 SURVEY OF MOBLE ROBOTS ’ Anita N& Flym Research Scientist !; "IT Artificlal Intelligence...case of an attempt at system building before the technology for the components was available. 1.3 The Stanford Cart 1973-1981 From 1973 to 1981, work...autonomous and yet still exhibit a high level of sophistication. Rapidly changing technology , including both the advent of the home computer and

  11. Geophex Airborne Unmanned Survey System

    SciTech Connect

    Won, I.L.; Keiswetter, D.

    1995-12-31

    Ground-based surveys place personnel at risk due to the proximity of buried unexploded ordnance (UXO) items or by exposure to radioactive materials and hazardous chemicals. The purpose of this effort is to design, construct, and evaluate a portable, remotely-piloted, airborne, geophysical survey system. This non-intrusive system will provide stand-off capability to conduct surveys and detect buried objects, structures, and conditions of interest at hazardous locations. During a survey, the operators remain remote from, but within visual distance of, the site. The sensor system never contacts the Earth, but can be positioned near the ground so that weak geophysical anomalies can be detected. The Geophex Airborne Unmanned Survey System (GAUSS) is designed to detect and locate small-scale anomalies at hazardous sites using magnetic and electromagnetic survey techniques. The system consists of a remotely-piloted, radio-controlled, model helicopter (RCH) with flight computer, light-weight geophysical sensors, an electronic positioning system, a data telemetry system, and a computer base-station. The report describes GAUSS and its test results.

  12. Proposed Expansion of Acme Landfill Operations.

    DTIC Science & Technology

    1982-08-01

    wastes from landfill effective January 1, 1983: PCB’s, cyanides, pesticides , toxic metals, halogenated organics, and non-halogenated organics. (Section...1983, the six substances slated for restrictive action are: PCB’s, pesticides , toxic metals, cyanide, halogenated organics, and non-haloqenated...regrading as needed. Expansive soils in the site areas for Alternitives A, B, and C would affect pavements and light structures they support. Moisture

  13. The ACM Periodical Bank: A Retrospective View.

    ERIC Educational Resources Information Center

    Clarke, Jack A.

    1980-01-01

    Evaluates a cooperative venture in interlibrary lending of periodicals planned and executed by ten midwestern colleges. The study traces the consortium's history from 1967 to the present, describing successes and problems. (RAA)

  14. The Acme of Skill: Nonkinetic Warfare

    DTIC Science & Technology

    2008-05-01

    frontmatter.indd 6 7/29/08 12:15:43 PM Introduction The term nonkinetic warfare may seem to be an oxy- moron . How can warfare be described as...the hallmark of the previous two genera- tions of warfare. The authors attribute the evolution of warfare largely to the advancement in technology and

  15. World survey of CAM

    NASA Astrophysics Data System (ADS)

    Hatvany, J.; Merchant, M. E.; Rathmill, K.; Yoshikawa, H.

    The worldwide state of the art and development trends in CAM are surveyed, emphasizing flexible manufacturing systems (FMS), robotics, computer-aided process planning, and computer-aided scheduling. The use of FMS, NC machine tools, DNC systems, and unmanned and nearly unmanned factories, are discussed as the state of the art in the USA, Japan, Western Europe and Eastern Europe. For the same areas, trends are projected, including the use of graphics and languages in CAM, and metamorphic machine tools. A Delphi-type forecast and its conclusions are presented. A CAM system for manufacture is projected for 1985, the use of robots equalling humans in assembly capability for 1990, and the fifty percent replacement of direct labor in automobile final assembly by programmable automation by 1995. An attempt is made to outline a methodical approach to forecasting the development of CAM over the next 10-15 years. Key issues in CAM proliferation, including financial and social aspects, are addressed.

  16. Faculty of Education Students' Computer Self-Efficacy Beliefs and Their Attitudes towards Computers and Implementing Computer Supported Education

    ERIC Educational Resources Information Center

    Berkant, Hasan Güner

    2016-01-01

    This study investigates faculty of education students' computer self-efficacy beliefs and their attitudes towards computers and implementing computer supported education. This study is descriptive and based on a correlational survey model. The final sample consisted of 414 students studying in the faculty of education of a Turkish university. The…

  17. Hardware survey for the avionics test bed

    NASA Technical Reports Server (NTRS)

    Cobb, J. M.

    1981-01-01

    A survey of maor hardware items that could possibly be used in the development of an avionics test bed for space shuttle attached or autonomous large space structures was conducted in NASA Johnson Space Center building 16. The results of the survey are organized to show the hardware by laboratory usage. Computer systems in each laboratory are described in some detail.

  18. On teaching computer ethics within a computer science department.

    PubMed

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  19. Social Attitudes and the Computer Revolution

    ERIC Educational Resources Information Center

    Lee, Robert S.

    1970-01-01

    Presents the results of a nationwide survey of attitudes toward computers in two categories: (1) the computer as a purposeful instrument, helpful in science, industry, and space exploration, and (2) the computer as a relatively autonomous machine that can perform the functions of human thinking. (MB)

  20. Computing Newsletter for Schools of Business.

    ERIC Educational Resources Information Center

    Couger, J. Daniel, Ed.

    1973-01-01

    The first of the two issues included here reports on various developments concerning the use of computers for schools of business. One-page articles cover these topics: widespread use of simulation games, survey of computer use in higher education, ten new computer cases which teach techniques for management analysis, advantages of the use of…

  1. Computers in Public Broadcasting: Who, What, Where.

    ERIC Educational Resources Information Center

    Yousuf, M. Osman

    This handbook offers guidance to public broadcasting managers on computer acquisition and development activities. Based on a 1981 survey of planned and current computer uses conducted by the Corporation for Public Broadcasting (CPB) Information Clearinghouse, computer systems in public radio and television broadcasting stations are listed by…

  2. Using Personal Computers to Promote Economic Development.

    ERIC Educational Resources Information Center

    ECO Northwest, Ltd., Helena, MT.

    A study was conducted to determine the feasibility of increasing economic development within Montana through the use of personal computers in small businesses. A statewide mail survey of 1,650 businesses (employing between 4 and 25 employees) was conducted to determine the current status of computer use and the potential for expanding computer use…

  3. Online Hand Holding in Fixing Computer Glitches

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2005-01-01

    According to most surveys, computer manufacturers such as HP puts out reliable products, and computers in general are less troublesome than in the past. But personal computers are still prone to bugs, conflicts, viruses, spyware infestations, hacker and phishing attacks, and--most of all--user error. Unfortunately, technical support from computer…

  4. Computer Organizational Techniques Used by Office Personnel.

    ERIC Educational Resources Information Center

    Alexander, Melody

    1995-01-01

    According to survey responses from 404 of 532 office personnel, 81.7% enjoy working with computers; the majority save files on their hard drives, use disk labels and storage files, do not use subdirectories or compress data, and do not make backups of floppy disks. Those with higher degrees, more computer experience, and more daily computer use…

  5. A regional land use survey based on remote sensing and other data: A report on a LANDSAT and computer mapping project, volume 1. [Arizona, Colorado, Montana, New Mexico, Utah, and Wyoming

    NASA Technical Reports Server (NTRS)

    Nez, G. (Principal Investigator); Mutter, D.

    1977-01-01

    The author has identified the following significant results. New LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.

  6. Multiple Surveys of Students and Survey Fatigue

    ERIC Educational Resources Information Center

    Porter, Stephen R.; Whitcomb, Michael E.; Weitzer, William H.

    2004-01-01

    This chapter reviews the literature on survey fatigue and summarizes a research project that indicates that administering multiple surveys in one academic year can significantly suppress response rates in later surveys. (Contains 4 tables.)

  7. Cameron Station remedial investigation: Final asbestos survey report. Final report

    SciTech Connect

    1992-02-01

    Woodward-Clyde Federal Services (WCFS) conducted a comprehensive asbestos survey of the facilities at Cameron Station as part of its contract with the US Army Toxic and Hazardous Materials Agency (USATHAMA) to perform a remedial investigation and feasibility study (RI/FS) at the base. The purpose of the survey which was initiated August 23, 1990 in response to the Base Realignment And Closure Environmental Restoration Strategy (BRAC), was to identify friable and non-friable asbestos-containing material (ACM), provide options for abatement of asbestos, provide cost estimates for both abatement and operations and maintenance costs, and identifying actions requiring immediate action in Cameron Station`s 24 buildings. BRAC states that only friable asbestos which presents a threat to health and safety shall be removed; non-friable asbestos or friable asbestos which is encapsulated or in good repair shall be left in place and identified to the buyer per GSA agreement. The investigation followed protocols that met or exceeded the requirements of 40 CFR 763, the EPA regulations promulgated under the Asbestos Hazard Emergency Response Act (AHERA).

  8. Competence, continuing education, and computers.

    PubMed

    Hegge, Margaret; Powers, Penny; Hendrickx, Lori; Vinson, Judith

    2002-01-01

    A survey of RNs in South Dakota was performed to determine their perceived level of competence, the extent to which their continuing nursing education (CNE) needs are being met, and their use of computers for CNE. Nationally certified nurses rated themselves significantly more competent than nurses who are not nationally certified. Fewer than half of the RNs reported their CNE needs were being met despite geographic access to CNE and programs available in their specialty. Three-fourths of nurses had computers at home while 76% had computers at work, yet fewer than 20% of nurses used these computers for CNE.

  9. Computational dosimetry

    SciTech Connect

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  10. Computational Toxicology

    EPA Science Inventory

    Computational toxicology’ is a broad term that encompasses all manner of computer-facilitated informatics, data-mining, and modeling endeavors in relation to toxicology, including exposure modeling, physiologically based pharmacokinetic (PBPK) modeling, dose-response modeling, ...

  11. Cloud Computing

    SciTech Connect

    Pete Beckman and Ian Foster

    2009-12-04

    Chicago Matters: Beyond Burnham (WTTW). Chicago has become a world center of "cloud computing." Argonne experts Pete Beckman and Ian Foster explain what "cloud computing" is and how you probably already use it on a daily basis.

  12. Computer Starters!

    ERIC Educational Resources Information Center

    Instructor, 1983

    1983-01-01

    Instructor's Computer-Using Teachers Board members give practical tips on how to get a classroom ready for a new computer, introduce students to the machine, and help them learn about programing and computer literacy. Safety, scheduling, and supervision requirements are noted. (PP)

  13. Computer Literacy.

    ERIC Educational Resources Information Center

    San Marcos Unified School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: After viewing many computer-literacy programs, we believe San Marcos Junior High School has developed a unique program which will truly develop computer literacy. Our hope is to give all students a comprehensive look at computers as they go through their two years here. They will not only learn the…

  14. Distributed Computing.

    ERIC Educational Resources Information Center

    Ryland, Jane N.

    1988-01-01

    The microcomputer revolution, in which small and large computers have gained tremendously in capability, has created a distributed computing environment. This circumstance presents administrators with the opportunities and the dilemmas of choosing appropriate computing resources for each situation. (Author/MSE)

  15. Laser Surveying

    NASA Technical Reports Server (NTRS)

    1978-01-01

    NASA technology has produced a laser-aided system for surveying land boundaries in difficult terrain. It does the job more accurately than conventional methods, takes only one-third the time normally required, and is considerably less expensive. In surveying to mark property boundaries, the objective is to establish an accurate heading between two "corner" points. This is conventionally accomplished by erecting a "range pole" at one point and sighting it from the other point through an instrument called a theodolite. But how do you take a heading between two points which are not visible to each other, for instance, when tall trees, hills or other obstacles obstruct the line of sight? That was the problem confronting the U.S. Department of Agriculture's Forest Service. The Forest Service manages 187 million acres of land in 44 states and Puerto Rico. Unfortunately, National Forest System lands are not contiguous but intermingled in complex patterns with privately-owned land. In recent years much of the private land has been undergoing development for purposes ranging from timber harvesting to vacation resorts. There is a need for precise boundary definition so that both private owners and the Forest Service can manage their properties with confidence that they are not trespassing on the other's land.

  16. Farmland Survey

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A 1981 U.S. Department of Agriculture (USDA) study estimated that the nation is converting farmland to non-agricultural uses at the rate of 3 million acres a year. Seeking information on farmland loss in Florida, the state legislature, in 1984, directed establishment of a program for development of accurate data to enable intelligent legislation of state growth management. Thus was born Florida's massive Mapping and Monitoring of Agricultural Lands Project (MMALP). It employs data from the NASA-developed Landsat Earth resources survey satellite system as a quicker, less expensive alternative to ground surveying. The 3 year project involved inventory of Florida's 36 million acres classifying such as cropland, pastureland, citrus, woodlands, wetland, water and populated areas. Direction was assigned to Florida Department of Community Affairs (DCA) with assistance from the DOT. With the cooperation of the USDA, Soil Conservation Service, DCA decided that combining soil data with the Landsat land cover data would make available to land use planners a more comprehensive view of a county's land potential.

  17. Portable Computer

    NASA Technical Reports Server (NTRS)

    1985-01-01

    SPOC, a navigation monitoring computer used by NASA in a 1983 mission, was a modification of a commercial computer called GRiD Compass, produced by GRiD Systems Corporation. SPOC was chosen because of its small size, large storage capacity, and high processing speed. The principal modification required was a fan to cool the computer. SPOC automatically computes position, orbital paths, communication locations, etc. Some of the modifications were adapted for commercial applications. The computer is presently used in offices for conferences, for on-site development, and by the army as part of a field communications systems.

  18. 78 FR 50374 - Proposed Information Collection; Comment Request; Information and Communication Technology Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-19

    ... software (computers and peripheral equipment; ICT equipment, excluding computers and peripherals; electromedical and electrotherapeutic apparatus; and computer software, including payroll associated with software development). The survey also collects capital expenditures data on the four types of...

  19. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  20. Computers improves sonar seabed maps

    SciTech Connect

    Not Available

    1984-05-01

    A software package for computer aided mapping of sonar (CAMOS) has been developed in Norway. It has automatic mosaic presentation, which produces fully scale-rectified side scan sonograms automatically plotted on geographical and UTM map grids. The program is the first of its kind in the world. The maps produced by this method are more accurate and detailed than those produced by conventional methods. The main applications of CAMOS are: seafloor mapping; pipeline route surveys; pipeline inspection surveys; platform site surveys; geological mapping and geotechnical investigations. With the aerial-photograph quality of the CAMOS maps, a more accurate and visual representation of the seabed is achieved.

  1. Infrastructure Survey 2011

    ERIC Educational Resources Information Center

    Group of Eight (NJ1), 2012

    2012-01-01

    In 2011, the Group of Eight (Go8) conducted a survey on the state of its buildings and infrastructure. The survey is the third Go8 Infrastructure survey, with previous surveys being conducted in 2007 and 2009. The current survey updated some of the information collected in the previous surveys. It also collated data related to aspects of the…

  2. The ASCI Network for SC '98: Dense Wave Division Multiplexing for Distributed and Distance Computing

    SciTech Connect

    Adams, R.L.; Butman, W.; Martinez, L.G.; Pratt, T.J.; Vahle, M.O.

    1999-06-01

    This document highlights the DISCOM's Distance computing and communication team activities at the 1998 Supercomputing conference in Orlando, Florida. This conference is sponsored by the IEEE and ACM. Sandia National Laboratories, Lawrence Livermore National Laboratory, and Los Alamos National Laboratory have participated in this conference for ten years. For the last three years, the three laboratories have a joint booth at the conference under the DOE's ASCI, Accelerated Strategic Computing Initiatives. The DISCOM communication team uses the forum to demonstrate and focus communications and networking developments. At SC '98, DISCOM demonstrated the capabilities of Dense Wave Division Multiplexing. We exhibited an OC48 ATM encryptor. We also coordinated the other networking activities within the booth. This paper documents those accomplishments, discusses the details of their implementation, and describes how these demonstrations support overall strategies in ATM networking.

  3. A Survey of Collectives

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Wolpert, David

    2004-01-01

    Due to the increasing sophistication and miniaturization of computational components, complex, distributed systems of interacting agents are becoming ubiquitous. Such systems, where each agent aims to optimize its own performance, but where there is a well-defined set of system-level performance criteria, are called collectives. The fundamental problem in analyzing/designing such systems is in determining how the combined actions of self-interested agents leads to 'coordinated' behavior on a iarge scale. Examples of artificial systems which exhibit such behavior include packet routing across a data network, control of an array of communication satellites, coordination of multiple deployables, and dynamic job scheduling across a distributed computer grid. Examples of natural systems include ecosystems, economies, and the organelles within a living cell. No current scientific discipline provides a thorough understanding of the relation between the structure of collectives and how well they meet their overall performance criteria. Although still very young, research on collectives has resulted in successes both in understanding and designing such systems. It is eqected that as it matures and draws upon other disciplines related to collectives, this field will greatly expand the range of computationally addressable tasks. Moreover, in addition to drawing on them, such a fully developed field of collective intelligence may provide insight into already established scientific fields, such as mechanism design, economics, game theory, and population biology. This chapter provides a survey to the emerging science of collectives.

  4. A Survey of Automated Activities in the Libraries of Great Britain and the Commonwealth Countries; Volume 2, World Survey Series.

    ERIC Educational Resources Information Center

    Patrinostro, Frank S., Comp.; Sanders, Nancy P., Ed.

    Concerned with identifying computer based library projects in Great Britain and the commonwealth countries, this survey is based primarily on the survey questionnaires, but information was also gathered from extensive research of the literature. This published report of the survey findings is divided into four parts: (1) an analysis of the Library…

  5. Computer Anxiety and Other Factors Preventing Computer Use among United States Secondary Agricultural Educators.

    ERIC Educational Resources Information Center

    Fletcher, William E.; Deeds, Jacquelyn P.

    1994-01-01

    Survey responses from 176 of 224 secondary agriculture teachers showed that 40.9% had mild to severe computer anxiety, 59% were relaxed using computers. Math ability was not related to anxiety. Those with less than 10 years of teaching experience were more likely to be computer literate. More support from all administrative levels would increase…

  6. Post Graduate Students' Computing Confidence, Computer and Internet Usage at Kuvempu University--An Indian Study

    ERIC Educational Resources Information Center

    Dange, Jagannath K.

    2010-01-01

    There is a common belief that students entering Post Graduation have appropriate computing skills for study purposes and there is no longer a felt need for computer training programmes in tertiary education. First year students of Post Graduation were surveyed in 2009, they were asked about their Education and Computing backgrounds. Further, the…

  7. The computer-literate nurse.

    PubMed

    Bryson, D M

    1991-01-01

    This study investigated the perceptions of nursing educators concerning the amount and kinds of computer training that should occur in the nursing degree program. Data were collected in two phases: a semi-structured interview of experts in the application of the computer to nursing and a random sample of nursing educators in 2-year and 4-year nursing degree programs. The panel of experts identified objectives within each of seven domains: programming and algorithms, skills in computer usage, major uses and applications, limitations of computers, personal and social aspects, and relevant values and attitudes. The responses of this panel were used to generate an universe of computer literacy objectives. The sample of nursing educators then identified a subset of objectives within the universe that they felt nursing students should master in order to be computer literate. The survey found that nursing educators desire graduates of nursing degree programs to understand how a computer works and to develop skills in using application programs. They do not expect nursing graduates to acquire programming skills, however, they do expect the graduates to acquire skills in using the computer as a tool in nursing. These skills include using a word processor for writing nursing care plans, using computer-aided instruction as a learning tool, using a hospital computer information system, using a computerized library database, and using software for statistical computations.

  8. Computer finds ore

    NASA Astrophysics Data System (ADS)

    Bell, Peter M.

    Artificial intelligence techniques are being used for the first time to evaluate geophysical, geochemical, and geologic data and theory in order to locate ore deposits. After several years of development, an intelligent computer code has been formulated and applied to the Mount Tolman area in Washington state. In a project funded by the United States Geological Survey and the National Science Foundation a set of computer programs, under the general title Prospector, was used successfully to locate a previously unknown ore-grade porphyry molybdenum deposit in the vicinity of Mount Tolman (Science, Sept. 3, 1982).The general area of the deposit had been known to contain exposures of porphyry mineralization. Between 1964 and 1978, exploration surveys had been run by the Bear Creek Mining Company, and later exploration was done in the area by the Amax Corporation. Some of the geophysical data and geochemical and other prospecting surveys were incorporated into the programs, and mine exploration specialists contributed to a set of rules for Prospector. The rules were encoded as ‘inference networks’ to form the ‘expert system’ on which the artificial intelligence codes were based. The molybdenum ore deposit discovered by the test is large, located subsurface, and has an areal extent of more than 18 km2.

  9. Computer Interview Problem Assessment of Psychiatric Patients

    PubMed Central

    Angle, Hugh V.; Ellinwood, Everett H.; Carroll, Judith

    1978-01-01

    Behavioral Assessment information, a more general form of Problem- Oriented Record data, appears to have many useful clinical qualities and was selected to be the information content for a computer interview system. This interview system was designed to assess problematic behaviors of psychiatric patients. The computer interview covered 29 life problem areas and took patients from four to eight hours to complete. In two reliability studies, the computer interview was compared to human interviews. A greater number of general and specific patient problems were identified in the computer interview than in the human interviews. The attitudes of computer patients and clinicians receiving the computer reports were surveyed.

  10. Predictors for electronic survey completion in healthcare research.

    PubMed

    Beling, Jennifer; Libertini, Linda S; Sun, Zhiyuan; Masina, V Maria; Albert, Nancy M

    2011-05-01

    Few studies have examined patients' preferences for and predictors of completing health surveys by paper versus Internet. The purpose of this study was to examine if participants of registry research preferred to complete health surveys by the Internet or paper, and if demographics and previous computer experiences were associated with health survey completion method preference. Using a descriptive design and convenience sample, participants of colorectal surgery registries completed an 18-item survey about Internet use and personal characteristics. Multiple linear regressions were used to determine predictors of total Internet use and access and survey preference. In 526 participants, preference for Internet-based health survey completion was associated with younger age, higher education, computer ownership, and using e-health medical records (all P ≤ .01). Those who previously completed Internet-based health surveys were more often married or divorced and computer owners and had electronic access to health records (all P ≤ .001). After multivariable regression, the Internet use/access sum score was associated with computer ownership, using a secure Web-based system and preference for completing electronic health surveys (all P < .001). In conclusion, after controlling for demographics, computer ownership, comfort in using Web-based systems including surveys, and access to computerized health records predicted preference for completing research-based health surveys by the Internet.

  11. 2005 Navy MWR Customer Survey

    DTIC Science & Technology

    2007-07-01

    Cybernet cafe --- --- 44% Game room/amusement machines 30% 31% 43% Breakfast at MWR facilities --- --- 30% Bingo --- --- 4% Question 18. For each...On-base movies/theatres 79% Outdoor activities 79% Swimming pools 79% Computers/Internet service 78% Bingo 78% Game room/amusement machines 77...surveys via cell phone - First scientific Navy-wide assessment of cell phone and text messaging use • Importance and Use sections - New in 2005: Bingo

  12. The IRAS Minor Planet Survey

    DTIC Science & Technology

    1992-12-01

    tronomical Satellite (IRAS) and to compute albedos and diameters from their IRAS fluxes. It also presents listings of the results obtained. These...how this material should be referenced. The primary purpose of the Infrared Astronomical Satellite (IRAS) was to survey the sky in four wavelength...bands centered near 12, 25, 60 and 100 pm. The satellite was launched in January 1983 and obtained observations until November 1983. In this period it

  13. Colorado Student Enrollment in Mathematics and Science, Fall 1991. A Survey of Enrollment of Colorado Public School Students, Grades 7-12, in Science, Mathematics, and Computer Science Courses.

    ERIC Educational Resources Information Center

    Hennes, James D.; And Others

    In recent years the graduation requirements in mathematics and science have increased in response to state and national goals calling for increased competency by U.S. graduates in those areas. Data on course enrollment in science, mathematics, and computer science in grades 7-12 were collected from 89 percent of the schools in Colorado in October,…

  14. Developing Computation

    ERIC Educational Resources Information Center

    McIntosh, Alistair

    2004-01-01

    In this article, the author presents the results of a state project that focused on the effect of developing informal written computation processes through Years 2-4. The "developing computation" project was conducted in Tasmania over the two years 2002-2003 and involved nine schools: five government schools, two Catholic schools, and…

  15. Computing Life

    ERIC Educational Resources Information Center

    National Institute of General Medical Sciences (NIGMS), 2009

    2009-01-01

    Computer advances now let researchers quickly search through DNA sequences to find gene variations that could lead to disease, simulate how flu might spread through one's school, and design three-dimensional animations of molecules that rival any video game. By teaming computers and biology, scientists can answer new and old questions that could…

  16. Computer News

    ERIC Educational Resources Information Center

    Science Activities: Classroom Projects and Curriculum Ideas, 2007

    2007-01-01

    This article presents several news stories about computers and technology. (1) Applied Science Associates of Narragansett, Rhode Island is providing computer modeling technology to help locate the remains to the USS Bonhomme Richard, which sank in 1779 after claiming a Revolutionary War victory. (2) Whyville, the leading edu-tainment virtual world…

  17. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  18. Computational Pathology

    PubMed Central

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  19. Computer Insecurity.

    ERIC Educational Resources Information Center

    Wilson, David L.

    1994-01-01

    College administrators recently appealed to students and faculty to change their computer passwords after security experts announced that tens of thousands had been stolen by computer hackers. Federal officials are investigating. Such attacks are not uncommon, but the most effective solutions are either inconvenient or cumbersome. (MSE)

  20. Computational astrophysics

    NASA Technical Reports Server (NTRS)

    Miller, Richard H.

    1987-01-01

    Astronomy is an area of applied physics in which unusually beautiful objects challenge the imagination to explain observed phenomena in terms of known laws of physics. It is a field that has stimulated the development of physical laws and of mathematical and computational methods. Current computational applications are discussed in terms of stellar and galactic evolution, galactic dynamics, and particle motions.

  1. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Discussed are three examples of computer graphics including biomorphs, Truchet tilings, and fractal popcorn. The graphics are shown and the basic algorithm using multiple iteration of a particular function or mathematical operation is described. An illustration of a snail shell created by computer graphics is presented. (YP)

  2. Computer Graphics.

    ERIC Educational Resources Information Center

    Halpern, Jeanne W.

    1970-01-01

    Computer graphics have been called the most exciting development in computer technology. At the University of Michigan, three kinds of graphics output equipment are now being used: symbolic printers, line plotters or drafting devices, and cathode-ray tubes (CRT). Six examples are given that demonstrate the range of graphics use at the University.…

  3. I, Computer

    ERIC Educational Resources Information Center

    Barack, Lauren

    2005-01-01

    What child hasn't chatted with friends through a computer? But chatting with a computer? Some Danish scientists have literally put a face on their latest software program, bringing to virtual life storyteller Hans Christian Andersen, who engages users in actual conversations. The digitized Andersen resides at the Hans Christian Andersen Museum in…

  4. Teaching perspectives among introductory computer programming faculty in higher education

    NASA Astrophysics Data System (ADS)

    Mainier, Michael J.

    This study identified the teaching beliefs, intentions, and actions of 80 introductory computer programming (CS1) faculty members from institutions of higher education in the United States using the Teacher Perspectives Inventory. Instruction method used inside the classroom, categorized by ACM CS1 curriculum guidelines, was also captured along with information to develop a demographic profile of respondents. Introductory computer programming faculty combined beliefs, intentions, and actions scores displayed a dominant trend within the apprenticeship perspective while indicating a general preference for the imperative-first instruction method. This result indicates possible misalignment regarding the underlying value of these teachers to simulate the experience of computer programming in comparison to their non-traditional instructional approach of lecture and textbook learning. The factors of teaching experience and first language were found to have significant influence on faculty particularly within the social reform perspective, indicating established faculty members possess the intent to change society for the better while instructors born outside of the U.S. are more likely to actually teach through this perspective.

  5. A Web of Resources for Introductory Computer Science.

    ERIC Educational Resources Information Center

    Rebelsky, Samuel A.

    As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…

  6. Computers in Schools of Southeast Texas in 1994.

    ERIC Educational Resources Information Center

    Henderson, David L.; Renfrow, Raylene

    This paper reviews literature on the use of computers at work and home, computer skills needed by new teachers, and suggestions for administrators to support computer usage in schools. A survey of 52 school districts serving the Houston area of southeast Texas is reported, indicating that 22,664 computers were in use, with a mean of 436 computers…

  7. A survey of underwater-acoustic ray tracing techniques

    NASA Astrophysics Data System (ADS)

    Jones, R. M.

    1983-06-01

    A survey of techniques and features available in underwater acoustic ray tracing computer programs is presented. The survey includes methods for constructing raypath trajectories, construction eigenrays, ray-intensity calculations, and ray theory corrections. The survey also includes models for sound speed (including interpolation methods), ocean bottom (including both bathymetry and reflection coefficient), ocean surface reflection coefficient, dissipation, temperature, salinity, and ocean current. In addition, methods for displaying models and methods for presenting ray tracing results are surveyed.

  8. Coal-seismic, desktop computer programs in BASIC; Part 7, Display and compute shear-pair seismograms

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of geophysical data taken with the U.S. Geological Survey's coal-seismic system is done with a desk-top, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report discusses and presents five computer pro grams used to display and compute shear-pair seismograms.

  9. Universal computer test stand (recommended computer test requirements). [for space shuttle computer evaluation

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Techniques are considered which would be used to characterize areospace computers with the space shuttle application as end usage. The system level digital problems which have been encountered and documented are surveyed. From the large cross section of tests, an optimum set is recommended that has a high probability of discovering documented system level digital problems within laboratory environments. Defined is a baseline hardware, software system which is required as a laboratory tool to test aerospace computers. Hardware and software baselines and additions necessary to interface the UTE to aerospace computers for test purposes are outlined.

  10. Multiresolution Wavelet Based Adaptive Numerical Dissipation Control for Shock-Turbulence Computations

    NASA Technical Reports Server (NTRS)

    Sjoegreen, B.; Yee, H. C.

    2001-01-01

    The recently developed essentially fourth-order or higher low dissipative shock-capturing scheme of Yee, Sandham and Djomehri (1999) aimed at minimizing nu- merical dissipations for high speed compressible viscous flows containing shocks, shears and turbulence. To detect non smooth behavior and control the amount of numerical dissipation to be added, Yee et al. employed an artificial compression method (ACM) of Harten (1978) but utilize it in an entirely different context than Harten originally intended. The ACM sensor consists of two tuning parameters and is highly physical problem dependent. To minimize the tuning of parameters and physical problem dependence, new sensors with improved detection properties are proposed. The new sensors are derived from utilizing appropriate non-orthogonal wavelet basis functions and they can be used to completely switch to the extra numerical dissipation outside shock layers. The non-dissipative spatial base scheme of arbitrarily high order of accuracy can be maintained without compromising its stability at all parts of the domain where the solution is smooth. Two types of redundant non-orthogonal wavelet basis functions are considered. One is the B-spline wavelet (Mallat & Zhong 1992) used by Gerritsen and Olsson (1996) in an adaptive mesh refinement method, to determine regions where re nement should be done. The other is the modification of the multiresolution method of Harten (1995) by converting it to a new, redundant, non-orthogonal wavelet. The wavelet sensor is then obtained by computing the estimated Lipschitz exponent of a chosen physical quantity (or vector) to be sensed on a chosen wavelet basis function. Both wavelet sensors can be viewed as dual purpose adaptive methods leading to dynamic numerical dissipation control and improved grid adaptation indicators. Consequently, they are useful not only for shock-turbulence computations but also for computational aeroacoustics and numerical combustion. In addition, these

  11. Cloud Computing

    DTIC Science & Technology

    2009-11-12

    Eucalyptus Systems • Provides an open-source application that can be used to implement a cloud computing environment on a datacenter • Trying to establish an...Summary Cloud Computing is in essence an economic model • It is a different way to acquire and manage IT resources There are multiple cloud providers...edgeplatform.html • Amazon Elastic Compute Cloud (EC2): http://aws.amazon.com/ec2/ • Amazon Simple Storage Solution (S3): http://aws.amazon.com/s3/ • Eucalyptus

  12. Optical computing.

    NASA Technical Reports Server (NTRS)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  13. Alumni Perspectives Survey, 2010. Survey Report

    ERIC Educational Resources Information Center

    Sheikh, Sabeen

    2010-01-01

    During the months of April and September of 2009, the Graduate Management Admission Council[R] (GMAC[R]) conducted the Alumni Perspectives Survey, a longitudinal study of prior respondents to the Global Management Education Graduate Survey of management students nearing graduation. A total of 3,708 alumni responded to the April 2009 survey,…

  14. 2012 Alumni Perspectives Survey. Survey Report

    ERIC Educational Resources Information Center

    Leach, Laura

    2012-01-01

    Conducted in September 2011, this Alumni Perspectives Survey by the Graduate Management Admission Council (GMAC) is a longitudinal study of respondents to the Global Management Education Graduate Survey, the annual GMAC[R] exit survey of graduate management students in their final year of business school. This 12th annual report includes responses…

  15. Computer Science Research at Langley

    NASA Technical Reports Server (NTRS)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  16. Astronomical surveys and big data

    NASA Astrophysics Data System (ADS)

    Mickaelian, Areg M.

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ -rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ -ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC) in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS), and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  17. Applied technology center business plan and market survey

    NASA Technical Reports Server (NTRS)

    Hodgin, Robert F.; Marchesini, Roberto

    1990-01-01

    Business plan and market survey for the Applied Technology Center (ATC), computer technology transfer and development non-profit corporation, is presented. The mission of the ATC is to stimulate innovation in state-of-the-art and leading edge computer based technology. The ATC encourages the practical utilization of late-breaking computer technologies by firms of all variety.

  18. Computer Stimulation

    ERIC Educational Resources Information Center

    Moore, John W.; Moore, Elizabeth

    1977-01-01

    Discusses computer simulation approach of Limits to Growth, in which interactions of five variables (population, pollution, resources, food per capita, and industrial output per capita) indicate status of the world. Reviews other books that predict future of the world. (CS)

  19. Computer Poker

    ERIC Educational Resources Information Center

    Findler, Nicholas V.

    1978-01-01

    This familiar card game has interested mathematicians, economists, and psychologists as a model of decision-making in the real world. It is now serving as a vehicle for investigations in computer science. (Author/MA)

  20. Evolutionary Computing

    SciTech Connect

    Patton, Robert M; Cui, Xiaohui; Jiao, Yu; Potok, Thomas E

    2008-01-01

    The rate at which information overwhelms humans is significantly more than the rate at which humans have learned to process, analyze, and leverage this information. To overcome this challenge, new methods of computing must be formulated, and scientist and engineers have looked to nature for inspiration in developing these new methods. Consequently, evolutionary computing has emerged as new paradigm for computing, and has rapidly demonstrated its ability to solve real-world problems where traditional techniques have failed. This field of work has now become quite broad and encompasses areas ranging from artificial life to neural networks. This chapter focuses specifically on two sub-areas of nature-inspired computing: Evolutionary Algorithms and Swarm Intelligence.

  1. Computer Calculus.

    ERIC Educational Resources Information Center

    Steen, Lynn Arthur

    1981-01-01

    The development of symbolic computer algebra designed to manipulate abstract mathematical expressions is discussed. The ability of this software to mimic the standard patterns of human problem solving represents a major advance toward "true" artificial intelligence. (MP)

  2. Personal Computers.

    ERIC Educational Resources Information Center

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  3. The influence of computer literacy and computer anxiety on computer self-efficacy: the moderating effect of gender.

    PubMed

    Lee, Chun-Lin; Huang, Ming-Kuei

    2014-03-01

    Although researchers have published many studies on computer literacy and anxiety related to computer self-efficacy, there are two gaps in relevant literature. First, the effects of computer literacy and computer anxiety on computer self-efficacy are considered separately, yet their interaction effect is neglected. Second, the role of individual gender characteristics in the relationships between computer literacy and anxiety on computer self-efficacy is far from clear. To address these two concerns, this study empirically investigates the interaction effect between computer literacy and computer anxiety, and the moderating role of gender. This study tests hypotheses using survey data from people who have experience using computers in Taiwan, and uses hierarchical regression to analyze the models. Results indicate that computer literacy can help form positive computer self-efficacy more effectively for males than for females, and computer anxiety can lead to more negative computer self-efficacy for females than for males. A three-way interaction also exists among computer literacy, computer anxiety, and gender. The results, research contributions, and limitations are discussed, and implications for future studies are suggested.

  4. LHC Computing

    SciTech Connect

    Lincoln, Don

    2015-07-28

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  5. Quantum Computing

    DTIC Science & Technology

    1998-04-01

    information representation and processing technology, although faster than the wheels and gears of the Charles Babbage computation machine, is still in...the same computational complexity class as the Babbage machine, with bits of information represented by entities which obey classical (non-quantum...nuclear double resonances Charles M Bowden and Jonathan P. Dowling Weapons Sciences Directorate, AMSMI-RD-WS-ST Missile Research, Development, and

  6. The State of Computers in the State of Arkansas.

    ERIC Educational Resources Information Center

    Schoppmeyer, Martin W.; And Others

    To explore and document the status of computer use in Arkansas, a survey was sent to each of the 310 school superintendents in the state, and 221 surveys were returned. Results indicated that only a minority of the schools had a computer in every room; these tended to be placed in lower grade classrooms. Excepting kindergarten, the majority of…

  7. Computer Viruses: An Assessment of Student Perceptions.

    ERIC Educational Resources Information Center

    Jones, Mary C.; Arnett, Kirk P.

    1992-01-01

    A majority of 213 college business students surveyed had knowledge of computer viruses; one-fourth had been exposed to them. Many believed that computer professionals are responsible for prevention and cure. Educators should make students aware of multiple sources of infection, the breadth and extent of possible damage, and viral detection and…

  8. Computing Strategies in Small Universities and Colleges.

    ERIC Educational Resources Information Center

    Coughlin, Patrick J.

    A survey was conducted to identify the patterns of academic and administrative computer services in use--or planned for the near future--in small colleges and universities as they relate to such strategic policy areas as: (1) management/governance structure; (2) personnel-staff; (3) personnel-faculty; (4) academic computing; (5) library services;…

  9. Graph Partitioning Models for Parallel Computing

    SciTech Connect

    Hendrickson, B.; Kolda, T.G.

    1999-03-02

    Calculations can naturally be described as graphs in which vertices represent computation and edges reflect data dependencies. By partitioning the vertices of a graph, the calculation can be divided among processors of a parallel computer. However, the standard methodology for graph partitioning minimizes the wrong metric and lacks expressibility. We survey several recently proposed alternatives and discuss their relative merits.

  10. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  11. Online Computer Gaming: A Comparison of Adolescent and Adult Gamers

    ERIC Educational Resources Information Center

    Griffiths, M. D.; Davies, Mark N. O.; Chappell, Darren

    2004-01-01

    Despite the growing popularity of online game playing, there have been no surveys comparing adolescent and adult players. Therefore, an online questionnaire survey was used to examine various factors of online computer game players (n=540) who played the most popular online game Everquest. The survey examined basic demographic information, playing…

  12. Creation of a full color geologic map by computer: A case history from the Port Moller project resource assessment, Alaska Peninsula: A section in Geologic studies in Alaska by the U.S. Geological Survey, 1988

    USGS Publications Warehouse

    Wilson, Frederic H.

    1989-01-01

    Graphics programs on computers can facilitate the compilation and production of geologic maps, including full color maps of publication quality. This paper describes the application of two different programs, GSMAP and ARC/INFO, to the production of a geologic map of the Port Meller and adjacent 1:250,000-scale quadrangles on the Alaska Peninsula. GSMAP was used at first because of easy digitizing on inexpensive computer hardware. Limitations in its editing capability led to transfer of the digital data to ARC/INFO, a Geographic Information System, which has better editing and also added data analysis capability. Although these improved capabilities are accompanied by increased complexity, the availability of ARC/INFO's data analysis capability provides unanticipated advantages. It allows digital map data to be processed as one of multiple data layers for mineral resource assessment. As a result of development of both software packages, it is now easier to apply both software packages to geologic map production. Both systems accelerate the drafting and revision of maps and enhance the compilation process. Additionally, ARC/ INFO's analysis capability enhances the geologist's ability to develop answers to questions of interest that were previously difficult or impossible to obtain.

  13. Locating waterfowl observations on aerial surveys

    USGS Publications Warehouse

    Butler, W.I.; Hodges, J.I.; Stehn, R.A.

    1995-01-01

    We modified standard aerial survey data collection to obtain the geographic location for each waterfowl observation on surveys in Alaska during 1987-1993. Using transect navigation with CPS (global positioning system), data recording on continuously running tapes, and a computer data input program, we located observations with an average deviation along transects of 214 m. The method provided flexibility in survey design and data analysis. Although developed for geese nesting near the coast of the Yukon-Kuskokwim Delta, the methods are widely applicable and were used on other waterfowl surveys in Alaska to map distribution and relative abundance of waterfowl. Accurate location data with GIS analysis and display may improve precision and usefulness of data from any aerial transect survey.

  14. Computational mechanics

    SciTech Connect

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  15. Research on Computer-Based Education for Reading Teachers: A 1989 Update. Results of the First National Assessment of Computer Competence.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    Results of the 1985-86 National Assessment of Educational Progress (NAEP) survey of American students' knowledge of computers suggest that American schools have a long way to go before computers can be said to have made a significant impact. The survey covered the 3rd, 7th, and 11th grade levels and assessed competence in knowledge of computers,…

  16. Community Perception Survey, 2001.

    ERIC Educational Resources Information Center

    Rasmussen, Patricia; Silverman, Barbara

    This document is a report on the 2001 Community Perception Survey administered by Mt. San Antonio College (SAC) (California). The survey gathered public perception data of SAC services and programs. The survey was mailed to 773 service area community leaders; 160 (21%) responded. Survey results showed that: (1) 70% had knowledge of SAC programs…

  17. Quantum computers.

    PubMed

    Ladd, T D; Jelezko, F; Laflamme, R; Nakamura, Y; Monroe, C; O'Brien, J L

    2010-03-04

    Over the past several decades, quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing information encoded in systems that exhibit unique quantum properties? Today it is understood that the answer is yes, and many research groups around the world are working towards the highly ambitious technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. A number of physical systems, spanning much of modern physics, are being developed for quantum computation. However, it remains unclear which technology, if any, will ultimately prove successful. Here we describe the latest developments for each of the leading approaches and explain the major challenges for the future.

  18. Qubus computation

    NASA Astrophysics Data System (ADS)

    Munro, W. J.; Nemoto, Kae; Spiller, T. P.; van Loock, P.; Braunstein, Samuel L.; Milburn, G. J.

    2006-08-01

    Processing information quantum mechanically is known to enable new communication and computational scenarios that cannot be accessed with conventional information technology (IT). We present here a new approach to scalable quantum computing---a "qubus computer"---which realizes qubit measurement and quantum gates through interacting qubits with a quantum communication bus mode. The qubits could be "static" matter qubits or "flying" optical qubits, but the scheme we focus on here is particularly suited to matter qubits. Universal two-qubit quantum gates may be effected by schemes which involve measurement of the bus mode, or by schemes where the bus disentangles automatically and no measurement is needed. This approach enables a parity gate between qubits, mediated by a bus, enabling near-deterministic Bell state measurement and entangling gates. Our approach is therefore the basis for very efficient, scalable QIP, and provides a natural method for distributing such processing, combining it with quantum communication.

  19. Computational Psychiatry

    PubMed Central

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  20. Computing for particle physics. Report of the HEPAP Subpanel on Computer Needs for the Next Decade

    NASA Astrophysics Data System (ADS)

    1985-08-01

    The increasing importance of computation to the future progress in high energy physics is documented. Experimental computing demands are analyzed for the near future (four to ten years). The computer industry's plans for the near term and long term are surveyed as they relate to the solution of high energy physics computing problems. This survey includes large processors and the future role of alternatives to commercial mainframes. The needs for low speed and high speed networking are assessed, and the need for an integrated network for high energy physics is evaluated. Software requirements are analyzed. The role to be played by multiple processor systems is examined. The computing needs associated with elementary particle theory are briefly summarized. Computing needs associated with the Superconducting Super Collider are analyzed. Recommendations are offered for expanding computing capabilities in high energy physics and for networking between the laboratories.

  1. Computing for particle physics. Report of the HEPAP subpanel on computer needs for the next decade

    SciTech Connect

    Not Available

    1985-08-01

    The increasing importance of computation to the future progress in high energy physics is documented. Experimental computing demands are analyzed for the near future (four to ten years). The computer industry's plans for the near term and long term are surveyed as they relate to the solution of high energy physics computing problems. This survey includes large processors and the future role of alternatives to commercial mainframes. The needs for low speed and high speed networking are assessed, and the need for an integrated network for high energy physics is evaluated. Software requirements are analyzed. The role to be played by multiple processor systems is examined. The computing needs associated with elementary particle theory are briefly summarized. Computing needs associated with the Superconducting Super Collider are analyzed. Recommendations are offered for expanding computing capabilities in high energy physics and for networking between the laboratories. (LEW)

  2. Computational mechanics

    SciTech Connect

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  3. The AAS Workforce Survey

    NASA Astrophysics Data System (ADS)

    Postman, Marc; Norman, D. J.; Evans, N. R.; Ivie, R.

    2014-01-01

    The AAS Demographics Committee, on behalf of the AAS, was tasked with initiating a biennial survey to improve the Society's ability to serve its members and to inform the community about changes in the community's demographics. A survey, based in part on similar surveys for other scientific societies, was developed in the summer of 2012 and was publicly launched in January 2013. The survey randomly targeted 2500 astronomers who are members of the AAS. The survey was closed 4 months later (April 2013). The response rate was excellent - 63% (1583 people) completed the survey. I will summarize the results from this survey, highlighting key results and plans for their broad dissemination.

  4. Distributed GPU Computing in GIScience

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.

    2013-12-01

    Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE

  5. Computer viruses

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  6. LHC Computing

    ScienceCinema

    Lincoln, Don

    2016-07-12

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  7. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  8. Computational Hearing

    DTIC Science & Technology

    1998-11-01

    ranging from the anatomy and physiology of the auditory pathway to the perception of speech and music under both ideal and not-so-ideal (but more...physiology of various parts of the auditory pathway, to auditory prostheses, speech and audio coding, computational models of pitch and timbre , the role of

  9. Library Computing.

    ERIC Educational Resources Information Center

    Dayall, Susan A.; And Others

    1987-01-01

    Six articles on computers in libraries discuss training librarians and staff to use new software; appropriate technology; system upgrades of the Research Libraries Group's information system; pre-IBM PC microcomputers; multiuser systems for small to medium-sized libraries; and a library user's view of the traditional card catalog. (EM)

  10. Computational trigonometry

    SciTech Connect

    Gustafson, K.

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  11. Computational Estimation

    ERIC Educational Resources Information Center

    Fung, Maria G.; Latulippe, Christine L.

    2010-01-01

    Elementary school teachers are responsible for constructing the foundation of number sense in youngsters, and so it is recommended that teacher-training programs include an emphasis on number sense to ensure the development of dynamic, productive computation and estimation skills in students. To better prepare preservice elementary school teachers…

  12. Business Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    A brief definition of some fundamentals of microcomputers and of the ways they may be used in small businesses can help potential buyers make informed purchases. Hardware (the mechanical devices from which computers are made) described here are the video display, keyboard, central processing unit, "random access" and "read only" memories, cassette…

  13. Computer Guerrillas.

    ERIC Educational Resources Information Center

    Immel, A. Richard

    1983-01-01

    Describes several cases in which microcomputers were used to prevent large organizations (e.g., utility companies, U.S. Government Forestry Commission) from carrying out actions considered not to be in the public's best interests. The use of the computers by social activitists in their efforts to halt environmental destruction is discussed. (EAO)

  14. Computer Corner.

    ERIC Educational Resources Information Center

    Mason, Margie

    1985-01-01

    This article: describes how to prevent pins on game paddles from breaking; suggests using needlepoint books for ideas to design computer graphics; lists a BASIC program to create a Christmas tree, with extension activities; suggests a LOGO Christmas activity; and describes a book on the development of microcomputers. (JN)

  15. Computational Physics.

    ERIC Educational Resources Information Center

    Borcherds, P. H.

    1986-01-01

    Describes an optional course in "computational physics" offered at the University of Birmingham. Includes an introduction to numerical methods and presents exercises involving fast-Fourier transforms, non-linear least-squares, Monte Carlo methods, and the three-body problem. Recommends adding laboratory work into the course in the…

  16. Computational Musicology.

    ERIC Educational Resources Information Center

    Bel, Bernard; Vecchione, Bernard

    1993-01-01

    Asserts that a revolution has been occurring in musicology since the 1970s. Contends that music has change from being only a source of emotion to appearing more open to science and techniques based on computer technology. Describes recent research and other writings about the topic and provides an extensive bibliography. (CFR)

  17. Computer Corner.

    ERIC Educational Resources Information Center

    Smith, David A.; And Others

    1986-01-01

    APL was invented specifically as a mathematical teaching tool, and is an excellent vehicle for teaching mathematical concepts using computers. This article illustrates the use of APL in teaching many different topics in mathematics, including logic, set theory, functions, statistics, linear algebra, and matrices. (MNS)

  18. Networking computers.

    PubMed

    McBride, D C

    1997-03-01

    This decade the role of the personal computer has shifted dramatically from a desktop device designed to increase individual productivity and efficiency to an instrument of communication linking people and machines in different places with one another. A computer in one city can communicate with another that may be thousands of miles away. Networking is how this is accomplished. Just like the voice network used by the telephone, computer networks transmit data and other information via modems over these same telephone lines. A network can be created over both short and long distances. Networks can be established within a hospital or medical building or over many hospitals or buildings covering many geographic areas. Those confined to one location are called LANs, local area networks. Those that link computers in one building to those at other locations are known as WANs, or wide area networks. The ultimate wide area network is the one we've all been hearing so much about these days--the Internet, and its World Wide Web. Setting up a network is a process that requires careful planning and commitment. To avoid potential pitfalls and to make certain the network you establish meets your needs today and several years down the road, several steps need to be followed. This article reviews the initial steps involved in getting ready to network.

  19. COMPUTATIONAL SOCIOLINGUISTICS.

    ERIC Educational Resources Information Center

    SEDELOW, WALTER A., JR.

    THE USE OF THE COMPUTER MAY BE ONE OF THE WAYS IN WHICH VARIED LINGUISTIC INTERESTS (SOCIOLINGUISTICS, PSYCHOLINGUISTICS) COME TO BE RENDERED INTERRELATED AND EVEN INTELLECTUALLY COHERENT. (THE CRITERION OF COHERENCE IS SET HERE AT MONISM AS TO MODELS.) ONE OF THE AUTHOR'S MAJOR INTERESTS IS A SYSTEMATIC APPROACH TO SCIENTIFIC CREATIVITY,…

  20. Computational Mathematics

    DTIC Science & Technology

    2012-03-06

    Marsha Berger, NYU) Inclusion of the Adaptation/Adjoint module, Embedded Boundary Methods in the software package Cart3D --- Transition to NASA...ONR, DOE, AFRL, DIA Cart3D used for computing Formation Flight to reduce drag and improve energy efficiency Application to Explosively Formed

  1. Cooling Technology for Electronic Computers

    NASA Astrophysics Data System (ADS)

    Nakayama, Wataru

    The rapid growth of data processing speed in computers has been sustained by the advances in cooling technology. This article first presents a review of the published data of heat loads in recent Japanese large-scale computers. The survey indicates that, since around 1980, the high-level integration of microelectronic circuits has brought about almost four fold increase in the power dissipation from logic chips. The integration also has invited the evolutions of multichip modules and new schemes of electronic interconnections. Forced convection air-cooling and liquid cooling coupled with thermal connectors are discussed with reference to the designs employed in actual computers. More advanced cooling schemes are also discussed. Finally, the importance of thermal environmental control of computer rooms is emphasized.

  2. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  3. A Computing Cluster for Numerical Simulation

    DTIC Science & Technology

    2006-10-23

    34Contact and Friction for Cloth Animation", SIGGRAPH 2002, ACM TOG 21, 594-603 (2002). "* [BHTF] Bao, Z., Hong, J.-M., Teran , J. and Fedkiw, R...Simulation of Large Bodies of Water by Coupling Two and Three Dimensional Techniques", SIGGRAPH 2006, ACM TOG 25, 805-811 (2006). "* [ITF] Irving, G., Teran ...O’Brien (2006) "* [TSBNLF] Teran , J., Sifakis, E., Blemker, S., Ng Thow Hing, V., Lau, C. and Fedkiw, R., "Creating and Simulating Skeletal Muscle from the

  4. A search for stratiform massive-sulfide exploration targets in Appalachian Devonian rocks; a case study using computer-assisted attribute-coincidence mapping

    USGS Publications Warehouse

    Wedow, Helmuth

    1983-01-01

    The empirical model for sediment-associated, stratiform, exhalative, massive-sulfide deposits presented by D. Large in 1979 and 1980 has been redesigned to permit its use in a computer-assisted search for exploration-target areas in Devonian rocks of the Appalachian region using attribute-coincidence mapping (ACM). Some 36 gridded-data maps and selected maps derived therefrom were developed to show the orthogonal patterns, using the 7-1/2 minute quadrangle as an information cell, of geologic data patterns relevant to the empirical model. From these map and data files, six attribute-coincidence maps were prepared to illustrate both variation in the application of ACM techniques and the extent of possible significant exploration-target areas. As a result of this preliminary work in ACM, four major (and some lesser) exploration-target areas needing further study and analysis have been defined as follows: 1) in western and central New York in the outcrop area of lowermost Upper Devonian rocks straddling the Clarendon-Linden fault; 2) in western Virginia and eastern West Virginia in an area largely coincident with the well-known 'Oriskany' Mn-Fe ores; 3) an area in West Virginia, Maryland, and Virginia along and nearby the trend of the Alabama-New York lineament of King and Zietz approximately between 38- and 40-degrees N. latitude; and 4) an area in northeastern Ohio overlying an area coincident with a significant thickness of Silurian salt and high modern seismic activity. Some lesser, smaller areas suggested by relatively high coincidence may also be worthy of further study.

  5. Business aspects and sustainability for healthgrids - an expert survey.

    PubMed

    Scholz, Stefan; Semler, Sebastian C; Breitner, Michael H

    2009-01-01

    Grid computing initiatives in medicine and life sciences are under pressure to prove their sustainability. While some first business model frameworks were outlined, few practical experiences were considered. This gap has been narrowed by an international survey of 33 grid computing experts with biomedical and non-biomedical background on business aspects. The experts surveyed were cautiously optimistic about a sustainable implementation of grid computing within a mid term timeline. They identified marketable application areas, stated the underlying value proposition, outlined trends and specify critical success factors. From a general perspective of their answers, they provided a stable basis for a road map of sustainable grid computing solutions for medicine and life sciences.

  6. Specification/Verification of Temporal Properties for Distributed Systems: Issues and Approaches. Volume 1

    DTIC Science & Technology

    1990-02-01

    Philip A. Bernstein and Nathan Goodman. Concurrency control in distributed database systems. ACM Computing Surveys, 13(2):185-221, June 1981. [5] K. J...Sequential Processe8. Series in Computer Science. PrenticeHall International, Englewood Cliff, NJ, 1985. 96 [24] A. L. Hopkins Jr., T. Basil Smith, III, and J

  7. Adaptive Explicitly Parallel Instruction Computing

    DTIC Science & Technology

    2000-12-16

    1993. [17] James F. Blinn. Jim Blinn’s corner: Fugue for MMX. IEEE Computer Graphics and Applications, 17(2):88– 93, March/April 1997. Makes several...processors. IEEE Transactions on Computers, C-29(4):308–316, April 1980. [22] Doug Burger and James R. Goodman. Guest editors introduction: Billion...sequencing and scheduling: A survey. Ann. Discrete Mathematics, 5:287–326, 1979. [58] C. Ebeling D. C. Green and P. Franklin . RaPiD – reconfigurable

  8. High resolution survey for topographic surveying

    NASA Astrophysics Data System (ADS)

    Luh, L. C.; Setan, H.; Majid, Z.; Chong, A. K.; Tan, Z.

    2014-02-01

    In this decade, terrestrial laser scanner (TLS) is getting popular in many fields such as reconstruction, monitoring, surveying, as-built of facilities, archaeology, and topographic surveying. This is due the high speed in data collection which is about 50,000 to 1,000,000 three-dimensional (3D) points per second at high accuracy. The main advantage of 3D representation for the data is that it is more approximate to the real world. Therefore, the aim of this paper is to show the use of High-Definition Surveying (HDS), also known as 3D laser scanning for topographic survey. This research investigates the effectiveness of using terrestrial laser scanning system for topographic survey by carrying out field test in Universiti Teknologi Malaysia (UTM), Skudai, Johor. The 3D laser scanner used in this study is a Leica ScanStation C10. Data acquisition was carried out by applying the traversing method. In this study, the result for the topographic survey is under 1st class survey. At the completion of this study, a standard of procedure was proposed for topographic data acquisition using laser scanning systems. This proposed procedure serves as a guideline for users who wish to utilize laser scanning system in topographic survey fully.

  9. 2012 Corporate Recruiters Survey. Survey Report

    ERIC Educational Resources Information Center

    Estrada, Rebecca

    2012-01-01

    This paper presents the results from the 2012 Corporate Recruiters Survey conducted by the Graduate Management Admission Council[R] (GMAC[R]). Conducted annually since 2001, this survey examines the job outlook for recent graduate business students as well as employer needs and expectations. The objectives of this study are to obtain a picture of…

  10. Advanced Algorithms and Statistics for MOS Surveys

    NASA Astrophysics Data System (ADS)

    Bolton, A. S.

    2016-10-01

    This paper presents an individual view on the current state of computational data processing and statistics for inference and discovery in multi-object spectroscopic surveys, supplemented by a historical perspective and a few present-day applications. It is more op-ed than review, and hopefully more readable as a result.

  11. A survey of big data research

    PubMed Central

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  12. Aerial radiation surveys

    SciTech Connect

    Jobst, J.

    1980-01-01

    A recent aerial radiation survey of the surroundings of the Vitro mill in Salt Lake City shows that uranium mill tailings have been removed to many locations outside their original boundary. To date, 52 remote sites have been discovered within a 100 square kilometer aerial survey perimeter surrounding the mill; 9 of these were discovered with the recent aerial survey map. Five additional sites, also discovered by aerial survey, contained uranium ore, milling equipment, or radioactive slag. Because of the success of this survey, plans are being made to extend the aerial survey program to other parts of the Salt Lake valley where diversions of Vitro tailings are also known to exist.

  13. RATIO COMPUTER

    DOEpatents

    Post, R.F.

    1958-11-11

    An electronic computer circuit is described for producing an output voltage proportional to the product or quotient of tbe voltages of a pair of input signals. ln essence, the disclosed invention provides a computer having two channels adapted to receive separate input signals and each having amplifiers with like fixed amplification factors and like negatlve feedback amplifiers. One of the channels receives a constant signal for comparison purposes, whereby a difference signal is produced to control the amplification factors of the variable feedback amplifiers. The output of the other channel is thereby proportional to the product or quotient of input signals depending upon the relation of input to fixed signals in the first mentioned channel.

  14. Computational Combustion

    SciTech Connect

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  15. Computational Physics

    NASA Astrophysics Data System (ADS)

    Thijssen, Jos

    2013-10-01

    1. Introduction; 2. Quantum scattering with a spherically symmetric potential; 3. The variational method for the Schrödinger equation; 4. The Hartree-Fock method; 5. Density functional theory; 6. Solving the Schrödinger equation in periodic solids; 7. Classical equilibrium statistical mechanics; 8. Molecular dynamics simulations; 9. Quantum molecular dynamics; 10. The Monte Carlo method; 11. Transfer matrix and diagonalisation of spin chains; 12. Quantum Monte Carlo methods; 13. The infinite element method for partial differential equations; 14. The lattice Boltzmann method for fluid dynamics; 15. Computational methods for lattice field theories; 16. High performance computing and parallelism; Appendix A. Numerical methods; Appendix B. Random number generators; References; Index.

  16. Role of Computer Assisted Instruction (CAI) in an Introductory Computer Concepts Course.

    ERIC Educational Resources Information Center

    Skudrna, Vincent J.

    1997-01-01

    Discusses the role of computer assisted instruction (CAI) in undergraduate education via a survey of related literature and specific applications. Describes an undergraduate computer concepts course and includes appendices of instructions, flowcharts, programs, sample student work in accounting, COBOL instructional model, decision logic in a…

  17. Computing as a Matter of Course. The Instructional Use of Computers at Dartmouth College.

    ERIC Educational Resources Information Center

    Nevison, John M.

    The faculty of Dartmouth College was surveyed to measure the extent of instructional computing in undergraduate courses, including the total amount, the spread across the curriculum, and the variety of individual efforts. The on-campus teaching faculty received a short, four-question card asking if they had used computing in their courses. Those…

  18. Perceived Social Supports, Computer Self-Efficacy, and Computer Use among High School Students

    ERIC Educational Resources Information Center

    Hsiao, Hsi-Chi; Tu, Ya-Ling; Chung, Hsin-Nan

    2012-01-01

    This study investigated the function of social supports and computer self-efficacy in predicting high school students' perceived effect of computer use. The study was survey method to collect data. The questionnaires were distributed to the high school students in Taiwan. 620 questionnaires were distributed and 525 questionnaires were gathered…

  19. Singularity computations

    NASA Technical Reports Server (NTRS)

    Swedlow, J. L.

    1976-01-01

    An approach is described for singularity computations based on a numerical method for elastoplastic flow to delineate radial and angular distribution of field quantities and measure the intensity of the singularity. The method is applicable to problems in solid mechanics and lends itself to certain types of heat flow and fluid motion studies. Its use is not limited to linear, elastic, small strain, or two-dimensional situations.

  20. Spatial Computation

    DTIC Science & Technology

    2003-12-01

    particular program, synthesized under compiler control from the application source code . The translation is illustrated in Figure 1.4. From now on, when we use...very efficient method of exploring the design of complex application-specific system-on-a-chip devices using only the application source code . • New...computation gates. This frees, but also complicates, the com- pilation process. In order to handle the great semantic gap between the source code and the

  1. Computational enzymology.

    PubMed

    Lonsdale, Richard; Ranaghan, Kara E; Mulholland, Adrian J

    2010-04-14

    Molecular simulations and modelling are changing the science of enzymology. Calculations can provide detailed, atomic-level insight into the fundamental mechanisms of biological catalysts. Computational enzymology is a rapidly developing area, and is testing theories of catalysis, challenging 'textbook' mechanisms, and identifying novel catalytic mechanisms. Increasingly, modelling is contributing directly to experimental studies of enzyme-catalysed reactions. Potential practical applications include interpretation of experimental data, catalyst design and drug development.

  2. Computational Electromagnetics

    DTIC Science & Technology

    2011-02-20

    a collaboration between Caltech’s postdoctoral associate N. Albin and OB) have shown that, for a variety of reasons, the first-order...KZK approximation", Nathan Albin , Oscar P. Bruno, Theresa Y. Cheung and Robin O. Cleveland, preprint, (2011) "A Spectral FC Solver for the Compressible...Navier-Stokes Equations in General Domains I: Explicit time-stepping" Nathan Albin and Oscar P. Bruno, To appear in Journal of Computational Physics

  3. The use of computer-assisted surgery as an educational tool for the training of orthopedic surgery residents in pedicle screw placement: a pilot study and survey among orthopedic residents

    PubMed Central

    Aoude, Ahmed; Alhamzah, Hamzah; Fortin, Maryse; Jarzem, Peter; Ouellet, Jean; Weber, Michael H.

    2016-01-01

    Background The training of orthopedic residents in adequate pedicle screw placement is very important. We sought to investigate orthopedic residents’ perspectives on the use of computer-assisted surgery (CAS) in a training trial. Methods Orthopedic residents were randomly assigned to independently place a screw using the free-hand technique and the CAS technique on 1 of 3 cadavers (Cobb angles 5º, 15º and 67º) at randomly selected thoracolumbar vertebral levels. All residents were blinded to their colleagues’ pedicle screw placements and were asked to complete a short questionnaire at the end of the session to evaluate their experience with CAS. We obtained CT images for each cadaver to assess pedicle screw placement accuracy and classified placement as A) screw completely in pedicle, B) screw < 2 mm outside pedicle, C) screw 2–4 mm outside pedicle, or D) screw > 4 mm outside pedicle. Results Twenty-four orthopedic residents participated in this trial study. In total, 65% preferred using the free-hand technique in an educational setting even though most (60%) said that CAS is safer. The main reason for free-hand technique preference was the difficult technical aspects encountered with CAS. In addition, accuracy of pedicle screw placement in this trial showed that 5 screws were classified as A or B (safe zone) and 19 as grade C or D (unsafe zone) using the free-hand technique compared with 15 and 9, respectively, using CAS (p = 0.008). Conclusion Orthopedic residents perceived CAS as safe and demonstrated improved accuracy in pedicle screw placement in a single setting. However, the residents preferred the free-hand technique in an educational stetting owing to the difficult technical aspects of CAS. PMID:28234614

  4. Computer Simulation Of Cyclic Oxidation

    NASA Technical Reports Server (NTRS)

    Probst, H. B.; Lowell, C. E.

    1990-01-01

    Computer model developed to simulate cyclic oxidation of metals. With relatively few input parameters, kinetics of cyclic oxidation simulated for wide variety of temperatures, durations of cycles, and total numbers of cycles. Program written in BASICA and run on any IBM-compatible microcomputer. Used in variety of ways to aid experimental research. In minutes, effects of duration of cycle and/or number of cycles on oxidation kinetics of material surveyed.

  5. 77 FR 20367 - Proposed Information Collection; Comment Request; Computer and Internet Use Supplement to the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-04

    ...; Computer and Internet Use Supplement to the Census Bureau's Current Population Survey AGENCY: National... this reinstatement will be a revised set of computer and Internet usage survey questions. II. Method of Collection Personal visits and telephone interviews, using computer-assisted telephone interviewing...

  6. Sex Differences in Secondary School Students' Attitudes toward Computers.

    ERIC Educational Resources Information Center

    Collis, Betty

    1985-01-01

    Summarizes results of a study measuring eighth- and twelfth-grade students' attitudes toward computers. Sex and age differences, computer literacy course impact, and correlation of student attitudes toward computers and mathematics and science are assessed. A table giving means and standard deviations of responses to survey items is included. (MBR)

  7. Computer and Software Use in Teaching the Beginning Statistics Course.

    ERIC Educational Resources Information Center

    Bartz, Albert E.; Sabolik, Marisa A.

    2001-01-01

    Surveys the extent of computer usage in the beginning statistics course and the variety of statistics software used. Finds that 69% of the psychology departments used computers in beginning statistics courses and 90% used computer-assisted data analysis in statistics or other courses. (CMK)

  8. Effect of Mailing Address Style on Survey Response Rate.

    ERIC Educational Resources Information Center

    Cookingham, Frank G.

    This study determined the effect of using mailing labels prepared by a letter-quality computer printer on survey response rate. D. A. Dillman's personalization approach to conducting mail surveys suggests that envelopes with addresses typed directly on them may produce a higher response rate than envelopes with addresses typed on self-adhesive…

  9. Social Media and Archives: A Survey of Archive Users

    ERIC Educational Resources Information Center

    Washburn, Bruce; Eckert, Ellen; Proffitt, Merrilee

    2013-01-01

    In April and May of 2012, the Online Computer Library Center (OCLC) Research conducted a survey of users of archives to learn more about their habits and preferences. In particular, they focused on the roles that social media, recommendations, reviews, and other forms of user-contributed annotation play in archival research. OCLC surveyed faculty,…

  10. Autonomous mobile robot for radiologic surveys

    DOEpatents

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1994-06-28

    An apparatus is described for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm. 5 figures.

  11. Autonomous mobile robot for radiologic surveys

    DOEpatents

    Dudar, Aed M.; Wagner, David G.; Teese, Gregory D.

    1994-01-01

    An apparatus for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm.

  12. Surveys: an introduction.

    PubMed

    Rubenfeld, Gordon D

    2004-10-01

    Surveys are a valuable research tool for studying the knowledge, attitudes, and behavior of a study population. This article explores quantitative analyses of written questionnaires as instruments for survey research. Obtaining accurate and precise information from a survey requires minimizing the possibility of bias from inappropriate sampling or a flawed survey instrument, and this article describes strategies to minimize sampling bias by increasing response rates, comparing responders to nonresponders, and identifying the appropriate sampling population. It is crucial that the survey instrument be valid, meaning that it actually measures what the investigator intends it to measure. In developing a valid survey instrument, it can be useful to adapt survey instruments that were developed by other researchers and to conduct extensive pilot-testing of your survey instrument.

  13. National Health Care Survey

    Cancer.gov

    This survey encompasses a family of health care provider surveys, including information about the facilities that supply health care, the services rendered, and the characteristics of the patients served.

  14. Water Use: A Survey

    ERIC Educational Resources Information Center

    Fleming, Rose Glee; Warden, Jessie

    1976-01-01

    A survey of Florida State University students showed that their current laundry practices generate energy and water over-consumption. The survey also resulted in some concrete suggestions to the students that would improve their conservation practices. (Author/BP)

  15. Computers: from ethos and ethics to mythos and religion. Notes on the new frontier between computers and philosophy

    SciTech Connect

    Mitcham, C.

    1986-01-01

    This essay surveys recent studies concerning the social, cultural, ethical and religious dimensions of computers. The argument is that computers have certain cultural influences which call for ethical analysis. Further suggestions are that American culture is itself reflected in new ways in the high-technology computer milieu, and that ethical issues entail religious ones which are being largely ignored. 28 references.

  16. Kiso Supernova Survey (KISS): Survey strategy

    NASA Astrophysics Data System (ADS)

    Morokuma, Tomoki; Tominaga, Nozomu; Tanaka, Masaomi; Mori, Kensho; Matsumoto, Emiko; Kikuchi, Yuki; Shibata, Takumi; Sako, Shigeyuki; Aoki, Tsutomu; Doi, Mamoru; Kobayashi, Naoto; Maehara, Hiroyuki; Matsunaga, Noriyuki; Mito, Hiroyuki; Miyata, Takashi; Nakada, Yoshikazu; Soyano, Takao; Tarusawa, Ken'ichi; Miyazaki, Satoshi; Nakata, Fumiaki; Okada, Norio; Sarugaku, Yuki; Richmond, Michael W.; Akitaya, Hiroshi; Aldering, Greg; Arimatsu, Ko; Contreras, Carlos; Horiuchi, Takashi; Hsiao, Eric Y.; Itoh, Ryosuke; Iwata, Ikuru; Kawabata, Koji S.; Kawai, Nobuyuki; Kitagawa, Yutaro; Kokubo, Mitsuru; Kuroda, Daisuke; Mazzali, Paolo; Misawa, Toru; Moritani, Yuki; Morrell, Nidia; Okamoto, Rina; Pavlyuk, Nikolay; Phillips, Mark M.; Pian, Elena; Sahu, Devendra; Saito, Yoshihiko; Sano, Kei; Stritzinger, Maximilian D.; Tachibana, Yutaro; Taddia, Francesco; Takaki, Katsutoshi; Tateuchi, Ken; Tomita, Akihiko; Tsvetkov, Dmitry; Ui, Takahiro; Ukita, Nobuharu; Urata, Yuji; Walker, Emma S.; Yoshii, Taketoshi

    2014-12-01

    The Kiso Supernova Survey (KISS) is a high-cadence optical wide-field supernova (SN) survey. The primary goal of the survey is to catch the very early light of a SN, during the shock breakout phase. Detection of SN shock breakouts combined with multi-band photometry obtained with other facilities would provide detailed physical information on the progenitor stars of SNe. The survey is performed using a 2.2° × 2.2° field-of-view instrument on the 1.05-m Kiso Schmidt telescope, the Kiso Wide Field Camera (KWFC). We take a 3-min exposure in g-band once every hour in our survey, reaching magnitude g ˜ 20-21. About 100 nights of telescope time per year have been spent on the survey since 2012 April. The number of the shock breakout detections is estimated to be of the order of 1 during our three-year project. This paper summarizes the KISS project including the KWFC observing setup, the survey strategy, the data reduction system, and CBET-reported SNe discovered so far by KISS.

  17. Quality indexing with computer-aided lexicography

    NASA Technical Reports Server (NTRS)

    Buchan, Ronald L.

    1992-01-01

    Indexing with computers is a far cry from indexing with the first indexing tool, the manual card sorter. With the aid of computer-aided lexicography, both indexing and indexing tools can provide standardization, consistency, and accuracy, resulting in greater quality control than ever before. A brief survey of computer activity in indexing is presented with detailed illustrations from NASA activity. Applications from techniques mentioned, such as Retrospective Indexing (RI), can be made to many indexing systems. In addition to improving the quality of indexing with computers, the improved efficiency with which certain tasks can be done is demonstrated.

  18. The current status of super computers

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1978-01-01

    In this paper, commercially available super computers are surveyed. Computer performance in general is limited by circuit speeds and physical size. Assuming the use of the fastest technology, super computers typically use parallelism in the form of either vector processing or array processing to obtain performance. The Burroughs Scientific Processor is an array computer with 16 separate processors, the Cray-1 and CDC STAR-100 are vector processors, the Goodyear Aerospace STARAN is an array processor with up to 8192 single bit processors, and the Systems Development Corporation PEPE is a collection of up to 288 separate processors.

  19. The Effect of Survey Mode on High School Risk Behavior Data: A Comparison between Web and Paper-Based Surveys

    ERIC Educational Resources Information Center

    Raghupathy, Shobana; Hahn-Smith, Stephen

    2013-01-01

    There has been increasing interest in using of web-based surveys--rather than paper based surveys--for collecting data on alcohol and other drug use in middle and high schools in the US. However, prior research has indicated that respondent confidentiality is an underlying concern with online data collection especially when computer-assisted…

  20. A Survey of Automated Activities in the Libraries of Mexico, Central America and South America; Volume 4, World Survey Series.

    ERIC Educational Resources Information Center

    Patrinostro, Frank S., Comp.; Sanders, Nancy P., Ed.

    The intent of this fourth volume of the "Survey of Automated Activities in the Libraries of the World" is to identify and describe computer-based library projects in the Latin American countries. Information was drawn from survey questionnaires sent to individual libraries. However, few of the South American libraries responded, and as a…