Sample records for computing research association

  1. Values and Objectives in Computing Education Research

    ERIC Educational Resources Information Center

    Pears, Arnold; Malmi, Lauri

    2009-01-01

    What is Computing Education Research (CER), why are we doing this type of research, and what should the community achieve? As associate editors to this special edition we provide our perspectives and discuss how they have influenced the evolution of the Koli Calling International Conference on Computing Education Research over the last nine years.…

  2. Computer-Based Assessments. Information Capsule. Volume 0918

    ERIC Educational Resources Information Center

    Blazer, Christie

    2010-01-01

    This Information Capsule reviews research conducted on computer-based assessments. Advantages and disadvantages associated with computer-based testing programs are summarized and research on the comparability of computer-based and paper-and-pencil assessments is reviewed. Overall, studies suggest that for most students, there are few if any…

  3. The AAHA Computer Program. American Animal Hospital Association.

    PubMed

    Albers, J W

    1986-07-01

    The American Animal Hospital Association Computer Program should benefit all small animal practitioners. Through the availability of well-researched and well-developed certified software, veterinarians will have increased confidence in their purchase decisions. With the expansion of computer applications to improve practice management efficiency, veterinary computer systems will further justify their initial expense. The development of the Association's veterinary computer network will provide a variety of important services to the profession.

  4. Computational Approaches to Phenotyping

    PubMed Central

    Lussier, Yves A.; Liu, Yang

    2007-01-01

    The recent completion of the Human Genome Project has made possible a high-throughput “systems approach” for accelerating the elucidation of molecular underpinnings of human diseases, and subsequent derivation of molecular-based strategies to more effectively prevent, diagnose, and treat these diseases. Although altered phenotypes are among the most reliable manifestations of altered gene functions, research using systematic analysis of phenotype relationships to study human biology is still in its infancy. This article focuses on the emerging field of high-throughput phenotyping (HTP) phenomics research, which aims to capitalize on novel high-throughput computation and informatics technology developments to derive genomewide molecular networks of genotype–phenotype associations, or “phenomic associations.” The HTP phenomics research field faces the challenge of technological research and development to generate novel tools in computation and informatics that will allow researchers to amass, access, integrate, organize, and manage phenotypic databases across species and enable genomewide analysis to associate phenotypic information with genomic data at different scales of biology. Key state-of-the-art technological advancements critical for HTP phenomics research are covered in this review. In particular, we highlight the power of computational approaches to conduct large-scale phenomics studies. PMID:17202287

  5. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. The 1993-94 CESDIS year included a broad range of computer science research applied to NASA problems. This report provides an overview of these research projects and programs as well as a summary of the various other activities of CESDIS in support of NASA and the university research community, We have had an exciting and challenging year.

  6. Telepresence: A "Real" Component in a Model to Make Human-Computer Interface Factors Meaningful in the Virtual Learning Environment

    ERIC Educational Resources Information Center

    Selverian, Melissa E. Markaridian; Lombard, Matthew

    2009-01-01

    A thorough review of the research relating to Human-Computer Interface (HCI) form and content factors in the education, communication and computer science disciplines reveals strong associations of meaningful perceptual "illusions" with enhanced learning and satisfaction in the evolving classroom. Specifically, associations emerge…

  7. Challenges and opportunities of cloud computing for atmospheric sciences

    NASA Astrophysics Data System (ADS)

    Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.

    2016-04-01

    Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.

  8. Gender and stereotypes in motivation to study computer programming for careers in multimedia

    NASA Astrophysics Data System (ADS)

    Doubé, Wendy; Lang, Catherine

    2012-03-01

    A multimedia university programme with relatively equal numbers of male and female students in elective programming subjects provided a rare opportunity to investigate female motivation to study and pursue computer programming in a career. The MSLQ was used to survey 85 participants. In common with research into deterrence of females from STEM domains, females displayed significantly lower self-efficacy and expectancy for success. In contrast to research into deterrence of females from STEM domains, both genders placed similar high values on computer programming and shared high extrinsic and intrinsic goal orientation. The authors propose that the stereotype associated with a creative multimedia career could attract female participation in computer programming whereas the stereotype associated with computer science could be a deterrent.

  9. Network and computing infrastructure for scientific applications in Georgia

    NASA Astrophysics Data System (ADS)

    Kvatadze, R.; Modebadze, Z.

    2016-09-01

    Status of network and computing infrastructure and available services for research and education community of Georgia are presented. Research and Educational Networking Association - GRENA provides the following network services: Internet connectivity, network services, cyber security, technical support, etc. Computing resources used by the research teams are located at GRENA and at major state universities. GE-01-GRENA site is included in European Grid infrastructure. Paper also contains information about programs of Learning Center and research and development projects in which GRENA is participating.

  10. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. Research areas of primary interest at CESDIS include: 1) High performance computing, especially software design and performance evaluation for massively parallel machines; 2) Parallel input/output and data storage systems for high performance parallel computers; 3) Data base and intelligent data management systems for parallel computers; 4) Image processing; 5) Digital libraries; and 6) Data compression. CESDIS funds multiyear projects at U. S. universities and colleges. Proposals are accepted in response to calls for proposals and are selected on the basis of peer reviews. Funds are provided to support faculty and graduate students working at their home institutions. Project personnel visit Goddard during academic recess periods to attend workshops, present seminars, and collaborate with NASA scientists on research projects. Additionally, CESDIS takes on specific research tasks of shorter duration for computer science research requested by NASA Goddard scientists.

  11. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  12. Case Study: Creation of a Degree Program in Computer Security. White Paper.

    ERIC Educational Resources Information Center

    Belon, Barbara; Wright, Marie

    This paper reports on research into the field of computer security, and undergraduate degrees offered in that field. Research described in the paper reveals only one computer security program at the associate's degree level in the entire country. That program, at Texas State Technical College in Waco, is a 71-credit-hour program leading to an…

  13. The OSG open facility: A sharing ecosystem

    DOE PAGES

    Jayatilaka, B.; Levshina, T.; Rynge, M.; ...

    2015-12-23

    The Open Science Grid (OSG) ties together individual experiments’ computing power, connecting their resources to create a large, robust computing grid, this computing infrastructure started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero. In the years since, the OSG has broadened its focus to also address the needs of other US researchers and increased delivery of Distributed High Through-put Computing (DHTC) to users from a wide variety of disciplines via the OSG Open Facility. Presently, the Open Facility delivers about 100 million computing wall hours per year to researchers whomore » are not already associated with the owners of the computing sites, this is primarily accomplished by harvesting and organizing the temporarily unused capacity (i.e. opportunistic cycles) from the sites in the OSG. Using these methods, OSG resource providers and scientists share computing hours with researchers in many other fields to enable their science, striving to make sure that these computing power used with maximal efficiency. Furthermore, we believe that expanded access to DHTC is an essential tool for scientific innovation and work continues in expanding this service.« less

  14. Conference Abstracts: AEDS '84.

    ERIC Educational Resources Information Center

    Baird, William E.

    1985-01-01

    The Association of Educational Data Systems (AEDS) conference included 102 presentations. Abstracts of seven of these presentations are provided. Topic areas considered include LOGO, teaching probability through a computer game, writing effective computer assisted instructional materials, computer literacy, research on instructional…

  15. A Quantitative Model for Assessing Visual Simulation Software Architecture

    DTIC Science & Technology

    2011-09-01

    Software Engineering Arnold Buss Research Associate Professor of MOVES LtCol Jeff Boleng, PhD Associate Professor of Computer Science U.S. Air Force Academy... science (operating and programming systems series). New York, NY, USA: Elsevier Science Ltd. Henry, S., & Kafura, D. (1984). The evaluation of software...Rudy Darken Professor of Computer Science Dissertation Supervisor Ted Lewis Professor of Computer Science Richard Riehle Professor of Practice

  16. Interfacing laboratory instruments to multiuser, virtual memory computers

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Stang, David B.; Roth, Don J.

    1989-01-01

    Incentives, problems and solutions associated with interfacing laboratory equipment with multiuser, virtual memory computers are presented. The major difficulty concerns how to utilize these computers effectively in a medium sized research group. This entails optimization of hardware interconnections and software to facilitate multiple instrument control, data acquisition and processing. The architecture of the system that was devised, and associated programming and subroutines are described. An example program involving computer controlled hardware for ultrasonic scan imaging is provided to illustrate the operational features.

  17. A Gaussian Approximation Approach for Value of Information Analysis.

    PubMed

    Jalal, Hawre; Alarid-Escudero, Fernando

    2018-02-01

    Most decisions are associated with uncertainty. Value of information (VOI) analysis quantifies the opportunity loss associated with choosing a suboptimal intervention based on current imperfect information. VOI can inform the value of collecting additional information, resource allocation, research prioritization, and future research designs. However, in practice, VOI remains underused due to many conceptual and computational challenges associated with its application. Expected value of sample information (EVSI) is rooted in Bayesian statistical decision theory and measures the value of information from a finite sample. The past few years have witnessed a dramatic growth in computationally efficient methods to calculate EVSI, including metamodeling. However, little research has been done to simplify the experimental data collection step inherent to all EVSI computations, especially for correlated model parameters. This article proposes a general Gaussian approximation (GA) of the traditional Bayesian updating approach based on the original work by Raiffa and Schlaifer to compute EVSI. The proposed approach uses a single probabilistic sensitivity analysis (PSA) data set and involves 2 steps: 1) a linear metamodel step to compute the EVSI on the preposterior distributions and 2) a GA step to compute the preposterior distribution of the parameters of interest. The proposed approach is efficient and can be applied for a wide range of data collection designs involving multiple non-Gaussian parameters and unbalanced study designs. Our approach is particularly useful when the parameters of an economic evaluation are correlated or interact.

  18. Factors influencing health professions students' use of computers for data analysis at three Ugandan public medical schools: a cross-sectional survey.

    PubMed

    Munabi, Ian G; Buwembo, William; Bajunirwe, Francis; Kitara, David Lagoro; Joseph, Ruberwa; Peter, Kawungezi; Obua, Celestino; Quinn, John; Mwaka, Erisa S

    2015-02-25

    Effective utilization of computers and their applications in medical education and research is of paramount importance to students. The objective of this study was to determine the association between owning a computer and use of computers for research data analysis and the other factors influencing health professions students' computer use for data analysis. We conducted a cross sectional study among undergraduate health professions students at three public universities in Uganda using a self-administered questionnaire. The questionnaire was composed of questions on participant demographics, students' participation in research, computer ownership, and use of computers for data analysis. Descriptive and inferential statistics (uni-variable and multi- level logistic regression analysis) were used to analyse data. The level of significance was set at 0.05. Six hundred (600) of 668 questionnaires were completed and returned (response rate 89.8%). A majority of respondents were male (68.8%) and 75.3% reported owning computers. Overall, 63.7% of respondents reported that they had ever done computer based data analysis. The following factors were significant predictors of having ever done computer based data analysis: ownership of a computer (adj. OR 1.80, p = 0.02), recently completed course in statistics (Adj. OR 1.48, p =0.04), and participation in research (Adj. OR 2.64, p <0.01). Owning a computer, participation in research and undertaking courses in research methods influence undergraduate students' use of computers for research data analysis. Students are increasingly participating in research, and thus need to have competencies for the successful conduct of research. Medical training institutions should encourage both curricular and extra-curricular efforts to enhance research capacity in line with the modern theories of adult learning.

  19. Applied Computational Fluid Dynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Kwak, Dochan (Technical Monitor)

    1994-01-01

    The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.

  20. Ergonomics standards and guidelines for computer workstation design and the impact on users' health - a review.

    PubMed

    Woo, E H C; White, P; Lai, C W K

    2016-03-01

    This paper presents an overview of global ergonomics standards and guidelines for design of computer workstations, with particular focus on their inconsistency and associated health risk impact. Overall, considerable disagreements were found in the design specifications of computer workstations globally, particularly in relation to the results from previous ergonomics research and the outcomes from current ergonomics standards and guidelines. To cope with the rapid advancement in computer technology, this article provides justifications and suggestions for modifications in the current ergonomics standards and guidelines for the design of computer workstations. Practitioner Summary: A research gap exists in ergonomics standards and guidelines for computer workstations. We explore the validity and generalisability of ergonomics recommendations by comparing previous ergonomics research through to recommendations and outcomes from current ergonomics standards and guidelines.

  1. Developmental Studies of Computer Programming Skills. A Symposium: Annual Meeting of the American Educational Research Association (New Orleans, Louisiana, April 23-27, 1984). Technical Report No. 29.

    ERIC Educational Resources Information Center

    Kurland, D. Midian, Ed.

    The five papers in this symposium contribute to a dialog on the aims and methods of computer education, and indicate directions future research must take if necessary information is to be available to make informed decisions about the use of computers in schools. The first two papers address the question of what is required for a student to become…

  2. Computational Fluid Dynamics. [numerical methods and algorithm development

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This collection of papers was presented at the Computational Fluid Dynamics (CFD) Conference held at Ames Research Center in California on March 12 through 14, 1991. It is an overview of CFD activities at NASA Lewis Research Center. The main thrust of computational work at Lewis is aimed at propulsion systems. Specific issues related to propulsion CFD and associated modeling will also be presented. Examples of results obtained with the most recent algorithm development will also be presented.

  3. Australian Infection Control Association members' use of skills and resources that promote evidence-based infection control.

    PubMed

    Murphy, C L; McLaws, M

    2000-04-01

    To adopt an evidence-based approach, professionals must be able to access, identify, interpret, and critically appraise best evidence. Critical appraisal requires essential skills, such as computer literacy and an understanding of research principles. These skills also are required for professionals to contribute to evidence. In 1996, members of the Australian Infection Control Association were surveyed to establish a profile including the extent to which they were reading infection control publications, using specific documents for policy and guideline development, developing and undertaking research, publishing research, and using computers. The relationships between demographics, computer use, and research activity were examined. The response rate was 63. 4% (630/993). The study group comprised mostly women (96.1%), and most (66.4%) were older than 40 years of age. Median infection control experience was 4 years (mean, 5.4 years; range, <12 months to 35 years). When developing guidelines and policies (92.7%; 584/630), infection control professionals reviewed State Health Department Infection Control Guidelines and Regulations. Research relating to infection control was undertaken by 21.5% (135/628) of the sample, and 27.6% (37/134) of this group published their research findings. Of the respondents (51.1%; 318/622) who used a computer to undertake infection control tasks, the majority (89.0%) used a personal computer for word processing. Regardless of infection control experience, Australian infection control professionals must be adequately prepared to contribute to, access, appraise, and where appropriate, apply best evidence to their practice. We suggest that computer literacy, an understanding of research principles, and familiarity with infection control literature are three essential skills that infection control professionals must possess and regularly exercise.

  4. Issues of data governance associated with data mining in medical research: experiences from an empirical study.

    PubMed

    Nahar, Jesmin; Imam, Tasadduq; Tickle, Kevin S; Garcia-Alonso, Debora

    2013-01-01

    This chapter is a review of data mining techniques used in medical research. It will cover the existing applications of these techniques in the identification of diseases, and also present the authors' research experiences in medical disease diagnosis and analysis. A computational diagnosis approach can have a significant impact on accurate diagnosis and result in time and cost effective solutions. The chapter will begin with an overview of computational intelligence concepts, followed by details on different classification algorithms. Use of association learning, a well recognised data mining procedure, will also be discussed. Many of the datasets considered in existing medical data mining research are imbalanced, and the chapter focuses on this issue as well. Lastly, the chapter outlines the need of data governance in this research domain.

  5. Naval Postgraduate School Research. Volume 8, Number 2, June 1998

    DTIC Science & Technology

    1998-06-01

    N P S R E S E A R C H Volume 8, Number 2 June 1998 Office of the Dean of Research • Naval Postgraduate School • Monterey, California...LABORATORY Department of Electrical and Computer Engineering Research Associate Professor Richard W. Adler Research Associate Wilbur R . Vincent Visiting...electromagnetic environmental effects. RESEARCH LAB SIGNAL ENHANCEMENT LAB, continued from page 1 -- continued on page 3 Wilbur R . Vincent is a Research

  6. Methods for Computationally Efficient Structured CFD Simulations of Complex Turbomachinery Flows

    NASA Technical Reports Server (NTRS)

    Herrick, Gregory P.; Chen, Jen-Ping

    2012-01-01

    This research presents more efficient computational methods by which to perform multi-block structured Computational Fluid Dynamics (CFD) simulations of turbomachinery, thus facilitating higher-fidelity solutions of complicated geometries and their associated flows. This computational framework offers flexibility in allocating resources to balance process count and wall-clock computation time, while facilitating research interests of simulating axial compressor stall inception with more complete gridding of the flow passages and rotor tip clearance regions than is typically practiced with structured codes. The paradigm presented herein facilitates CFD simulation of previously impractical geometries and flows. These methods are validated and demonstrate improved computational efficiency when applied to complicated geometries and flows.

  7. LBNL Computational ResearchTheory Facility Groundbreaking - Full Press Conference. Feb 1st, 2012

    ScienceCinema

    Yelick, Kathy

    2018-01-24

    Energy Secretary Steven Chu, along with Berkeley Lab and UC leaders, broke ground on the Lab's Computational Research and Theory (CRT) facility yesterday. The CRT will be at the forefront of high-performance supercomputing research and be DOE's most efficient facility of its kind. Joining Secretary Chu as speakers were Lab Director Paul Alivisatos, UC President Mark Yudof, Office of Science Director Bill Brinkman, and UC Berkeley Chancellor Robert Birgeneau. The festivities were emceed by Associate Lab Director for Computing Sciences, Kathy Yelick, and Berkeley Mayor Tom Bates joined in the shovel ceremony.

  8. LBNL Computational Research and Theory Facility Groundbreaking. February 1st, 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yelick, Kathy

    2012-02-02

    Energy Secretary Steven Chu, along with Berkeley Lab and UC leaders, broke ground on the Lab's Computational Research and Theory (CRT) facility yesterday. The CRT will be at the forefront of high-performance supercomputing research and be DOE's most efficient facility of its kind. Joining Secretary Chu as speakers were Lab Director Paul Alivisatos, UC President Mark Yudof, Office of Science Director Bill Brinkman, and UC Berkeley Chancellor Robert Birgeneau. The festivities were emceed by Associate Lab Director for Computing Sciences, Kathy Yelick, and Berkeley Mayor Tom Bates joined in the shovel ceremony.

  9. LBNL Computational Research and Theory Facility Groundbreaking. February 1st, 2012

    ScienceCinema

    Yelick, Kathy

    2017-12-09

    Energy Secretary Steven Chu, along with Berkeley Lab and UC leaders, broke ground on the Lab's Computational Research and Theory (CRT) facility yesterday. The CRT will be at the forefront of high-performance supercomputing research and be DOE's most efficient facility of its kind. Joining Secretary Chu as speakers were Lab Director Paul Alivisatos, UC President Mark Yudof, Office of Science Director Bill Brinkman, and UC Berkeley Chancellor Robert Birgeneau. The festivities were emceed by Associate Lab Director for Computing Sciences, Kathy Yelick, and Berkeley Mayor Tom Bates joined in the shovel ceremony.

  10. Sexual Assault Perpetrators' Tactics: Associations with Their Personal Characteristics and Aspects of the Incident

    ERIC Educational Resources Information Center

    Abbey, Antonia; Jacques-Tiura, Angela J.

    2011-01-01

    Past theory and empirical research have consistently associated a number of risk factors with sexual assault perpetration. This study extends past research by considering if the tactics which perpetrators use to obtain sex are associated with these risk factors or with characteristics of the sexual assault. Audio computer-assisted self-interviews…

  11. Does Recreational Computer Use Affect High School Achievement?

    ERIC Educational Resources Information Center

    Bowers, Alex J.; Berland, Matthew

    2013-01-01

    Historically, the relationship between student academic achievement and use of computers for fun and video gaming has been described from a multitude of perspectives, from positive, to negative, to neutral. However, recent research has indicated that computer use and video gaming may be positively associated with achievement, yet these studies…

  12. The Contribution of Visualization to Learning Computer Architecture

    ERIC Educational Resources Information Center

    Yehezkel, Cecile; Ben-Ari, Mordechai; Dreyfus, Tommy

    2007-01-01

    This paper describes a visualization environment and associated learning activities designed to improve learning of computer architecture. The environment, EasyCPU, displays a model of the components of a computer and the dynamic processes involved in program execution. We present the results of a research program that analysed the contribution of…

  13. Efficient computational methods to study new and innovative signal detection techniques in SETI

    NASA Technical Reports Server (NTRS)

    Deans, Stanley R.

    1991-01-01

    The purpose of the research reported here is to provide a rapid computational method for computing various statistical parameters associated with overlapped Hann spectra. These results are important for the Targeted Search part of the Search for ExtraTerrestrial Intelligence (SETI) Microwave Observing Project.

  14. PNNLs Data Intensive Computing research battles Homeland Security threats

    ScienceCinema

    David Thurman; Joe Kielman; Katherine Wolf; David Atkinson

    2018-05-11

    The Pacific Northwest National Laboratorys (PNNL's) approach to data intensive computing (DIC) is focused on three key research areas: hybrid hardware architecture, software architectures, and analytic algorithms. Advancements in these areas will help to address, and solve, DIC issues associated with capturing, managing, analyzing and understanding, in near real time, data at volumes and rates that push the frontiers of current technologies.

  15. A brief overview of NASA Langley's research program in formal methods

    NASA Technical Reports Server (NTRS)

    1992-01-01

    An overview of NASA Langley's research program in formal methods is presented. The major goal of this work is to bring formal methods technology to a sufficiently mature level for use by the United States aerospace industry. Towards this goal, work is underway to design and formally verify a fault-tolerant computing platform suitable for advanced flight control applications. Also, several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of six NASA civil servants and contractors from Boeing Military Aircraft Company, Computational Logic Inc., Odyssey Research Associates, SRI International, University of California at Davis, and Vigyan Inc.

  16. Technical Evaluation Report for Symposium AVT-147: Computational Uncertainty in Military Vehicle Design

    NASA Technical Reports Server (NTRS)

    Radespiel, Rolf; Hemsch, Michael J.

    2007-01-01

    The complexity of modern military systems, as well as the cost and difficulty associated with experimentally verifying system and subsystem design makes the use of high-fidelity based simulation a future alternative for design and development. The predictive ability of such simulations such as computational fluid dynamics (CFD) and computational structural mechanics (CSM) have matured significantly. However, for numerical simulations to be used with confidence in design and development, quantitative measures of uncertainty must be available. The AVT 147 Symposium has been established to compile state-of-the art methods of assessing computational uncertainty, to identify future research and development needs associated with these methods, and to present examples of how these needs are being addressed and how the methods are being applied. Papers were solicited that address uncertainty estimation associated with high fidelity, physics-based simulations. The solicitation included papers that identify sources of error and uncertainty in numerical simulation from either the industry perspective or from the disciplinary or cross-disciplinary research perspective. Examples of the industry perspective were to include how computational uncertainty methods are used to reduce system risk in various stages of design or development.

  17. Assessment of computational issues associated with analysis of high-lift systems

    NASA Technical Reports Server (NTRS)

    Balasubramanian, R.; Jones, Kenneth M.; Waggoner, Edgar G.

    1992-01-01

    Thin-layer Navier-Stokes calculations for wing-fuselage configurations from subsonic to hypersonic flow regimes are now possible. However, efficient, accurate solutions for using these codes for two- and three-dimensional high-lift systems have yet to be realized. A brief overview of salient experimental and computational research is presented. An assessment of the state-of-the-art relative to high-lift system analysis and identification of issues related to grid generation and flow physics which are crucial for computational success in this area are also provided. Research in support of the high-lift elements of NASA's High Speed Research and Advanced Subsonic Transport Programs which addresses some of the computational issues is presented. Finally, fruitful areas of concentrated research are identified to accelerate overall progress for high lift system analysis and design.

  18. MicroRNAs and complex diseases: from experimental results to computational models.

    PubMed

    Chen, Xing; Xie, Di; Zhao, Qi; You, Zhu-Hong

    2017-10-17

    Plenty of microRNAs (miRNAs) were discovered at a rapid pace in plants, green algae, viruses and animals. As one of the most important components in the cell, miRNAs play a growing important role in various essential and important biological processes. For the recent few decades, amounts of experimental methods and computational models have been designed and implemented to identify novel miRNA-disease associations. In this review, the functions of miRNAs, miRNA-target interactions, miRNA-disease associations and some important publicly available miRNA-related databases were discussed in detail. Specially, considering the important fact that an increasing number of miRNA-disease associations have been experimentally confirmed, we selected five important miRNA-related human diseases and five crucial disease-related miRNAs and provided corresponding introductions. Identifying disease-related miRNAs has become an important goal of biomedical research, which will accelerate the understanding of disease pathogenesis at the molecular level and molecular tools design for disease diagnosis, treatment and prevention. Computational models have become an important means for novel miRNA-disease association identification, which could select the most promising miRNA-disease pairs for experimental validation and significantly reduce the time and cost of the biological experiments. Here, we reviewed 20 state-of-the-art computational models of predicting miRNA-disease associations from different perspectives. Finally, we summarized four important factors for the difficulties of predicting potential disease-related miRNAs, the framework of constructing powerful computational models to predict potential miRNA-disease associations including five feasible and important research schemas, and future directions for further development of computational models. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Body image dissatisfaction, physical activity and screen-time in Spanish adolescents.

    PubMed

    Añez, Elizabeth; Fornieles-Deu, Albert; Fauquet-Ars, Jordi; López-Guimerà, Gemma; Puntí-Vidal, Joaquim; Sánchez-Carracedo, David

    2018-01-01

    This cross-sectional study contributes to the literature on whether body dissatisfaction is a barrier/facilitator to engaging in physical activity and to investigate the impact of mass-media messages via computer-time on body dissatisfaction. High-school students ( N = 1501) reported their physical activity, computer-time (homework/leisure) and body dissatisfaction. Researchers measured students' weight and height. Analyses revealed that body dissatisfaction was negatively associated with physical activity on both genders, whereas computer-time was associated only with girls' body dissatisfaction. Specifically, as computer-homework increased, body dissatisfaction decreased; as computer-leisure increased, body dissatisfaction increased. Weight-related interventions should improve body image and physical activity simultaneously, while critical consumption of mass-media interventions should include a computer component.

  20. Research Projects, Technical Reports and Publications

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1996-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Advanced Methods for Scientific Computing High Performance Networks During this report pefiod Professor Antony Jameson of Princeton University, Professor Wei-Pai Tang of the University of Waterloo, Professor Marsha Berger of New York University, Professor Tony Chan of UCLA, Associate Professor David Zingg of University of Toronto, Canada and Assistant Professor Andrew Sohn of New Jersey Institute of Technology have been visiting RIACS. January 1, 1996 through September 30, 1996 RIACS had three staff scientists, four visiting scientists, one post-doctoral scientist, three consultants, two research associates and one research assistant. RIACS held a joint workshop with Code 1 29-30 July 1996. The workshop was held to discuss needs and opportunities in basic research in computer science in and for NASA applications. There were 14 talks given by NASA, industry and university scientists and three open discussion sessions. There were approximately fifty participants. A proceedings is being prepared. It is planned to have similar workshops on an annual basis. RIACS technical reports are usually preprints of manuscripts that have been submitted to research 'ournals or conference proceedings. A list of these reports for the period January i 1, 1996 through September 30, 1996 is in the Reports and Abstracts section of this report.

  1. RIACS

    NASA Technical Reports Server (NTRS)

    Moore, Robert C.

    1998-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: (1) Automated Reasoning. (2) Human-Centered Computing. and (3) High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.

  2. Prosodic Stress, Information, and Intelligibility of Speech in Noise

    DTIC Science & Technology

    2009-02-28

    across periods during which acoustic information has been suppressed. 15. SUBJECT TERMS Robust speech intelligibility Computational model of...Research Fellow at the Department of Computer Science at the University of Southern California). This research involved superimposing acoustic and...presented at an invitational-only session of the Acoustical Society of America’s and European Acoustic Association’s joint meeting in 2008. In summary, the

  3. The Peer Assisted Teaching Model for Undergraduate Research at a HBCU

    ERIC Educational Resources Information Center

    Wu, Liyun; Lewis, Marilyn W.

    2018-01-01

    Despite wide application of research skills in higher education, undergraduate students reported research and computer anxiety, and low association between research and their professional goals. This study aims to assess whether peer-assisted mentoring programs would promote positive changes in undergraduates' attitudes toward research. Using a…

  4. USRA/RIACS

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1992-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing; Advanced Methods for Scientific Computing; Learning Systems; High Performance Networks and Technology; Graphics, Visualization, and Virtual Environments.

  5. Understanding sequence similarity and framework analysis between centromere proteins using computational biology.

    PubMed

    Doss, C George Priya; Chakrabarty, Chiranjib; Debajyoti, C; Debottam, S

    2014-11-01

    Certain mysteries pointing toward their recruitment pathways, cell cycle regulation mechanisms, spindle checkpoint assembly, and chromosome segregation process are considered the centre of attraction in cancer research. In modern times, with the established databases, ranges of computational platforms have provided a platform to examine almost all the physiological and biochemical evidences in disease-associated phenotypes. Using existing computational methods, we have utilized the amino acid residues to understand the similarity within the evolutionary variance of different associated centromere proteins. This study related to sequence similarity, protein-protein networking, co-expression analysis, and evolutionary trajectory of centromere proteins will speed up the understanding about centromere biology and will create a road map for upcoming researchers who are initiating their work of clinical sequencing using centromere proteins.

  6. Wildlife software: procedures for publication of computer software

    USGS Publications Warehouse

    Samuel, M.D.

    1990-01-01

    Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.

  7. The effects of integrating service learning into computer science: an inter-institutional longitudinal study

    NASA Astrophysics Data System (ADS)

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-07-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.

  8. International Federation of Library Associations Annual Conference Papers. General Research Libraries Division: Parliamentary Libraries and National Libraries Sections (47th, Leipzig, East Germany, August 17-22, 1981).

    ERIC Educational Resources Information Center

    Gude, Gilbert; And Others

    This set of papers presented to the General Research Libraries Division of the International Federation of Library Associations (IFLA) during its 47th annual conference (1981) includes: "The Effect of the Introduction of Computers on Library and Research Staff," by Gilbert Gude; "Libraries as Information Service Agencies…

  9. Simulation Accelerator

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Under a NASA SBIR (Small Business Innovative Research) contract, (NAS5-30905), EAI Simulation Associates, Inc., developed a new digital simulation computer, Starlight(tm). With an architecture based on the analog model of computation, Starlight(tm) outperforms all other computers on a wide range of continuous system simulation. This system is used in a variety of applications, including aerospace, automotive, electric power and chemical reactors.

  10. PRACE - The European HPC Infrastructure

    NASA Astrophysics Data System (ADS)

    Stadelmeyer, Peter

    2014-05-01

    The mission of PRACE (Partnership for Advanced Computing in Europe) is to enable high impact scientific discovery and engineering research and development across all disciplines to enhance European competitiveness for the benefit of society. PRACE seeks to realize this mission by offering world class computing and data management resources and services through a peer review process. This talk gives a general overview about PRACE and the PRACE research infrastructure (RI). PRACE is established as an international not-for-profit association and the PRACE RI is a pan-European supercomputing infrastructure which offers access to computing and data management resources at partner sites distributed throughout Europe. Besides a short summary about the organization, history, and activities of PRACE, it is explained how scientists and researchers from academia and industry from around the world can access PRACE systems and which education and training activities are offered by PRACE. The overview also contains a selection of PRACE contributions to societal challenges and ongoing activities. Examples of the latter are beside others petascaling, application benchmark suite, best practice guides for efficient use of key architectures, application enabling / scaling, new programming models, and industrial applications. The Partnership for Advanced Computing in Europe (PRACE) is an international non-profit association with its seat in Brussels. The PRACE Research Infrastructure provides a persistent world-class high performance computing service for scientists and researchers from academia and industry in Europe. The computer systems and their operations accessible through PRACE are provided by 4 PRACE members (BSC representing Spain, CINECA representing Italy, GCS representing Germany and GENCI representing France). The Implementation Phase of PRACE receives funding from the EU's Seventh Framework Programme (FP7/2007-2013) under grant agreements RI-261557, RI-283493 and RI-312763. For more information, see www.prace-ri.eu

  11. Prevalence of neck pain and headaches: impact of computer use and other associative factors.

    PubMed

    Smith, L; Louw, Q; Crous, L; Grimmer-Somers, K

    2009-02-01

    Headaches and neck pain are reported to be among the most prevalent musculoskeletal complaints in the general population. A significant body of research has reported a high prevalence of headaches and neck pain among adolescents. Sitting for lengthy periods in fixed postures such as at computer terminals may result in adolescent neck pain and headaches. The aim of this paper was to report the association between computer use (exposure) and headaches and neck pain (outcome) among adolescent school students in a developing country. A cross-sectional study was conducted and comprehensive description of the data collection instrument was used to collect the data from 1073 high-school students. Headaches were associated with high psychosocial scores and were more common among girls. We found a concerning association between neck pain and high hours of computing for school students, and have confirmed the need to educate new computer users (school students) about appropriate ergonomics and postural health.

  12. RIACS

    NASA Technical Reports Server (NTRS)

    Moore, Robert C.

    1998-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. Research is carried out by a staff of full-time scientist,augmented by visitors, students, post doctoral candidates and visiting university faculty. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: Automated Reasoning. Human-Centered Computing. and High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.

  13. A literature review of neck pain associated with computer use: public health implications

    PubMed Central

    Green, Bart N

    2008-01-01

    Prolonged use of computers during daily work activities and recreation is often cited as a cause of neck pain. This review of the literature identifies public health aspects of neck pain as associated with computer use. While some retrospective studies support the hypothesis that frequent computer operation is associated with neck pain, few prospective studies reveal causal relationships. Many risk factors are identified in the literature. Primary prevention strategies have largely been confined to addressing environmental exposure to ergonomic risk factors, since to date, no clear cause for this work-related neck pain has been acknowledged. Future research should include identifying causes of work related neck pain so that appropriate primary prevention strategies may be developed and to make policy recommendations pertaining to prevention. PMID:18769599

  14. The Underrepresentation of Women in Computing Fields: A Synthesis of Literature Using a Life Course Perspective

    ERIC Educational Resources Information Center

    Main, Joyce B.; Schimpf, Corey

    2017-01-01

    Using a life course perspective, this literature review synthesizes research on women's underrepresentation in computing fields across four life stages: 1) pre-high school; 2) high school; 3) college major choice and persistence; and 4) post-baccalaureate employment. Issues associated with access to, and use of, computing resources at the pre-high…

  15. RIACS/USRA

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1993-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing, Advanced Methods for Scientific Computing, High Performance Networks and Technology, and Learning Systems. Parallel compiler techniques, adaptive numerical methods for flows in complicated geometries, and optimization were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade.

  16. Proceedings of Selected Research Paper Presentations at the 1988 Convention of the Association for Educational Communications and Technology and Sponsored by the Research and Theory Division (10th, New Orleans, Louisiana, January 14-19, 1988).

    ERIC Educational Resources Information Center

    Simonson, Michael R., Ed.; Frederick, Jacqueline K., Ed.

    1988-01-01

    The 54 papers in this volume represent some of the most current thinking in educational communications and technology. Individual papers address the following topics: feedback in computer-assisted instruction (CAI); cognitive style and cognitive strategies in CAI; persuasive film-making; learning strategies; computer technology and children's word…

  17. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    NASA Astrophysics Data System (ADS)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the changing computer hardware platforms in order to provide fast, accurate and efficient solutions to large, complex electromagnetic problems. The research in this dissertation proves that the performance of parallel code is intimately related to the configuration of the computer hardware and can be maximized for different hardware platforms. To benchmark and optimize the performance of parallel CEM software, a variety of large, complex projects are created and executed on a variety of computer platforms. The computer platforms used in this research are detailed in this dissertation. The projects run as benchmarks are also described in detail and results are presented. The parameters that affect parallel CEM software on High Performance Computing Clusters (HPCC) are investigated. This research demonstrates methods to maximize the performance of parallel CEM software code.

  18. University Libraries and Other General Research Libraries Section. General Research Libraries Division. Papers.

    ERIC Educational Resources Information Center

    International Federation of Library Associations, The Hague (Netherlands).

    Papers on university and other research libraries, presented at the 1983 International Federation of Library Associations (IFLA) conference, include: (1) "The Impact of Technology on Users of Academic and Research Libraries," in which C. Lee Jones (United States) focuses on the impact of technical advances in computing and…

  19. Researcher Biographies

    Science.gov Websites

    interest: mechanical system design sensitivity analysis and optimization of linear and nonlinear structural systems, reliability analysis and reliability-based design optimization, computational methods in committee member, ISSMO; Associate Editor, Mechanics Based Design of Structures and Machines; Associate

  20. Multiple-User, Multitasking, Virtual-Memory Computer System

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Roth, Don J.; Stang, David B.

    1993-01-01

    Computer system designed and programmed to serve multiple users in research laboratory. Provides for computer control and monitoring of laboratory instruments, acquisition and anlaysis of data from those instruments, and interaction with users via remote terminals. System provides fast access to shared central processing units and associated large (from megabytes to gigabytes) memories. Underlying concept of system also applicable to monitoring and control of industrial processes.

  1. Computer-Supported Instructional Communication: A Multidisciplinary Account of Relevant Factors

    ERIC Educational Resources Information Center

    Rummel, Nikol; Kramer, Nicole

    2010-01-01

    The papers in the present special issue summarize research that aims at compiling and understanding variables associated with successful communication in computer-supported instructional settings. Secondly, the papers add to the question of how adaptiveness of instructional communication may be achieved. A particular strength of the special issue…

  2. Computer-Assisted Learning in Language Arts

    ERIC Educational Resources Information Center

    Serwer, Blanche L.; Stolurow, Lawrence M.

    1970-01-01

    A description of computer program segments in the feasibility and development phase of Operationally Relevant Activities for Children's Language Experience (Project ORACLE); original form of this paper was prepared by Serwer for presentation to annual meeting of New England Research Association (1st, Boston College, June 5-6, 1969). (Authors/RD)

  3. Commentary: New Technologies on the Horizon for Teaching

    ERIC Educational Resources Information Center

    Parslow, Graham R.

    2013-01-01

    A well-researched report has listed the technologies that should increasingly feature in teaching. It is projected that in the coming year there will be increased use of cloud computing, mobile applications, social exchanges, and tablet computing. The New Media Consortium (NMC) that produced the report is an international association of…

  4. Formal and Information Learning in a Computer Clubhouse Environment

    ERIC Educational Resources Information Center

    McDougall, Anne; Lowe, Jenny; Hopkins, Josie

    2004-01-01

    This paper outlines the establishment and running of an after-school Computer Clubhouse, describing aspects of the leadership, mentoring and learning activities undertaken there. Research data has been collected from examination of documents associated with the Clubhouse, interviews with its founders, Director, session leaders and mentors, and…

  5. Research of aerohydrodynamic and aeroelastic processes on PNRPU HPC system

    NASA Astrophysics Data System (ADS)

    Modorskii, V. Ya.; Shevelev, N. A.

    2016-10-01

    Research of aerohydrodynamic and aeroelastic processes with the High Performance Computing Complex in PNIPU is actively conducted within the university priority development direction "Aviation engine and gas turbine technology". Work is carried out in two areas: development and use of domestic software and use of well-known foreign licensed applied software packets. In addition, the third direction associated with the verification of computational experiments - physical modeling, with unique proprietary experimental installations is being developed.

  6. Sparse distributed memory and related models

    NASA Technical Reports Server (NTRS)

    Kanerva, Pentti

    1992-01-01

    Described here is sparse distributed memory (SDM) as a neural-net associative memory. It is characterized by two weight matrices and by a large internal dimension - the number of hidden units is much larger than the number of input or output units. The first matrix, A, is fixed and possibly random, and the second matrix, C, is modifiable. The SDM is compared and contrasted to (1) computer memory, (2) correlation-matrix memory, (3) feet-forward artificial neural network, (4) cortex of the cerebellum, (5) Marr and Albus models of the cerebellum, and (6) Albus' cerebellar model arithmetic computer (CMAC). Several variations of the basic SDM design are discussed: the selected-coordinate and hyperplane designs of Jaeckel, the pseudorandom associative neural memory of Hassoun, and SDM with real-valued input variables by Prager and Fallside. SDM research conducted mainly at the Research Institute for Advanced Computer Science (RIACS) in 1986-1991 is highlighted.

  7. Computer Science Research Review 1974-75

    DTIC Science & Technology

    1975-08-01

    mwmmmimmm^m^mmmrm. : i i 1 Faculty and Visitors Mario Barbaccl Research Associate B.S., Universidad Nacional de Ingenieria , Lima, Peru (1966...Engineer, Universidad Nacional de Ingenieria , Lima, Peru (1968) Ph.D., Carnegie-Mellon University (1974) Carnegie. 1969: Design Automation

  8. Library Theory and Research Section. Education and Research Division. Papers.

    ERIC Educational Resources Information Center

    International Federation of Library Associations, The Hague (Netherlands).

    Papers on library/information science theory and research, which were presented at the 1983 International Federation of Library Associations (IFLA) conference, include: (1) "The Role of the Library in Computer-Aided Information and Documentation Systems," in which Wolf D. Rauch (West Germany) asserts that libraries must adapt to the…

  9. The 1987 RIACS annual report

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established at the NASA Ames Research Center in June of 1983. RIACS is privately operated by the Universities Space Research Association (USRA), a consortium of 64 universities with graduate programs in the aerospace sciences, under several Cooperative Agreements with NASA. RIACS's goal is to provide preeminent leadership in basic and applied computer science research as partners in support of NASA's goals and missions. In pursuit of this goal, RIACS contributes to several of the grand challenges in science and engineering facing NASA: flying an airplane inside a computer; determining the chemical properties of materials under hostile conditions in the atmospheres of earth and the planets; sending intelligent machines on unmanned space missions; creating a one-world network that makes all scientific resources, including those in space, accessible to all the world's scientists; providing intelligent computational support to all stages of the process of scientific investigation from problem formulation to results dissemination; and developing accurate global models for climatic behavior throughout the world. In working with these challenges, we seek novel architectures, and novel ways to use them, that exploit the potential of parallel and distributed computation and make possible new functions that are beyond the current reach of computing machines. The investigation includes pattern computers as well as the more familiar numeric and symbolic computers, and it includes networked systems of resources distributed around the world. We believe that successful computer science research is interdisciplinary: it is driven by (and drives) important problems in other disciplines. We believe that research should be guided by a clear long-term vision with planned milestones. And we believe that our environment must foster and exploit innovation. Our activities and accomplishments for the calendar year 1987 and our plans for 1988 are reported.

  10. A design methodology for portable software on parallel computers

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Miller, Keith W.; Chrisman, Dan A.

    1993-01-01

    This final report for research that was supported by grant number NAG-1-995 documents our progress in addressing two difficulties in parallel programming. The first difficulty is developing software that will execute quickly on a parallel computer. The second difficulty is transporting software between dissimilar parallel computers. In general, we expect that more hardware-specific information will be included in software designs for parallel computers than in designs for sequential computers. This inclusion is an instance of portability being sacrificed for high performance. New parallel computers are being introduced frequently. Trying to keep one's software on the current high performance hardware, a software developer almost continually faces yet another expensive software transportation. The problem of the proposed research is to create a design methodology that helps designers to more precisely control both portability and hardware-specific programming details. The proposed research emphasizes programming for scientific applications. We completed our study of the parallelizability of a subsystem of the NASA Earth Radiation Budget Experiment (ERBE) data processing system. This work is summarized in section two. A more detailed description is provided in Appendix A ('Programming Practices to Support Eventual Parallelism'). Mr. Chrisman, a graduate student, wrote and successfully defended a Ph.D. dissertation proposal which describes our research associated with the issues of software portability and high performance. The list of research tasks are specified in the proposal. The proposal 'A Design Methodology for Portable Software on Parallel Computers' is summarized in section three and is provided in its entirety in Appendix B. We are currently studying a proposed subsystem of the NASA Clouds and the Earth's Radiant Energy System (CERES) data processing system. This software is the proof-of-concept for the Ph.D. dissertation. We have implemented and measured the performance of a portion of this subsystem on the Intel iPSC/2 parallel computer. These results are provided in section four. Our future work is summarized in section five, our acknowledgements are stated in section six, and references for published papers associated with NAG-1-995 are provided in section seven.

  11. Final Technical Progress Report; Closeout Certifications; CSSV Newsletter Volume I; CSSV Newsletter Volume II; CSSV Activity Journal; CSSV Final Financial Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houston, Johnny L; Geter, Kerry

    This Project?s third year of implementation in 2007-2008, the final year, as designated by Elizabeth City State University (ECSU), in cooperation with the National Association of Mathematicians (NAM) Inc., in an effort to promote research and research training programs in computational science ? scientific visualization (CSSV). A major goal of the Project was to attract the energetic and productive faculty, graduate and upper division undergraduate students of diverse ethnicities to a program that investigates science and computational science issues of long-term interest to the Department of Energy (DoE) and the nation. The breadth and depth of computational science?scientific visualization andmore » the magnitude of resources available are enormous for permitting a variety of research activities. ECSU?s Computational Science-Science Visualization Center will serve as a conduit for directing users to these enormous resources.« less

  12. Reinforcement learning in depression: A review of computational research.

    PubMed

    Chen, Chong; Takahashi, Taiki; Nakagawa, Shin; Inoue, Takeshi; Kusumi, Ichiro

    2015-08-01

    Despite being considered primarily a mood disorder, major depressive disorder (MDD) is characterized by cognitive and decision making deficits. Recent research has employed computational models of reinforcement learning (RL) to address these deficits. The computational approach has the advantage in making explicit predictions about learning and behavior, specifying the process parameters of RL, differentiating between model-free and model-based RL, and the computational model-based functional magnetic resonance imaging and electroencephalography. With these merits there has been an emerging field of computational psychiatry and here we review specific studies that focused on MDD. Considerable evidence suggests that MDD is associated with impaired brain signals of reward prediction error and expected value ('wanting'), decreased reward sensitivity ('liking') and/or learning (be it model-free or model-based), etc., although the causality remains unclear. These parameters may serve as valuable intermediate phenotypes of MDD, linking general clinical symptoms to underlying molecular dysfunctions. We believe future computational research at clinical, systems, and cellular/molecular/genetic levels will propel us toward a better understanding of the disease. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Reproducible Computing: a new Technology for Statistics Education and Educational Research

    NASA Astrophysics Data System (ADS)

    Wessa, Patrick

    2009-05-01

    This paper explains how the R Framework (http://www.wessa.net) and a newly developed Compendium Platform (http://www.freestatistics.org) allow us to create, use, and maintain documents that contain empirical research results which can be recomputed and reused in derived work. It is illustrated that this technological innovation can be used to create educational applications that can be shown to support effective learning of statistics and associated analytical skills. It is explained how a Compendium can be created by anyone, without the need to understand the technicalities of scientific word processing (L style="font-variant: small-caps">ATEX) or statistical computing (R code). The proposed Reproducible Computing system allows educational researchers to objectively measure key aspects of the actual learning process based on individual and constructivist activities such as: peer review, collaboration in research, computational experimentation, etc. The system was implemented and tested in three statistics courses in which the use of Compendia was used to create an interactive e-learning environment that simulated the real-world process of empirical scientific research.

  14. Next Generation Distributed Computing for Cancer Research

    PubMed Central

    Agarwal, Pankaj; Owzar, Kouros

    2014-01-01

    Advances in next generation sequencing (NGS) and mass spectrometry (MS) technologies have provided many new opportunities and angles for extending the scope of translational cancer research while creating tremendous challenges in data management and analysis. The resulting informatics challenge is invariably not amenable to the use of traditional computing models. Recent advances in scalable computing and associated infrastructure, particularly distributed computing for Big Data, can provide solutions for addressing these challenges. In this review, the next generation of distributed computing technologies that can address these informatics problems is described from the perspective of three key components of a computational platform, namely computing, data storage and management, and networking. A broad overview of scalable computing is provided to set the context for a detailed description of Hadoop, a technology that is being rapidly adopted for large-scale distributed computing. A proof-of-concept Hadoop cluster, set up for performance benchmarking of NGS read alignment, is described as an example of how to work with Hadoop. Finally, Hadoop is compared with a number of other current technologies for distributed computing. PMID:25983539

  15. Next generation distributed computing for cancer research.

    PubMed

    Agarwal, Pankaj; Owzar, Kouros

    2014-01-01

    Advances in next generation sequencing (NGS) and mass spectrometry (MS) technologies have provided many new opportunities and angles for extending the scope of translational cancer research while creating tremendous challenges in data management and analysis. The resulting informatics challenge is invariably not amenable to the use of traditional computing models. Recent advances in scalable computing and associated infrastructure, particularly distributed computing for Big Data, can provide solutions for addressing these challenges. In this review, the next generation of distributed computing technologies that can address these informatics problems is described from the perspective of three key components of a computational platform, namely computing, data storage and management, and networking. A broad overview of scalable computing is provided to set the context for a detailed description of Hadoop, a technology that is being rapidly adopted for large-scale distributed computing. A proof-of-concept Hadoop cluster, set up for performance benchmarking of NGS read alignment, is described as an example of how to work with Hadoop. Finally, Hadoop is compared with a number of other current technologies for distributed computing.

  16. Annual Proceedings of Selected Research and Development Papers Presented at the National Convention of the Association for Educational Communications and Technology (22nd, Long Beach, California, February 16-20, 2000).

    ERIC Educational Resources Information Center

    Sparks, Kristin E., Ed.; Simonson, Michael, Ed.

    2000-01-01

    Subjects addressed by the 35 papers in this proceedings include: computer-based cooperative, collaborative and individual learning; comparison of students' and teachers' computer affect and behavior effects on performance; effects of headings and computer experience in CBI; use of CSCA to support argumentation skills in legal education; drama's…

  17. Computational Modeling of Space Physiology

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  18. Early experiences in developing and managing the neuroscience gateway.

    PubMed

    Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas T

    2015-02-01

    The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway.

  19. Early experiences in developing and managing the neuroscience gateway

    PubMed Central

    Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas. T.

    2015-01-01

    SUMMARY The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway. PMID:26523124

  20. Computers on the Battlefield: Can They Survive?

    DTIC Science & Technology

    1983-01-01

    presents the re- search findings of senior fellows, faculty, students , and associates of the University and its component institutions, the National War...Directorate and NDU Press Director of Research and Publisher Colonel John E. Endicott, USAF Associate Director and Professor of Research Colonel...Command and Control Net- w o rk .............................................. 3 3 2-2 Relative Cost to Fix an Error During System Develop- m e nt

  1. An automated framework for hypotheses generation using literature.

    PubMed

    Abedi, Vida; Zand, Ramin; Yeasin, Mohammed; Faisal, Fazle Elahi

    2012-08-29

    In bio-medicine, exploratory studies and hypothesis generation often begin with researching existing literature to identify a set of factors and their association with diseases, phenotypes, or biological processes. Many scientists are overwhelmed by the sheer volume of literature on a disease when they plan to generate a new hypothesis or study a biological phenomenon. The situation is even worse for junior investigators who often find it difficult to formulate new hypotheses or, more importantly, corroborate if their hypothesis is consistent with existing literature. It is a daunting task to be abreast with so much being published and also remember all combinations of direct and indirect associations. Fortunately there is a growing trend of using literature mining and knowledge discovery tools in biomedical research. However, there is still a large gap between the huge amount of effort and resources invested in disease research and the little effort in harvesting the published knowledge. The proposed hypothesis generation framework (HGF) finds "crisp semantic associations" among entities of interest - that is a step towards bridging such gaps. The proposed HGF shares similar end goals like the SWAN but are more holistic in nature and was designed and implemented using scalable and efficient computational models of disease-disease interaction. The integration of mapping ontologies with latent semantic analysis is critical in capturing domain specific direct and indirect "crisp" associations, and making assertions about entities (such as disease X is associated with a set of factors Z). Pilot studies were performed using two diseases. A comparative analysis of the computed "associations" and "assertions" with curated expert knowledge was performed to validate the results. It was observed that the HGF is able to capture "crisp" direct and indirect associations, and provide knowledge discovery on demand. The proposed framework is fast, efficient, and robust in generating new hypotheses to identify factors associated with a disease. A full integrated Web service application is being developed for wide dissemination of the HGF. A large-scale study by the domain experts and associated researchers is underway to validate the associations and assertions computed by the HGF.

  2. The EPA CompTox Chemistry Dashboard - an online resource ...

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data driven approaches that integrate chemistry, exposure and biological data. As an outcome of these efforts the National Center for Computational Toxicology (NCCT) has measured, assembled and delivered an enormous quantity and diversity of data for the environmental sciences including high-throughput in vitro screening data, in vivo and functional use data, exposure models and chemical databases with associated properties. A series of software applications and databases have been produced over the past decade to deliver these data. Recent work has focused on the development of a new architecture that assembles the resources into a single platform. With a focus on delivering access to Open Data streams, web service integration accessibility and a user-friendly web application the CompTox Dashboard provides access to data associated with ~720,000 chemical substances. These data include research data in the form of bioassay screening data associated with the ToxCast program, experimental and predicted physicochemical properties, product and functional use information and related data of value to environmental scientists. This presentation will provide an overview of the CompTox Dashboard and its va

  3. Interactive and Multimedia Contents Associated with a System for Computer-Aided Assessment

    ERIC Educational Resources Information Center

    Paiva, Rui C.; Ferreira, Milton S.; Mendes, Ana G.; Eusébio, Augusto M. J.

    2015-01-01

    This article presents a research study addressing the development, implementation, evaluation, and use of Interactive Modules for Online Training (MITO) of mathematics in higher education. This work was carried out in the context of the MITO project, which combined several features of the learning and management system Moodle, the computer-aided…

  4. Interdisciplinary Facilities that Support Collaborative Teaching and Learning

    ERIC Educational Resources Information Center

    Asoodeh, Mike; Bonnette, Roy

    2006-01-01

    It has become widely accepted that the computer is an indispensable tool in the study of science and technology. Thus, in recent years curricular programs such as Industrial Technology and associated scientific disciplines have been adopting and adapting the computer as a tool in new and innovative ways to support teaching, learning, and research.…

  5. Identifying Predictors of Achievement in the Newly Defined Information Literacy: A Neural Network Analysis

    ERIC Educational Resources Information Center

    Sexton, Randall; Hignite, Michael; Margavio, Thomas M.; Margavio, Geanie W.

    2009-01-01

    Information Literacy is a concept that evolved as a result of efforts to move technology-based instructional and research efforts beyond the concepts previously associated with "computer literacy." While computer literacy was largely a topic devoted to knowledge of hardware and software, information literacy is concerned with students' abilities…

  6. Associations between Screen-Based Sedentary Behaviour and Anxiety Symptoms in Mothers with Young Children

    PubMed Central

    Teychenne, Megan; Hinkley, Trina

    2016-01-01

    Objectives Anxiety is a serious illness and women (including mothers with young children) are at particular risk. Although physical activity (PA) may reduce anxiety risk, little research has investigated the link between sedentary behaviour and anxiety risk. The aim of this study was to examine the association between screen-based sedentary behaviour and anxiety symptoms, independent of PA, amongst mothers with young children. Methods During 2013–2014, 528 mothers with children aged 2–5 years completed self-report measures of recreational screen-based sedentary behaviour (TV/DVD/video viewing, computer/e-games/hand held device use) and anxiety symptoms (using the Hospital Anxiety and Depression Scale, HADS-A). Linear regression analyses examined the cross-sectional association between screen-based sedentary behaviour and anxiety symptoms. Results In models that adjusted for key demographic and behavioural covariates (including moderate- to vigorous-intensity PA, MVPA), computer/device use (B = 0.212; 95% CI = 0.048, 0.377) and total screen time (B = 0.109; 95% CI = 0.014, 0.205) were positively associated with heightened anxiety symptoms. TV viewing was not associated with anxiety symptoms in either model. Conclusions Higher levels of recreational computer or handheld device use and overall screen time may be linked to higher risk of anxiety symptoms in mothers with young children, independent of MVPA. Further longitudinal and intervention research is required to determine temporal associations. PMID:27191953

  7. Stability Analysis of Finite Difference Approximations to Hyperbolic Systems,and Problems in Applied and Computational Matrix and Operator Theory

    DTIC Science & Technology

    1990-12-07

    Fundaqao Calouste Gulbenkian, Instituto Gulbenkian de Ci~ncia, Centro de C6lculo Cientifico , Coimbra, 1973. 28, Dirac, P. A. M., Spinors in Hilbert Space...Office of Scientific Research grants 1965 Mathematical Association of America Editorial Prize for the article entitled: "Linear Transformations on...matrices" 1966 L.R. Ford Memorial Prize awarded by the Mathematical Association of America for the article , "Permanents" 1989 Outstanding Computer

  8. Interdisciplinary research and education at the biology-engineering-computer science interface: a perspective.

    PubMed

    Tadmor, Brigitta; Tidor, Bruce

    2005-09-01

    Progress in the life sciences, including genome sequencing and high-throughput experimentation, offers an opportunity for understanding biology and medicine from a systems perspective. This 'new view', which complements the more traditional component-based approach, involves the integration of biological research with approaches from engineering disciplines and computer science. The result is more than a new set of technologies. Rather, it promises a fundamental reconceptualization of the life sciences based on the development of quantitative and predictive models to describe crucial processes. To achieve this change, learning communities are being formed at the interface of the life sciences, engineering and computer science. Through these communities, research and education will be integrated across disciplines and the challenges associated with multidisciplinary team-based science will be addressed.

  9. Knowledge and utilization of computer-software for statistics among Nigerian dentists.

    PubMed

    Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I

    2013-01-01

    The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.

  10. Use of the computer for research on student thinking in physics

    NASA Astrophysics Data System (ADS)

    Grayson, Diane J.; McDermott, Lillian C.

    1996-05-01

    This paper describes the use of the computer-based interview as a research technique for investigating how students think about physics. Two computer programs provide the context: one intended for instruction, the other for research. The one designed for use as an instructional aid displays the motion of a ball rolling along a track that has level and inclined segments. The associated motion graphs are also shown. The other program, which was expressly designed for use in research, is based on the simulated motion of a modified Atwood's machine. The programs require students to predict the effect of the initial conditions and system parameters on the motion or on a graph of the motion. The motion that would actually occur is then displayed. The investigation focuses on the reasoning used by the students as they try to resolve discrepancies between their predictions and observations.

  11. Some Thoughts Regarding Practical Quantum Computing

    NASA Astrophysics Data System (ADS)

    Ghoshal, Debabrata; Gomez, Richard; Lanzagorta, Marco; Uhlmann, Jeffrey

    2006-03-01

    Quantum computing has become an important area of research in computer science because of its potential to provide more efficient algorithmic solutions to certain problems than are possible with classical computing. The ability of performing parallel operations over an exponentially large computational space has proved to be the main advantage of the quantum computing model. In this regard, we are particularly interested in the potential applications of quantum computers to enhance real software systems of interest to the defense, industrial, scientific and financial communities. However, while much has been written in popular and scientific literature about the benefits of the quantum computational model, several of the problems associated to the practical implementation of real-life complex software systems in quantum computers are often ignored. In this presentation we will argue that practical quantum computation is not as straightforward as commonly advertised, even if the technological problems associated to the manufacturing and engineering of large-scale quantum registers were solved overnight. We will discuss some of the frequently overlooked difficulties that plague quantum computing in the areas of memories, I/O, addressing schemes, compilers, oracles, approximate information copying, logical debugging, error correction and fault-tolerant computing protocols.

  12. Resident research associateships, postdoctoral research awards 1989: opportunities for research at the U.S. Geological Survey, U.S. Department of the Interior

    USGS Publications Warehouse

    ,; ,

    1989-01-01

    The scientists of the U.S. Geological Survey are engaged in a wide range of geologic, geophysical, geochemical, hydrologic, and cartographic programs, including the application of computer science to them. These programs offer exciting possibilities for scientific achievement and professional growth to young scientists through participation as Research Associates.

  13. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  14. Home media and children's achievement and behavior.

    PubMed

    Hofferth, Sandra L

    2010-01-01

    This study provides a national picture of the time American 6- to 12-year-olds spent playing video games, using the computer, and watching TV at home in 1997 and 2003, and the association of early use with their achievement and behavior as adolescents. Girls benefited from computer use more than boys, and Black children benefited more than White children. Greater computer use in middle childhood was associated with increased achievement for White and Black girls, and for Black but not White boys. Increased video game play was associated with an improved ability to solve applied problems for Black girls but lower verbal achievement for all girls. For boys, increased video game play was linked to increased aggressive behavior problems. © 2010 The Author. Child Development © 2010 Society for Research in Child Development, Inc.

  15. MoCog1: A computer simulation of recognition-primed human decision making

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    This report describes the successful results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior. Most human decision-making is of the experience-based, relatively straight-forward, largely automatic, type of response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. This report describes the development of the architecture and computer program associated with such 'recognition-primed' decision-making. The resultant computer program was successfully utilized as a vehicle to simulate findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior in response to their environment. The present work is an expanded version and is based on research reported while the author was an employee of NASA ARC.

  16. RAID v2.0: an updated resource of RNA-associated interactions across organisms.

    PubMed

    Yi, Ying; Zhao, Yue; Li, Chunhua; Zhang, Lin; Huang, Huiying; Li, Yana; Liu, Lanlan; Hou, Ping; Cui, Tianyu; Tan, Puwen; Hu, Yongfei; Zhang, Ting; Huang, Yan; Li, Xiaobo; Yu, Jia; Wang, Dong

    2017-01-04

    With the development of biotechnologies and computational prediction algorithms, the number of experimental and computational prediction RNA-associated interactions has grown rapidly in recent years. However, diverse RNA-associated interactions are scattered over a wide variety of resources and organisms, whereas a fully comprehensive view of diverse RNA-associated interactions is still not available for any species. Hence, we have updated the RAID database to version 2.0 (RAID v2.0, www.rna-society.org/raid/) by integrating experimental and computational prediction interactions from manually reading literature and other database resources under one common framework. The new developments in RAID v2.0 include (i) over 850-fold RNA-associated interactions, an enhancement compared to the previous version; (ii) numerous resources integrated with experimental or computational prediction evidence for each RNA-associated interaction; (iii) a reliability assessment for each RNA-associated interaction based on an integrative confidence score; and (iv) an increase of species coverage to 60. Consequently, RAID v2.0 recruits more than 5.27 million RNA-associated interactions, including more than 4 million RNA-RNA interactions and more than 1.2 million RNA-protein interactions, referring to nearly 130 000 RNA/protein symbols across 60 species. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. Computational intelligence in bioinformatics: SNP/haplotype data in genetic association study for common diseases.

    PubMed

    Kelemen, Arpad; Vasilakos, Athanasios V; Liang, Yulan

    2009-09-01

    Comprehensive evaluation of common genetic variations through association of single-nucleotide polymorphism (SNP) structure with common complex disease in the genome-wide scale is currently a hot area in human genome research due to the recent development of the Human Genome Project and HapMap Project. Computational science, which includes computational intelligence (CI), has recently become the third method of scientific enquiry besides theory and experimentation. There have been fast growing interests in developing and applying CI in disease mapping using SNP and haplotype data. Some of the recent studies have demonstrated the promise and importance of CI for common complex diseases in genomic association study using SNP/haplotype data, especially for tackling challenges, such as gene-gene and gene-environment interactions, and the notorious "curse of dimensionality" problem. This review provides coverage of recent developments of CI approaches for complex diseases in genetic association study with SNP/haplotype data.

  18. Promising Areas for Psychometric Research.

    ERIC Educational Resources Information Center

    Angoff, William H.

    1988-01-01

    An overview of four papers on useful future directions for psychometric research is provided. The papers were drawn from American Psychological Association symposia; they cover the nature of general intelligence, item bias and selection, cut scores, equating problems, computer-adaptive testing, and individual and group achievement measurement.…

  19. Technology in the Service of Creativity: Computer Assisted Writing Project--Stetson Middle School, Philadelphia, Pennsylvania. Final Report.

    ERIC Educational Resources Information Center

    Bender, Evelyn

    The American Library Association's Carroll Preston Baber Research Award supported this project on the use, impact and feasibility of a computer assisted writing facility located in the library of Stetson Middle School in Philadelphia, an inner city school with a population of minority, "at risk" students. The writing facility consisted…

  20. The Mind Research Network - Mental Illness Neuroscience Discovery Grant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, J.; Calhoun, V.

    The scientific and technological programs of the Mind Research Network (MRN), reflect DOE missions in basic science and associated instrumentation, computational modeling, and experimental techniques. MRN's technical goals over the course of this project have been to develop and apply integrated, multi-modality functional imaging techniques derived from a decade of DOE-support research and technology development.

  1. A Review of New Brunswick's Dedicated Notebook Research Project: One-to-One Computing--A Compelling Classroom-Change Intervention

    ERIC Educational Resources Information Center

    Milton, Penny

    2008-01-01

    The Canadian Education Association (CEA) was commissioned by Hewlett-Packard Canada to create a case study describing the development, implementation and outcomes of New Brunswick's Dedicated Notebook Research Project. The New Brunswick Department of Education designed its research project to assess impacts on teaching and learning of dedicated…

  2. CloudMan as a platform for tool, data, and analysis distribution.

    PubMed

    Afgan, Enis; Chapman, Brad; Taylor, James

    2012-11-27

    Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions.

  3. RKNNMDA: Ranking-based KNN for MiRNA-Disease Association prediction.

    PubMed

    Chen, Xing; Wu, Qiao-Feng; Yan, Gui-Ying

    2017-07-03

    Cumulative verified experimental studies have demonstrated that microRNAs (miRNAs) could be closely related with the development and progression of human complex diseases. Based on the assumption that functional similar miRNAs may have a strong correlation with phenotypically similar diseases and vice versa, researchers developed various effective computational models which combine heterogeneous biologic data sets including disease similarity network, miRNA similarity network, and known disease-miRNA association network to identify potential relationships between miRNAs and diseases in biomedical research. Considering the limitations in previous computational study, we introduced a novel computational method of Ranking-based KNN for miRNA-Disease Association prediction (RKNNMDA) to predict potential related miRNAs for diseases, and our method obtained an AUC of 0.8221 based on leave-one-out cross validation. In addition, RKNNMDA was applied to 3 kinds of important human cancers for further performance evaluation. The results showed that 96%, 80% and 94% of predicted top 50 potential related miRNAs for Colon Neoplasms, Esophageal Neoplasms, and Prostate Neoplasms have been confirmed by experimental literatures, respectively. Moreover, RKNNMDA could be used to predict potential miRNAs for diseases without any known miRNAs, and it is anticipated that RKNNMDA would be of great use for novel miRNA-disease association identification.

  4. RKNNMDA: Ranking-based KNN for MiRNA-Disease Association prediction

    PubMed Central

    Chen, Xing; Yan, Gui-Ying

    2017-01-01

    ABSTRACT Cumulative verified experimental studies have demonstrated that microRNAs (miRNAs) could be closely related with the development and progression of human complex diseases. Based on the assumption that functional similar miRNAs may have a strong correlation with phenotypically similar diseases and vice versa, researchers developed various effective computational models which combine heterogeneous biologic data sets including disease similarity network, miRNA similarity network, and known disease-miRNA association network to identify potential relationships between miRNAs and diseases in biomedical research. Considering the limitations in previous computational study, we introduced a novel computational method of Ranking-based KNN for miRNA-Disease Association prediction (RKNNMDA) to predict potential related miRNAs for diseases, and our method obtained an AUC of 0.8221 based on leave-one-out cross validation. In addition, RKNNMDA was applied to 3 kinds of important human cancers for further performance evaluation. The results showed that 96%, 80% and 94% of predicted top 50 potential related miRNAs for Colon Neoplasms, Esophageal Neoplasms, and Prostate Neoplasms have been confirmed by experimental literatures, respectively. Moreover, RKNNMDA could be used to predict potential miRNAs for diseases without any known miRNAs, and it is anticipated that RKNNMDA would be of great use for novel miRNA-disease association identification. PMID:28421868

  5. Multimodal neuroelectric interface development

    NASA Technical Reports Server (NTRS)

    Trejo, Leonard J.; Wheeler, Kevin R.; Jorgensen, Charles C.; Rosipal, Roman; Clanton, Sam T.; Matthews, Bryan; Hibbs, Andrew D.; Matthews, Robert; Krupka, Michael

    2003-01-01

    We are developing electromyographic and electroencephalographic methods, which draw control signals for human-computer interfaces from the human nervous system. We have made progress in four areas: 1) real-time pattern recognition algorithms for decoding sequences of forearm muscle activity associated with control gestures; 2) signal-processing strategies for computer interfaces using electroencephalogram (EEG) signals; 3) a flexible computation framework for neuroelectric interface research; and d) noncontact sensors, which measure electromyogram or EEG signals without resistive contact to the body.

  6. Technical Services Workstations. SPEC Kit 213.

    ERIC Educational Resources Information Center

    Brugger, Judith M., Comp.; And Others

    Technical services workstations (TSWs) are personal computers that have been customized for use in technical services departments. To gather information on their use and prevalence in research libraries, the Program for Cooperative Cataloging Standing Committee on Automation surveyed the 119 members of the Association of Research Libraries (ARL)…

  7. Improving Family Forest Knowledge Transfer through Social Network Analysis

    ERIC Educational Resources Information Center

    Gorczyca, Erika L.; Lyons, Patrick W.; Leahy, Jessica E.; Johnson, Teresa R.; Straub, Crista L.

    2012-01-01

    To better engage Maine's family forest landowners our study used social network analysis: a computational social science method for identifying stakeholders, evaluating models of engagement, and targeting areas for enhanced partnerships. Interviews with researchers associated with a research center were conducted to identify how social network…

  8. Efficacy of brain-computer interface-driven neuromuscular electrical stimulation for chronic paresis after stroke.

    PubMed

    Mukaino, Masahiko; Ono, Takashi; Shindo, Keiichiro; Fujiwara, Toshiyuki; Ota, Tetsuo; Kimura, Akio; Liu, Meigen; Ushiba, Junichi

    2014-04-01

    Brain computer interface technology is of great interest to researchers as a potential therapeutic measure for people with severe neurological disorders. The aim of this study was to examine the efficacy of brain computer interface, by comparing conventional neuromuscular electrical stimulation and brain computer interface-driven neuromuscular electrical stimulation, using an A-B-A-B withdrawal single-subject design. A 38-year-old male with severe hemiplegia due to a putaminal haemorrhage participated in this study. The design involved 2 epochs. In epoch A, the patient attempted to open his fingers during the application of neuromuscular electrical stimulation, irrespective of his actual brain activity. In epoch B, neuromuscular electrical stimulation was applied only when a significant motor-related cortical potential was observed in the electroencephalogram. The subject initially showed diffuse functional magnetic resonance imaging activation and small electro-encephalogram responses while attempting finger movement. Epoch A was associated with few neurological or clinical signs of improvement. Epoch B, with a brain computer interface, was associated with marked lateralization of electroencephalogram (EEG) and blood oxygenation level dependent responses. Voluntary electromyogram (EMG) activity, with significant EEG-EMG coherence, was also prompted. Clinical improvement in upper-extremity function and muscle tone was observed. These results indicate that self-directed training with a brain computer interface may induce activity- dependent cortical plasticity and promote functional recovery. This preliminary clinical investigation encourages further research using a controlled design.

  9. A Modeling Framework for Optimal Computational Resource Allocation Estimation: Considering the Trade-offs between Physical Resolutions, Uncertainty and Computational Costs

    NASA Astrophysics Data System (ADS)

    Moslehi, M.; de Barros, F.; Rajagopal, R.

    2014-12-01

    Hydrogeological models that represent flow and transport in subsurface domains are usually large-scale with excessive computational complexity and uncertain characteristics. Uncertainty quantification for predicting flow and transport in heterogeneous formations often entails utilizing a numerical Monte Carlo framework, which repeatedly simulates the model according to a random field representing hydrogeological characteristics of the field. The physical resolution (e.g. grid resolution associated with the physical space) for the simulation is customarily chosen based on recommendations in the literature, independent of the number of Monte Carlo realizations. This practice may lead to either excessive computational burden or inaccurate solutions. We propose an optimization-based methodology that considers the trade-off between the following conflicting objectives: time associated with computational costs, statistical convergence of the model predictions and physical errors corresponding to numerical grid resolution. In this research, we optimally allocate computational resources by developing a modeling framework for the overall error based on a joint statistical and numerical analysis and optimizing the error model subject to a given computational constraint. The derived expression for the overall error explicitly takes into account the joint dependence between the discretization error of the physical space and the statistical error associated with Monte Carlo realizations. The accuracy of the proposed framework is verified in this study by applying it to several computationally extensive examples. Having this framework at hand aims hydrogeologists to achieve the optimum physical and statistical resolutions to minimize the error with a given computational budget. Moreover, the influence of the available computational resources and the geometric properties of the contaminant source zone on the optimum resolutions are investigated. We conclude that the computational cost associated with optimal allocation can be substantially reduced compared with prevalent recommendations in the literature.

  10. Computer aided design and manufacturing: analysis and development of research issues

    NASA Astrophysics Data System (ADS)

    Taylor, K.; Jadeja, J. C.

    2005-11-01

    The paper focuses on the current issues in the areas of computer aided manufacturing and design. The importance of integrating CAD and CAM is analyzed. The associated issues with the integration and recent advancements in this field have been documented. The development of methods for enhancing productivity is explored. A research experiment was conducted in the laboratories of West Virginia University with an objective to portray effects of various machining parameters on production. Graphical results and their interpretations are supplied to better realize the main purpose of the experimentation.

  11. Computer vision syndrome: A review.

    PubMed

    Gowrisankaran, Sowjanya; Sheedy, James E

    2015-01-01

    Computer vision syndrome (CVS) is a collection of symptoms related to prolonged work at a computer display. This article reviews the current knowledge about the symptoms, related factors and treatment modalities for CVS. Relevant literature on CVS published during the past 65 years was analyzed. Symptoms reported by computer users are classified into internal ocular symptoms (strain and ache), external ocular symptoms (dryness, irritation, burning), visual symptoms (blur, double vision) and musculoskeletal symptoms (neck and shoulder pain). The major factors associated with CVS are either environmental (improper lighting, display position and viewing distance) and/or dependent on the user's visual abilities (uncorrected refractive error, oculomotor disorders and tear film abnormalities). Although the factors associated with CVS have been identified the physiological mechanisms that underlie CVS are not completely understood. Additionally, advances in technology have led to the increased use of hand-held devices, which might impose somewhat different visual challenges compared to desktop displays. Further research is required to better understand the physiological mechanisms underlying CVS and symptoms associated with the use of hand-held and stereoscopic displays.

  12. $10M Gift Supports "Data Recycling" at UCSF.

    PubMed

    2017-10-01

    The University of California, San Francisco's Institute for Computational Health Sciences has received a $10 million gift to support "data recycling" investigations. The approach to medical research involves mining existing data to potentially uncover new uses for existing drugs and help improve clinical care. ©2017 American Association for Cancer Research.

  13. Jetliner Alert Systems

    NASA Technical Reports Server (NTRS)

    1983-01-01

    NASA research and design has significantly improved crew alert systems. The Engine Indication and Crew Alerting System (EICAS), developed by Psycho-Linguistic Research Associates, is technologically advanced and able to order alerts by priority. Ames has also developed computer controlled voice synthesizers for readouts during difficult landing approaches. This is available to airplane manufacturers.

  14. Making a Connection between Computational Modeling and Educational Research.

    ERIC Educational Resources Information Center

    Carbonaro, Michael

    2003-01-01

    Bruner, Goodnow, and Austin's (1956) research on concept development is reexamined from a connectionist perspective. A neural network was constructed which associates positive and negative instances of a concept with corresponding attribute values. Results suggest the simultaneous learning of attributes guided the network in constructing a faster…

  15. RIACS FY2002 Annual Report

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    2002-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. Operated by the Universities Space Research Association (a non-profit university consortium), RIACS is located at the NASA Ames Research Center, Moffett Field, California. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in September 2003. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology (IT) Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1) Automated Reasoning for Autonomous Systems; 2) Human-Centered Computing; and 3) High Performance Computing and Networking. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains including aerospace technology, earth science, life sciences, and astrobiology. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  16. Introduction to USRA

    NASA Technical Reports Server (NTRS)

    Davis, M. H. (Editor); Singy, A. (Editor)

    1994-01-01

    The Universities Space Research Association (USRA) was incorporated 25 years ago in the District of Columbia as a private nonprofit corporation under the auspices of the National Academy of Sciences. Institutional membership in the association has grown from 49 colleges and universities, when it was founded, to 76 in 1993. USRA provides a mechanism through which universities can cooperate effectively with one another, with the government, and with other organizations to further space science and technology and to promote education in these areas. Its mission is carried out through the institutes, centers, divisions, and programs that are described in detail in this booklet. These include the Lunar and Planetary Institute, the Institute for Computer Applications in Science and Engineering (ICASE), the Research Institute for Advanced Computer Science (RIACS), and the Center of Excellence in Space Data and Information Sciences (CESDIS).

  17. Situated Computing: The Next Frontier for HCI Research

    DTIC Science & Technology

    2002-01-01

    population works and lives with information. Most individuals interact with information through a single portal: a personal desktop or laptop...of single devices, nor will one person necessarily own each device. This leap of imagination requires that human-computer interaction (HCI...wireless technologies, including Bluetooth [16], IrDA [22] (Infrared Data Association- standards for infrared communications) and HomeRF TM [21

  18. PNNL pushing scientific discovery through data intensive computing breakthroughs

    ScienceCinema

    Deborah Gracio; David Koppenaal; Ruby Leung

    2018-05-18

    The Pacific Northwest National Laboratory's approach to data intensive computing (DIC) is focused on three key research areas: hybrid hardware architectures, software architectures, and analytic algorithms. Advancements in these areas will help to address, and solve, DIC issues associated with capturing, managing, analyzing and understanding, in near real time, data at volumes and rates that push the frontiers of current technologies.

  19. Computer use, language, and literacy in safety net clinic communication.

    PubMed

    Ratanawongsa, Neda; Barton, Jennifer L; Lyles, Courtney R; Wu, Michael; Yelin, Edward H; Martinez, Diana; Schillinger, Dean

    2017-01-01

    Patients with limited health literacy (LHL) and limited English proficiency (LEP) experience suboptimal communication and health outcomes. Electronic health record implementation in safety net clinics may affect communication with LHL and LEP patients.We investigated the associations between safety net clinician computer use and patient-provider communication for patients with LEP and LHL. We video-recorded encounters at 5 academically affiliated US public hospital clinics between English- and Spanish-speaking patients with chronic conditions and their primary and specialty care clinicians. We analyzed changes in communication behaviors (coded with the Roter Interaction Analysis System) with each additional point on a clinician computer use score, controlling for clinician type and visit length and stratified by English proficiency and health literacy status. Greater clinician computer use was associated with more biomedical statements (+12.4, P = .03) and less positive affect (-0.6, P < .01) from LEP/LHL patients. In visits with patients with adequate English proficiency/health literacy, greater clinician computer use was associated with less positive patient affect (-0.9, P < .01), fewer clinician psychosocial statements (-3.5, P < .05), greater clinician verbal dominance (+0.09, P < .01), and lower ratings on quality of care and communication. Higher clinician computer use was associated with more biomedical focus with LEP/LHL patients, and clinician verbal dominance and lower ratings with patients with adequate English proficiency and health literacy. Implementation research should explore interventions to enhance relationship-centered communication for diverse patient populations in the computer era. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  20. Knowledge Discovery as an Aid to Organizational Creativity.

    ERIC Educational Resources Information Center

    Siau, Keng

    2000-01-01

    This article presents the concept of knowledge discovery, a process of searching for associations in large volumes of computer data, as an aid to creativity. It then discusses the various techniques in knowledge discovery. Mednick's associative theory of creative thought serves as the theoretical foundation for this research. (Contains…

  1. Synergy between Information and Communications Technologies and Educational Action Research and Collaborative Construction of Our Active Identities

    ERIC Educational Resources Information Center

    Davis, Niki; Morrow, Donna

    2010-01-01

    Bridget Somekh's contributions to the debate on the theory and practice of action research and associated methodologies have often been gained through leadership of innovative action and research with computers in education. A review of her work provides evidence of the journey that starts with an appreciation of the wonders of technology before…

  2. Proceedings of Selected Research and Development Presentations at the 1996 National Convention of the Association for Educational Communications and Technology Sponsored by the Research and Theory Division (18th, Indianapolis, IN, 1996).

    ERIC Educational Resources Information Center

    Simonson, Michael R., Ed.; And Others

    1996-01-01

    This proceedings volume contains 77 papers. Subjects addressed include: image processing; new faculty research methods; preinstructional activities for preservice teacher education; computer "window" presentation styles; interface design; stress management instruction; cooperative learning; graphical user interfaces; student attitudes,…

  3. Computation of Discrete Slanted Hole Film Cooling Flow Using the Navier-Stokes Equations.

    DTIC Science & Technology

    1982-07-01

    7 -121 796 COMPUTATION OF DISCRETE SLANTED HOLE FILM COOLING FLOW i/ i USING THE NAVIER- ..(U) CIENTIFIC RESEARCH ASSOCIATES INC GLASTONBURY CT H...V U U6-IMSA P/ & .OS,-TR. 82-1004 Report R82-910002-4 / COMPUTATION OF DISCRETE SLAMED HOLE FILM COOLING FLOW ( USING THE XAVIER-STOKES EQUATIONS H...CL SIT %GE (f.en Dae Entere)04 REPORT DOCUMENTATION PAGE BEFORE COMPLETING FORM REPORT NUMBER 2. GOVT ACCESSION NO] S. RECIPIENT’S CATALOG NUMBERAO

  4. Computation of Large Turbulence Structures and Noise of Supersonic Jets

    NASA Technical Reports Server (NTRS)

    Tam, Christopher

    1996-01-01

    Our research effort concentrated on obtaining an understanding of the generation mechanisms and the prediction of the three components of supersonic jet noise. In addition, we also developed a computational method for calculating the mean flow of turbulent high-speed jets. Below is a short description of the highlights of our contributions in each of these areas: (a) Broadband shock associated noise, (b) Turbulent mixing noise, (c) Screech tones and impingement tones, (d) Computation of the mean flow of turbulent jets.

  5. The challenge of ubiquitous computing in health care: technology, concepts and solutions. Findings from the IMIA Yearbook of Medical Informatics 2005.

    PubMed

    Bott, O J; Ammenwerth, E; Brigl, B; Knaup, P; Lang, E; Pilgram, R; Pfeifer, B; Ruderich, F; Wolff, A C; Haux, R; Kulikowski, C

    2005-01-01

    To review recent research efforts in the field of ubiquitous computing in health care. To identify current research trends and further challenges for medical informatics. Analysis of the contents of the Yearbook on Medical Informatics 2005 of the International Medical Informatics Association (IMIA). The Yearbook of Medical Informatics 2005 includes 34 original papers selected from 22 peer-reviewed scientific journals related to several distinct research areas: health and clinical management, patient records, health information systems, medical signal processing and biomedical imaging, decision support, knowledge representation and management, education and consumer informatics as well as bioinformatics. A special section on ubiquitous health care systems is devoted to recent developments in the application of ubiquitous computing in health care. Besides additional synoptical reviews of each of the sections the Yearbook includes invited reviews concerning E-Health strategies, primary care informatics and wearable healthcare. Several publications demonstrate the potential of ubiquitous computing to enhance effectiveness of health services delivery and organization. But ubiquitous computing is also a societal challenge, caused by the surrounding but unobtrusive character of this technology. Contributions from nearly all of the established sub-disciplines of medical informatics are demanded to turn the visions of this promising new research field into reality.

  6. From chalkboard, slides, and paper to e-learning: How computing technologies have transformed anatomical sciences education.

    PubMed

    Trelease, Robert B

    2016-11-01

    Until the late-twentieth century, primary anatomical sciences education was relatively unenhanced by advanced technology and dependent on the mainstays of printed textbooks, chalkboard- and photographic projection-based classroom lectures, and cadaver dissection laboratories. But over the past three decades, diffusion of innovations in computer technology transformed the practices of anatomical education and research, along with other aspects of work and daily life. Increasing adoption of first-generation personal computers (PCs) in the 1980s paved the way for the first practical educational applications, and visionary anatomists foresaw the usefulness of computers for teaching. While early computers lacked high-resolution graphics capabilities and interactive user interfaces, applications with video discs demonstrated the practicality of programming digital multimedia linking descriptive text with anatomical imaging. Desktop publishing established that computers could be used for producing enhanced lecture notes, and commercial presentation software made it possible to give lectures using anatomical and medical imaging, as well as animations. Concurrently, computer processing supported the deployment of medical imaging modalities, including computed tomography, magnetic resonance imaging, and ultrasound, that were subsequently integrated into anatomy instruction. Following its public birth in the mid-1990s, the World Wide Web became the ubiquitous multimedia networking technology underlying the conduct of contemporary education and research. Digital video, structural simulations, and mobile devices have been more recently applied to education. Progressive implementation of computer-based learning methods interacted with waves of ongoing curricular change, and such technologies have been deemed crucial for continuing medical education reforms, providing new challenges and opportunities for anatomical sciences educators. Anat Sci Educ 9: 583-602. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.

  7. Prevalence of and factors associated with musculoskeletal symptoms in the spine attributed to computer use in undergraduate students.

    PubMed

    Kanchanomai, Siriluck; Janwantanakul, Prawit; Pensri, Praneet; Jiamjarasrangsi, Wiroj

    2012-01-01

    This study aimed to investigate the 3-month prevalence of musculoskeletal symptoms at the spine attributed to computer use and to identify biopsychosocial factors associated with the prevalence in undergraduate students. Undergraduate students who studied at a public university in Thailand. A cross-sectional survey was conducted with a self-administered questionnaire delivered to 3,545 students. A total of 2,511 students (73.7%) returned the questionnaires. Cervical symptoms (22.3%) were the most frequently reported, followed by thoracic (11%) and lumbar symptoms (10.7%). Females, daily computer use greater than three hours and too-high keyboard's position were significantly associated with a high prevalence of cervical symptoms. A significant association was found between higher undergraduate year of the study and too-high keyboard's position and a high prevalence of thoracic symptoms. Higher undergraduate year of the study and daily computer use greater than three hours were significantly related to a high prevalence of lumbar symptoms. Better-than-normal mental health status was associated with a low prevalence of lumbar symptoms. Spinal symptoms are common among undergraduate students. Various factors were identified to be associated with high prevalence of spinal symptoms. Further research investigating the causal relation between these factors and musculoskeletal symptoms should be conducted.

  8. Pacific Research Platform - Creation of a West Coast Big Data Freeway System Applied to the CONNected objECT (CONNECT) Data Mining Framework for Earth Science Knowledge Discovery

    NASA Astrophysics Data System (ADS)

    Sellars, S. L.; Nguyen, P.; Tatar, J.; Graham, J.; Kawsenuk, B.; DeFanti, T.; Smarr, L.; Sorooshian, S.; Ralph, M.

    2017-12-01

    A new era in computational earth sciences is within our grasps with the availability of ever-increasing earth observational data, enhanced computational capabilities, and innovative computation approaches that allow for the assimilation, analysis and ability to model the complex earth science phenomena. The Pacific Research Platform (PRP), CENIC and associated technologies such as the Flash I/O Network Appliance (FIONA) provide scientists a unique capability for advancing towards this new era. This presentation reports on the development of multi-institutional rapid data access capabilities and data pipeline for applying a novel image characterization and segmentation approach, CONNected objECT (CONNECT) algorithm to study Atmospheric River (AR) events impacting the Western United States. ARs are often associated with torrential rains, swollen rivers, flash flooding, and mudslides. CONNECT is computationally intensive, reliant on very large data transfers, storage and data mining techniques. The ability to apply the method to multiple variables and datasets located at different University of California campuses has previously been challenged by inadequate network bandwidth and computational constraints. The presentation will highlight how the inter-campus CONNECT data mining framework improved from our prior download speeds of 10MB/s to 500MB/s using the PRP and the FIONAs. We present a worked example using the NASA MERRA data to describe how the PRP and FIONA have provided researchers with the capability for advancing knowledge about ARs. Finally, we will discuss future efforts to expand the scope to additional variables in earth sciences.

  9. Galaxy CloudMan: delivering cloud compute clusters.

    PubMed

    Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James

    2010-12-21

    Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.

  10. Enabling the First Ever Measurement of Coherent Neutrino Scattering Through Background Neutron Measurements.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reyna, David; Betty, Rita

    Using High Performance Computing to Examine the Processes of Neurogenesis Underlying Pattern Separation/Completion of Episodic Information - Sandia researchers developed novel methods and metrics for studying the computational function of neurogenesis,thus generating substantial impact to the neuroscience and neural computing communities. This work could benefit applications in machine learning and other analysis activities. The purpose of this project was to computationally model the impact of neural population dynamics within the neurobiological memory system in order to examine how subareas in the brain enable pattern separation and completion of information in memory across time as associated experiences.

  11. How should a speech recognizer work?

    PubMed

    Scharenborg, Odette; Norris, Dennis; Bosch, Louis; McQueen, James M

    2005-11-12

    Although researchers studying human speech recognition (HSR) and automatic speech recognition (ASR) share a common interest in how information processing systems (human or machine) recognize spoken language, there is little communication between the two disciplines. We suggest that this lack of communication follows largely from the fact that research in these related fields has focused on the mechanics of how speech can be recognized. In Marr's (1982) terms, emphasis has been on the algorithmic and implementational levels rather than on the computational level. In this article, we provide a computational-level analysis of the task of speech recognition, which reveals the close parallels between research concerned with HSR and ASR. We illustrate this relation by presenting a new computational model of human spoken-word recognition, built using techniques from the field of ASR that, in contrast to current existing models of HSR, recognizes words from real speech input. 2005 Lawrence Erlbaum Associates, Inc.

  12. Marshal Wrubel and the Electronic Computer as an Astronomical Instrument

    NASA Astrophysics Data System (ADS)

    Mutschlecner, J. P.; Olsen, K. H.

    1998-05-01

    In 1960, Marshal H. Wrubel, professor of astrophysics at Indiana University, published an influential review paper under the title, "The Electronic Computer as an Astronomical Instrument." This essay pointed out the enormous potential of the electronic computer as an instrument of observational and theoretical research in astronomy, illustrated programming concepts, and made specific recommendations for the increased use of computers in astronomy. He noted that, with a few scattered exceptions, computer use by the astronomical community had heretofore been "timid and sporadic." This situation was to improve dramatically in the next few years. By the late 1950s, general-purpose, high-speed, "mainframe" computers were just emerging from the experimental, developmental stage, but few were affordable by or available to academic and research institutions not closely associated with large industrial or national defense programs. Yet by 1960 Wrubel had spent a decade actively pioneering and promoting the imaginative application of electronic computation within the astronomical community. Astronomy upper-level undergraduate and graduate students at Indiana were introduced to computing, and Ph.D. candidates who he supervised applied computer techniques to problems in theoretical astrophysics. He wrote an early textbook on programming, taught programming classes, and helped establish and direct the Research Computing Center at Indiana, later named the Wrubel Computing Center in his honor. He and his students created a variety of algorithms and subroutines and exchanged these throughout the astronomical community by distributing the Astronomical Computation News Letter. Nationally as well as internationally, Wrubel actively cooperated with other groups interested in computing applications for theoretical astrophysics, often through his position as secretary of the IAU commission on Stellar Constitution.

  13. CloudMan as a platform for tool, data, and analysis distribution

    PubMed Central

    2012-01-01

    Background Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. Results CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. Conclusions With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions. PMID:23181507

  14. Facilitating higher-fidelity simulations of axial compressor instability and other turbomachinery flow conditions

    NASA Astrophysics Data System (ADS)

    Herrick, Gregory Paul

    The quest to accurately capture flow phenomena with length-scales both short and long and to accurately represent complex flow phenomena within disparately sized geometry inspires a need for an efficient, high-fidelity, multi-block structured computational fluid dynamics (CFD) parallel computational scheme. This research presents and demonstrates a more efficient computational method by which to perform multi-block structured CFD parallel computational simulations, thus facilitating higher-fidelity solutions of complicated geometries (due to the inclusion of grids for "small'' flow areas which are often merely modeled) and their associated flows. This computational framework offers greater flexibility and user-control in allocating the resource balance between process count and wall-clock computation time. The principal modifications implemented in this revision consist of a "multiple grid block per processing core'' software infrastructure and an analytic computation of viscous flux Jacobians. The development of this scheme is largely motivated by the desire to simulate axial compressor stall inception with more complete gridding of the flow passages (including rotor tip clearance regions) than has been previously done while maintaining high computational efficiency (i.e., minimal consumption of computational resources), and thus this paradigm shall be demonstrated with an examination of instability in a transonic axial compressor. However, the paradigm presented herein facilitates CFD simulation of myriad previously impractical geometries and flows and is not limited to detailed analyses of axial compressor flows. While the simulations presented herein were technically possible under the previous structure of the subject software, they were much less computationally efficient and thus not pragmatically feasible; the previous research using this software to perform three-dimensional, full-annulus, time-accurate, unsteady, full-stage (with sliding-interface) simulations of rotating stall inception in axial compressors utilized tip clearance periodic models, while the scheme here is demonstrated by a simulation of axial compressor stall inception utilizing gridded rotor tip clearance regions. As will be discussed, much previous research---experimental, theoretical, and computational---has suggested that understanding clearance flow behavior is critical to understanding stall inception, and previous computational research efforts which have used tip clearance models have begged the question, "What about the clearance flows?''. This research begins to address that question.

  15. The Grad Cohort Workshop: Evaluating an Intervention to Retain Women Graduate Students in Computing

    PubMed Central

    Stout, Jane G.; Tamer, Burçin; Wright, Heather M.; Clarke, Lori A.; Dwarkadas, Sandhya; Howard, Ayanna M.

    2017-01-01

    Women engaged in computing career tracks are vastly outnumbered by men and often must contend with negative stereotypes about their innate technical aptitude. Research suggests women's marginalized presence in computing may result in women psychologically disengaging, and ultimately dropping out, perpetuating women's underrepresentation in computing. To combat this vicious cycle, the Computing Research Association's Committee on the Status of Women in Computing Research (CRA-W) runs a multi-day mentorship workshop for women graduate students called Grad Cohort, which consists of a speaker series and networking opportunities. We studied the long-term impact of Grad Cohort on women Ph.D. students' (a) dedication to becoming well-known in one's field, and giving back to the community (professional goals), (b) the degree to which one feels computing is an important element of “who they are” (computing identity), and (c) beliefs that computing skills are innate (entity beliefs). Of note, entity beliefs are known to be demoralizing and can lead to disengagement from academic endeavors. We compared a propensity score matched sample of women and men Ph.D. students in computing programs who had never participated in Grad Cohort to a sample of past Grad Cohort participants. Grad Cohort participants reported interest in becoming well-known in their field to a greater degree than women non-participants, and to an equivalent degree as men. Also, Grad Cohort participants reported stronger interest in giving back to the community than their peers. Further, whereas women non-participants identified with computing to a lesser degree than men and held stronger entity beliefs than men, Grad Cohort participants' computing identity and entity beliefs were equivalent to men. Importantly, stronger entity beliefs predicted a weaker computing identity among students, with the exception of Grad Cohort participants. This latter finding suggests Grad Cohort may shield students' computing identity from the damaging nature of entity beliefs. Together, these findings suggest Grad Cohort may fortify women's commitment to pursuing computing research careers and move the needle toward greater gender diversity in computing. PMID:28119657

  16. The Grad Cohort Workshop: Evaluating an Intervention to Retain Women Graduate Students in Computing.

    PubMed

    Stout, Jane G; Tamer, Burçin; Wright, Heather M; Clarke, Lori A; Dwarkadas, Sandhya; Howard, Ayanna M

    2016-01-01

    Women engaged in computing career tracks are vastly outnumbered by men and often must contend with negative stereotypes about their innate technical aptitude. Research suggests women's marginalized presence in computing may result in women psychologically disengaging, and ultimately dropping out, perpetuating women's underrepresentation in computing. To combat this vicious cycle, the Computing Research Association's Committee on the Status of Women in Computing Research (CRA-W) runs a multi-day mentorship workshop for women graduate students called Grad Cohort, which consists of a speaker series and networking opportunities. We studied the long-term impact of Grad Cohort on women Ph.D. students' (a) dedication to becoming well-known in one's field, and giving back to the community ( professional goals ), (b) the degree to which one feels computing is an important element of "who they are" ( computing identity) , and (c) beliefs that computing skills are innate ( entity beliefs ). Of note, entity beliefs are known to be demoralizing and can lead to disengagement from academic endeavors. We compared a propensity score matched sample of women and men Ph.D. students in computing programs who had never participated in Grad Cohort to a sample of past Grad Cohort participants. Grad Cohort participants reported interest in becoming well-known in their field to a greater degree than women non-participants, and to an equivalent degree as men. Also, Grad Cohort participants reported stronger interest in giving back to the community than their peers. Further, whereas women non-participants identified with computing to a lesser degree than men and held stronger entity beliefs than men, Grad Cohort participants' computing identity and entity beliefs were equivalent to men. Importantly, stronger entity beliefs predicted a weaker computing identity among students, with the exception of Grad Cohort participants. This latter finding suggests Grad Cohort may shield students' computing identity from the damaging nature of entity beliefs. Together, these findings suggest Grad Cohort may fortify women's commitment to pursuing computing research careers and move the needle toward greater gender diversity in computing.

  17. The ISCB Student Council Internship Program: Expanding computational biology capacity worldwide.

    PubMed

    Anupama, Jigisha; Francescatto, Margherita; Rahman, Farzana; Fatima, Nazeefa; DeBlasio, Dan; Shanmugam, Avinash Kumar; Satagopam, Venkata; Santos, Alberto; Kolekar, Pandurang; Michaut, Magali; Guney, Emre

    2018-01-01

    Education and training are two essential ingredients for a successful career. On one hand, universities provide students a curriculum for specializing in one's field of study, and on the other, internships complement coursework and provide invaluable training experience for a fruitful career. Consequently, undergraduates and graduates are encouraged to undertake an internship during the course of their degree. The opportunity to explore one's research interests in the early stages of their education is important for students because it improves their skill set and gives their career a boost. In the long term, this helps to close the gap between skills and employability among students across the globe and balance the research capacity in the field of computational biology. However, training opportunities are often scarce for computational biology students, particularly for those who reside in less-privileged regions. Aimed at helping students develop research and academic skills in computational biology and alleviating the divide across countries, the Student Council of the International Society for Computational Biology introduced its Internship Program in 2009. The Internship Program is committed to providing access to computational biology training, especially for students from developing regions, and improving competencies in the field. Here, we present how the Internship Program works and the impact of the internship opportunities so far, along with the challenges associated with this program.

  18. Media Use and Health Outcomes in Adolescents: Findings from a Nationally Representative Survey

    PubMed Central

    Casiano, Hygiea; Kinley, D. Jolene; Katz, Laurence Y.; Chartier, Mariette J.; Sareen, Jitender

    2012-01-01

    Objective: Examine the association between quantity of media use and health outcomes in adolescents. Method: Multiple logistic regression analyses were conducted with the Canadian Community Health Survey 1.1 (youth aged 12–19 (n=9137)) to determine the association between hours of use of television/videos, video games, and computers/Internet, and health outcomes including depression, alcohol dependence, binge drinking, suicidal ideation, help-seeking behaviour, risky sexual activity, and obesity. Results: Obesity was associated with frequent television/video use (Adjusted Odds Ratio (AOR) 1.10). Depression and risky sexual behaviour were less likely in frequent video game users (AOR 0.87 and 0.73). Binge drinking was less likely in frequent users of video games (AOR 0.92) and computers/Internet (AOR 0.90). Alcohol dependence was less likely in frequent computer/Internet users (AOR 0.89). Conclusions: Most health outcomes, except for obesity, were not associated with using media in youth. Further research into the appropriate role of media will help harness its full potential. PMID:23133464

  19. Expressing Youth Voice through Video Games and Coding

    ERIC Educational Resources Information Center

    Martin, Crystle

    2017-01-01

    A growing body of research focuses on the impact of video games and coding on learning. The research often elevates learning the technical skills associated with video games and coding or the importance of problem solving and computational thinking, which are, of course, necessary and relevant. However, the literature less often explores how young…

  20. A survey of GPU-based medical image computing techniques

    PubMed Central

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming

    2012-01-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080

  1. Scheduling based on a dynamic resource connection

    NASA Astrophysics Data System (ADS)

    Nagiyev, A. E.; Botygin, I. A.; Shersntneva, A. I.; Konyaev, P. A.

    2017-02-01

    The practical using of distributed computing systems associated with many problems, including troubles with the organization of an effective interaction between the agents located at the nodes of the system, with the specific configuration of each node of the system to perform a certain task, with the effective distribution of the available information and computational resources of the system, with the control of multithreading which implements the logic of solving research problems and so on. The article describes the method of computing load balancing in distributed automatic systems, focused on the multi-agency and multi-threaded data processing. The scheme of the control of processing requests from the terminal devices, providing the effective dynamic scaling of computing power under peak load is offered. The results of the model experiments research of the developed load scheduling algorithm are set out. These results show the effectiveness of the algorithm even with a significant expansion in the number of connected nodes and zoom in the architecture distributed computing system.

  2. ceRNAs in plants: computational approaches and associated challenges for target mimic research.

    PubMed

    Paschoal, Alexandre Rossi; Lozada-Chávez, Irma; Domingues, Douglas Silva; Stadler, Peter F

    2017-05-30

    The competing endogenous RNA hypothesis has gained increasing attention as a potential global regulatory mechanism of microRNAs (miRNAs), and as a powerful tool to predict the function of many noncoding RNAs, including miRNAs themselves. Most studies have been focused on animals, although target mimic (TMs) discovery as well as important computational and experimental advances has been developed in plants over the past decade. Thus, our contribution summarizes recent progresses in computational approaches for research of miRNA:TM interactions. We divided this article in three main contributions. First, a general overview of research on TMs in plants is presented with practical descriptions of the available literature, tools, data, databases and computational reports. Second, we describe a common protocol for the computational and experimental analyses of TM. Third, we provide a bioinformatics approach for the prediction of TM motifs potentially cross-targeting both members within the same or from different miRNA families, based on the identification of consensus miRNA-binding sites from known TMs across sequenced genomes, transcriptomes and known miRNAs. This computational approach is promising because, in contrast to animals, miRNA families in plants are large with identical or similar members, several of which are also highly conserved. From the three consensus TM motifs found with our approach: MIM166, MIM171 and MIM159/319, the last one has found strong support on the recent experimental work by Reichel and Millar [Specificity of plant microRNA TMs: cross-targeting of mir159 and mir319. J Plant Physiol 2015;180:45-8]. Finally, we stress the discussion on the major computational and associated experimental challenges that have to be faced in future ceRNA studies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Computational Electromagnetics Application to Small Geometric Anomalies and Associated Ucertainty Evaluation

    DTIC Science & Technology

    2010-02-28

    implemented a fast method to enable the statistical characterization of electromagnetic interference and compatibility (EMI/EMC) phenomena on electrically...higher accuracy is needed, e.g., to compute higher moment statistics . To address this problem, we have developed adaptive stochastic collocation methods ...SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) AF OFFICE OF SCIENTIFIC RESEARCH 875 N. RANDOLPH ST. ROOM 3112 ARLINGTON VA 22203 UA

  4. Development of Reduced-Order Models for Aeroelastic and Flutter Prediction Using the CFL3Dv6.0 Code

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Bartels, Robert E.

    2002-01-01

    A reduced-order model (ROM) is developed for aeroelastic analysis using the CFL3D version 6.0 computational fluid dynamics (CFD) code, recently developed at the NASA Langley Research Center. This latest version of the flow solver includes a deforming mesh capability, a modal structural definition for nonlinear aeroelastic analyses, and a parallelization capability that provides a significant increase in computational efficiency. Flutter results for the AGARD 445.6 Wing computed using CFL3D v6.0 are presented, including discussion of associated computational costs. Modal impulse responses of the unsteady aerodynamic system are then computed using the CFL3Dv6 code and transformed into state-space form. Important numerical issues associated with the computation of the impulse responses are presented. The unsteady aerodynamic state-space ROM is then combined with a state-space model of the structure to create an aeroelastic simulation using the MATLAB/SIMULINK environment. The MATLAB/SIMULINK ROM is used to rapidly compute aeroelastic transients including flutter. The ROM shows excellent agreement with the aeroelastic analyses computed using the CFL3Dv6.0 code directly.

  5. ACM Conference on Research and Development in Information Retrieval: Proceedings of the Annual Conference of the Association of Computing Machinery (Pisa, Italy, September 8-10, 1986).

    ERIC Educational Resources Information Center

    Rabitti, Fausto, Ed.

    Intended to identify and encourage research, development, and applications of information retrieval, the principal objective of this conference was to provide an international forum to promote an understanding of current research and to stimulate the exchange of ideas and experiences in information retrieval systems. Introductory material for…

  6. AstroML: Python-powered Machine Learning for Astronomy

    NASA Astrophysics Data System (ADS)

    Vander Plas, Jake; Connolly, A. J.; Ivezic, Z.

    2014-01-01

    As astronomical data sets grow in size and complexity, automated machine learning and data mining methods are becoming an increasingly fundamental component of research in the field. The astroML project (http://astroML.org) provides a common repository for practical examples of the data mining and machine learning tools used and developed by astronomical researchers, written in Python. The astroML module contains a host of general-purpose data analysis and machine learning routines, loaders for openly-available astronomical datasets, and fast implementations of specific computational methods often used in astronomy and astrophysics. The associated website features hundreds of examples of these routines being used for analysis of real astronomical datasets, while the associated textbook provides a curriculum resource for graduate-level courses focusing on practical statistics, machine learning, and data mining approaches within Astronomical research. This poster will highlight several of the more powerful and unique examples of analysis performed with astroML, all of which can be reproduced in their entirety on any computer with the proper packages installed.

  7. Associations between parental rules, style of communication and children's screen time.

    PubMed

    Bjelland, Mona; Soenens, Bart; Bere, Elling; Kovács, Éva; Lien, Nanna; Maes, Lea; Manios, Yannis; Moschonis, George; te Velde, Saskia J

    2015-10-01

    Research suggests an inverse association between parental rules and screen time in pre-adolescents, and that parents' style of communication with their children is related to the children's time spent watching TV. The aims of this study were to examine associations of parental rules and parental style of communication with children's screen time and perceived excessive screen time in five European countries. UP4FUN was a multi-centre, cluster randomised controlled trial with pre- and post-test measurements in each of five countries; Belgium, Germany, Greece, Hungary and Norway. Questionnaires were completed by the children at school and the parent questionnaire was brought home. Three structural equation models were tested based on measures of screen time and parental style of communication from the pre-test questionnaires. Of the 152 schools invited, 62 (41 %) schools agreed to participate. In total 3325 children (average age 11.2 years and 51 % girls) and 3038 parents (81 % mothers) completed the pre-test questionnaire. The average TV/DVD times across the countries were between 1.5 and 1.8 h/day, while less time was used for computer/games console (0.9-1.4 h/day). The children's perceived parental style of communication was quite consistent for TV/DVD and computer/games console. The presence of rules was significantly associated with less time watching TV/DVD and use of computer/games console time. Moreover, the use of an autonomy-supportive style was negatively related to both time watching TV/DVD and use of computer/games console time. The use of a controlling style was related positively to perceived excessive time used on TV/DVD and excessive time used on computer/games console. With a few exceptions, results were similar across the five countries. This study suggests that an autonomy-supportive style of communicating rules for TV/DVD or computer/ games console use is negatively related to children's time watching TV/DVD and use of computer/games console time. In contrast, a controlling style is associated with more screen time and with more perceived excessive screen time in particular. Longitudinal research is needed to further examine effects of parental style of communication on children's screen time as well as possible reciprocal effects. International Standard Randomized Controlled Trial Number Register, registration number: ISRCTN34562078 . Date applied29/07/2011, Date assigned11/10/2011.

  8. CFD for hypersonic propulsion

    NASA Technical Reports Server (NTRS)

    Povinelli, Louis A.

    1991-01-01

    An overview is given of research activity on the application of computational fluid dynamics (CDF) for hypersonic propulsion systems. After the initial consideration of the highly integrated nature of air-breathing hypersonic engines and airframe, attention is directed toward computations carried out for the components of the engine. A generic inlet configuration is considered in order to demonstrate the highly three dimensional viscous flow behavior occurring within rectangular inlets. Reacting flow computations for simple jet injection as well as for more complex combustion chambers are then discussed in order to show the capability of viscous finite rate chemical reaction computer simulations. Finally, the nozzle flow fields are demonstrated, showing the existence of complex shear layers and shock structure in the exhaust plume. The general issues associated with code validation as well as the specific issue associated with the use of CFD for design are discussed. A prognosis for the success of CFD in the design of future propulsion systems is offered.

  9. CFD for hypersonic propulsion

    NASA Technical Reports Server (NTRS)

    Povinelli, Louis A.

    1990-01-01

    An overview is given of research activity on the application of computational fluid dynamics (CDF) for hypersonic propulsion systems. After the initial consideration of the highly integrated nature of air-breathing hypersonic engines and airframe, attention is directed toward computations carried out for the components of the engine. A generic inlet configuration is considered in order to demonstrate the highly three dimensional viscous flow behavior occurring within rectangular inlets. Reacting flow computations for simple jet injection as well as for more complex combustion chambers are then discussed in order to show the capability of viscous finite rate chemical reaction computer simulations. Finally, the nozzle flow fields are demonstrated, showing the existence of complex shear layers and shock structure in the exhaust plume. The general issues associated with code validation as well as the specific issue associated with the use of CFD for design are discussed. A prognosis for the success of CFD in the design of future propulsion systems is offered.

  10. Outcomes and challenges of global high-resolution non-hydrostatic atmospheric simulations using the K computer

    NASA Astrophysics Data System (ADS)

    Satoh, Masaki; Tomita, Hirofumi; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Miyamoto, Yoshiaki; Yamaura, Tsuyoshi; Miyakawa, Tomoki; Nakano, Masuo; Kodama, Chihiro; Noda, Akira T.; Nasuno, Tomoe; Yamada, Yohei; Fukutomi, Yoshiki

    2017-12-01

    This article reviews the major outcomes of a 5-year (2011-2016) project using the K computer to perform global numerical atmospheric simulations based on the non-hydrostatic icosahedral atmospheric model (NICAM). The K computer was made available to the public in September 2012 and was used as a primary resource for Japan's Strategic Programs for Innovative Research (SPIRE), an initiative to investigate five strategic research areas; the NICAM project fell under the research area of climate and weather simulation sciences. Combining NICAM with high-performance computing has created new opportunities in three areas of research: (1) higher resolution global simulations that produce more realistic representations of convective systems, (2) multi-member ensemble simulations that are able to perform extended-range forecasts 10-30 days in advance, and (3) multi-decadal simulations for climatology and variability. Before the K computer era, NICAM was used to demonstrate realistic simulations of intra-seasonal oscillations including the Madden-Julian oscillation (MJO), merely as a case study approach. Thanks to the big leap in computational performance of the K computer, we could greatly increase the number of cases of MJO events for numerical simulations, in addition to integrating time and horizontal resolution. We conclude that the high-resolution global non-hydrostatic model, as used in this five-year project, improves the ability to forecast intra-seasonal oscillations and associated tropical cyclogenesis compared with that of the relatively coarser operational models currently in use. The impacts of the sub-kilometer resolution simulation and the multi-decadal simulations using NICAM are also reviewed.

  11. Galaxy CloudMan: delivering cloud compute clusters

    PubMed Central

    2010-01-01

    Background Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is “cloud computing”, which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate “as is” use by experimental biologists. Results We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon’s EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. Conclusions The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge. PMID:21210983

  12. USRA/RIACS

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1992-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) learning systems; (4) high performance networks and technology; and (5) graphics, visualization, and virtual environments. In the past year, parallel compiler techniques and adaptive numerical methods for flows in complicated geometries were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade. We concluded a summer student visitors program during this six months. We had six visiting graduate students that worked on projects over the summer and presented seminars on their work at the conclusion of their visits. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period July 1, 1992 through December 31, 1992 is provided.

  13. A Bayesian Method for Evaluating and Discovering Disease Loci Associations

    PubMed Central

    Jiang, Xia; Barmada, M. Michael; Cooper, Gregory F.; Becich, Michael J.

    2011-01-01

    Background A genome-wide association study (GWAS) typically involves examining representative SNPs in individuals from some population. A GWAS data set can concern a million SNPs and may soon concern billions. Researchers investigate the association of each SNP individually with a disease, and it is becoming increasingly commonplace to also analyze multi-SNP associations. Techniques for handling so many hypotheses include the Bonferroni correction and recently developed Bayesian methods. These methods can encounter problems. Most importantly, they are not applicable to a complex multi-locus hypothesis which has several competing hypotheses rather than only a null hypothesis. A method that computes the posterior probability of complex hypotheses is a pressing need. Methodology/Findings We introduce the Bayesian network posterior probability (BNPP) method which addresses the difficulties. The method represents the relationship between a disease and SNPs using a directed acyclic graph (DAG) model, and computes the likelihood of such models using a Bayesian network scoring criterion. The posterior probability of a hypothesis is computed based on the likelihoods of all competing hypotheses. The BNPP can not only be used to evaluate a hypothesis that has previously been discovered or suspected, but also to discover new disease loci associations. The results of experiments using simulated and real data sets are presented. Our results concerning simulated data sets indicate that the BNPP exhibits both better evaluation and discovery performance than does a p-value based method. For the real data sets, previous findings in the literature are confirmed and additional findings are found. Conclusions/Significance We conclude that the BNPP resolves a pressing problem by providing a way to compute the posterior probability of complex multi-locus hypotheses. A researcher can use the BNPP to determine the expected utility of investigating a hypothesis further. Furthermore, we conclude that the BNPP is a promising method for discovering disease loci associations. PMID:21853025

  14. Computer use, language, and literacy in safety net clinic communication

    PubMed Central

    Barton, Jennifer L; Lyles, Courtney R; Wu, Michael; Yelin, Edward H; Martinez, Diana; Schillinger, Dean

    2017-01-01

    Objective: Patients with limited health literacy (LHL) and limited English proficiency (LEP) experience suboptimal communication and health outcomes. Electronic health record implementation in safety net clinics may affect communication with LHL and LEP patients. We investigated the associations between safety net clinician computer use and patient-provider communication for patients with LEP and LHL. Materials and Methods: We video-recorded encounters at 5 academically affiliated US public hospital clinics between English- and Spanish-speaking patients with chronic conditions and their primary and specialty care clinicians. We analyzed changes in communication behaviors (coded with the Roter Interaction Analysis System) with each additional point on a clinician computer use score, controlling for clinician type and visit length and stratified by English proficiency and health literacy status. Results: Greater clinician computer use was associated with more biomedical statements (+12.4, P = .03) and less positive affect (−0.6, P < .01) from LEP/LHL patients. In visits with patients with adequate English proficiency/health literacy, greater clinician computer use was associated with less positive patient affect (−0.9, P < .01), fewer clinician psychosocial statements (−3.5, P < .05), greater clinician verbal dominance (+0.09, P < .01), and lower ratings on quality of care and communication. Conclusion: Higher clinician computer use was associated with more biomedical focus with LEP/LHL patients, and clinician verbal dominance and lower ratings with patients with adequate English proficiency and health literacy. Discussion: Implementation research should explore interventions to enhance relationship-centered communication for diverse patient populations in the computer era. PMID:27274017

  15. High performance network and channel-based storage

    NASA Technical Reports Server (NTRS)

    Katz, Randy H.

    1991-01-01

    In the traditional mainframe-centered view of a computer system, storage devices are coupled to the system through complex hardware subsystems called input/output (I/O) channels. With the dramatic shift towards workstation-based computing, and its associated client/server model of computation, storage facilities are now found attached to file servers and distributed throughout the network. We discuss the underlying technology trends that are leading to high performance network-based storage, namely advances in networks, storage devices, and I/O controller and server architectures. We review several commercial systems and research prototypes that are leading to a new approach to high performance computing based on network-attached storage.

  16. Bridging the Digital Divide in Diabetes: Family Support and Implications for Health Literacy

    PubMed Central

    Mayberry, Lindsay S.; Kripalani, Sunil; Rothman, Russell L.

    2011-01-01

    Abstract Background Patient web portals (PWPs) offer patients remote access to their medical record and communication with providers. Adults with health literacy limitations are less likely to access and use health information technology (HIT), including PWPs. In diabetes, PWP use has been associated with patient satisfaction, patient–provider communication, and glycemic control. Methods Using mixed methods, we explored the relationships between health literacy, numeracy, and computer literacy and the usage of a PWP and HIT. Participants (N=61 adults with type 2 diabetes) attended focus groups and completed surveys, including measures of health literacy, numeracy, and computer anxiety (an indicator of computer literacy) and frequency of PWP and HIT use. Results Computer literacy was positively associated with health literacy (r=0.41, P<0.001) and numeracy (r=0.35, P<0.001), but health literacy was not associated with numeracy. Participants with limited health literacy (23%), numeracy (43%), or computer literacy (25%) were no less likely to access PWPs or HIT, but lower health literacy was associated with less frequent use of a computer to research diabetes medications or treatments. In focus groups, participants spontaneously commented on family support when accessing and using PWPs or HIT for diabetes management. Conclusions Participants reported family members facilitated access and usage of HIT, taught them usage skills, and acted as online delegates. Participant statements suggest family members may bridge the HIT “digital divide” in diabetes by helping adults access a PWP or HIT for diabetes management. PMID:21718098

  17. A method for mapping fire hazard and risk across multiple scales and its application in fire management

    Treesearch

    Robert E. Keane; Stacy A. Drury; Eva C. Karau; Paul F. Hessburg; Keith M. Reynolds

    2010-01-01

    This paper presents modeling methods for mapping fire hazard and fire risk using a research model called FIREHARM (FIRE Hazard and Risk Model) that computes common measures of fire behavior, fire danger, and fire effects to spatially portray fire hazard over space. FIREHARM can compute a measure of risk associated with the distribution of these measures over time using...

  18. Defects Associated with Soldification of Melt Processed Superalloys for the Aerospace Industry

    DTIC Science & Technology

    2008-07-23

    resulting computational model will be in a form that is usable in their efforts to design new alloys and processing routes. Given the broad research...thermodynamics modeling by Asta and Woodward. The permeability of dendritic arrays in superalloys has been determined using three-dimensional reconstructions of...the solid-liquid mush and finite-element fluid simulations by Pollock and Spowart. Close interaction with industry ensured that computational

  19. Communications and Computers in the 21st Century. Hearing before the Technology Policy Task Force of the Committee on Science, Space, and Technology. House of Representatives, One Hundredth Congress, First Session.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. House Committee on Science, Space and Technology.

    Based upon the premise that manufacturing, communications, and computers are the key to productivity, this hearing before the Technology Policy Task Force was held to examine how the federal government interacts with universities, engineering research centers, professional associations, and private businesses in these areas. This document contains…

  20. Technology & Disability: Research, Design, Practice, and Policy. Proceedings of the RESNA International Conference (25th, Minneapolis, Minnesota, June 27-July 1, 2002).

    ERIC Educational Resources Information Center

    Simpson, Richard, Ed.

    These proceedings of the 2002 annual RESNA (Association for the Advancement of Rehabilitation Technology) conference include more than 200 presentations on all facets of assistive technology, including concurrent sessions, scientific platform sessions, interactive poster presentations, computer demonstrations, and the research symposium. The…

  1. Health Literacy Assessment of the STOFHLA: Paper versus Electronic Administration Continuation Study

    ERIC Educational Resources Information Center

    Chesser, Amy K.; Keene Woods, Nikki; Wipperman, Jennifer; Wilson, Rachel; Dong, Frank

    2014-01-01

    Low health literacy is associated with poor health outcomes. Research is needed to understand the mechanisms and pathways of its effects. Computer-based assessment tools may improve efficiency and cost-effectiveness of health literacy research. The objective of this preliminary study was to assess if administration of the Short Test of Functional…

  2. Computer Modeling of Complete IC Fabrication Process.

    DTIC Science & Technology

    1984-01-01

    Venson Shaw 10. C. S. Chang 11. Elizabeth Batson 12. Richard Pinto 13. Jacques Beauduoin SPEAKERS: 1. Tayo Akinwande 2. Dimitri Antoniadis 3. Walter...Numerical Model of Polysilicon Emitter Contacts in Bipolar Transistors,’ To be published IEEE Trans. Electron Devices. [34] M. R. Pinto , R. W. Dutton...Received PhD, Spring 1082) Balaji Swaminathan (Received PhD, Spring 1983) Len Mei Research Associate Michael Kump Research Assistant Mark Pinto Research

  3. The Association between Students' Use of an Electronic Voting System and their Learning Outcomes

    ERIC Educational Resources Information Center

    Kennedy, G. E.; Cutts, Q. I.

    2005-01-01

    This paper reports on the use of an electronic voting system (EVS) in a first-year computing science subject. Previous investigations suggest that students' use of an EVS would be positively associated with their learning outcomes. However, no research has established this relationship empirically. This study sought to establish whether there was…

  4. A Transcript Analysis of Graduates of Three Community College of Philadelphia Curricula between the Years 1985 and 1992. Institutional Research Report #83.

    ERIC Educational Resources Information Center

    Terzian, Aram L.; Obetz, Wayne S.

    A study was conducted at the Community College of Philadelphia (CCP) to examine the course-taking patterns of 94 graduates of the associate in arts (AA) curriculum, 1,957 graduates of the association in general studies (AGS) curriculum, and 99 graduates of the associate in science (AS) curriculum. Using a computer-based approach to transcript…

  5. Applications of complex systems theory in nursing education, research, and practice.

    PubMed

    Clancy, Thomas R; Effken, Judith A; Pesut, Daniel

    2008-01-01

    The clinical and administrative processes in today's healthcare environment are becoming increasingly complex. Multiple providers, new technology, competition, and the growing ubiquity of information all contribute to the notion of health care as a complex system. A complex system (CS) is characterized by a highly connected network of entities (e.g., physical objects, people or groups of people) from which higher order behavior emerges. Research in the transdisciplinary field of CS has focused on the use of computational modeling and simulation as a methodology for analyzing CS behavior. The creation of virtual worlds through computer simulation allows researchers to analyze multiple variables simultaneously and begin to understand behaviors that are common regardless of the discipline. The application of CS principles, mediated through computer simulation, informs nursing practice of the benefits and drawbacks of new procedures, protocols and practices before having to actually implement them. The inclusion of new computational tools and their applications in nursing education is also gaining attention. For example, education in CSs and applied computational applications has been endorsed by The Institute of Medicine, the American Organization of Nurse Executives and the American Association of Colleges of Nursing as essential training of nurse leaders. The purpose of this article is to review current research literature regarding CS science within the context of expert practice and implications for the education of nurse leadership roles. The article focuses on 3 broad areas: CS defined, literature review and exemplars from CS research and applications of CS theory in nursing leadership education. The article also highlights the key role nursing informaticists play in integrating emerging computational tools in the analysis of complex nursing systems.

  6. Computational Sustainability Tools Illuminating the Triple Value Model

    EPA Science Inventory

    The National Research Council (NRC) report Sustainability and the U.S. EPA recommends development of a "sustainability toolbox" to address challenges associated with implementing sustainability at the EPA. The three pillars of sustainability - industrial, social, and en...

  7. Computational models of intergroup competition and warfare.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Letendre, Kenneth; Abbott, Robert G.

    2011-11-01

    This document reports on the research of Kenneth Letendre, the recipient of a Sandia Graduate Research Fellowship at the University of New Mexico. Warfare is an extreme form of intergroup competition in which individuals make extreme sacrifices for the benefit of their nation or other group to which they belong. Among animals, limited, non-lethal competition is the norm. It is not fully understood what factors lead to warfare. We studied the global variation in the frequency of civil conflict among countries of the world, and its positive association with variation in the intensity of infectious disease. We demonstrated that themore » burden of human infectious disease importantly predicts the frequency of civil conflict and tested a causal model for this association based on the parasite-stress theory of sociality. We also investigated the organization of social foraging by colonies of harvester ants in the genus Pogonomyrmex, using both field studies and computer models.« less

  8. Boolean and brain-inspired computing using spin-transfer torque devices

    NASA Astrophysics Data System (ADS)

    Fan, Deliang

    Several completely new approaches (such as spintronic, carbon nanotube, graphene, TFETs, etc.) to information processing and data storage technologies are emerging to address the time frame beyond current Complementary Metal-Oxide-Semiconductor (CMOS) roadmap. The high speed magnetization switching of a nano-magnet due to current induced spin-transfer torque (STT) have been demonstrated in recent experiments. Such STT devices can be explored in compact, low power memory and logic design. In order to truly leverage STT devices based computing, researchers require a re-think of circuit, architecture, and computing model, since the STT devices are unlikely to be drop-in replacements for CMOS. The potential of STT devices based computing will be best realized by considering new computing models that are inherently suited to the characteristics of STT devices, and new applications that are enabled by their unique capabilities, thereby attaining performance that CMOS cannot achieve. The goal of this research is to conduct synergistic exploration in architecture, circuit and device levels for Boolean and brain-inspired computing using nanoscale STT devices. Specifically, we first show that the non-volatile STT devices can be used in designing configurable Boolean logic blocks. We propose a spin-memristor threshold logic (SMTL) gate design, where memristive cross-bar array is used to perform current mode summation of binary inputs and the low power current mode spintronic threshold device carries out the energy efficient threshold operation. Next, for brain-inspired computing, we have exploited different spin-transfer torque device structures that can implement the hard-limiting and soft-limiting artificial neuron transfer functions respectively. We apply such STT based neuron (or 'spin-neuron') in various neural network architectures, such as hierarchical temporal memory and feed-forward neural network, for performing "human-like" cognitive computing, which show more than two orders of lower energy consumption compared to state of the art CMOS implementation. Finally, we show the dynamics of injection locked Spin Hall Effect Spin-Torque Oscillator (SHE-STO) cluster can be exploited as a robust multi-dimensional distance metric for associative computing, image/ video analysis, etc. Our simulation results show that the proposed system architecture with injection locked SHE-STOs and the associated CMOS interface circuits can be suitable for robust and energy efficient associative computing and pattern matching.

  9. The ISCB Student Council Internship Program: Expanding computational biology capacity worldwide

    PubMed Central

    Anupama, Jigisha; Shanmugam, Avinash Kumar; Santos, Alberto; Michaut, Magali

    2018-01-01

    Education and training are two essential ingredients for a successful career. On one hand, universities provide students a curriculum for specializing in one’s field of study, and on the other, internships complement coursework and provide invaluable training experience for a fruitful career. Consequently, undergraduates and graduates are encouraged to undertake an internship during the course of their degree. The opportunity to explore one’s research interests in the early stages of their education is important for students because it improves their skill set and gives their career a boost. In the long term, this helps to close the gap between skills and employability among students across the globe and balance the research capacity in the field of computational biology. However, training opportunities are often scarce for computational biology students, particularly for those who reside in less-privileged regions. Aimed at helping students develop research and academic skills in computational biology and alleviating the divide across countries, the Student Council of the International Society for Computational Biology introduced its Internship Program in 2009. The Internship Program is committed to providing access to computational biology training, especially for students from developing regions, and improving competencies in the field. Here, we present how the Internship Program works and the impact of the internship opportunities so far, along with the challenges associated with this program. PMID:29346365

  10. Introduction to the LaRC central scientific computing complex

    NASA Technical Reports Server (NTRS)

    Shoosmith, John N.

    1993-01-01

    The computers and associated equipment that make up the Central Scientific Computing Complex of the Langley Research Center are briefly described. The electronic networks that provide access to the various components of the complex and a number of areas that can be used by Langley and contractors staff for special applications (scientific visualization, image processing, software engineering, and grid generation) are also described. Flight simulation facilities that use the central computers are described. Management of the complex, procedures for its use, and available services and resources are discussed. This document is intended for new users of the complex, for current users who wish to keep appraised of changes, and for visitors who need to understand the role of central scientific computers at Langley.

  11. Advances in natural language processing.

    PubMed

    Hirschberg, Julia; Manning, Christopher D

    2015-07-17

    Natural language processing employs computational techniques for the purpose of learning, understanding, and producing human language content. Early computational approaches to language research focused on automating the analysis of the linguistic structure of language and developing basic technologies such as machine translation, speech recognition, and speech synthesis. Today's researchers refine and make use of such tools in real-world applications, creating spoken dialogue systems and speech-to-speech translation engines, mining social media for information about health or finance, and identifying sentiment and emotion toward products and services. We describe successes and challenges in this rapidly advancing area. Copyright © 2015, American Association for the Advancement of Science.

  12. The Role of Parents and Related Factors on Adolescent Computer Use

    PubMed Central

    Epstein, Jennifer A.

    2012-01-01

    Background Research suggested the importance of parents on their adolescents’ computer activity. Spending too much time on the computer for recreational purposes in particular has been found to be related to areas of public health concern in children/adolescents, including obesity and substance use. Design and Methods The goal of the research was to determine the association between recreational computer use and potentially linked factors (parental monitoring, social influences to use computers including parents, age of first computer use, self-control, and particular internet activities). Participants (aged 13-17 years and residing in the United States) were recruited via the Internet to complete an anonymous survey online using a survey tool. The target sample of 200 participants who completed the survey was achieved. The sample’s average age was 16 and was 63% girls. Results A set of regressions with recreational computer use as dependent variables were run. Conclusions Less parental monitoring, younger age at first computer use, listening or downloading music from the internet more frequently, using the internet for educational purposes less frequently, and parent’s use of the computer for pleasure were related to spending a greater percentage of time on non-school computer use. These findings suggest the importance of parental monitoring and parental computer use on their children’s own computer use, and the influence of some internet activities on adolescent computer use. Finally, programs aimed at parents to help them increase the age when their children start using computers and learn how to place limits on recreational computer use are needed. PMID:25170449

  13. Scientific Discovery through Advanced Computing in Plasma Science

    NASA Astrophysics Data System (ADS)

    Tang, William

    2005-03-01

    Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to the computational science area.

  14. Derivation of Tropospheric Ozone Climatology and Trends from TOMS Data

    NASA Technical Reports Server (NTRS)

    Newchurch, Michael J.; McPeters, Rich; Logan, Jennifer; Kim, Jae-Hwan

    2002-01-01

    This research addresses the following three objectives: (1) Derive tropospheric ozone columns from the TOMS instruments by computing the difference between total-ozone columns over cloudy areas and over clear areas in the tropics; (2) Compute secular trends in Nimbus-7 derived tropospheric Ozone column amounts and associated potential trends in the decadal-scale tropical cloud climatology; (3) Explain the occurrence of anomalously high ozone retrievals over high ice clouds.

  15. Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets

    PubMed Central

    Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L

    2014-01-01

    Background As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Methods Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Results Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Conclusions Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. PMID:24464852

  16. Current Models of Digital Scholarly Communication: Results of an Investigation Conducted by Ithaka for the Association of Research Libraries

    ERIC Educational Resources Information Center

    Maron, Nancy L.; Smith, K. Kirby

    2008-01-01

    As electronic resources for scholarship proliferate, more and more scholars turn to their computers rather than to print sources to conduct their research. The decentralized distribution of these new model works can make it difficult to fully appreciate their scope and number, even for university librarians tasked with knowing about valuable…

  17. STARS: An Integrated, Multidisciplinary, Finite-Element, Structural, Fluids, Aeroelastic, and Aeroservoelastic Analysis Computer Program

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.

    1997-01-01

    A multidisciplinary, finite element-based, highly graphics-oriented, linear and nonlinear analysis capability that includes such disciplines as structures, heat transfer, linear aerodynamics, computational fluid dynamics, and controls engineering has been achieved by integrating several new modules in the original STARS (STructural Analysis RoutineS) computer program. Each individual analysis module is general-purpose in nature and is effectively integrated to yield aeroelastic and aeroservoelastic solutions of complex engineering problems. Examples of advanced NASA Dryden Flight Research Center projects analyzed by the code in recent years include the X-29A, F-18 High Alpha Research Vehicle/Thrust Vectoring Control System, B-52/Pegasus Generic Hypersonics, National AeroSpace Plane (NASP), SR-71/Hypersonic Launch Vehicle, and High Speed Civil Transport (HSCT) projects. Extensive graphics capabilities exist for convenient model development and postprocessing of analysis results. The program is written in modular form in standard FORTRAN language to run on a variety of computers, such as the IBM RISC/6000, SGI, DEC, Cray, and personal computer; associated graphics codes use OpenGL and IBM/graPHIGS language for color depiction. This program is available from COSMIC, the NASA agency for distribution of computer programs.

  18. Games at work: the recreational use of computer games during working hours.

    PubMed

    Reinecke, Leonard

    2009-08-01

    The present study investigated the recreational use of video and computer games in the workplace. In an online survey, 833 employed users of online casual games reported on their use of computer games during working hours. The data indicate that playing computer games in the workplace elicits substantial levels of recovery experience. Recovery experience associated with gameplay was the strongest predictor for the use of games in the workplace. Furthermore, individuals with higher levels of work-related fatigue reported stronger recovery experience during gameplay and showed a higher tendency to play games during working hours than did persons with lower levels of work strain. Additionally, the social situation at work was found to have a significant influence on the use of games. Persons receiving less social support from colleagues and supervisors played games at work more frequently than did individuals with higher levels of social support. Furthermore, job control was positively related to the use of games at work. In sum, the results of the present study illustrate that computer games have a significant recovery potential. Implications of these findings for research on personal computer use during work and for games research in general are discussed.

  19. ERA 1103 UNIVAC 2 Calculating Machine

    NASA Image and Video Library

    1955-09-21

    The new 10-by 10-Foot Supersonic Wind Tunnel at the Lewis Flight Propulsion Laboratory included high tech data acquisition and analysis systems. The reliable gathering of pressure, speed, temperature, and other data from test runs in the facilities was critical to the research process. Throughout the 1940s and early 1950s female employees, known as computers, recorded all test data and performed initial calculations by hand. The introduction of punch card computers in the late 1940s gradually reduced the number of hands-on calculations. In the mid-1950s new computational machines were installed in the office building of the 10-by 10-Foot tunnel. The new systems included this UNIVAC 1103 vacuum tube computer—the lab’s first centralized computer system. The programming was done on paper tape and fed into the machine. The 10-by 10 computer center also included the Lewis-designed Computer Automated Digital Encoder (CADDE) and Digital Automated Multiple Pressure Recorder (DAMPR) systems which converted test data to binary-coded decimal numbers and recorded test pressures automatically, respectively. The systems primarily served the 10-by 10, but were also applied to the other large facilities. Engineering Research Associates (ERA) developed the initial UNIVAC computer for the Navy in the late 1940s. In 1952 the company designed a commercial version, the UNIVAC 1103. The 1103 was the first computer designed by Seymour Cray and the first commercially successful computer.

  20. Cost Computations for Cyber Fighter Associate

    DTIC Science & Technology

    2015-05-01

    associate. Aberdeen Proving Ground (MD): Army Research Laboratory (US); in press. 2 Harman D, Brown S, Henz B, Marvel LM. A communication protocol... Harman , et al.2 A specific class called ListenThread was created for multithreaded listeners. When ListenThread is instantiated, it is passed a given...2. Harman D, Brown S, Henz B, Marvel LM. A communication protocol for CyAMS and the cyber associate interface. Aberdeen Proving Ground (MD): US Army

  1. Community-driven computational biology with Debian Linux.

    PubMed

    Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles

    2010-12-21

    The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.

  2. JAX Colony Management System (JCMS): an extensible colony and phenotype data management system.

    PubMed

    Donnelly, Chuck J; McFarland, Mike; Ames, Abigail; Sundberg, Beth; Springer, Dave; Blauth, Peter; Bult, Carol J

    2010-04-01

    The Jackson Laboratory Colony Management System (JCMS) is a software application for managing data and information related to research mouse colonies, associated biospecimens, and experimental protocols. JCMS runs directly on computers that run one of the PC Windows operating systems, but can be accessed via web browser interfaces from any computer running a Windows, Macintosh, or Linux operating system. JCMS can be configured for a single user or multiple users in small- to medium-size work groups. The target audience for JCMS includes laboratory technicians, animal colony managers, and principal investigators. The application provides operational support for colony management and experimental workflows, sample and data tracking through transaction-based data entry forms, and date-driven work reports. Flexible query forms allow researchers to retrieve database records based on user-defined criteria. Recent advances in handheld computers with integrated barcode readers, middleware technologies, web browsers, and wireless networks add to the utility of JCMS by allowing real-time access to the database from any networked computer.

  3. ``But it doesn't come naturally'': how effort expenditure shapes the benefit of growth mindset on women's sense of intellectual belonging in computing

    NASA Astrophysics Data System (ADS)

    Stout, Jane G.; Blaney, Jennifer M.

    2017-10-01

    Research suggests growth mindset, or the belief that knowledge is acquired through effort, may enhance women's sense of belonging in male-dominated disciplines, like computing. However, other research indicates women who spend a great deal of time and energy in technical fields experience a low sense of belonging. The current study assessed the benefits of a growth mindset on women's (and men's) sense of intellectual belonging in computing, accounting for the amount of time and effort dedicated to academics. We define "intellectual belonging" as the sense that one is believed to be a competent member of the community. Whereas a stronger growth mindset was associated with stronger intellectual belonging for men, a growth mindset only boosted women's intellectual belonging when they did not work hard on academics. Our findings suggest, paradoxically, women may not benefit from a growth mindset in computing when they exert a lot of effort.

  4. The association between computer literacy and training on clinical productivity and user satisfaction in using the electronic medical record in Saudi Arabia.

    PubMed

    Alasmary, May; El Metwally, Ashraf; Househ, Mowafa

    2014-08-01

    The association of computer literacy, training on clinical productivity and satisfaction of a recently implemented Electronic Medical Record (EMR) system in Prince Sultan Medical Military City ((PSMMC)) was investigated. The scope of this study was to explore the association between age, occupation and computer literacy and clinical productivity and users' satisfaction of the newly implemented EMR at PSMMC as well as the association of user satisfaction with age and position. A self-administrated questionnaire was distributed to all doctors and nurses working in Alwazarat Family and Community Center (a Health center in PSMMC). A convenience sample size of 112 healthcare providers (65 Nurses and 47 physicians) completed the questionnaire. A combination of correlation, One Way ANOVA and t-tests were used to answer the research questions. Participants had high levels of self-reported literacy on computers and satisfaction of the system. Both levels were higher among physicians than among nurses. A moderate but significant (at p < 0.01 level) correlation was found between computer literacy and users' satisfaction towards the system (R = 0.343). Age was weakly, but significantly (at p < 0.05), positively correlated with satisfaction with the system (R = 0.29). Self-reported system productivity and satisfaction was statistically correlated at p < 0.01 (R = 0.509). High level of satisfaction with training on using the system was not positively correlated with overall satisfaction of using the system. This study demonstrated that EMR users with high computer literacy skills were more satisfied with using the EMR than users with low computer literacy skills.

  5. MoCog1: A computer simulation of recognition-primed human decision making, considering emotions

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1992-01-01

    The successful results of the first stage of a research effort to develop a versatile computer model of motivated human cognitive behavior are reported. Most human decision making appears to be an experience-based, relatively straightforward, largely automatic response to situations, utilizing cues and opportunities perceived from the current environment. The development, considering emotions, of the architecture and computer program associated with such 'recognition-primed' decision-making is described. The resultant computer program (MoCog1) was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  6. MoCog1: A computer simulation of recognition-primed human decision making

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  7. Infrastructure for Training and Partnershipes: California Water and Coastal Ocean Resources

    NASA Technical Reports Server (NTRS)

    Siegel, David A.; Dozier, Jeffrey; Gautier, Catherine; Davis, Frank; Dickey, Tommy; Dunne, Thomas; Frew, James; Keller, Arturo; MacIntyre, Sally; Melack, John

    2000-01-01

    The purpose of this project was to advance the existing ICESS/Bren School computing infrastructure to allow scientists, students, and research trainees the opportunity to interact with environmental data and simulations in near-real time. Improvements made with the funding from this project have helped to strengthen the research efforts within both units, fostered graduate research training, and helped fortify partnerships with government and industry. With this funding, we were able to expand our computational environment in which computer resources, software, and data sets are shared by ICESS/Bren School faculty researchers in all areas of Earth system science. All of the graduate and undergraduate students associated with the Donald Bren School of Environmental Science and Management and the Institute for Computational Earth System Science have benefited from the infrastructure upgrades accomplished by this project. Additionally, the upgrades fostered a significant number of research projects (attached is a list of the projects that benefited from the upgrades). As originally proposed, funding for this project provided the following infrastructure upgrades: 1) a modem file management system capable of interoperating UNIX and NT file systems that can scale to 6.7 TB, 2) a Qualstar 40-slot tape library with two AIT tape drives and Legato Networker backup/archive software, 3) previously unavailable import/export capability for data sets on Zip, Jaz, DAT, 8mm, CD, and DLT media in addition to a 622Mb/s Internet 2 connection, 4) network switches capable of 100 Mbps to 128 desktop workstations, 5) Portable Batch System (PBS) computational task scheduler, and vi) two Compaq/Digital Alpha XP1000 compute servers each with 1.5 GB of RAM along with an SGI Origin 2000 (purchased partially using funds from this project along with funding from various other sources) to be used for very large computations, as required for simulation of mesoscale meteorology or climate.

  8. The impact of computer science in molecular medicine: enabling high-throughput research.

    PubMed

    de la Iglesia, Diana; García-Remesal, Miguel; de la Calle, Guillermo; Kulikowski, Casimir; Sanz, Ferran; Maojo, Víctor

    2013-01-01

    The Human Genome Project and the explosion of high-throughput data have transformed the areas of molecular and personalized medicine, which are producing a wide range of studies and experimental results and providing new insights for developing medical applications. Research in many interdisciplinary fields is resulting in data repositories and computational tools that support a wide diversity of tasks: genome sequencing, genome-wide association studies, analysis of genotype-phenotype interactions, drug toxicity and side effects assessment, prediction of protein interactions and diseases, development of computational models, biomarker discovery, and many others. The authors of the present paper have developed several inventories covering tools, initiatives and studies in different computational fields related to molecular medicine: medical informatics, bioinformatics, clinical informatics and nanoinformatics. With these inventories, created by mining the scientific literature, we have carried out several reviews of these fields, providing researchers with a useful framework to locate, discover, search and integrate resources. In this paper we present an analysis of the state-of-the-art as it relates to computational resources for molecular medicine, based on results compiled in our inventories, as well as results extracted from a systematic review of the literature and other scientific media. The present review is based on the impact of their related publications and the available data and software resources for molecular medicine. It aims to provide information that can be useful to support ongoing research and work to improve diagnostics and therapeutics based on molecular-level insights.

  9. Computer and visual display terminals (VDT) vision syndrome (CVDTS).

    PubMed

    Parihar, J K S; Jain, Vaibhav Kumar; Chaturvedi, Piyush; Kaushik, Jaya; Jain, Gunjan; Parihar, Ashwini K S

    2016-07-01

    Computer and visual display terminals have become an essential part of modern lifestyle. The use of these devices has made our life simple in household work as well as in offices. However the prolonged use of these devices is not without any complication. Computer and visual display terminals syndrome is a constellation of symptoms ocular as well as extraocular associated with prolonged use of visual display terminals. This syndrome is gaining importance in this modern era because of the widespread use of technologies in day-to-day life. It is associated with asthenopic symptoms, visual blurring, dry eyes, musculoskeletal symptoms such as neck pain, back pain, shoulder pain, carpal tunnel syndrome, psychosocial factors, venous thromboembolism, shoulder tendonitis, and elbow epicondylitis. Proper identification of symptoms and causative factors are necessary for the accurate diagnosis and management. This article focuses on the various aspects of the computer vision display terminals syndrome described in the previous literature. Further research is needed for the better understanding of the complex pathophysiology and management.

  10. The Effort to Reduce a Muscle Fatigue Through Gymnastics Relaxation and Ergonomic Approach for Computer Users in Central Building State University of Medan

    NASA Astrophysics Data System (ADS)

    Gultom, Syamsul; Darma Sitepu, Indra; Hasibuan, Nurman

    2018-03-01

    Fatigue due to long and continuous computer usage can lead to problems of dominant fatigue associated with decreased performance and work motivation. Specific targets in the first phase have been achieved in this research such as: (1) Identified complaints on workers using computers, using the Bourdon Wiersma test kit. (2) Finding the right relaxation & work posture draft for a solution to reduce muscle fatigue in computer-based workers. The type of research used in this study is research and development method which aims to produce the products or refine existing products. The final product is a prototype of back-holder, monitoring filter and arranging a relaxation exercise as well as the manual book how to do this while in front of the computer to lower the fatigue level for computer users in Unimed’s Administration Center. In the first phase, observations and interviews have been conducted and identified the level of fatigue on the employees of computer users at Uniemd’s Administration Center using Bourdon Wiersma test and has obtained the following results: (1) The average velocity time of respondents in BAUK, BAAK and BAPSI after working with the value of interpretation of the speed obtained value of 8.4, WS 13 was in a good enough category, (2) The average of accuracy of respondents in BAUK, in BAAK and in BAPSI after working with interpretation value accuracy obtained Value of 5.5, WS 8 was in doubt-category. This result shows that computer users experienced a significant tiredness at the Unimed Administration Center, (3) the consistency of the average of the result in measuring tiredness level on computer users in Unimed’s Administration Center after working with values in consistency of interpretation obtained Value of 5.5 with WS 8 was put in a doubt-category, which means computer user in The Unimed Administration Center suffered an extreme fatigue. In phase II, based on the results of the first phase in this research, the researcher offers solutions such as the prototype of Back-Holder, monitoring filter, and design a proper relaxation exercise to reduce the fatigue level. Furthermore, in in order to maximize the exercise itself, a manual book will be given to employees whom regularly work in front of computers at Unimed’s Administration Center

  11. Computational dynamic approaches for temporal omics data with applications to systems medicine.

    PubMed

    Liang, Yulan; Kelemen, Arpad

    2017-01-01

    Modeling and predicting biological dynamic systems and simultaneously estimating the kinetic structural and functional parameters are extremely important in systems and computational biology. This is key for understanding the complexity of the human health, drug response, disease susceptibility and pathogenesis for systems medicine. Temporal omics data used to measure the dynamic biological systems are essentials to discover complex biological interactions and clinical mechanism and causations. However, the delineation of the possible associations and causalities of genes, proteins, metabolites, cells and other biological entities from high throughput time course omics data is challenging for which conventional experimental techniques are not suited in the big omics era. In this paper, we present various recently developed dynamic trajectory and causal network approaches for temporal omics data, which are extremely useful for those researchers who want to start working in this challenging research area. Moreover, applications to various biological systems, health conditions and disease status, and examples that summarize the state-of-the art performances depending on different specific mining tasks are presented. We critically discuss the merits, drawbacks and limitations of the approaches, and the associated main challenges for the years ahead. The most recent computing tools and software to analyze specific problem type, associated platform resources, and other potentials for the dynamic trajectory and interaction methods are also presented and discussed in detail.

  12. Security Approaches in Using Tablet Computers for Primary Data Collection in Clinical Research

    PubMed Central

    Wilcox, Adam B.; Gallagher, Kathleen; Bakken, Suzanne

    2013-01-01

    Next-generation tablets (iPads and Android tablets) may potentially improve the collection and management of clinical research data. The widespread adoption of tablets, coupled with decreased software and hardware costs, has led to increased consideration of tablets for primary research data collection. When using tablets for the Washington Heights/Inwood Infrastructure for Comparative Effectiveness Research (WICER) project, we found that the devices give rise to inherent security issues associated with the potential use of cloud-based data storage approaches. This paper identifies and describes major security considerations for primary data collection with tablets; proposes a set of architectural strategies for implementing data collection forms with tablet computers; and discusses the security, cost, and workflow of each strategy. The paper briefly reviews the strategies with respect to their implementation for three primary data collection activities for the WICER project. PMID:25848559

  13. Security approaches in using tablet computers for primary data collection in clinical research.

    PubMed

    Wilcox, Adam B; Gallagher, Kathleen; Bakken, Suzanne

    2013-01-01

    Next-generation tablets (iPads and Android tablets) may potentially improve the collection and management of clinical research data. The widespread adoption of tablets, coupled with decreased software and hardware costs, has led to increased consideration of tablets for primary research data collection. When using tablets for the Washington Heights/Inwood Infrastructure for Comparative Effectiveness Research (WICER) project, we found that the devices give rise to inherent security issues associated with the potential use of cloud-based data storage approaches. This paper identifies and describes major security considerations for primary data collection with tablets; proposes a set of architectural strategies for implementing data collection forms with tablet computers; and discusses the security, cost, and workflow of each strategy. The paper briefly reviews the strategies with respect to their implementation for three primary data collection activities for the WICER project.

  14. Field testing of eco-speed control using V2I communication.

    DOT National Transportation Integrated Search

    2016-04-15

    This research focused on the development of an Eco-Cooperative Adaptive Cruise Control (EcoCACC) : System and addressed the implementation issues associated with applying it in the field. : The Eco-CACC system computes and recommends a fuel-efficient...

  15. Bounding the electrostatic free energies associated with linear continuum models of molecular solvation.

    PubMed

    Bardhan, Jaydeep P; Knepley, Matthew G; Anitescu, Mihai

    2009-03-14

    The importance of electrostatic interactions in molecular biology has driven extensive research toward the development of accurate and efficient theoretical and computational models. Linear continuum electrostatic theory has been surprisingly successful, but the computational costs associated with solving the associated partial differential equations (PDEs) preclude the theory's use in most dynamical simulations. Modern generalized-Born models for electrostatics can reproduce PDE-based calculations to within a few percent and are extremely computationally efficient but do not always faithfully reproduce interactions between chemical groups. Recent work has shown that a boundary-integral-equation formulation of the PDE problem leads naturally to a new approach called boundary-integral-based electrostatics estimation (BIBEE) to approximate electrostatic interactions. In the present paper, we prove that the BIBEE method can be used to rigorously bound the actual continuum-theory electrostatic free energy. The bounds are validated using a set of more than 600 proteins. Detailed numerical results are presented for structures of the peptide met-enkephalin taken from a molecular-dynamics simulation. These bounds, in combination with our demonstration that the BIBEE methods accurately reproduce pairwise interactions, suggest a new approach toward building a highly accurate yet computationally tractable electrostatic model.

  16. Bounding the electrostatic free energies associated with linear continuum models of molecular solvation

    NASA Astrophysics Data System (ADS)

    Bardhan, Jaydeep P.; Knepley, Matthew G.; Anitescu, Mihai

    2009-03-01

    The importance of electrostatic interactions in molecular biology has driven extensive research toward the development of accurate and efficient theoretical and computational models. Linear continuum electrostatic theory has been surprisingly successful, but the computational costs associated with solving the associated partial differential equations (PDEs) preclude the theory's use in most dynamical simulations. Modern generalized-Born models for electrostatics can reproduce PDE-based calculations to within a few percent and are extremely computationally efficient but do not always faithfully reproduce interactions between chemical groups. Recent work has shown that a boundary-integral-equation formulation of the PDE problem leads naturally to a new approach called boundary-integral-based electrostatics estimation (BIBEE) to approximate electrostatic interactions. In the present paper, we prove that the BIBEE method can be used to rigorously bound the actual continuum-theory electrostatic free energy. The bounds are validated using a set of more than 600 proteins. Detailed numerical results are presented for structures of the peptide met-enkephalin taken from a molecular-dynamics simulation. These bounds, in combination with our demonstration that the BIBEE methods accurately reproduce pairwise interactions, suggest a new approach toward building a highly accurate yet computationally tractable electrostatic model.

  17. geneGIS: Computational Tools for Spatial Analyses of DNA Profiles with Associated Photo-Identification and Telemetry Records of Marine Mammals

    DTIC Science & Technology

    2012-09-30

    computational tools provide the ability to display, browse, select, filter and summarize spatio-temporal relationships of these individual-based...her research assistant at Esri, Shaun Walbridge, and members of the Marine Mammal Institute ( MMI ), including Tomas Follet and Debbie Steel. This...Genomics Laboratory, MMI , OSU. 4 As part of the geneGIS initiative, these SPLASH photo-identification records and the geneSPLASH DNA profiles

  18. Guidelines for developing vectorizable computer programs

    NASA Technical Reports Server (NTRS)

    Miner, E. W.

    1982-01-01

    Some fundamental principles for developing computer programs which are compatible with array-oriented computers are presented. The emphasis is on basic techniques for structuring computer codes which are applicable in FORTRAN and do not require a special programming language or exact a significant penalty on a scalar computer. Researchers who are using numerical techniques to solve problems in engineering can apply these basic principles and thus develop transportable computer programs (in FORTRAN) which contain much vectorizable code. The vector architecture of the ASC is discussed so that the requirements of array processing can be better appreciated. The "vectorization" of a finite-difference viscous shock-layer code is used as an example to illustrate the benefits and some of the difficulties involved. Increases in computing speed with vectorization are illustrated with results from the viscous shock-layer code and from a finite-element shock tube code. The applicability of these principles was substantiated through running programs on other computers with array-associated computing characteristics, such as the Hewlett-Packard (H-P) 1000-F.

  19. A Policy Assessment of Priorities and Functional Needs for the Military Computer-Assisted Instruction Terminal

    DTIC Science & Technology

    1975-12-01

    ceases to act as a testing ground for determining user needs when demand is low relative to high initial development costs. Customers are forced to...ANASTASIO Associate Director Data Analysis Research Division Educational Testing Service Princeton, NJ 08540 (609)921-9000 Director of Educational...technology and educational psychology - Indiana University. Research interestes in adaptive, interactive instructional systems. Management

  20. PREDICTORS OF COMPUTER USE IN COMMUNITY-DWELLING ETHNICALLY DIVERSE OLDER ADULTS

    PubMed Central

    Werner, Julie M.; Carlson, Mike; Jordan-Marsh, Maryalice; Clark, Florence

    2011-01-01

    Objective In this study we analyzed self-reported computer use, demographic variables, psychosocial variables, and health and well-being variables collected from 460 ethnically diverse, community-dwelling elders in order to investigate the relationship computer use has with demographics, well-being and other key psychosocial variables in older adults. Background Although younger elders with more education, those who employ active coping strategies, or those who are low in anxiety levels are thought to use computers at higher rates than others, previous research has produced mixed or inconclusive results regarding ethnic, gender, and psychological factors, or has concentrated on computer-specific psychological factors only (e.g., computer anxiety). Few such studies have employed large sample sizes or have focused on ethnically diverse populations of community-dwelling elders. Method With a large number of overlapping predictors, zero-order analysis alone is poorly equipped to identify variables that are independently associated with computer use. Accordingly, both zero-order and stepwise logistic regression analyses were conducted to determine the correlates of two types of computer use: email and general computer use. Results Results indicate that younger age, greater level of education, non-Hispanic ethnicity, behaviorally active coping style, general physical health, and role-related emotional health each independently predicted computer usage. Conclusion Study findings highlight differences in computer usage, especially in regard to Hispanic ethnicity and specific health and well-being factors. Application Potential applications of this research include future intervention studies, individualized computer-based activity programming, or customizable software and user interface design for older adults responsive to a variety of personal characteristics and capabilities. PMID:22046718

  1. Predictors of computer use in community-dwelling, ethnically diverse older adults.

    PubMed

    Werner, Julie M; Carlson, Mike; Jordan-Marsh, Maryalice; Clark, Florence

    2011-10-01

    In this study, we analyzed self-reported computer use, demographic variables, psychosocial variables, and health and well-being variables collected from 460 ethnically diverse, community-dwelling elders to investigate the relationship computer use has with demographics, well-being, and other key psychosocial variables in older adults. Although younger elders with more education, those who employ active coping strategies, or those who are low in anxiety levels are thought to use computers at higher rates than do others, previous research has produced mixed or inconclusive results regarding ethnic, gender, and psychological factors or has concentrated on computer-specific psychological factors only (e.g., computer anxiety). Few such studies have employed large sample sizes or have focused on ethnically diverse populations of community-dwelling elders. With a large number of overlapping predictors, zero-order analysis alone is poorly equipped to identify variables that are independently associated with computer use. Accordingly, both zero-order and stepwise logistic regression analyses were conducted to determine the correlates of two types of computer use: e-mail and general computer use. Results indicate that younger age, greater level of education, non-Hispanic ethnicity, behaviorally active coping style, general physical health, and role-related emotional health each independently predicted computer usage. Study findings highlight differences in computer usage, especially in regard to Hispanic ethnicity and specific health and well-being factors. Potential applications of this research include future intervention studies, individualized computer-based activity programming, or customizable software and user interface design for older adults responsive to a variety of personal characteristics and capabilities.

  2. Waggle: A Framework for Intelligent Attentive Sensing and Actuation

    NASA Astrophysics Data System (ADS)

    Sankaran, R.; Jacob, R. L.; Beckman, P. H.; Catlett, C. E.; Keahey, K.

    2014-12-01

    Advances in sensor-driven computation and computationally steered sensing will greatly enable future research in fields including environmental and atmospheric sciences. We will present "Waggle," an open-source hardware and software infrastructure developed with two goals: (1) reducing the separation and latency between sensing and computing and (2) improving the reliability and longevity of sensing-actuation platforms in challenging and costly deployments. Inspired by "deep-space probe" systems, the Waggle platform design includes features that can support longitudinal studies, deployments with varying communication links, and remote management capabilities. Waggle lowers the barrier for scientists to incorporate real-time data from their sensors into their computations and to manipulate the sensors or provide feedback through actuators. A standardized software and hardware design allows quick addition of new sensors/actuators and associated software in the nodes and enables them to be coupled with computational codes both insitu and on external compute infrastructure. The Waggle framework currently drives the deployment of two observational systems - a portable and self-sufficient weather platform for study of small-scale effects in Chicago's urban core and an open-ended distributed instrument in Chicago that aims to support several research pursuits across a broad range of disciplines including urban planning, microbiology and computer science. Built around open-source software, hardware, and Linux OS, the Waggle system comprises two components - the Waggle field-node and Waggle cloud-computing infrastructure. Waggle field-node affords a modular, scalable, fault-tolerant, secure, and extensible platform for hosting sensors and actuators in the field. It supports insitu computation and data storage, and integration with cloud-computing infrastructure. The Waggle cloud infrastructure is designed with the goal of scaling to several hundreds of thousands of Waggle nodes. It supports aggregating data from sensors hosted by the nodes, staging computation, relaying feedback to the nodes and serving data to end-users. We will discuss the Waggle design principles and their applicability to various observational research pursuits, and demonstrate its capabilities.

  3. United States Air Force Summer Faculty Research Program (1983). Program Management Report.

    DTIC Science & Technology

    1983-12-01

    845-5011 Dr. John Eoll Degree: Ph.D., Astrophysics, 1976 Assistant Professor Specialty: Radiaton Transport , Fluid Lernir-Rhyne College Dynamics...Applications Newark, DE 19711 Assigned: RADC (302) 738-8173 Dr. Gregory Jones Degree: Ph.D., Mathematics, 1972 Associate Professor Specialty: Computability...1965 Associate Professor Specialty: Magnetic Resonance, University of Dayton Transport Properties Physics Department Assigned: ML Dayton, OH 45469 5

  4. Multi-modality imaging: Bird's-eye view from the 2014 American Heart Association Scientific Sessions.

    PubMed

    AlJaroudi, Wael A; Einstein, Andrew J; Chaudhry, Farooq A; Lloyd, Steven G; Hage, Fadi G

    2015-04-01

    A large number of studies were presented at the 2014 American Heart Association Scientific Sessions. In this review, we will summarize key studies in nuclear cardiology, computed tomography, echocardiography, and cardiac magnetic resonance imaging. This brief review will be helpful for readers of the Journal who are interested in being updated on the latest research covering these imaging modalities.

  5. Technologies for Large Data Management in Scientific Computing

    NASA Astrophysics Data System (ADS)

    Pace, Alberto

    2014-01-01

    In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focuses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.

  6. Discovery of the Kalman filter as a practical tool for aerospace and industry

    NASA Technical Reports Server (NTRS)

    Mcgee, L. A.; Schmidt, S. F.

    1985-01-01

    The sequence of events which led the researchers at Ames Research Center to the early discovery of the Kalman filter shortly after its introduction into the literature is recounted. The scientific breakthroughs and reformulations that were necessary to transform Kalman's work into a useful tool for a specific aerospace application are described. The resulting extended Kalman filter, as it is now known, is often still referred to simply as the Kalman filter. As the filter's use gained in popularity in the scientific community, the problems of implementation on small spaceborne and airborne computers led to a square-root formulation of the filter to overcome numerical difficulties associated with computer word length. The work that led to this new formulation is also discussed, including the first airborne computer implementation and flight test. Since then the applications of the extended and square-root formulations of the Kalman filter have grown rapidly throughout the aerospace industry.

  7. Functional Assays to Screen and Dissect Genomic Hits: Doubling Down on the National Investment in Genomic Research.

    PubMed

    Musunuru, Kiran; Bernstein, Daniel; Cole, F Sessions; Khokha, Mustafa K; Lee, Frank S; Lin, Shin; McDonald, Thomas V; Moskowitz, Ivan P; Quertermous, Thomas; Sankaran, Vijay G; Schwartz, David A; Silverman, Edwin K; Zhou, Xiaobo; Hasan, Ahmed A K; Luo, Xiao-Zhong James

    2018-04-01

    The National Institutes of Health have made substantial investments in genomic studies and technologies to identify DNA sequence variants associated with human disease phenotypes. The National Heart, Lung, and Blood Institute has been at the forefront of these commitments to ascertain genetic variation associated with heart, lung, blood, and sleep diseases and related clinical traits. Genome-wide association studies, exome- and genome-sequencing studies, and exome-genotyping studies of the National Heart, Lung, and Blood Institute-funded epidemiological and clinical case-control studies are identifying large numbers of genetic variants associated with heart, lung, blood, and sleep phenotypes. However, investigators face challenges in identification of genomic variants that are functionally disruptive among the myriad of computationally implicated variants. Studies to define mechanisms of genetic disruption encoded by computationally identified genomic variants require reproducible, adaptable, and inexpensive methods to screen candidate variant and gene function. High-throughput strategies will permit a tiered variant discovery and genetic mechanism approach that begins with rapid functional screening of a large number of computationally implicated variants and genes for discovery of those that merit mechanistic investigation. As such, improved variant-to-gene and gene-to-function screens-and adequate support for such studies-are critical to accelerating the translation of genomic findings. In this White Paper, we outline the variety of novel technologies, assays, and model systems that are making such screens faster, cheaper, and more accurate, referencing published work and ongoing work supported by the National Heart, Lung, and Blood Institute's R21/R33 Functional Assays to Screen Genomic Hits program. We discuss priorities that can accelerate the impressive but incomplete progress represented by big data genomic research. © 2018 American Heart Association, Inc.

  8. Oncologists partner with Watson on genomics.

    PubMed

    2015-08-01

    A new collaboration between IBM Watson Health and more than a dozen cancer centers uses the power of cognitive computing to dramatically reduce the time it takes to analyze data from patients' DNA and identify targeted treatment options. ©2015 American Association for Cancer Research.

  9. Research into Queueing Network Theory.

    DTIC Science & Technology

    1977-09-01

    and Zeigler, B. (1975) "Equilibrium properties of arbitrarily interconnected queueing netowrks ," Tech. Report 75-4, Computer and Communication...Associate. The project was extremely fortunate to secure the services of Dr. Wendel. Dr. Wendel was a project member for one month in the summer of

  10. Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets.

    PubMed

    Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L

    2014-01-01

    As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  11. Description of 'REQUEST-KYUSHYU' for KYUKEICHO regional data base

    NASA Astrophysics Data System (ADS)

    Takimoto, Shin'ichi

    Kyushu Economic Research Association (a foundational juridical person) initiated the regional database services, ' REQUEST-Kyushu ' recently. It is the full scale databases compiled based on the information and know-hows which the Association has accumulated over forty years. It covers the regional information database for journal and newspaper articles, and statistical information database for economic statistics. As to the former database it is searched on a personal computer and then a search result (original text) is sent through a facsimile. As to the latter, it is also searched on a personal computer where the data is processed, edited or downloaded. This paper describes characteristics, content and the system outline of 'REQUEST-Kyushu'.

  12. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study.

    PubMed

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were "beeped" several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research.

  13. The Unified Medical Language System: an informatics research collaboration.

    PubMed

    Humphreys, B L; Lindberg, D A; Schoolman, H M; Barnett, G O

    1998-01-01

    In 1986, the National Library of Medicine (NLM) assembled a large multidisciplinary, multisite team to work on the Unified Medical Language System (UMLS), a collaborative research project aimed at reducing fundamental barriers to the application of computers to medicine. Beyond its tangible products, the UMLS Knowledge Sources, and its influence on the field of informatics, the UMLS project is an interesting case study in collaborative research and development. It illustrates the strengths and challenges of substantive collaboration among widely distributed research groups. Over the past decade, advances in computing and communications have minimized the technical difficulties associated with UMLS collaboration and also facilitated the development, dissemination, and use of the UMLS Knowledge Sources. The spread of the World Wide Web has increased the visibility of the information access problems caused by multiple vocabularies and many information sources which are the focus of UMLS work. The time is propitious for building on UMLS accomplishments and making more progress on the informatics research issues first highlighted by the UMLS project more than 10 years ago.

  14. Collaborative mining and interpretation of large-scale data for biomedical research insights.

    PubMed

    Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis

    2014-01-01

    Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence.

  15. Collaborative Mining and Interpretation of Large-Scale Data for Biomedical Research Insights

    PubMed Central

    Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis

    2014-01-01

    Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence. PMID:25268270

  16. Association between screen viewing duration and sleep duration, sleep quality, and excessive daytime sleepiness among adolescents in Hong Kong.

    PubMed

    Mak, Yim Wah; Wu, Cynthia Sau Ting; Hui, Donna Wing Shun; Lam, Siu Ping; Tse, Hei Yin; Yu, Wing Yan; Wong, Ho Ting

    2014-10-28

    Screen viewing is considered to have adverse impacts on the sleep of adolescents. Although there has been a considerable amount of research on the association between screen viewing and sleep, most studies have focused on specific types of screen viewing devices such as televisions and computers. The present study investigated the duration with which currently prevalent screen viewing devices (including televisions, personal computers, mobile phones, and portable video devices) are viewed in relation to sleep duration, sleep quality, and daytime sleepiness among Hong Kong adolescents (N = 762). Television and computer viewing remain prevalent, but were not correlated with sleep variables. Mobile phone viewing was correlated with all sleep variables, while portable video device viewing was shown to be correlated only with daytime sleepiness. The results demonstrated a trend of increase in the prevalence and types of screen viewing and their effects on the sleep patterns of adolescents.

  17. Do Quantitative Measures of Research Productivity Correlate with Academic Rank in Oral and Maxillofacial Surgery?

    PubMed

    Susarla, Srinivas M; Dodson, Thomas B; Lopez, Joseph; Swanson, Edward W; Calotta, Nicholas; Peacock, Zachary S

    2015-08-01

    Academic promotion is linked to research productivity. The purpose of this study was to assess the correlation between quantitative measures of academic productivity and academic rank among academic oral and maxillofacial surgeons. This was a cross-sectional study of full-time academic oral and maxillofacial surgeons in the United States. The predictor variables were categorized as demographic (gender, medical degree, research doctorate, other advanced degree) and quantitative measures of academic productivity (total number of publications, total number of citations, maximum number of citations for a single article, I-10 index [number of publications with ≥ 10 citations], and h-index [number of publications h with ≥ h citations each]). The outcome variable was current academic rank (instructor, assistant professor, associate professor, professor, or endowed professor). Descriptive, bivariate, and multiple regression statistics were computed to evaluate associations between the predictors and academic rank. Receiver-operator characteristic curves were computed to identify thresholds for academic promotion. The sample consisted of 324 academic oral and maxillofacial surgeons, of whom 11.7% were female, 40% had medical degrees, and 8% had research doctorates. The h-index was the most strongly correlated with academic rank (ρ = 0.62, p < 0.001). H-indexes of ≥ 4, ≥ 8, and ≥ 13 were identified as thresholds for promotion to associate professor, professor, and endowed professor, respectively (p < 0.001). This study found that the h-index was strongly correlated with academic rank among oral and maxillofacial surgery faculty members and thus suggests that promotions committees should consider using the h-index as an additional method to assess research activity.

  18. NASA Langley Research and Technology-Transfer Program in Formal Methods

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Caldwell, James L.; Carreno, Victor A.; Holloway, C. Michael; Miner, Paul S.; DiVito, Ben L.

    1995-01-01

    This paper presents an overview of NASA Langley research program in formal methods. The major goals of this work are to make formal methods practical for use on life critical systems, and to orchestrate the transfer of this technology to U.S. industry through use of carefully designed demonstration projects. Several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of five NASA civil servants and contractors from Odyssey Research Associates, SRI International, and VIGYAN Inc.

  19. Summary of the Tandem Cylinder Solutions from the Benchmark Problems for Airframe Noise Computations-I Workshop

    NASA Technical Reports Server (NTRS)

    Lockard, David P.

    2011-01-01

    Fifteen submissions in the tandem cylinders category of the First Workshop on Benchmark problems for Airframe Noise Computations are summarized. Although the geometry is relatively simple, the problem involves complex physics. Researchers employed various block-structured, overset, unstructured and embedded Cartesian grid techniques and considerable computational resources to simulate the flow. The solutions are compared against each other and experimental data from 2 facilities. Overall, the simulations captured the gross features of the flow, but resolving all the details which would be necessary to compute the noise remains challenging. In particular, how to best simulate the effects of the experimental transition strip, and the associated high Reynolds number effects, was unclear. Furthermore, capturing the spanwise variation proved difficult.

  20. Robust Flutter Margin Analysis that Incorporates Flight Data

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Martin J.

    1998-01-01

    An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, mu, computes a stability margin that directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The mu margins are robust margins that indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 Systems Research Aircraft using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.

  1. Direction and Integration of Experimental Ground Test Capabilities and Computational Methods

    NASA Technical Reports Server (NTRS)

    Dunn, Steven C.

    2016-01-01

    This paper groups and summarizes the salient points and findings from two AIAA conference panels targeted at defining the direction, with associated key issues and recommendations, for the integration of experimental ground testing and computational methods. Each panel session utilized rapporteurs to capture comments from both the panel members and the audience. Additionally, a virtual panel of several experts were consulted between the two sessions and their comments were also captured. The information is organized into three time-based groupings, as well as by subject area. These panel sessions were designed to provide guidance to both researchers/developers and experimental/computational service providers in defining the future of ground testing, which will be inextricably integrated with the advancement of computational tools.

  2. Model of Procedure Usage – Results from a Qualitative Study to Inform Design of Computer-Based Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johanna H Oxstrand; Katya L Le Blanc

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts wemore » are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups, sharing procedures between fellow coworkers, the use of multiple procedures at once, etc. were considered. The model describes which affordances associated with paper based procedures should be transferred to computer-based procedures as well as what features should not be incorporated. The model also provides a means to identify what new features not present in paper based procedures need to be added to the computer-based procedures to further enhance performance. The next step is to use the requirements and specifications to develop concepts and prototypes of computer-based procedures. User tests and other data collection efforts will be conducted to ensure that the real issues with field procedures and their usage are being addressed and solved in the best manner possible. This paper describes the baseline study, the construction of the model of procedure use, and the requirements and specifications for computer-based procedures that were developed based on the model. It also addresses how the model and the insights gained from it were used to develop concepts and prototypes for computer based procedures.« less

  3. Research summary, January 1989 - June 1990

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established at NASA ARC in June of 1983. RIACS is privately operated by the Universities Space Research Association (USRA), a consortium of 62 universities with graduate programs in the aerospace sciences, under a Cooperative Agreement with NASA. RIACS serves as the representative of the USRA universities at ARC. This document reports our activities and accomplishments for the period 1 Jan. 1989 - 30 Jun. 1990. The following topics are covered: learning systems, networked systems, and parallel systems.

  4. Long non-coding RNAs and complex diseases: from experimental results to computational models.

    PubMed

    Chen, Xing; Yan, Chenggang Clarence; Zhang, Xu; You, Zhu-Hong

    2017-07-01

    LncRNAs have attracted lots of attentions from researchers worldwide in recent decades. With the rapid advances in both experimental technology and computational prediction algorithm, thousands of lncRNA have been identified in eukaryotic organisms ranging from nematodes to humans in the past few years. More and more research evidences have indicated that lncRNAs are involved in almost the whole life cycle of cells through different mechanisms and play important roles in many critical biological processes. Therefore, it is not surprising that the mutations and dysregulations of lncRNAs would contribute to the development of various human complex diseases. In this review, we first made a brief introduction about the functions of lncRNAs, five important lncRNA-related diseases, five critical disease-related lncRNAs and some important publicly available lncRNA-related databases about sequence, expression, function, etc. Nowadays, only a limited number of lncRNAs have been experimentally reported to be related to human diseases. Therefore, analyzing available lncRNA-disease associations and predicting potential human lncRNA-disease associations have become important tasks of bioinformatics, which would benefit human complex diseases mechanism understanding at lncRNA level, disease biomarker detection and disease diagnosis, treatment, prognosis and prevention. Furthermore, we introduced some state-of-the-art computational models, which could be effectively used to identify disease-related lncRNAs on a large scale and select the most promising disease-related lncRNAs for experimental validation. We also analyzed the limitations of these models and discussed the future directions of developing computational models for lncRNA research. © The Author 2016. Published by Oxford University Press.

  5. Space plasma branch at NRL

    NASA Astrophysics Data System (ADS)

    The Naval Research Laboratory (Washington, D.C.) formed the Space Plasma Branch within its Plasma Physics Division on July 1. Vithal Patel, former Program Director of Magnetospheric Physics, National Science Foundation, also joined NRL on the same date as Associate Superintendent of the Plasma Physics Division. Barret Ripin is head of the newly organized branch. The Space Plasma branch will do basic and applied space plasma research using a multidisciplinary approach. It consolidates traditional rocket and satellite space experiments, space plasma theory and computation, with laboratory space-related experiments. About 40 research scientists, postdoctoral fellows, engineers, and technicians are divided among its five sections. The Theory and Computation sections are led by Joseph Huba and Joel Fedder, the Space Experiments section is led by Paul Rodriguez, and the Pharos Laser Facility and Laser Experiments sections are headed by Charles Manka and Jacob Grun.

  6. Continued research on selected parameters to minimize community annoyance from airplane noise

    NASA Technical Reports Server (NTRS)

    Frair, L.

    1981-01-01

    Results from continued research on selected parameters to minimize community annoyance from airport noise are reported. First, a review of the initial work on this problem is presented. Then the research focus is expanded by considering multiobjective optimization approaches for this problem. A multiobjective optimization algorithm review from the open literature is presented. This is followed by the multiobjective mathematical formulation for the problem of interest. A discussion of the appropriate solution algorithm for the multiobjective formulation is conducted. Alternate formulations and associated solution algorithms are discussed and evaluated for this airport noise problem. Selected solution algorithms that have been implemented are then used to produce computational results for example airports. These computations involved finding the optimal operating scenario for a moderate size airport and a series of sensitivity analyses for a smaller example airport.

  7. Digital Screen Media and Cognitive Development.

    PubMed

    Anderson, Daniel R; Subrahmanyam, Kaveri

    2017-11-01

    In this article, we examine the impact of digital screen devices, including television, on cognitive development. Although we know that young infants and toddlers are using touch screen devices, we know little about their comprehension of the content that they encounter on them. In contrast, research suggests that children begin to comprehend child-directed television starting at ∼2 years of age. The cognitive impact of these media depends on the age of the child, the kind of programming (educational programming versus programming produced for adults), the social context of viewing, as well the particular kind of interactive media (eg, computer games). For children <2 years old, television viewing has mostly negative associations, especially for language and executive function. For preschool-aged children, television viewing has been found to have both positive and negative outcomes, and a large body of research suggests that educational television has a positive impact on cognitive development. Beyond the preschool years, children mostly consume entertainment programming, and cognitive outcomes are not well explored in research. The use of computer games as well as educational computer programs can lead to gains in academically relevant content and other cognitive skills. This article concludes by identifying topics and goals for future research and provides recommendations based on current research-based knowledge. Copyright © 2017 by the American Academy of Pediatrics.

  8. Computer-facilitated rapid HIV testing in emergency care settings: provider and patient usability and acceptability.

    PubMed

    Spielberg, Freya; Kurth, Ann E; Severynen, Anneleen; Hsieh, Yu-Hsiang; Moring-Parris, Daniel; Mackenzie, Sara; Rothman, Richard

    2011-06-01

    Providers in emergency care settings (ECSs) often face barriers to expanded HIV testing. We undertook formative research to understand the potential utility of a computer tool, "CARE," to facilitate rapid HIV testing in ECSs. Computer tool usability and acceptability were assessed among 35 adult patients, and provider focus groups were held, in two ECSs in Washington State and Maryland. The computer tool was usable by patients of varying computer literacy. Patients appreciated the tool's privacy and lack of judgment and their ability to reflect on HIV risks and create risk reduction plans. Staff voiced concerns regarding ECS-based HIV testing generally, including resources for follow-up of newly diagnosed people. Computer-delivered HIV testing support was acceptable and usable among low-literacy populations in two ECSs. Such tools may help circumvent some practical barriers associated with routine HIV testing in busy settings though linkages to care will still be needed.

  9. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.

  10. Unobtrusive monitoring of computer interactions to detect cognitive status in elders.

    PubMed

    Jimison, Holly; Pavel, Misha; McKanna, James; Pavel, Jesse

    2004-09-01

    The U.S. has experienced a rapid growth in the use of computers by elders. E-mail, Web browsing, and computer games are among the most common routine activities for this group of users. In this paper, we describe techniques for unobtrusively monitoring naturally occurring computer interactions to detect sustained changes in cognitive performance. Researchers have demonstrated the importance of the early detection of cognitive decline. Users over the age of 75 are at risk for medically related cognitive problems and confusion, and early detection allows for more effective clinical intervention. In this paper, we present algorithms for inferring a user's cognitive performance using monitoring data from computer games and psychomotor measurements associated with keyboard entry and mouse movement. The inferences are then used to classify significant performance changes, and additionally, to adapt computer interfaces with tailored hints and assistance when needed. These methods were tested in a group of elders in a residential facility.

  11. PREFACE: 9th World Congress on Computational Mechanics and 4th Asian Pacific Congress on Computational Mechanics

    NASA Astrophysics Data System (ADS)

    Khalili, N.; Valliappan, S.; Li, Q.; Russell, A.

    2010-07-01

    The use for mathematical models of natural phenomena has underpinned science and engineering for centuries, but until the advent of modern computers and computational methods, the full utility of most of these models remained outside the reach of the engineering communities. Since World War II, advances in computational methods have transformed the way engineering and science is undertaken throughout the world. Today, theories of mechanics of solids and fluids, electromagnetism, heat transfer, plasma physics, and other scientific disciplines are implemented through computational methods in engineering analysis, design, manufacturing, and in studying broad classes of physical phenomena. The discipline concerned with the application of computational methods is now a key area of research, education, and application throughout the world. In the early 1980's, the International Association for Computational Mechanics (IACM) was founded to promote activities related to computational mechanics and has made impressive progress. The most important scientific event of IACM is the World Congress on Computational Mechanics. The first was held in Austin (USA) in 1986 and then in Stuttgart (Germany) in 1990, Chiba (Japan) in 1994, Buenos Aires (Argentina) in 1998, Vienna (Austria) in 2002, Beijing (China) in 2004, Los Angeles (USA) in 2006 and Venice, Italy; in 2008. The 9th World Congress on Computational Mechanics is held in conjunction with the 4th Asian Pacific Congress on Computational Mechanics under the auspices of Australian Association for Computational Mechanics (AACM), Asian Pacific Association for Computational Mechanics (APACM) and International Association for Computational Mechanics (IACM). The 1st Asian Pacific Congress was in Sydney (Australia) in 2001, then in Beijing (China) in 2004 and Kyoto (Japan) in 2007. The WCCM/APCOM 2010 publications consist of a printed book of abstracts given to delegates, along with 247 full length peer reviewed papers published with free access online in IOP Conference Series: Materials Science and Engineering. The editors acknowledge the help of the paper reviewers in maintaining a high standard of assessment and the co-operation of the authors in complying with the requirements of the editors and the reviewers. We also would like to take this opportunity to thank the members of the Local Organising Committee and the International Scientific Committee for helping make WCCM/APCOM 2010 a successful event. We also thank The University of New South Wales, The University of Newcastle, the Centre for Infrastructure Engineering and Safety (CIES), IACM, APCAM, AACM for their financial support, along with the United States Association for Computational Mechanics for the Travel Awards made available. N. Khalili S. Valliappan Q. Li A. Russell 19 July 2010 Sydney, Australia

  12. An Amphibious Ship-To-Shore Simulation for Use on an IBM PC (Personal Computer)

    DTIC Science & Technology

    1984-09-01

    CA : «< <- j Special ■ *- amphibious ship- an IBM Personal ion of the phy- he logic used analysis, and a DD | JAM 11 1473 COITION...research, for instance, wiL1 be geared toward a technically oriented person who is familiar with computers, programming and the associated logic. A...problem, often vaguely stated by the decision aaker , into precise and operational terms [Ref. Hz p.51]. The analysis begins with specification of the

  13. Productivity enhancement planning using participative management concepts

    NASA Technical Reports Server (NTRS)

    White, M. E.; Kukla, J. C.

    1985-01-01

    A productivity enhancement project which used participative management for both planning and implementation is described. The process and results associated with using participative management to plan and implement a computer terminal upgrade project where the computer terminals are used by research and development (R&D) personnel are reported. The upgrade improved the productivity of R&D personnel substantially, and their commitment of the implementation is high. Successful utilization of participative management for this project has laid a foundation for continued style shift toward participation within the organization.

  14. A Research Program in Computer Technology. 1982 Annual Technical Report

    DTIC Science & Technology

    1983-03-01

    for the Defense Advanced Research Projects Agency. The research applies computer science and technology to areas of high DoD/ military impact. The ISI...implement the plan; New Computing Environment - investigation and adaptation of developing computer technologies to serve the research and military ...Computing Environment - ,.*_i;.;"’.)n and adaptation of developing computer technologies to serve the research and military tser communities; and Computer

  15. Parental education associations with children's body composition: mediation effects of energy balance-related behaviors within the ENERGY-project.

    PubMed

    Fernández-Alvira, Juan M; te Velde, Saskia J; De Bourdeaudhuij, Ilse; Bere, Elling; Manios, Yannis; Kovacs, Eva; Jan, Natasa; Brug, Johannes; Moreno, Luis A

    2013-06-21

    It is well known that the prevalence of overweight and obesity is considerably higher among youth from lower socio-economic families, but there is little information about the role of some energy balance-related behaviors in the association between socio-economic status and childhood overweight and obesity. The objective of this paper was to assess the possible mediation role of energy balance-related behaviors in the association between parental education and children's body composition. Data were obtained from the cross sectional study of the "EuropeaN Energy balance Research to prevent excessive weight Gain among Youth" (ENERGY) project. 2121 boys and 2516 girls aged 10 to 12 from Belgium, Greece, Hungary, the Netherlands, Norway, Slovenia and Spain were included in the analyses. Data were obtained via questionnaires assessing obesity related dietary, physical activity and sedentary behaviors and basic anthropometric objectively measured indicators (weight, height, waist circumference). The possible mediating effect of sugared drinks intake, breakfast consumption, active transportation to school, sports participation, TV viewing, computer use and sleep duration in the association between parental education and children's body composition was explored via MacKinnon's product-of-coefficients test in single and multiple mediation models. Two different body composition indicators were included in the models, namely Body Mass Index and waist circumference. The association between parental education and children's body composition was partially mediated by breakfast consumption, sports participation, TV viewing and computer use. Additionally, a suppression effect was found for sugared drinks intake. No mediation effect was found for active transportation and sleep duration. The significant mediators explained a higher proportion of the association between parental education and waist circumference compared to the association between parental education and BMI. Tailored overweight and obesity prevention strategies in low SES preadolescent populations should incorporate specific messages focusing on the importance of encouraging daily breakfast consumption, increasing sports participation and decreasing TV viewing and computer use. However, longitudinal research to support these findings is needed.

  16. EUROPLANET-RI modelling service for the planetary science community: European Modelling and Data Analysis Facility (EMDAF)

    NASA Astrophysics Data System (ADS)

    Khodachenko, Maxim; Miller, Steven; Stoeckler, Robert; Topf, Florian

    2010-05-01

    Computational modeling and observational data analysis are two major aspects of the modern scientific research. Both appear nowadays under extensive development and application. Many of the scientific goals of planetary space missions require robust models of planetary objects and environments as well as efficient data analysis algorithms, to predict conditions for mission planning and to interpret the experimental data. Europe has great strength in these areas, but it is insufficiently coordinated; individual groups, models, techniques and algorithms need to be coupled and integrated. Existing level of scientific cooperation and the technical capabilities for operative communication, allow considerable progress in the development of a distributed international Research Infrastructure (RI) which is based on the existing in Europe computational modelling and data analysis centers, providing the scientific community with dedicated services in the fields of their computational and data analysis expertise. These services will appear as a product of the collaborative communication and joint research efforts of the numerical and data analysis experts together with planetary scientists. The major goal of the EUROPLANET-RI / EMDAF is to make computational models and data analysis algorithms associated with particular national RIs and teams, as well as their outputs, more readily available to their potential user community and more tailored to scientific user requirements, without compromising front-line specialized research on model and data analysis algorithms development and software implementation. This objective will be met through four keys subdivisions/tasks of EMAF: 1) an Interactive Catalogue of Planetary Models; 2) a Distributed Planetary Modelling Laboratory; 3) a Distributed Data Analysis Laboratory, and 4) enabling Models and Routines for High Performance Computing Grids. Using the advantages of the coordinated operation and efficient communication between the involved computational modelling, research and data analysis expert teams and their related research infrastructures, EMDAF will provide a 1) flexible, 2) scientific user oriented, 3) continuously developing and fast upgrading computational and data analysis service to support and intensify the European planetary scientific research. At the beginning EMDAF will create a set of demonstrators and operational tests of this service in key areas of European planetary science. This work will aim at the following objectives: (a) Development and implementation of tools for distant interactive communication between the planetary scientists and computing experts (including related RIs); (b) Development of standard routine packages, and user-friendly interfaces for operation of the existing numerical codes and data analysis algorithms by the specialized planetary scientists; (c) Development of a prototype of numerical modelling services "on demand" for space missions and planetary researchers; (d) Development of a prototype of data analysis services "on demand" for space missions and planetary researchers; (e) Development of a prototype of coordinated interconnected simulations of planetary phenomena and objects (global multi-model simulators); (f) Providing the demonstrators of a coordinated use of high performance computing facilities (super-computer networks), done in cooperation with European HPC Grid DEISA.

  17. Social correlates of leisure-time sedentary behaviours in Canadian adults.

    PubMed

    Huffman, S; Szafron, M

    2017-03-01

    Research on the correlates of sedentary behaviour among adults is needed to design health interventions to modify this behaviour. This study explored the associations of social correlates with leisure-time sedentary behaviour of Canadian adults, and whether these associations differ between different types of sedentary behaviour. A sample of 12,021 Canadian adults was drawn from the 2012 Canadian Community Health Survey, and analyzed using binary logistic regression to model the relationships that marital status, the presence of children in the household, and social support have with overall time spent sitting, using a computer, playing video games, watching television, and reading during leisure time. Covariates included gender, age, education, income, employment status, perceived health, physical activity level, body mass index (BMI), and province or territory of residence. Extensive computer time was primarily negatively related to being in a common law relationship, and primarily positively related to being single/never married. Being single/never married was positively associated with extensive sitting time in men only. Having children under 12 in the household was protective against extensive video game and reading times. Increasing social support was negatively associated with extensive computer time in men and women, while among men increasing social support was positively associated with extensive sitting time. Computer, video game, television, and reading time have unique correlates among Canadian adults. Marital status, the presence of children in the household, and social support should be considered in future analyses of sedentary activities in adults.

  18. Impaired associative learning in schizophrenia: behavioral and computational studies

    PubMed Central

    Diwadkar, Vaibhav A.; Flaugher, Brad; Jones, Trevor; Zalányi, László; Ujfalussy, Balázs; Keshavan, Matcheri S.

    2008-01-01

    Associative learning is a central building block of human cognition and in large part depends on mechanisms of synaptic plasticity, memory capacity and fronto–hippocampal interactions. A disorder like schizophrenia is thought to be characterized by altered plasticity, and impaired frontal and hippocampal function. Understanding the expression of this dysfunction through appropriate experimental studies, and understanding the processes that may give rise to impaired behavior through biologically plausible computational models will help clarify the nature of these deficits. We present a preliminary computational model designed to capture learning dynamics in healthy control and schizophrenia subjects. Experimental data was collected on a spatial-object paired-associate learning task. The task evinces classic patterns of negatively accelerated learning in both healthy control subjects and patients, with patients demonstrating lower rates of learning than controls. Our rudimentary computational model of the task was based on biologically plausible assumptions, including the separation of dorsal/spatial and ventral/object visual streams, implementation of rules of learning, the explicit parameterization of learning rates (a plausible surrogate for synaptic plasticity), and learning capacity (a plausible surrogate for memory capacity). Reductions in learning dynamics in schizophrenia were well-modeled by reductions in learning rate and learning capacity. The synergy between experimental research and a detailed computational model of performance provides a framework within which to infer plausible biological bases of impaired learning dynamics in schizophrenia. PMID:19003486

  19. Revisiting Mathematics Manipulative Materials

    ERIC Educational Resources Information Center

    Swan, Paul; Marshall, Linda

    2010-01-01

    It is over 12 years since "APMC" published Bob Perry and Peter Howard's research on the use of mathematics manipulative materials in primary mathematics classrooms. Since then the availability of virtual manipulatives and associated access to computers and interactive whiteboards have caused educators to rethink the use of mathematics…

  20. ARC Collaborative Research Seminar Series

    Science.gov Websites

    been used to formulate design rules for hydration-based TES systems. Don Siegel is an Associate structural-acoustics, design of complex systems, and blast event simulations. Technology that he developed interests includes advanced fatigue and fracture assessment methodologies, computational methods for

  1. Provenance Usage in the OceanLink Project

    NASA Astrophysics Data System (ADS)

    Narock, T.; Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Cheatham, M.; Fils, D.; Finin, T.; Hitzler, P.; Janowicz, K.; Jones, M.; Krisnadhi, A.; Lehnert, K. A.; Mickle, A.; Raymond, L. M.; Schildhauer, M.; Shepherd, A.; Wiebe, P. H.

    2014-12-01

    A wide spectrum of maturing methods and tools, collectively characterized as the Semantic Web, is helping to vastly improve thedissemination of scientific research. The OceanLink project, an NSF EarthCube Building Block, is utilizing semantic technologies tointegrate geoscience data repositories, library holdings, conference abstracts, and funded research awards. Provenance is a vital componentin meeting both the scientific and engineering requirements of OceanLink. Provenance plays a key role in justification and understanding when presenting users with results aggregated from multiple sources. In the engineering sense, provenance enables the identification of new data and the ability to determine which data sources to query. Additionally, OceanLink will leverage human and machine computation for crowdsourcing, text mining, and co-reference resolution. The results of these computations, and their associated provenance, will be folded back into the constituent systems to continually enhance precision and utility. We will touch on the various roles provenance is playing in OceanLink as well as present our use of the PROV Ontology and associated Ontology Design Patterns.

  2. The Evidence and Conclusion Ontology (ECO): Supporting GO Annotations.

    PubMed

    Chibucos, Marcus C; Siegele, Deborah A; Hu, James C; Giglio, Michelle

    2017-01-01

    The Evidence and Conclusion Ontology (ECO) is a community resource for describing the various types of evidence that are generated during the course of a scientific study and which are typically used to support assertions made by researchers. ECO describes multiple evidence types, including evidence resulting from experimental (i.e., wet lab) techniques, evidence arising from computational methods, statements made by authors (whether or not supported by evidence), and inferences drawn by researchers curating the literature. In addition to summarizing the evidence that supports a particular assertion, ECO also offers a means to document whether a computer or a human performed the process of making the annotation. Incorporating ECO into an annotation system makes it possible to leverage the structure of the ontology such that associated data can be grouped hierarchically, users can select data associated with particular evidence types, and quality control pipelines can be optimized. Today, over 30 resources, including the Gene Ontology, use the Evidence and Conclusion Ontology to represent both evidence and how annotations are made.

  3. Swept-Wing Ice Accretion Characterization and Aerodynamics

    NASA Technical Reports Server (NTRS)

    Broeren, Andy P.; Potapczuk, Mark G.; Riley, James T.; Villedieu, Philippe; Moens, Frederic; Bragg, Michael B.

    2013-01-01

    NASA, FAA, ONERA, the University of Illinois and Boeing have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large-scale, three-dimensional swept wings. The overall goal is to improve the fidelity of experimental and computational simulation methods for swept-wing ice accretion formation and resulting aerodynamic effect. A seven-phase research effort has been designed that incorporates ice-accretion and aerodynamic experiments and computational simulations. As the baseline, full-scale, swept-wing-reference geometry, this research will utilize the 65% scale Common Research Model configuration. Ice-accretion testing will be conducted in the NASA Icing Research Tunnel for three hybrid swept-wing models representing the 20%, 64% and 83% semispan stations of the baseline-reference wing. Three-dimensional measurement techniques are being developed and validated to document the experimental ice-accretion geometries. Artificial ice shapes of varying geometric fidelity will be developed for aerodynamic testing over a large Reynolds number range in the ONERA F1 pressurized wind tunnel and in a smaller-scale atmospheric wind tunnel. Concurrent research will be conducted to explore and further develop the use of computational simulation tools for ice accretion and aerodynamics on swept wings. The combined results of this research effort will result in an improved understanding of the ice formation and aerodynamic effects on swept wings. The purpose of this paper is to describe this research effort in more detail and report on the current results and status to date. 1

  4. Swept-Wing Ice Accretion Characterization and Aerodynamics

    NASA Technical Reports Server (NTRS)

    Broeren, Andy P.; Potapczuk, Mark G.; Riley, James T.; Villedieu, Philippe; Moens, Frederic; Bragg, Michael B.

    2013-01-01

    NASA, FAA, ONERA, the University of Illinois and Boeing have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large-scale, three-dimensional swept wings. The overall goal is to improve the fidelity of experimental and computational simulation methods for swept-wing ice accretion formation and resulting aerodynamic effect. A seven-phase research effort has been designed that incorporates ice-accretion and aerodynamic experiments and computational simulations. As the baseline, full-scale, swept-wing-reference geometry, this research will utilize the 65 percent scale Common Research Model configuration. Ice-accretion testing will be conducted in the NASA Icing Research Tunnel for three hybrid swept-wing models representing the 20, 64 and 83 percent semispan stations of the baseline-reference wing. Threedimensional measurement techniques are being developed and validated to document the experimental ice-accretion geometries. Artificial ice shapes of varying geometric fidelity will be developed for aerodynamic testing over a large Reynolds number range in the ONERA F1 pressurized wind tunnel and in a smaller-scale atmospheric wind tunnel. Concurrent research will be conducted to explore and further develop the use of computational simulation tools for ice accretion and aerodynamics on swept wings. The combined results of this research effort will result in an improved understanding of the ice formation and aerodynamic effects on swept wings. The purpose of this paper is to describe this research effort in more detail and report on the current results and status to date.

  5. Telescience workstation

    NASA Technical Reports Server (NTRS)

    Brown, Robert L.; Doyle, Dee; Haines, Richard F.; Slocum, Michael

    1989-01-01

    As part of the Telescience Testbed Pilot Program, the Universities Space Research Association/ Research Institute for Advanced Computer Science (USRA/RIACS) proposed to support remote communication by providing a network of human/machine interfaces, computer resources, and experimental equipment which allows: remote science, collaboration, technical exchange, and multimedia communication. The telescience workstation is intended to provide a local computing environment for telescience. The purpose of the program are as follows: (1) to provide a suitable environment to integrate existing and new software for a telescience workstation; (2) to provide a suitable environment to develop new software in support of telescience activities; (3) to provide an interoperable environment so that a wide variety of workstations may be used in the telescience program; (4) to provide a supportive infrastructure and a common software base; and (5) to advance, apply, and evaluate the telescience technolgy base. A prototype telescience computing environment designed to bring practicing scientists in domains other than their computer science into a modern style of doing their computing was created and deployed. This environment, the Telescience Windowing Environment, Phase 1 (TeleWEn-1), met some, but not all of the goals stated above. The TeleWEn-1 provided a window-based workstation environment and a set of tools for text editing, document preparation, electronic mail, multimedia mail, raster manipulation, and system management.

  6. Health Literacy Assessment of the STOFHLA: Paper versus electronic administration continuation study.

    PubMed

    Chesser, Amy K; Keene Woods, Nikki; Wipperman, Jennifer; Wilson, Rachel; Dong, Frank

    2014-02-01

    Low health literacy is associated with poor health outcomes. Research is needed to understand the mechanisms and pathways of its effects. Computer-based assessment tools may improve efficiency and cost-effectiveness of health literacy research. The objective of this preliminary study was to assess if administration of the Short Test of Functional Health Literacy in Adults (STOFHLA) through a computer-based medium was comparable to the paper-based test in terms of accuracy and time to completion. A randomized, crossover design was used to compare computer versus paper format of the STOFHLA at a Midwestern family medicine residency program. Eighty participants were initially randomized to either computer (n = 42) or paper (n = 38) format of the STOFHLA. After a 30-day washout period, participants returned to complete the other version of the STOFHLA. Data analysis revealed no significant difference between paper- and computer-based surveys (p = .9401; N = 57). The majority of participants showed "adequate" health literacy via paper- and computer-based surveys (100% and 97% of participants, respectively). Electronic administration of STOFHLA results were equivalent to the paper administration results for evaluation of adult health literacy. Future investigations should focus on expanded populations in multiple health care settings and validation of other health literacy screening tools in a clinical setting.

  7. USAF Summer Faculty Research Program. 1981 Research Reports. Volume I.

    DTIC Science & Technology

    1981-10-01

    Kent, OH 44242 (216) 672-2816 Dr. Martin D. Altschuler Degree: PhD, Physics and Astronomy, 1964 Associate Professor Specialty: Robot Vision, Surface...line inspection and control, computer- aided manufacturing, robot vision, mapping of machine parts and castings, etc. The technique we developed...posture, reduced healing time and bacteria level, and improved capacity for work endurance and efficiency. 1 ,2𔃽 Federal agencies, such as the FDA and

  8. Indirect Costs of Health Research--How They are Computed, What Actions are Needed. Report by the Comptroller General of the United States.

    ERIC Educational Resources Information Center

    Comptroller General of the U.S., Washington, DC.

    A review by the General Accounting Office of various aspects of indirect costs associated with federal health research grants is presented. After an introduction detailing the scope of the review and defining indirect costs and federal participation, the report focuses on the causes of the rapid increase of indirect costs. Among findings was that…

  9. USAF Summer Research Program - 1993 Summer Research Extension Program Final Reports, Volume 1B, Armstrong Laboratory

    DTIC Science & Technology

    1994-11-01

    For example, the Collimating scotopic components of the ERG flash response are significantly attenuated by retinitis pigmentosa [7]. It is possible... RETINAL DAMAGE Bernard S. Gerstman Associate Professor Department of Physics Florida International University University Park Miami, FL 33199 Final...and Florida International University April 1994 6-1 A COMPUTATIONAL THERMAL MODEL AND THEORETICAL THERMODYNAMIC MODEL OF LASER INDUCED RETINAL DAMAGE

  10. The Air Force Interactive Meteorological System: A Research Tool for Satellite Meteorology

    DTIC Science & Technology

    1992-12-02

    NFARnet itself is a subnet to the global computer network INTERNET that links nearly all U.S. government research facilities and universi- ties along...required input to a generalized mathematical solution to the satellite/earth coordinate transform used for earth location of GOES sensor data. A direct...capability also exists to convert absolute coordinates to relative coordinates for transformations associated with gridded fields. 3. Spatial objective

  11. Survey of Ergonomics Databases in Member Countries of the International Ergonomics Association.

    DTIC Science & Technology

    1986-07-01

    Psychology Salud y Trabajo Journal of Auditory Research Scandinavian Journal of Psychology Journal of Consumer Research Scandinavian Journal of Rehabilitation...Biological Engineering and Computing World Textile Abstracts Medicina del Lavoro Medicina y Segundad del Trabajo Zeitschrift fi~r Arbeituwisenchaft Mens en ...sources of data will be misinterpreted or missed en - applicability" in selecting subject areas, many data found in tirely. Thus the effectiveness of

  12. The cyber threat landscape: Challenges and future research directions

    NASA Astrophysics Data System (ADS)

    Gil, Santiago; Kott, Alexander; Barabási, Albert-László

    2014-07-01

    While much attention has been paid to the vulnerability of computer networks to node and link failure, there is limited systematic understanding of the factors that determine the likelihood that a node (computer) is compromised. We therefore collect threat log data in a university network to study the patterns of threat activity for individual hosts. We relate this information to the properties of each host as observed through network-wide scans, establishing associations between the network services a host is running and the kinds of threats to which it is susceptible. We propose a methodology to associate services to threats inspired by the tools used in genetics to identify statistical associations between mutations and diseases. The proposed approach allows us to determine probabilities of infection directly from observation, offering an automated high-throughput strategy to develop comprehensive metrics for cyber-security.

  13. Computers in aeronautics and space research at the Lewis Research Center

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This brochure presents a general discussion of the role of computers in aerospace research at NASA's Lewis Research Center (LeRC). Four particular areas of computer applications are addressed: computer modeling and simulation, computer assisted engineering, data acquisition and analysis, and computer controlled testing.

  14. An Incremental High-Utility Mining Algorithm with Transaction Insertion

    PubMed Central

    Gan, Wensheng; Zhang, Binbin

    2015-01-01

    Association-rule mining is commonly used to discover useful and meaningful patterns from a very large database. It only considers the occurrence frequencies of items to reveal the relationships among itemsets. Traditional association-rule mining is, however, not suitable in real-world applications since the purchased items from a customer may have various factors, such as profit or quantity. High-utility mining was designed to solve the limitations of association-rule mining by considering both the quantity and profit measures. Most algorithms of high-utility mining are designed to handle the static database. Fewer researches handle the dynamic high-utility mining with transaction insertion, thus requiring the computations of database rescan and combination explosion of pattern-growth mechanism. In this paper, an efficient incremental algorithm with transaction insertion is designed to reduce computations without candidate generation based on the utility-list structures. The enumeration tree and the relationships between 2-itemsets are also adopted in the proposed algorithm to speed up the computations. Several experiments are conducted to show the performance of the proposed algorithm in terms of runtime, memory consumption, and number of generated patterns. PMID:25811038

  15. Dynamics of co-authorship and productivity across different fields of scientific research.

    PubMed

    Parish, Austin J; Boyack, Kevin W; Ioannidis, John P A

    2018-01-01

    We aimed to assess which factors correlate with collaborative behavior and whether such behavior associates with scientific impact (citations and becoming a principal investigator). We used the R index which is defined for each author as log(Np)/log(I1), where I1 is the number of co-authors who appear in at least I1 papers written by that author and Np are his/her total papers. Higher R means lower collaborative behavior, i.e. not working much with others, or not collaborating repeatedly with the same co-authors. Across 249,054 researchers who had published ≥30 papers in 2000-2015 but had not published anything before 2000, R varied across scientific fields. Lower values of R (more collaboration) were seen in physics, medicine, infectious disease and brain sciences and higher values of R were seen for social science, computer science and engineering. Among the 9,314 most productive researchers already reaching Np ≥ 30 and I1 ≥ 4 by the end of 2006, R mostly remained stable for most fields from 2006 to 2015 with small increases seen in physics, chemistry, and medicine. Both US-based authorship and male gender were associated with higher values of R (lower collaboration), although the effect was small. Lower values of R (more collaboration) were associated with higher citation impact (h-index), and the effect was stronger in certain fields (physics, medicine, engineering, health sciences) than in others (brain sciences, computer science, infectious disease, chemistry). Finally, for a subset of 400 U.S. researchers in medicine, infectious disease and brain sciences, higher R (lower collaboration) was associated with a higher chance of being a principal investigator by 2016. Our analysis maps the patterns and evolution of collaborative behavior across scientific disciplines.

  16. Community-driven computational biology with Debian Linux

    PubMed Central

    2010-01-01

    Background The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. Results The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Conclusions Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers. PMID:21210984

  17. Enrichment of Human-Computer Interaction in Brain-Computer Interfaces via Virtual Environments

    PubMed Central

    Víctor Rodrigo, Mercado-García

    2017-01-01

    Tridimensional representations stimulate cognitive processes that are the core and foundation of human-computer interaction (HCI). Those cognitive processes take place while a user navigates and explores a virtual environment (VE) and are mainly related to spatial memory storage, attention, and perception. VEs have many distinctive features (e.g., involvement, immersion, and presence) that can significantly improve HCI in highly demanding and interactive systems such as brain-computer interfaces (BCI). BCI is as a nonmuscular communication channel that attempts to reestablish the interaction between an individual and his/her environment. Although BCI research started in the sixties, this technology is not efficient or reliable yet for everyone at any time. Over the past few years, researchers have argued that main BCI flaws could be associated with HCI issues. The evidence presented thus far shows that VEs can (1) set out working environmental conditions, (2) maximize the efficiency of BCI control panels, (3) implement navigation systems based not only on user intentions but also on user emotions, and (4) regulate user mental state to increase the differentiation between control and noncontrol modalities. PMID:29317861

  18. The OSG Open Facility: an on-ramp for opportunistic scientific computing

    NASA Astrophysics Data System (ADS)

    Jayatilaka, B.; Levshina, T.; Sehgal, C.; Gardner, R.; Rynge, M.; Würthwein, F.

    2017-10-01

    The Open Science Grid (OSG) is a large, robust computing grid that started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero, but has evolved in recent years to a much larger user and resource platform. In addition to meeting the US LHC community’s computational needs, the OSG continues to be one of the largest providers of distributed high-throughput computing (DHTC) to researchers from a wide variety of disciplines via the OSG Open Facility. The Open Facility consists of OSG resources that are available opportunistically to users other than resource owners and their collaborators. In the past two years, the Open Facility has doubled its annual throughput to over 200 million wall hours. More than half of these resources are used by over 100 individual researchers from over 60 institutions in fields such as biology, medicine, math, economics, and many others. Over 10% of these individual users utilized in excess of 1 million computational hours each in the past year. The largest source of these cycles is temporary unused capacity at institutions affiliated with US LHC computational sites. An increasing fraction, however, comes from university HPC clusters and large national infrastructure supercomputers offering unused capacity. Such expansions have allowed the OSG to provide ample computational resources to both individual researchers and small groups as well as sizable international science collaborations such as LIGO, AMS, IceCube, and sPHENIX. Opening up access to the Fermilab FabrIc for Frontier Experiments (FIFE) project has also allowed experiments such as mu2e and NOvA to make substantial use of Open Facility resources, the former with over 40 million wall hours in a year. We present how this expansion was accomplished as well as future plans for keeping the OSG Open Facility at the forefront of enabling scientific research by way of DHTC.

  19. Aeroelasticity Benchmark Assessment: Subsonic Fixed Wing Program

    NASA Technical Reports Server (NTRS)

    Florance, Jennifer P.; Chwalowski, Pawel; Wieseman, Carol D.

    2010-01-01

    The fundamental technical challenge in computational aeroelasticity is the accurate prediction of unsteady aerodynamic phenomena and the effect on the aeroelastic response of a vehicle. Currently, a benchmarking standard for use in validating the accuracy of computational aeroelasticity codes does not exist. Many aeroelastic data sets have been obtained in wind-tunnel and flight testing throughout the world; however, none have been globally presented or accepted as an ideal data set. There are numerous reasons for this. One reason is that often, such aeroelastic data sets focus on the aeroelastic phenomena alone (flutter, for example) and do not contain associated information such as unsteady pressures and time-correlated structural dynamic deflections. Other available data sets focus solely on the unsteady pressures and do not address the aeroelastic phenomena. Other discrepancies can include omission of relevant data, such as flutter frequency and / or the acquisition of only qualitative deflection data. In addition to these content deficiencies, all of the available data sets present both experimental and computational technical challenges. Experimental issues include facility influences, nonlinearities beyond those being modeled, and data processing. From the computational perspective, technical challenges include modeling geometric complexities, coupling between the flow and the structure, grid issues, and boundary conditions. The Aeroelasticity Benchmark Assessment task seeks to examine the existing potential experimental data sets and ultimately choose the one that is viewed as the most suitable for computational benchmarking. An initial computational evaluation of that configuration will then be performed using the Langley-developed computational fluid dynamics (CFD) software FUN3D1 as part of its code validation process. In addition to the benchmarking activity, this task also includes an examination of future research directions. Researchers within the Aeroelasticity Branch will examine other experimental efforts within the Subsonic Fixed Wing (SFW) program (such as testing of the NASA Common Research Model (CRM)) and other NASA programs and assess aeroelasticity issues and research topics.

  20. The OSG Open Facility: An On-Ramp for Opportunistic Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jayatilaka, B.; Levshina, T.; Sehgal, C.

    The Open Science Grid (OSG) is a large, robust computing grid that started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero, but has evolved in recent years to a much larger user and resource platform. In addition to meeting the US LHC community’s computational needs, the OSG continues to be one of the largest providers of distributed high-throughput computing (DHTC) to researchers from a wide variety of disciplines via the OSG Open Facility. The Open Facility consists of OSG resources that are available opportunistically to users other than resource ownersmore » and their collaborators. In the past two years, the Open Facility has doubled its annual throughput to over 200 million wall hours. More than half of these resources are used by over 100 individual researchers from over 60 institutions in fields such as biology, medicine, math, economics, and many others. Over 10% of these individual users utilized in excess of 1 million computational hours each in the past year. The largest source of these cycles is temporary unused capacity at institutions affiliated with US LHC computational sites. An increasing fraction, however, comes from university HPC clusters and large national infrastructure supercomputers offering unused capacity. Such expansions have allowed the OSG to provide ample computational resources to both individual researchers and small groups as well as sizable international science collaborations such as LIGO, AMS, IceCube, and sPHENIX. Opening up access to the Fermilab FabrIc for Frontier Experiments (FIFE) project has also allowed experiments such as mu2e and NOvA to make substantial use of Open Facility resources, the former with over 40 million wall hours in a year. We present how this expansion was accomplished as well as future plans for keeping the OSG Open Facility at the forefront of enabling scientific research by way of DHTC.« less

  1. Hydrodynamic Analyses and Evaluation of New Fluid Film Bearing Concepts

    NASA Technical Reports Server (NTRS)

    Keith, Theo G., Jr.; Dimofte, Florin

    1998-01-01

    Over the past several years, numerical and experimental investigations have been performed on a waved journal bearing. The research work was undertaken by Dr. Florin Dimofte, a Senior Research Associate in the Mechanical Engineering Department at the University of Toledo. Dr. Theo Keith, Distinguished University Professor in the Mechanical Engineering Department was the Technical Coordinator of the project. The wave journal bearing is a bearing with a slight but precise variation in its circular profile such that a waved profile is circumscribed on the inner bearing diameter. The profile has a wave amplitude that is equal to a fraction of the bearing clearance. Prior to this period of research on the wave bearing, computer codes were written and an experimental facility was established. During this period of research considerable effort was directed towards the study of the bearing's stability. The previously developed computer codes and the experimental facility were of critical importance in performing this stability research. A collection of papers and reports were written to describe the results of this work. The attached captures that effort and represents the research output during the grant period.

  2. Center for Interface Science and Catalysis | Theory

    Science.gov Websites

    & Stanford School of Engineering Toggle navigation Home Research Publications People About Academic to overcome challenges associated with the atomic-scale design of catalysts for chemical computational methods we are developing a quantitative description of chemical processes at the solid-gas and

  3. Emergent Network Defense

    ERIC Educational Resources Information Center

    Crane, Earl Newell

    2013-01-01

    The research problem that inspired this effort is the challenge of managing the security of systems in large-scale heterogeneous networked environments. Human intervention is slow and limited: humans operate at much slower speeds than networked computer communications and there are few humans associated with each network. Enabling each node in the…

  4. 76 FR 23537 - Hass Avocado Promotion, Research, and Information Order; Importer Associations and Assessment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-27

    ... DEPARTMENT OF AGRICULTURE Agricultural Marketing Service [Document Number AMS-FV-10-0063] Hass... AGENCY: Agricultural Marketing Service, USDA. ACTION: Notice. This notice announces an updated... Agricultural Marketing Service at http://www.ams.usda.gov/FVPromotion . The updated computation will become...

  5. Modeling Code Is Helping Cleveland Develop New Products

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Master Builders, Inc., is a 350-person company in Cleveland, Ohio, that develops and markets specialty chemicals for the construction industry. Developing new products involves creating many potential samples and running numerous tests to characterize the samples' performance. Company engineers enlisted NASA's help to replace cumbersome physical testing with computer modeling of the samples' behavior. Since the NASA Lewis Research Center's Structures Division develops mathematical models and associated computation tools to analyze the deformation and failure of composite materials, its researchers began a two-phase effort to modify Lewis' Integrated Composite Analyzer (ICAN) software for Master Builders' use. Phase I has been completed, and Master Builders is pleased with the results. The company is now working to begin implementation of Phase II.

  6. Benefit from NASA

    NASA Image and Video Library

    1998-01-01

    Don Sirois, an Auburn University research associate, and Bruce Strom, a mechanical engineering Co-Op Student, are evaluating the dimensional characteristics of an aluminum automobile engine casting. More accurate metal casting processes may reduce the weight of some cast metal products used in automobiles, such as engines. Research in low gravity has taken an important first step toward making metal products used in homes, automobiles, and aircraft less expensive, safer, and more durable. Auburn University and industry are partnering with NASA to develop one of the first accurate computer model predictions of molten metals and molding materials used in a manufacturing process called casting. Ford Motor Company's casting plant in Cleveland, Ohio is using NASA-sponsored computer modeling information to improve the casting process of automobile and light-truck engine blocks.

  7. Improving Metal Casting Process

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Don Sirois, an Auburn University research associate, and Bruce Strom, a mechanical engineering Co-Op Student, are evaluating the dimensional characteristics of an aluminum automobile engine casting. More accurate metal casting processes may reduce the weight of some cast metal products used in automobiles, such as engines. Research in low gravity has taken an important first step toward making metal products used in homes, automobiles, and aircraft less expensive, safer, and more durable. Auburn University and industry are partnering with NASA to develop one of the first accurate computer model predictions of molten metals and molding materials used in a manufacturing process called casting. Ford Motor Company's casting plant in Cleveland, Ohio is using NASA-sponsored computer modeling information to improve the casting process of automobile and light-truck engine blocks.

  8. GPU-accelerated computing for Lagrangian coherent structures of multi-body gravitational regimes

    NASA Astrophysics Data System (ADS)

    Lin, Mingpei; Xu, Ming; Fu, Xiaoyu

    2017-04-01

    Based on a well-established theoretical foundation, Lagrangian Coherent Structures (LCSs) have elicited widespread research on the intrinsic structures of dynamical systems in many fields, including the field of astrodynamics. Although the application of LCSs in dynamical problems seems straightforward theoretically, its associated computational cost is prohibitive. We propose a block decomposition algorithm developed on Compute Unified Device Architecture (CUDA) platform for the computation of the LCSs of multi-body gravitational regimes. In order to take advantage of GPU's outstanding computing properties, such as Shared Memory, Constant Memory, and Zero-Copy, the algorithm utilizes a block decomposition strategy to facilitate computation of finite-time Lyapunov exponent (FTLE) fields of arbitrary size and timespan. Simulation results demonstrate that this GPU-based algorithm can satisfy double-precision accuracy requirements and greatly decrease the time needed to calculate final results, increasing speed by approximately 13 times. Additionally, this algorithm can be generalized to various large-scale computing problems, such as particle filters, constellation design, and Monte-Carlo simulation.

  9. High-Performance Computational Analysis of Glioblastoma Pathology Images with Database Support Identifies Molecular and Survival Correlates.

    PubMed

    Kong, Jun; Wang, Fusheng; Teodoro, George; Cooper, Lee; Moreno, Carlos S; Kurc, Tahsin; Pan, Tony; Saltz, Joel; Brat, Daniel

    2013-12-01

    In this paper, we present a novel framework for microscopic image analysis of nuclei, data management, and high performance computation to support translational research involving nuclear morphometry features, molecular data, and clinical outcomes. Our image analysis pipeline consists of nuclei segmentation and feature computation facilitated by high performance computing with coordinated execution in multi-core CPUs and Graphical Processor Units (GPUs). All data derived from image analysis are managed in a spatial relational database supporting highly efficient scientific queries. We applied our image analysis workflow to 159 glioblastomas (GBM) from The Cancer Genome Atlas dataset. With integrative studies, we found statistics of four specific nuclear features were significantly associated with patient survival. Additionally, we correlated nuclear features with molecular data and found interesting results that support pathologic domain knowledge. We found that Proneural subtype GBMs had the smallest mean of nuclear Eccentricity and the largest mean of nuclear Extent, and MinorAxisLength. We also found gene expressions of stem cell marker MYC and cell proliferation maker MKI67 were correlated with nuclear features. To complement and inform pathologists of relevant diagnostic features, we queried the most representative nuclear instances from each patient population based on genetic and transcriptional classes. Our results demonstrate that specific nuclear features carry prognostic significance and associations with transcriptional and genetic classes, highlighting the potential of high throughput pathology image analysis as a complementary approach to human-based review and translational research.

  10. The Effects of Linear Microphone Array Changes on Computed Sound Exposure Level Footprints

    NASA Technical Reports Server (NTRS)

    Mueller, Arnold W.; Wilson, Mark R.

    1997-01-01

    Airport land planning commissions often are faced with determining how much area around an airport is affected by the sound exposure levels (SELS) associated with helicopter operations. This paper presents a study of the effects changing the size and composition of a microphone array has on the computed SEL contour (ground footprint) areas used by such commissions. Descent flight acoustic data measured by a fifteen microphone array were reprocessed for five different combinations of microphones within this array. This resulted in data for six different arrays for which SEL contours were computed. The fifteen microphone array was defined as the 'baseline' array since it contained the greatest amount of data. The computations used a newly developed technique, the Acoustic Re-propagation Technique (ART), which uses parts of the NASA noise prediction program ROTONET. After the areas of the SEL contours were calculated the differences between the areas were determined. The area differences for the six arrays are presented that show a five and a three microphone array (with spacing typical of that required by the FAA FAR Part 36 noise certification procedure) compare well with the fifteen microphone array. All data were obtained from a database resulting from a joint project conducted by NASA and U.S. Army researchers at Langley and Ames Research Centers. A brief description of the joint project test design, microphone array set-up, and data reduction methodology associated with the database are discussed.

  11. The EPA Comptox Chemistry Dashboard: A Web-Based Data ...

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data driven approaches that integrate chemistry, exposure and biological data. As an outcome of these efforts the National Center for Computational Toxicology (NCCT) has measured, assembled and delivered an enormous quantity and diversity of data for the environmental sciences including high-throughput in vitro screening data, in vivo and functional use data, exposure models and chemical databases with associated properties. A series of software applications and databases have been produced over the past decade to deliver these data but recent developments have focused on the development of a new software architecture that assembles the resources into a single platform. A new web application, the CompTox Chemistry Dashboard provides access to data associated with ~720,000 chemical substances. These data include experimental and predicted physicochemical property data, bioassay screening data associated with the ToxCast program, product and functional use information and a myriad of related data of value to environmental scientists. The dashboard provides chemical-based searching based on chemical names, synonyms and CAS Registry Numbers. Flexible search capabilities allow for chemical identificati

  12. American Association of Orthodontists Foundation Craniofacial Growth Legacy Collection: Overview of a powerful tool for orthodontic research and teaching.

    PubMed

    Baumrind, Sheldon; Curry, Sean

    2015-08-01

    This article reports on the current status of the American Association of Orthodontists Foundation (AAOF) Craniofacial Growth Legacy Collection--an AAOF-supported multi-institutional project that uses the Internet and cloud computing to collect and share craniofacial images and data for orthodontic research and education. The project gives investigators and clinicians all over the world online access to longitudinal information on craniofacial development in untreated children with malocclusions of various types. It also is a unique source of control samples for testing the validity of consensually accepted beliefs about the effects of orthodontic treatment or of failure to treat. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  13. Physical Medicine and Rehabilitation Resident Use of iPad Mini Mobile Devices.

    PubMed

    Niehaus, William; Boimbo, Sandra; Akuthota, Venu

    2015-05-01

    Previous research on the use of tablet devices in residency programs has been undertaken in radiology and medicine or with standard-sized tablet devices. With new, smaller tablet devices, there is an opportunity to assess their effect on resident behavior. This prospective study attempts to evaluate resident behavior after receiving a smaller tablet device. To evaluate whether smaller tablet computers facilitate residents' daily tasks. Prospective study that administered surveys to evaluate tablet computer use. Residency program. Thirteen physical medicine and rehabilitation residents. Residents were provided 16-GB iPad Minis and surveyed with Redcap to collect usage information at baseline, 3, and 6 months. Survey analysis was conducted using SAS (SAS, Cary, NC) for descriptive analysis. To evaluate multiple areas of resident education, the following tasks were selected: accessing e-mail, logging duty hours, logging procedures, researching clinical information, accessing medical journals, reviewing didactic presentations, and completing evaluations. Then, measurements were taken of: (1) residents' response to how tablet computers made it easier to access the aforementioned tasks; and (2) residents' response to how tablet computers affected the frequency they performed the aforementioned tasks. After being provided tablet computers, our physical medicine and rehabilitation residents reported significantly greater access to e-mail, medical journals, and didactic material. Also, receiving tablet computers was reported to increase the frequency that residents accessed e-mail, researched clinical information, accessed medical journals, reviewed didactic presentations, and completed evaluations. After receiving a tablet computer, residents reported an increase in the use of calendar programs, note-taking programs, PDF readers, online storage programs, and file organization programs. These physical medicine and rehabilitation residents reported tablet computers increased access to e-mail, presentation material, and medical journals. Tablet computers also were reported to increase the frequency residents were able to complete tasks associated with residency training. Copyright © 2015 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  14. Research | Computational Science | NREL

    Science.gov Websites

    Research Research NREL's computational science experts use advanced high-performance computing (HPC technologies, thereby accelerating the transformation of our nation's energy system. Enabling High-Impact Research NREL's computational science capabilities enable high-impact research. Some recent examples

  15. Designing Research Services: Cross-Disciplinary Administration and the Research Lifecycle

    NASA Astrophysics Data System (ADS)

    Madden, G.

    2017-12-01

    The sheer number of technical and administrative offices involved in the research lifecycle, and the lack of shared governance and shared processes across those offices, creates challenges to the successful preservation of research outputs. Universities need a more integrated approach to the research lifecycle that allows us to: recognize a research project as it is being initiated; identify the data associated with the research project; document and track any compliance, security, access, and publication requirements associated with the research and its data; follow the research and its associated components across the research lifecycle; and finally recognize that the research has come to a close so we can trigger the various preservation, access, and communications processes that close the loop, inform the public, and promote the continued progress of science. Such an approach will require cooperation, communications, and shared workflow tools that tie together (often across many years) PIs, research design methodologists, grants offices, contract negotiators, central research administrators, research compliance specialists, desktop IT support units, server administrators, high performance computing facilities, data centers, specialized data transfer networks, institutional research repositories, institutional data repositories, and research communications groups, all of which play a significant role in the technical or administrative success of research. This session will focus on progress towards improving cross-disciplinary administrative and technical cooperation at Penn State University, with an emphasis on generalizable approaches that can be adopted elsewhere.

  16. Chemical reacting flows

    NASA Technical Reports Server (NTRS)

    Mularz, Edward J.; Sockol, Peter M.

    1987-01-01

    Future aerospace propulsion concepts involve the combination of liquid or gaseous fuels in a highly turbulent internal air stream. Accurate predictive computer codes which can simulate the fluid mechanics, chemistry, and turbulence combustion interaction of these chemical reacting flows will be a new tool that is needed in the design of these future propulsion concepts. Experimental and code development research is being performed at Lewis to better understand chemical reacting flows with the long term goal of establishing these reliable computer codes. The approach to understanding chemical reacting flows is to look at separate simple parts of this complex phenomena as well as to study the full turbulent reacting flow process. As a result research on the fluid mechanics associated with chemical reacting flows was initiated. The chemistry of fuel-air combustion is also being studied. Finally, the phenomena of turbulence-combustion interaction is being investigated. This presentation will highlight research, both experimental and analytical, in each of these three major areas.

  17. Chemical reacting flows

    NASA Technical Reports Server (NTRS)

    Mularz, Edward J.; Sockol, Peter M.

    1990-01-01

    Future aerospace propulsion concepts involve the combustion of liquid or gaseous fuels in a highly turbulent internal airstream. Accurate predictive computer codes which can simulate the fluid mechanics, chemistry, and turbulence-combustion interaction of these chemical reacting flows will be a new tool that is needed in the design of these future propulsion concepts. Experimental and code development research is being performed at LeRC to better understand chemical reacting flows with the long-term goal of establishing these reliable computer codes. Our approach to understand chemical reacting flows is to look at separate, more simple parts of this complex phenomenon as well as to study the full turbulent reacting flow process. As a result, we are engaged in research on the fluid mechanics associated with chemical reacting flows. We are also studying the chemistry of fuel-air combustion. Finally, we are investigating the phenomenon of turbulence-combustion interaction. Research, both experimental and analytical, is highlighted in each of these three major areas.

  18. Coi-wiz: An interactive computer wizard for analyzing cardiac optical signals.

    PubMed

    Yuan, Xiaojing; Uyanik, Ilyas; Situ, Ning; Xi, Yutao; Cheng, Jie

    2009-01-01

    A number of revolutionary techniques have been developed for cardiac electrophysiology research to better study the various arrhythmia mechanisms that can enhance ablating strategies for cardiac arrhythmias. Once the three-dimensional high resolution cardiac optical imaging data is acquired, it is time consuming to manually go through them and try to identify the patterns associated with various arrhythmia symptoms. In this paper, we present an interactive computer wizard that helps cardiac electrophysiology researchers to visualize and analyze the high resolution cardiac optical imaging data. The wizard provides a file interface that accommodates different file formats. A series of analysis algorithms output waveforms, activation and action potential maps after spatial and temporal filtering, velocity field and heterogeneity measure. The interactive GUI allows the researcher to identify the region of interest in both the spatial and temporal domain, thus enabling them to study different heart chamber at their choice.

  19. Extracting Depth From Motion Parallax in Real-World and Synthetic Displays

    NASA Technical Reports Server (NTRS)

    Hecht, Heiko; Kaiser, Mary K.; Aiken, William; Null, Cynthia H. (Technical Monitor)

    1994-01-01

    In psychophysical studies on human sensitivity to visual motion parallax (MP), the use of computer displays is pervasive. However, a number of potential problems are associated with such displays: cue conflicts arise when observers accommodate to the screen surface, and observer head and body movements are often not reflected in the displays. We investigated observers' sensitivity to depth information in MP (slant, depth order, relative depth) using various real-world displays and their computer-generated analogs. Angle judgments of real-world stimuli were consistently superior to judgments that were based on computer-generated stimuli. Similar results were found for perceived depth order and relative depth. Perceptual competence of observers tends to be underestimated in research that is based on computer generated displays. Such findings cannot be generalized to more realistic viewing situations.

  20. Proceedings of Selected Research Paper Presentations at the Convention of the Association for Educational Communications and Technology and Sponsored by the Research and Theory Division (11th, Dallas, Texas, February 1-5, 1989).

    ERIC Educational Resources Information Center

    Simonson, Michael R., Ed.; Frey, Diane, Ed.

    1989-01-01

    The 46 papers is this volume represent some of the most current thinking in educational communications and technology. Individual papers address the following topics: gender differences in the selection of elective computer science courses and in the selection of non-traditional careers; instruction for individuals with different cognitive styles;…

  1. An Interrogative Model of Computer-Aided Adaptive Testing: Some Experimental Evidence

    DTIC Science & Technology

    1988-09-01

    Ahilitfas 2 Final 3g zj, research report, Office of Naval Research, Arlington, VA, June 1986. Brovn, 3. S. and Harris, a., " Artificial Intelligence and...Building an Intellegent Tutoring System," in Methods and Tactics in Cggnitive Science (Rds. Kintsch, Miller, and Poison), Lavrence Zrlbaum Associates...Education, Washington, DC, November 1984. 89 -7- In SIvasankaran, T. R. and Bul, Tung X., "A Bayesian Diagnostic Model for Intellegent CAI Systems

  2. USAF Summer Research Program - 1993 Graduate Student Research Program Final Reports, Volume 6, AEDC, FJSRL and WHMC

    DTIC Science & Technology

    1993-12-01

    Mechanical Engineering Associate, PhD Laboratory: PL/VT Division Engineering University of Texas, San Anton Vol-Page No: 3-26 San Antonio, TX 7824-9065...parameters. The modules can be primitive or compound. Primitive modules represent the elementary computation units and define their interfaces. The... linear under varying conditions for the range of processor numbers. Discussion Performance: Our evaluation of the performance measurement results is the

  3. Analytic and Computational Studies on Micro-Propulsion and Micro-detonics

    DTIC Science & Technology

    2006-08-22

    Professor. • Dr. Aslan Kasimov, PostDoctoral Research Associate (Stewart), May 2004- June 2005. • Mr. Aslan Kasimov, Graduate Student (Stewart...Short continuing as his Ph.D. advisor. 2. Research completed with AFOSR support (a ) Reseach summary: Prof. D.S. Stewart (1) A.R. Kasimov and D.S...Theory of Instability and Nonlinear Evolution of Self-Sustained Detonation Waves”. Ph.D., Spring 2004. • Mr. Dave Kessler, Graduate Student (Short

  4. The Unified Medical Language System

    PubMed Central

    Humphreys, Betsy L.; Lindberg, Donald A. B.; Schoolman, Harold M.; Barnett, G. Octo

    1998-01-01

    In 1986, the National Library of Medicine (NLM) assembled a large multidisciplinary, multisite team to work on the Unified Medical Language System (UMLS), a collaborative research project aimed at reducing fundamental barriers to the application of computers to medicine. Beyond its tangible products, the UMLS Knowledge Sources, and its influence on the field of informatics, the UMLS project is an interesting case study in collaborative research and development. It illustrates the strengths and challenges of substantive collaboration among widely distributed research groups. Over the past decade, advances in computing and communications have minimized the technical difficulties associated with UMLS collaboration and also facilitated the development, dissemination, and use of the UMLS Knowledge Sources. The spread of the World Wide Web has increased the visibility of the information access problems caused by multiple vocabularies and many information sources which are the focus of UMLS work. The time is propitious for building on UMLS accomplishments and making more progress on the informatics research issues first highlighted by the UMLS project more than 10 years ago. PMID:9452981

  5. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study

    PubMed Central

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were “beeped” several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research. PMID:28487664

  6. TARGET Research Goals

    Cancer.gov

    TARGET researchers use various sequencing and array-based methods to examine the genomes, transcriptomes, and for some diseases epigenomes of select childhood cancers. This “multi-omic” approach generates a comprehensive profile of molecular alterations for each cancer type. Alterations are changes in DNA or RNA, such as rearrangements in chromosome structure or variations in gene expression, respectively. Through computational analyses and assays to validate biological function, TARGET researchers predict which alterations disrupt the function of a gene or pathway and promote cancer growth, progression, and/or survival. Researchers identify candidate therapeutic targets and/or prognostic markers from the cancer-associated alterations.

  7. Revisiting Cognitive Tools: Shifting the Focus to Tools-in-Use

    ERIC Educational Resources Information Center

    Kim, Minchi C.

    2012-01-01

    Many studies have been conducted on the topics of tools, computers, and technology designed to promote student learning. However, researchers have rarely raised the critical issue of the lack of consensus on conceptualizing "cognitive tools" or the possible challenges associated with previous definitions. By examining the limitations on research…

  8. Delta Pi Epsilon National Research Conference Proceedings (Philadelphia, Pennsylvania, November 10-12, 1994).

    ERIC Educational Resources Information Center

    Delta Pi Epsilon Society, Little Rock, AR.

    Selected papers are as follows: "Are Office Support Personnel Aware of the Ergonomical Issues Associated with Computer Keyboarding?" (Evans); "Background and Characteristics of Japanese Students Who Enroll in an American Two-Year Information Processing Program Taught in Japan" (Morgan, Wiggs); "Business Education's (BE)…

  9. Intersectional Computer-Supported Collaboration in Business Writing: Learning through Challenged Performance

    ERIC Educational Resources Information Center

    Remley, Dirk

    2009-01-01

    Carter (2007) identifies four meta-genres associated with writing activities that can help students learn discipline-specific writing skills relative to standards within a given field: these include problem solving, empirical approaches to analysis, selection of sources to use within research, and production of materials that meet accepted…

  10. Electronic Journals in Academic Libraries: A Comparison of ARL and Non-ARL Libraries.

    ERIC Educational Resources Information Center

    Shemberg, Marian; Grossman, Cheryl

    1999-01-01

    Describes a survey dealing with academic library provision of electronic journals and other electronic resources that compared ARL (Association of Research Libraries) members to non-ARL members. Highlights include full-text electronic journals; computers in libraries; online public access catalogs; interlibrary loan and electronic reserves; access…

  11. Information Competency and Creative Initiative of Personality and Their Manifestation in Activity

    ERIC Educational Resources Information Center

    Tabachuk, Natalia P.; Ledovskikh, Irina A.; Shulika, Nadezhda A.; Karpova, Irina V.; Kazinets, Victor A.; Polichka, Anatolii E.

    2018-01-01

    The relevance of the research is due to the global trends of development of the information society that are associated with the rapid advancement of civilization (IT penetration, increased computer availability, variability) and innovation processes in the sphere of education (competency-based approach, humanization and humanitarization). These…

  12. Annual Research Briefs - 2000: Center for Turbulence Research

    NASA Technical Reports Server (NTRS)

    2000-01-01

    This report contains the 2000 annual progress reports of the postdoctoral Fellows and visiting scholars of the Center for Turbulence Research (CTR). It summarizes the research efforts undertaken under the core CTR program. Last year, CTR sponsored sixteen resident Postdoctoral Fellows, nine Research Associates, and two Senior Research Fellows, hosted seven short term visitors, and supported four doctoral students. The Research Associates are supported by the Departments of Defense and Energy. The reports in this volume are divided into five groups. The first group largely consists of the new areas of interest at CTR. It includes efficient algorithms for molecular dynamics, stability in protoplanetary disks, and experimental and numerical applications of evolutionary optimization algorithms for jet flow control. The next group of reports is in experimental, theoretical, and numerical modeling efforts in turbulent combustion. As more challenging computations are attempted, the need for additional theoretical and experimental studies in combustion has emerged. A pacing item for computation of nonpremixed combustion is the prediction of extinction and re-ignition phenomena, which is currently being addressed at CTR. The third group of reports is in the development of accurate and efficient numerical methods, which has always been an important part of CTR's work. This is the tool development part of the program which supports our high fidelity numerical simulations in such areas as turbulence in complex geometries, hypersonics, and acoustics. The final two groups of reports are concerned with LES and RANS prediction methods. There has been significant progress in wall modeling for LES of high Reynolds number turbulence and in validation of the v(exp 2) - f model for industrial applications.

  13. Using the cloud to speed-up calibration of watershed-scale hydrologic models (Invited)

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Ercan, M. B.; Castronova, A. M.; Humphrey, M.; Beekwilder, N.; Steele, J.; Kim, I.

    2013-12-01

    This research focuses on using the cloud to address computational challenges associated with hydrologic modeling. One example is calibration of a watershed-scale hydrologic model, which can take days of execution time on typical computers. While parallel algorithms for model calibration exist and some researchers have used multi-core computers or clusters to run these algorithms, these solutions do not fully address the challenge because (i) calibration can still be too time consuming even on multicore personal computers and (ii) few in the community have the time and expertise needed to manage a compute cluster. Given this, another option for addressing this challenge that we are exploring through this work is the use of the cloud for speeding-up calibration of watershed-scale hydrologic models. The cloud used in this capacity provides a means for renting a specific number and type of machines for only the time needed to perform a calibration model run. The cloud allows one to precisely balance the duration of the calibration with the financial costs so that, if the budget allows, the calibration can be performed more quickly by renting more machines. Focusing specifically on the SWAT hydrologic model and a parallel version of the DDS calibration algorithm, we show significant speed-up time across a range of watershed sizes using up to 256 cores to perform a model calibration. The tool provides a simple web-based user interface and the ability to monitor the calibration job submission process during the calibration process. Finally this talk concludes with initial work to leverage the cloud for other tasks associated with hydrologic modeling including tasks related to preparing inputs for constructing place-based hydrologic models.

  14. Flow Field Analysis of Fully Coupled Computations of a Flexible Wing undergoing Stall Flutter

    DTIC Science & Technology

    2016-01-01

    unsteady aerodynamic loads due to structural displacements. In terms of actuation , most, if not all, active ∗Research Associate, Department of...flutter suppression techniques, conventional trailing edge flap actuators with a bandwidth of 10 Hz5 was used. Interestingly, the frequencies associated...influence of the flow features on the aeroelastic instability are quantified. Finally, the influence of actuation through a blowing port at 75% span is

  15. A Systems Engineering Survey of Artificial Intelligence and Smart Sensor Networks in a Network-Centric Environment

    DTIC Science & Technology

    2009-09-01

    problems, to better model the problem solving of computer systems. This research brought about the intertwining of AI and cognitive psychology . Much of...where symbol sequences are sequential intelligent states of the network, and must be classified as normal, abnormal , or unknown. These symbols...is associated with abnormal behavior; and abcbc is associated with unknown behavior, as it fits no known behavior. Predicted outcomes from

  16. Portable Computer Technology (PCT) Research and Development Program Phase 2

    NASA Technical Reports Server (NTRS)

    Castillo, Michael; McGuire, Kenyon; Sorgi, Alan

    1995-01-01

    The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.

  17. Analysis of Biosignals During Immersion in Computer Games.

    PubMed

    Yeo, Mina; Lim, Seokbeen; Yoon, Gilwon

    2017-11-17

    The number of computer game users is increasing as computers and various IT devices in connection with the Internet are commonplace in all ages. In this research, in order to find the relevance of behavioral activity and its associated biosignal, biosignal changes before and after as well as during computer games were measured and analyzed for 31 subjects. For this purpose, a device to measure electrocardiogram, photoplethysmogram and skin temperature was developed such that the effect of motion artifacts could be minimized. The device was made wearable for convenient measurement. The game selected for the experiments was League of Legends™. Analysis on the pulse transit time, heart rate variability and skin temperature showed increased sympathetic nerve activities during computer game, while the parasympathetic nerves became less active. Interestingly, the sympathetic predominance group showed less change in the heart rate variability as compared to the normal group. The results can be valuable for studying internet gaming disorder.

  18. Managing geometric information with a data base management system

    NASA Technical Reports Server (NTRS)

    Dube, R. P.

    1984-01-01

    The strategies for managing computer based geometry are described. The computer model of geometry is the basis for communication, manipulation, and analysis of shape information. The research on integrated programs for aerospace-vehicle design (IPAD) focuses on the use of data base management system (DBMS) technology to manage engineering/manufacturing data. The objectives of IPAD is to develop a computer based engineering complex which automates the storage, management, protection, and retrieval of engineering data. In particular, this facility must manage geometry information as well as associated data. The approach taken on the IPAD project to achieve this objective is discussed. Geometry management in current systems and the approach taken in the early IPAD prototypes are examined.

  19. Examining effectiveness of tailorable computer-assisted therapy programmes for substance misuse: Programme usage and clinical outcomes data from Breaking Free Online.

    PubMed

    Elison, Sarah; Jones, Andrew; Ward, Jonathan; Davies, Glyn; Dugdale, Stephanie

    2017-11-01

    When evaluating complex, tailorable digital behavioural interventions, additional approaches may be required alongside established methodologies such as randomised controlled trials (RCTs). Research evaluating a computer-assisted therapy (CAT) programme for substance misuse, Breaking Free Online (BFO), is informed by Medical Research Council (MRC) guidance recommending examination of 'mechanisms of action' of individual intervention strategies, which is relevant when evaluating digital interventions with content that may evolve over time. To report findings from examination of mechanisms of action of tailoring advice within the BFO programme and outcomes from specific intervention strategies. Analysis of covariance and linear regressions were used to assess intervention completion data, and psychometric and clinical outcomes, for 2311 service users accessing drug and alcohol treatment services across the UK. Tailoring advice provided to users appeared to prompt them to prioritise completion of intervention strategies associated with their areas of highest biopsychosocial impairment. Completion of specific intervention strategies within BFO were associated with specific clinical outcomes, with a dose response also being found. Mechanisms of action analyses revealed the primacy of cognitions, with cognitive restructuring strategies being associated with improvements in mental health, severity of substance dependence, quality of life and global biopsychosocial functioning. The MRC framework provides an evolved research paradigm within the field of digital behavioural change. By assessing baseline profiles of need, BFO can target the most appropriate clinical content for individual users. Mechanisms of action research can be used to inform modifications to BFO to continually update clinical content and the technology platform. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Surviving the Glut: The Management of Event Streams in Cyberphysical Systems

    NASA Astrophysics Data System (ADS)

    Buchmann, Alejandro

    Alejandro Buchmann is Professor in the Department of Computer Science, Technische Universität Darmstadt, where he heads the Databases and Distributed Systems Group. He received his MS (1977) and PhD (1980) from the University of Texas at Austin. He was an Assistant/Associate Professor at the Institute for Applied Mathematics and Systems IIMAS/UNAM in Mexico, doing research on databases for CAD, geographic information systems, and objectoriented databases. At Computer Corporation of America (later Xerox Advanced Information Systems) in Cambridge, Mass., he worked in the areas of active databases and real-time databases, and at GTE Laboratories, Waltham, in the areas of distributed object systems and the integration of heterogeneous legacy systems. 1991 he returned to academia and joined T.U. Darmstadt. His current research interests are at the intersection of middleware, databases, eventbased distributed systems, ubiquitous computing, and very large distributed systems (P2P, WSN). Much of the current research is concerned with guaranteeing quality of service and reliability properties in these systems, for example, scalability, performance, transactional behaviour, consistency, and end-to-end security. Many research projects imply collaboration with industry and cover a broad spectrum of application domains. Further information can be found at http://www.dvs.tu-darmstadt.de

  1. Assessment of CFD capability for prediction of hypersonic shock interactions

    NASA Astrophysics Data System (ADS)

    Knight, Doyle; Longo, José; Drikakis, Dimitris; Gaitonde, Datta; Lani, Andrea; Nompelis, Ioannis; Reimann, Bodo; Walpot, Louis

    2012-01-01

    The aerothermodynamic loadings associated with shock wave boundary layer interactions (shock interactions) must be carefully considered in the design of hypersonic air vehicles. The capability of Computational Fluid Dynamics (CFD) software to accurately predict hypersonic shock wave laminar boundary layer interactions is examined. A series of independent computations performed by researchers in the US and Europe are presented for two generic configurations (double cone and cylinder) and compared with experimental data. The results illustrate the current capabilities and limitations of modern CFD methods for these flows.

  2. Mathematical and computational modelling of skin biophysics: a review

    PubMed Central

    2017-01-01

    The objective of this paper is to provide a review on some aspects of the mathematical and computational modelling of skin biophysics, with special focus on constitutive theories based on nonlinear continuum mechanics from elasticity, through anelasticity, including growth, to thermoelasticity. Microstructural and phenomenological approaches combining imaging techniques are also discussed. Finally, recent research applications on skin wrinkles will be presented to highlight the potential of physics-based modelling of skin in tackling global challenges such as ageing of the population and the associated skin degradation, diseases and traumas. PMID:28804267

  3. Mathematical and computational modelling of skin biophysics: a review

    NASA Astrophysics Data System (ADS)

    Limbert, Georges

    2017-07-01

    The objective of this paper is to provide a review on some aspects of the mathematical and computational modelling of skin biophysics, with special focus on constitutive theories based on nonlinear continuum mechanics from elasticity, through anelasticity, including growth, to thermoelasticity. Microstructural and phenomenological approaches combining imaging techniques are also discussed. Finally, recent research applications on skin wrinkles will be presented to highlight the potential of physics-based modelling of skin in tackling global challenges such as ageing of the population and the associated skin degradation, diseases and traumas.

  4. NASA LeRC/Akron University Graduate Cooperative Fellowship Program and Graduate Student Researchers Program

    NASA Technical Reports Server (NTRS)

    Fertis, D. G.; Simon, A. L.

    1981-01-01

    The requisite methodology to solve linear and nonlinear problems associated with the static and dynamic analysis of rotating machinery, their static and dynamic behavior, and the interaction between the rotating and nonrotating parts of an engine is developed. Linear and nonlinear structural engine problems are investigated by developing solution strategies and interactive computational methods whereby the man and computer can communicate directly in making analysis decisions. Representative examples include modifying structural models, changing material, parameters, selecting analysis options and coupling with interactive graphical display for pre- and postprocessing capability.

  5. Analysis of cognitive theories in artificial intelligence and psychology in relation to the qualitative process of emotion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Semrau, P.

    The purpose of this study was to analyze selected cognitive theories in the areas of artificial intelligence (A.I.) and psychology to determine the role of emotions in the cognitive or intellectual processes. Understanding the relationship of emotions to processes of intelligence has implications for constructing theories of aesthetic response and A.I. systems in art. Psychological theories were examined that demonstrated the changing nature of the research in emotion related to cognition. The basic techniques in A.I. were reviewed and the A.I. research was analyzed to determine the process of cognition and the role of emotion. The A.I. research emphasized themore » digital, quantifiable character of the computer and associated cognitive models and programs. In conclusion, the cognitive-emotive research in psychology and the cognitive research in A.I. emphasized quantification methods over analog and qualitative characteristics required for a holistic explanation of cognition. Further A.I. research needs to examine the qualitative aspects of values, attitudes, and beliefs on influencing the creative thinking processes. Inclusion of research related to qualitative problem solving in art provides a more comprehensive base of study for examining the area of intelligence in computers.« less

  6. Evaluative studies in nuclear medicine research: emission computed tomography assessment. Final report, January 1-December 31, 1981

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potchen, E.J.; Harris, G.I.; Gift, D.A.

    The report provides information on an assessment of the potential short and long term benefits of emission computed tomography (ECT) in biomedical research and patient care. Work during the past year has been augmented by the development and use of an opinion survey instrument to reach a wider representation of knowledgeable investigators and users of this technology. This survey instrument is reproduced in an appendix. Information derived from analysis of the opinion survey, and used in conjunction with results of independent staff studies of available sources, provides the basis for the discussions given in following sections of PET applications inmore » the brain, of technical factors, and of economic implications. Projections of capital and operating costs on a per study basis were obtained from a computerized, pro forma accounting model and are compared with the survey cost estimates for both research and clinical modes of application. The results of a cash-flow model analysis of the relationship between projected economic benefit of PET research to disease management and the costs associated with such research are presented and discussed.« less

  7. NASA's computer science research program

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  8. Stuck on Screens: Patterns of Computer and Gaming Station Use in Youth Seen in a Psychiatric Clinic

    PubMed Central

    Baer, Susan; Bogusz, Elliot; Green, David A.

    2011-01-01

    Objective: Computer and gaming-station use has become entrenched in the culture of our youth. Parents of children with psychiatric disorders report concerns about overuse, but research in this area is limited. The goal of this study is to evaluate computer/gaming-station use in adolescents in a psychiatric clinic population and to examine the relationship between use and functional impairment. Method: 102 adolescents, ages 11–17, from out-patient psychiatric clinics participated. Amount of computer/gaming-station use, type of use (gaming or non-gaming), and presence of addictive features were ascertained along with emotional/functional impairment. Multivariate linear regression was used to examine correlations between patterns of use and impairment. Results: Mean screen time was 6.7±4.2 hrs/day. Presence of addictive features was positively correlated with emotional/functional impairment. Time spent on computer/gaming-station use was not correlated overall with impairment after controlling for addictive features, but non-gaming time was positively correlated with risky behavior in boys. Conclusions: Youth with psychiatric disorders are spending much of their leisure time on the computer/gaming-station and a substantial subset show addictive features of use which is associated with impairment. Further research to develop measures and to evaluate risk is needed to identify the impact of this problem. PMID:21541096

  9. Perceptual factors that influence use of computer enhanced visual displays

    NASA Technical Reports Server (NTRS)

    Littman, David; Boehm-Davis, Debbie

    1993-01-01

    This document is the final report for the NASA/Langley contract entitled 'Perceptual Factors that Influence Use of Computer Enhanced Visual Displays.' The document consists of two parts. The first part contains a discussion of the problem to which the grant was addressed, a brief discussion of work performed under the grant, and several issues suggested for follow-on work. The second part, presented as Appendix I, contains the annual report produced by Dr. Ann Fulop, the Postdoctoral Research Associate who worked on-site in this project. The main focus of this project was to investigate perceptual factors that might affect a pilot's ability to use computer generated information that is projected into the same visual space that contains information about real world objects. For example, computer generated visual information can identify the type of an attacking aircraft, or its likely trajectory. Such computer generated information must not be so bright that it adversely affects a pilot's ability to perceive other potential threats in the same volume of space. Or, perceptual attributes of computer generated and real display components should not contradict each other in ways that lead to problems of accommodation and, thus, distance judgments. The purpose of the research carried out under this contract was to begin to explore the perceptual factors that contribute to effective use of these displays.

  10. Association between Screen Viewing Duration and Sleep Duration, Sleep Quality, and Excessive Daytime Sleepiness among Adolescents in Hong Kong

    PubMed Central

    Mak, Yim Wah; Wu, Cynthia Sau Ting; Hui, Donna Wing Shun; Lam, Siu Ping; Tse, Hei Yin; Yu, Wing Yan; Wong, Ho Ting

    2014-01-01

    Screen viewing is considered to have adverse impacts on the sleep of adolescents. Although there has been a considerable amount of research on the association between screen viewing and sleep, most studies have focused on specific types of screen viewing devices such as televisions and computers. The present study investigated the duration with which currently prevalent screen viewing devices (including televisions, personal computers, mobile phones, and portable video devices) are viewed in relation to sleep duration, sleep quality, and daytime sleepiness among Hong Kong adolescents (N = 762). Television and computer viewing remain prevalent, but were not correlated with sleep variables. Mobile phone viewing was correlated with all sleep variables, while portable video device viewing was shown to be correlated only with daytime sleepiness. The results demonstrated a trend of increase in the prevalence and types of screen viewing and their effects on the sleep patterns of adolescents. PMID:25353062

  11. Resin-composite blocks for dental CAD/CAM applications.

    PubMed

    Ruse, N D; Sadoun, M J

    2014-12-01

    Advances in digital impression technology and manufacturing processes have led to a dramatic paradigm shift in dentistry and to the widespread use of computer-aided design/computer-aided manufacturing (CAD/CAM) in the fabrication of indirect dental restorations. Research and development in materials suitable for CAD/CAM applications are currently the most active field in dental materials. Two classes of materials are used in the production of CAD/CAM restorations: glass-ceramics/ceramics and resin composites. While glass-ceramics/ceramics have overall superior mechanical and esthetic properties, resin-composite materials may offer significant advantages related to their machinability and intra-oral reparability. This review summarizes recent developments in resin-composite materials for CAD/CAM applications, focusing on both commercial and experimental materials. © International & American Associations for Dental Research.

  12. GSTARS computer models and their applications, Part II: Applications

    USGS Publications Warehouse

    Simoes, F.J.M.; Yang, C.T.

    2008-01-01

    In part 1 of this two-paper series, a brief summary of the basic concepts and theories used in developing the Generalized Stream Tube model for Alluvial River Simulation (GSTARS) computer models was presented. Part 2 provides examples that illustrate some of the capabilities of the GSTARS models and how they can be applied to solve a wide range of river and reservoir sedimentation problems. Laboratory and field case studies are used and the examples show representative applications of the earlier and of the more recent versions of GSTARS. Some of the more recent capabilities implemented in GSTARS3, one of the latest versions of the series, are also discussed here with more detail. ?? 2008 International Research and Training Centre on Erosion and Sedimentation and the World Association for Sedimentation and Erosion Research.

  13. Computer-Assisted Diagnosis of the Sleep Apnea-Hypopnea Syndrome: A Review

    PubMed Central

    Alvarez-Estevez, Diego; Moret-Bonillo, Vicente

    2015-01-01

    Automatic diagnosis of the Sleep Apnea-Hypopnea Syndrome (SAHS) has become an important area of research due to the growing interest in the field of sleep medicine and the costs associated with its manual diagnosis. The increment and heterogeneity of the different techniques, however, make it somewhat difficult to adequately follow the recent developments. A literature review within the area of computer-assisted diagnosis of SAHS has been performed comprising the last 15 years of research in the field. Screening approaches, methods for the detection and classification of respiratory events, comprehensive diagnostic systems, and an outline of current commercial approaches are reviewed. An overview of the different methods is presented together with validation analysis and critical discussion of the current state of the art. PMID:26266052

  14. Parallel Distributed Processing Theory in the Age of Deep Networks.

    PubMed

    Bowers, Jeffrey S

    2017-12-01

    Parallel distributed processing (PDP) models in psychology are the precursors of deep networks used in computer science. However, only PDP models are associated with two core psychological claims, namely that all knowledge is coded in a distributed format and cognition is mediated by non-symbolic computations. These claims have long been debated in cognitive science, and recent work with deep networks speaks to this debate. Specifically, single-unit recordings show that deep networks learn units that respond selectively to meaningful categories, and researchers are finding that deep networks need to be supplemented with symbolic systems to perform some tasks. Given the close links between PDP and deep networks, it is surprising that research with deep networks is challenging PDP theory. Copyright © 2017. Published by Elsevier Ltd.

  15. BioSMACK: a linux live CD for genome-wide association analyses.

    PubMed

    Hong, Chang Bum; Kim, Young Jin; Moon, Sanghoon; Shin, Young-Ah; Go, Min Jin; Kim, Dong-Joon; Lee, Jong-Young; Cho, Yoon Shin

    2012-01-01

    Recent advances in high-throughput genotyping technologies have enabled us to conduct a genome-wide association study (GWAS) on a large cohort. However, analyzing millions of single nucleotide polymorphisms (SNPs) is still a difficult task for researchers conducting a GWAS. Several difficulties such as compatibilities and dependencies are often encountered by researchers using analytical tools, during the installation of software. This is a huge obstacle to any research institute without computing facilities and specialists. Therefore, a proper research environment is an urgent need for researchers working on GWAS. We developed BioSMACK to provide a research environment for GWAS that requires no configuration and is easy to use. BioSMACK is based on the Ubuntu Live CD that offers a complete Linux-based operating system environment without installation. Moreover, we provide users with a GWAS manual consisting of a series of guidelines for GWAS and useful examples. BioSMACK is freely available at http://ksnp.cdc. go.kr/biosmack.

  16. Adolescent Sedentary Behaviors: Correlates Differ for Television Viewing and Computer Use

    PubMed Central

    Babey, Susan H.; Hastert, Theresa A.; Wolstein, Joelle

    2013-01-01

    Purpose Sedentary behavior is associated with obesity in youth. Understanding correlates of specific sedentary behaviors can inform the development of interventions to reduce sedentary time. The current research examines correlates of leisure computer use and television viewing among California adolescents. Methods Using data from the 2005 California Health Interview Survey (CHIS), we examined individual, family and environmental correlates of two sedentary behaviors among 4,029 adolescents: leisure computer use and television watching. Results Linear regression analyses adjusting for a range of factors indicated several differences in the correlates of television watching and computer use. Correlates of additional time spent watching television included male sex, American Indian and African American race, lower household income, lower levels of physical activity, lower parent educational attainment, and additional hours worked by parents. Correlates of a greater amount of time spent using the computer for fun included older age, Asian race, higher household income, lower levels of physical activity, less parental knowledge of free time activities, and living in neighborhoods with higher proportions of non-white residents and higher proportions of low-income residents. Only physical activity was associated similarly with both watching television and computer use. Conclusions These results suggest that correlates of time spent on television watching and leisure computer use are different. Reducing screen time is a potentially successful strategy in combating childhood obesity, and understanding differences in the correlates of different screen time behaviors can inform the development of more effective interventions to reduce sedentary time. PMID:23260837

  17. Simulating smokers' acceptance of modifications in a cessation program.

    PubMed Central

    Spoth, R

    1992-01-01

    Recent research has underscored the importance of assessing barriers to smokers' acceptance of cessation programs. This paper illustrates the use of computer simulations to gauge smokers' response to program modifications which may produce barriers to participation. It also highlights methodological issues encountered in conducting this work. Computer simulations were based on conjoint analysis, a consumer research method which enables measurement of smokers' relative preference for various modifications of cessation programs. Results from two studies are presented in this paper. The primary study used a randomly selected sample of 218 adult smokers who participated in a computer-assisted phone interview. Initially, the study assessed smokers' relative utility rating of 30 features of cessation programs. Utility data were used in computer-simulated comparisons of a low-cost, self-help oriented program under development and five other existing programs. A baseline version of the program under development and two modifications (for example, use of a support group with a higher level of cost) were simulated. Both the baseline version and modifications received a favorable response vis-à-vis comparison programs. Modifications requiring higher program costs were, however, associated with moderately reduced levels of favorable consumer response. The second study used a sample of 70 smokers who responded to an expanded set of smoking cessation program features focusing on program packaging. This secondary study incorporate in-person, computer-assisted interviews at a shopping mall, with smokers viewing an artist's mock-up of various program options on display. A similar pattern of responses to simulated program modifications emerged, with monetary cost apparently playing a key role. The significance of conjoint-based computer simulation as a tool in program development or dissemination, salient methodological issues, and implications for further research are discussed. PMID:1738813

  18. Simulating smokers' acceptance of modifications in a cessation program.

    PubMed

    Spoth, R

    1992-01-01

    Recent research has underscored the importance of assessing barriers to smokers' acceptance of cessation programs. This paper illustrates the use of computer simulations to gauge smokers' response to program modifications which may produce barriers to participation. It also highlights methodological issues encountered in conducting this work. Computer simulations were based on conjoint analysis, a consumer research method which enables measurement of smokers' relative preference for various modifications of cessation programs. Results from two studies are presented in this paper. The primary study used a randomly selected sample of 218 adult smokers who participated in a computer-assisted phone interview. Initially, the study assessed smokers' relative utility rating of 30 features of cessation programs. Utility data were used in computer-simulated comparisons of a low-cost, self-help oriented program under development and five other existing programs. A baseline version of the program under development and two modifications (for example, use of a support group with a higher level of cost) were simulated. Both the baseline version and modifications received a favorable response vis-à-vis comparison programs. Modifications requiring higher program costs were, however, associated with moderately reduced levels of favorable consumer response. The second study used a sample of 70 smokers who responded to an expanded set of smoking cessation program features focusing on program packaging. This secondary study incorporate in-person, computer-assisted interviews at a shopping mall, with smokers viewing an artist's mock-up of various program options on display. A similar pattern of responses to simulated program modifications emerged, with monetary cost apparently playing a key role. The significance of conjoint-based computer simulation as a tool in program development or dissemination, salient methodological issues, and implications for further research are discussed.

  19. Screen time is associated with depression and anxiety in Canadian youth.

    PubMed

    Maras, Danijela; Flament, Martine F; Murray, Marisa; Buchholz, Annick; Henderson, Katherine A; Obeid, Nicole; Goldfield, Gary S

    2015-04-01

    This study examined the relationships between screen time and symptoms of depression and anxiety in a large community sample of Canadian youth. Participants were 2482 English-speaking grade 7 to 12 students. Cross-sectional data collected between 2006 and 2010 as part of the Research on Eating and Adolescent Lifestyles (REAL) study were used. Mental health status was assessed using the Children's Depression Inventory and the Multidimensional Anxiety Scale for Children-10. Screen time (hours/day of TV, video games, and computer) was assessed using the Leisure-Time Sedentary Activities questionnaire. Linear multiple regressions indicated that after controlling for age, sex, ethnicity, parental education, geographic area, physical activity, and BMI, duration of screen time was associated with severity of depression (β=0.23, p<0.001) and anxiety (β=0.07, p<0.01). Video game playing (β=0.13, p<.001) and computer use (β=0.17, p<0.001) but not TV viewing were associated with more severe depressive symptoms. Video game playing (β=0.11, p<0.001) was associated with severity of anxiety. Screen time may represent a risk factor or marker of anxiety and depression in adolescents. Future research is needed to determine if reducing screen time aids the prevention and treatment of these psychiatric disorders in youth. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  20. Acquisition of gamma camera and physiological data by computer.

    PubMed

    Hack, S N; Chang, M; Line, B R; Cooper, J A; Robeson, G H

    1986-11-01

    We have designed, implemented, and tested a new Research Data Acquisition System (RDAS) that permits a general purpose digital computer to acquire signals from both gamma camera sources and physiological signal sources concurrently. This system overcomes the limited multi-source, high speed data acquisition capabilities found in most clinically oriented nuclear medicine computers. The RDAS can simultaneously input signals from up to four gamma camera sources with a throughput of 200 kHz per source and from up to eight physiological signal sources with an aggregate throughput of 50 kHz. Rigorous testing has found the RDAS to exhibit acceptable linearity and timing characteristics. In addition, flood images obtained by this system were compared with flood images acquired by a commercial nuclear medicine computer system. National Electrical Manufacturers Association performance standards of the flood images were found to be comparable.

  1. Bringing Precision Medicine to Community Oncologists.

    PubMed

    2017-01-01

    Quest Diagnostics has teamed up with Memorial Sloan Kettering Cancer Center and IBM Watson Health to offer IBM Watson Genomics to its network of community cancer centers and hospitals. This new service aims to advance precision medicine by combining genomic tumor sequencing with the power of cognitive computing. ©2017 American Association for Cancer Research.

  2. Towards a Research Model for Distance Education-Contributions from the Telecommuting Literature.

    ERIC Educational Resources Information Center

    Dick, Geoffrey N.

    This paper draws on an extensive review of literature associated with telecommuting and looks at features that might affect the offering and take-up of distance education, particularly distance education involving computer applications, telecommunications and web-based, off-campus delivery of courses or components of courses. The issue is…

  3. Change in Computer Access and the Academic Achievement of Immigrant Children

    ERIC Educational Resources Information Center

    Moon, Ui Jeong; Hofferth, Sandra

    2018-01-01

    Background/Context: Increased interest in the correlates of media devices available to children has led to research indicating that access to and use of technology are positively associated with children's academic achievement. However, the digital divide remains; not all children have access to digital technologies, and not all children can…

  4. The AT Odyssey Continues. Proceedings of the RESNA 2001 Annual Conference (Reno, Nevada, June 22-26, 2001). Volume 21.

    ERIC Educational Resources Information Center

    Simpson, Richard, Ed.

    These proceedings of the annual RESNA (Association for the Advancement of Rehabilitation Technology) conference include more than 200 presentations on all facets of assistive technology, including concurrent sessions, scientific platform sessions, interactive poster presentations, computer demonstrations, and the research symposia. The scientific…

  5. Selected Papers of the Southeastern Writing Center Association.

    ERIC Educational Resources Information Center

    Roberts, David H., Ed.; Wolff, William C., Ed.

    Addressing a variety of concerns of writing center directors and staff, directors of freshman composition, and English department chairs, the papers in this collection discuss writing center research and evaluation, writing center tutors, and computers in the writing center. The titles of the essays and their authors are as follows: (1) "Narrative…

  6. Thunderstorm Research International Program (TRIP 77) report to management

    NASA Technical Reports Server (NTRS)

    Taiani, A. J.

    1977-01-01

    A post analysis of the previous day's weather, followed by the day's forecast and an outlook on weather conditions for the following day is given. The normal NOAA weather charts were used, complemented by the latest GOES satellite pictures, the latest rawinsonde sounding, and the computer-derived thunderstorm probability forecasts associated with the sounding.

  7. Salary Compression: A Time-Series Ratio Analysis of ARL Position Classifications

    ERIC Educational Resources Information Center

    Seaman, Scott

    2007-01-01

    Although salary compression has previously been identified in such professional schools as engineering, business, and computer science, there is now evidence of salary compression among Association of Research Libraries members. Using salary data from the "ARL Annual Salary Survey", this study analyzes average annual salaries from 1994-1995…

  8. Mixed-Initiative Development of Plans With Expressive Temporal Constraints

    DTIC Science & Technology

    2007-06-14

    added.m One technique that is commonly used in DTP solving, incremen- tal full-path consistency, exploits this property by maintaining a stack of the...Giordano. " M. Pollack was elected to the CRA (Computing Research Association) Board of Directors, 2007-2009. " M. Moffitt won the IBM 2007 Josef Raviv

  9. Management of Library Security. SPEC Kit 247 and SPEC Flyer 247.

    ERIC Educational Resources Information Center

    Soete, George J., Comp.; Zimmerman, Glen, Comp.

    This SPEC (Systems and Procedures Exchange Center) Kit and Flyer reports results of a survey conducted in January 1999 that examined how ARL (Association of Research Libraries) member libraries assure the safety and security of persons, library materials, physical facilities, furnishings, computer equipment, etc. Forty-five of the 122 ARL member…

  10. Attitudes toward ecosystem management in the United States, 1992-1998

    Treesearch

    David N. Bengston; George Xu; David P. Fan

    2001-01-01

    Ecosystem management has been formally adopted by a large number of state and federal agencies and by forest products firms and associations. But little research has examined people's attitudes toward this new approach to natural resources management. This study used computer methods to measure favorable and unfavorable attitudes toward ecosystems managemnet...

  11. Correlating Computed and Flight Instructor Assessments of Straight-In Landing Approaches by Novice Pilots on a Flight Simulator

    NASA Technical Reports Server (NTRS)

    Heath, Bruce E.; Khan, M. Javed; Rossi, Marcia; Ali, Syed Firasat

    2005-01-01

    The rising cost of flight training and the low cost of powerful computers have resulted in increasing use of PC-based flight simulators. This has prompted FAA standards regulating such use and allowing aspects of training on simulators meeting these standards to be substituted for flight time. However, the FAA regulations require an authorized flight instructor as part of the training environment. Thus, while costs associated with flight time have been reduced, the cost associated with the need for a flight instructor still remains. The obvious area of research, therefore, has been to develop intelligent simulators. However, the two main challenges of such attempts have been training strategies and assessment. The research reported in this paper was conducted to evaluate various performance metrics of a straight-in landing approach by 33 novice pilots flying a light single engine aircraft simulation. These metrics were compared to assessments of these flights by two flight instructors to establish a correlation between the two techniques in an attempt to determine a composite performance metric for this flight maneuver.

  12. Center for Computing Research Summer Research Proceedings 2015.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, Andrew Michael; Parks, Michael L.

    2015-12-18

    The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).

  13. A computer-based education intervention to enhance surrogates' informed consent for genomics research.

    PubMed

    Shelton, Ann K; Freeman, Bradley D; Fish, Anne F; Bachman, Jean A; Richardson, Lloyd I

    2015-03-01

    Many research studies conducted today in critical care have a genomics component. Patients' surrogates asked to authorize participation in genomics research for a loved one in the intensive care unit may not be prepared to make informed decisions about a patient's participation in the research. To examine the effectiveness of a new, computer-based education module on surrogates' understanding of the process of informed consent for genomics research. A pilot study was conducted with visitors in the waiting rooms of 2 intensive care units in a Midwestern tertiary care medical center. Visitors were randomly assigned to the experimental (education module plus a sample genomics consent form; n = 65) or the control (sample genomics consent form only; n = 69) group. Participants later completed a test on informed genomics consent. Understanding the process of informed consent was greater (P = .001) in the experimental group than in the control group. Specifically, compared with the control group, the experimental group had a greater understanding of 8 of 13 elements of informed consent: intended benefits of research (P = .02), definition of surrogate consenter (P= .001), withdrawal from the study (P = .001), explanation of risk (P = .002), purpose of the institutional review board (P = .001), definition of substituted judgment (P = .03), compensation for harm (P = .001), and alternative treatments (P = .004). Computer-based education modules may be an important addition to conventional approaches for obtaining informed consent in the intensive care unit. Preparing patients' family members who may consider serving as surrogate consenters is critical to facilitating genomics research in critical care. ©2015 American Association of Critical-Care Nurses.

  14. Nutritional metabolomics: Progress in addressing complexity in diet and health

    PubMed Central

    Jones, Dean P.; Park, Youngja; Ziegler, Thomas R.

    2013-01-01

    Nutritional metabolomics is rapidly maturing to use small molecule chemical profiling to support integration of diet and nutrition in complex biosystems research. These developments are critical to facilitate transition of nutritional sciences from population-based to individual-based criteria for nutritional research, assessment and management. This review addresses progress in making these approaches manageable for nutrition research. Important concept developments concerning the exposome, predictive health and complex pathobiology, serve to emphasize the central role of diet and nutrition in integrated biosystems models of health and disease. Improved analytic tools and databases for targeted and non-targeted metabolic profiling, along with bioinformatics, pathway mapping and computational modeling, are now used for nutrition research on diet, metabolism, microbiome and health associations. These new developments enable metabolome-wide association studies (MWAS) and provide a foundation for nutritional metabolomics, along with genomics, epigenomics and health phenotyping, to support integrated models required for personalized diet and nutrition forecasting. PMID:22540256

  15. Explanation-aware computing of the prognosis for breast cancer supported by IK-DCBRC: Technical innovation.

    PubMed

    Khelassi, Abdeldjalil

    2014-01-01

    Active research is being conducted to determine the prognosis for breast cancer. However, the uncertainty is a major obstacle in this domain of medical research. In that context, explanation-aware computing has the potential for providing meaningful interactions between complex medical applications and users, which would ensure a significant reduction of uncertainty and risks. This paper presents an explanation-aware agent, supported by Intensive Knowledge-Distributed Case-Based Reasoning Classifier (IK-DCBRC), to reduce the uncertainty and risks associated with the diagnosis of breast cancer. A meaningful explanation is generated by inferring from a rule-based system according to the level of abstraction and the reasoning traces. The computer-aided detection is conducted by IK-DCBRC, which is a multi-agent system that applies the case-based reasoning paradigm and a fuzzy similarity function for the automatic prognosis by the class of breast tumors, i.e. malignant or benign, from a pattern of cytological images. A meaningful interaction between the physician and the computer-aided diagnosis system, IK-DCBRC, is achieved via an intelligent agent. The physician can observe the trace of reasoning, terms, justifications, and the strategy to be used to decrease the risks and doubts associated with the automatic diagnosis. The capability of the system we have developed was proven by an example in which conflicts were clarified and transparency was ensured. The explanation agent ensures the transparency of the automatic diagnosis of breast cancer supported by IK-DCBRC, which decreases uncertainty and risks and detects some conflicts.

  16. A blueprint for computational analysis of acoustical scattering from orchestral panel arrays

    NASA Astrophysics Data System (ADS)

    Burns, Thomas

    2005-09-01

    Orchestral panel arrays have been a topic of interest to acousticians, and it is reasonable to expect optimal design criteria to result from a combination of musician surveys, on-stage empirical data, and computational modeling of various configurations. Preparing a musicians survey to identify specific mechanisms of perception and sound quality is best suited for a clinically experienced hearing scientist. Measuring acoustical scattering from a panel array and discerning the effects from various boundaries is best suited for the experienced researcher in engineering acoustics. Analyzing a numerical model of the panel arrays is best suited for the tools typically used in computational engineering analysis. Toward this end, a streamlined process will be described using PROENGINEER to define a panel array geometry in 3-D, a commercial mesher to numerically discretize this geometry, SYSNOISE to solve the associated boundary element integral equations, and MATLAB to visualize the results. The model was run (background priority) on an SGI Altix (Linux) server with 12 CPUs, 24 Gbytes of RAM, and 1 Tbyte of disk space. These computational resources are available to research teams interested in this topic and willing to write and pursue grants.

  17. Effect of Counterflow Jet on a Supersonic Reentry Capsule

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan; Venkatachari, Balaji Shankar; Cheng, Gary C.

    2006-01-01

    Recent NASA initiatives for space exploration have reinvigorated research on Apollo-like capsule vehicles. Aerothermodynamic characteristics of these capsule configurations during reentry play a crucial role in the performance and safety of the planetary entry probes and the crew exploration vehicles. At issue are the forebody thermal shield protection and afterbody aeroheating predictions. Due to the lack of flight or wind tunnel measurements at hypersonic speed, design decisions on such vehicles would rely heavily on computational results. Validation of current computational tools against experimental measurement thus becomes one of the most important tasks for general hypersonic research. This paper is focused on time-accurate numerical computations of hypersonic flows over a set of capsule configurations, which employ a counterflow jet to offset the detached bow shock. The accompanying increased shock stand-off distance and modified heat transfer characteristics associated with the counterflow jet may provide guidance for future design of hypersonic reentry capsules. The newly emerged space-time conservation element solution element (CESE) method is used to perform time-accurate, unstructured mesh Navier-Stokes computations for all cases investigated. The results show good agreement between experimental and numerical Schlieren pictures. Surface heat flux and aerodynamic force predictions of the capsule configurations are discussed in detail.

  18. Measuring the bias against low-income country research: an Implicit Association Test.

    PubMed

    Harris, Matthew; Macinko, James; Jimenez, Geronimo; Mullachery, Pricila

    2017-11-06

    With an increasing array of innovations and research emerging from low-income countries there is a growing recognition that even high-income countries could learn from these contexts. It is well known that the source of a product influences perception of that product, but little research has examined whether this applies also in evidence-based medicine and decision-making. In order to examine likely barriers to learning from low-income countries, this study uses established methods in cognitive psychology to explore whether healthcare professionals and researchers implicitly associate good research with rich countries more so than with poor countries. Computer-based Implicit Association Test (IAT) distributed to healthcare professionals and researchers. Stimuli representing Rich Countries were chosen from OECD members in the top ten (>$36,000 per capita) World Bank rankings and Poor Countries were chosen from the bottom thirty (<$1000 per capita) countries by GDP per capita, in both cases giving attention to regional representation. Stimuli representing Research were descriptors of the motivation (objective/biased), value (useful/worthless), clarity (precise/vague), process (transparent/dishonest), and trustworthiness (credible/unreliable) of research. IAT results are presented as a Cohen's d statistic. Quantile regression was used to assess the contribution of covariates (e.g. age, sex, country of origin) to different values of IAT responses that correspond to different levels of implicit bias. Poisson regression was used to model dichotomized responses to the explicit bias item. Three hundred twenty one tests were completed in a four-week period between March and April 2015. The mean Implicit Association Test result (a standardized mean relative latency between congruent and non-congruent categories) for the sample was 0.57 (95% CI 0.52 to 0.61) indicating that on average our sample exhibited moderately strong implicit associations between Rich Countries and Good Research. People over 40 years of age were less likely to exhibit pro-poor implicit associations, and being a peer reviewer contributes to a more pro-poor association. The majority of our participants associate Good Research with Rich Countries, compared to Poor Countries. Implicit associations such as these might disfavor research from poor countries in research evaluation, evidence-based medicine and diffusion of innovations.

  19. Synergies and Distinctions between Computational Disciplines in Biomedical Research: Perspective from the Clinical and Translational Science Award Programs

    PubMed Central

    Bernstam, Elmer V.; Hersh, William R.; Johnson, Stephen B.; Chute, Christopher G.; Nguyen, Hien; Sim, Ida; Nahm, Meredith; Weiner, Mark; Miller, Perry; DiLaura, Robert P.; Overcash, Marc; Lehmann, Harold P.; Eichmann, David; Athey, Brian D.; Scheuermann, Richard H.; Anderson, Nick; Starren, Justin B.; Harris, Paul A.; Smith, Jack W.; Barbour, Ed; Silverstein, Jonathan C.; Krusch, David A.; Nagarajan, Rakesh; Becich, Michael J.

    2010-01-01

    Clinical and translational research increasingly requires computation. Projects may involve multiple computationally-oriented groups including information technology (IT) professionals, computer scientists and biomedical informaticians. However, many biomedical researchers are not aware of the distinctions among these complementary groups, leading to confusion, delays and sub-optimal results. Although written from the perspective of clinical and translational science award (CTSA) programs within academic medical centers, the paper addresses issues that extend beyond clinical and translational research. The authors describe the complementary but distinct roles of operational IT, research IT, computer science and biomedical informatics using a clinical data warehouse as a running example. In general, IT professionals focus on technology. The authors distinguish between two types of IT groups within academic medical centers: central or administrative IT (supporting the administrative computing needs of large organizations) and research IT (supporting the computing needs of researchers). Computer scientists focus on general issues of computation such as designing faster computers or more efficient algorithms, rather than specific applications. In contrast, informaticians are concerned with data, information and knowledge. Biomedical informaticians draw on a variety of tools, including but not limited to computers, to solve information problems in health care and biomedicine. The paper concludes with recommendations regarding administrative structures that can help to maximize the benefit of computation to biomedical research within academic health centers. PMID:19550198

  20. Applicability of computational systems biology in toxicology.

    PubMed

    Kongsbak, Kristine; Hadrup, Niels; Audouze, Karine; Vinggaard, Anne Marie

    2014-07-01

    Systems biology as a research field has emerged within the last few decades. Systems biology, often defined as the antithesis of the reductionist approach, integrates information about individual components of a biological system. In integrative systems biology, large data sets from various sources and databases are used to model and predict effects of chemicals on, for instance, human health. In toxicology, computational systems biology enables identification of important pathways and molecules from large data sets; tasks that can be extremely laborious when performed by a classical literature search. However, computational systems biology offers more advantages than providing a high-throughput literature search; it may form the basis for establishment of hypotheses on potential links between environmental chemicals and human diseases, which would be very difficult to establish experimentally. This is possible due to the existence of comprehensive databases containing information on networks of human protein-protein interactions and protein-disease associations. Experimentally determined targets of the specific chemical of interest can be fed into these networks to obtain additional information that can be used to establish hypotheses on links between the chemical and human diseases. Such information can also be applied for designing more intelligent animal/cell experiments that can test the established hypotheses. Here, we describe how and why to apply an integrative systems biology method in the hypothesis-generating phase of toxicological research. © 2014 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).

  1. Use of the Computer for Research on Instruction and Student Understanding in Physics.

    NASA Astrophysics Data System (ADS)

    Grayson, Diane Jeanette

    This dissertation describes an investigation of how the computer may be utilized to perform research on instruction and on student understanding in physics. The research was conducted within three content areas: kinematics, waves and dynamics. The main focus of the research on instruction was the determination of factors needed for a computer program to be instructionally effective. The emphasis in the research on student understanding was the identification of specific conceptual and reasoning difficulties students encounter with the subject matter. Most of the research was conducted using the computer -based interview, a technique developed during the early part of the work, conducted within the domain of kinematics. In a computer-based interview, a student makes a prediction about how a particular system will behave under given circumstances, observes a simulation of the event on a computer screen, and then is asked by an interviewer to explain any discrepancy between prediction and observation. In the course of the research, a model was developed for producing educational software. The model has three important components: (i) research on student difficulties in the content area to be addressed, (ii) observations of students using the computer program, and (iii) consequent program modification. This model was used to guide the development of an instructional computer program dealing with graphical representations of transverse pulses. Another facet of the research involved the design of a computer program explicitly for the purposes of research. A computer program was written that simulates a modified Atwood's machine. The program was than used in computer -based interviews and proved to be an effective means of probing student understanding of dynamics concepts. In order to ascertain whether or not the student difficulties identified were peculiar to the computer, laboratory-based interviews with real equipment were also conducted. The laboratory-based interviews were designed to parallel the computer-based interviews as closely as possible. The results of both types of interviews are discussed in detail. The dissertation concludes with a discussion of some of the benefits of using the computer in physics instruction and physics education research. Attention is also drawn to some of the limitations of the computer as a research instrument or instructional device.

  2. Emerging Uses of Computer Technology in Qualitative Research.

    ERIC Educational Resources Information Center

    Parker, D. Randall

    The application of computer technology in qualitative research and evaluation ranges from simple word processing to doing sophisticated data sorting and retrieval. How computer software can be used for qualitative research is discussed. Researchers should consider the use of computers in data analysis in light of their own familiarity and comfort…

  3. Text Mining for Neuroscience

    NASA Astrophysics Data System (ADS)

    Tirupattur, Naveen; Lapish, Christopher C.; Mukhopadhyay, Snehasis

    2011-06-01

    Text mining, sometimes alternately referred to as text analytics, refers to the process of extracting high-quality knowledge from the analysis of textual data. Text mining has wide variety of applications in areas such as biomedical science, news analysis, and homeland security. In this paper, we describe an approach and some relatively small-scale experiments which apply text mining to neuroscience research literature to find novel associations among a diverse set of entities. Neuroscience is a discipline which encompasses an exceptionally wide range of experimental approaches and rapidly growing interest. This combination results in an overwhelmingly large and often diffuse literature which makes a comprehensive synthesis difficult. Understanding the relations or associations among the entities appearing in the literature not only improves the researchers current understanding of recent advances in their field, but also provides an important computational tool to formulate novel hypotheses and thereby assist in scientific discoveries. We describe a methodology to automatically mine the literature and form novel associations through direct analysis of published texts. The method first retrieves a set of documents from databases such as PubMed using a set of relevant domain terms. In the current study these terms yielded a set of documents ranging from 160,909 to 367,214 documents. Each document is then represented in a numerical vector form from which an Association Graph is computed which represents relationships between all pairs of domain terms, based on co-occurrence. Association graphs can then be subjected to various graph theoretic algorithms such as transitive closure and cycle (circuit) detection to derive additional information, and can also be visually presented to a human researcher for understanding. In this paper, we present three relatively small-scale problem-specific case studies to demonstrate that such an approach is very successful in replicating a neuroscience expert's mental model of object-object associations entirely by means of text mining. These preliminary results provide the confidence that this type of text mining based research approach provides an extremely powerful tool to better understand the literature and drive novel discovery for the neuroscience community.

  4. Investigations into Gravitational Wave Emission from Compact Body Inspiral Into Massive Black Holes

    NASA Technical Reports Server (NTRS)

    Hughes, Scott A.

    2004-01-01

    Much of the grant's support (and associated time) was used in developmental activity, building infrastructure for the core of the work that the grant supports. Though infrastructure development was the bulk of the activity supported this year, important progress was made in research as well. The two most important "infrastructure" items were in computing hardware and personnel. Research activities were primarily focused on improving and extending. Hughes' Teukolsky-equation-based gravitational-wave generator. Several improvements have been incorporated into this generator.

  5. Study of the Effects of Photometric Geometry on Spectral Reflectance Measurements

    NASA Technical Reports Server (NTRS)

    Helfenstein, Paul

    1998-01-01

    The objective of this research is to investigate how the spectrophotometric properties of planetary surface materials depend on photometric geometry by refining and applying radiative transfer theory to data obtained from spacecraft and telescope observations of planetary surfaces, studies of laboratory analogs, and computer simulations. The goal is to perfect the physical interpretation of photometric parameters in the context of planetary surface geological properties and processes. The purpose of this report is to document the research achievements associated with this study.

  6. Computation of Thermodynamic Equilibria Pertinent to Nuclear Materials in Multi-Physics Codes

    NASA Astrophysics Data System (ADS)

    Piro, Markus Hans Alexander

    Nuclear energy plays a vital role in supporting electrical needs and fulfilling commitments to reduce greenhouse gas emissions. Research is a continuing necessity to improve the predictive capabilities of fuel behaviour in order to reduce costs and to meet increasingly stringent safety requirements by the regulator. Moreover, a renewed interest in nuclear energy has given rise to a "nuclear renaissance" and the necessity to design the next generation of reactors. In support of this goal, significant research efforts have been dedicated to the advancement of numerical modelling and computational tools in simulating various physical and chemical phenomena associated with nuclear fuel behaviour. This undertaking in effect is collecting the experience and observations of a past generation of nuclear engineers and scientists in a meaningful way for future design purposes. There is an increasing desire to integrate thermodynamic computations directly into multi-physics nuclear fuel performance and safety codes. A new equilibrium thermodynamic solver is being developed with this matter as a primary objective. This solver is intended to provide thermodynamic material properties and boundary conditions for continuum transport calculations. There are several concerns with the use of existing commercial thermodynamic codes: computational performance; limited capabilities in handling large multi-component systems of interest to the nuclear industry; convenient incorporation into other codes with quality assurance considerations; and, licensing entanglements associated with code distribution. The development of this software in this research is aimed at addressing all of these concerns. The approach taken in this work exploits fundamental principles of equilibrium thermodynamics to simplify the numerical optimization equations. In brief, the chemical potentials of all species and phases in the system are constrained by estimates of the chemical potentials of the system components at each iterative step, and the objective is to minimize the residuals of the mass balance equations. Several numerical advantages are achieved through this simplification. In particular, computational expense is reduced and the rate of convergence is enhanced. Furthermore, the software has demonstrated the ability to solve systems involving as many as 118 component elements. An early version of the code has already been integrated into the Advanced Multi-Physics (AMP) code under development by the Oak Ridge National Laboratory, Los Alamos National Laboratory, Idaho National Laboratory and Argonne National Laboratory. Keywords: Engineering, Nuclear -- 0552, Engineering, Material Science -- 0794, Chemistry, Mathematics -- 0405, Computer Science -- 0984

  7. An autonomous molecular computer for logical control of gene expression.

    PubMed

    Benenson, Yaakov; Gil, Binyamin; Ben-Dor, Uri; Adar, Rivka; Shapiro, Ehud

    2004-05-27

    Early biomolecular computer research focused on laboratory-scale, human-operated computers for complex computational problems. Recently, simple molecular-scale autonomous programmable computers were demonstrated allowing both input and output information to be in molecular form. Such computers, using biological molecules as input data and biologically active molecules as outputs, could produce a system for 'logical' control of biological processes. Here we describe an autonomous biomolecular computer that, at least in vitro, logically analyses the levels of messenger RNA species, and in response produces a molecule capable of affecting levels of gene expression. The computer operates at a concentration of close to a trillion computers per microlitre and consists of three programmable modules: a computation module, that is, a stochastic molecular automaton; an input module, by which specific mRNA levels or point mutations regulate software molecule concentrations, and hence automaton transition probabilities; and an output module, capable of controlled release of a short single-stranded DNA molecule. This approach might be applied in vivo to biochemical sensing, genetic engineering and even medical diagnosis and treatment. As a proof of principle we programmed the computer to identify and analyse mRNA of disease-related genes associated with models of small-cell lung cancer and prostate cancer, and to produce a single-stranded DNA molecule modelled after an anticancer drug.

  8. Children, computer exposure and musculoskeletal outcomes: the development of pathway models for school and home computer-related musculoskeletal outcomes.

    PubMed

    Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne

    2015-01-01

    Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.

  9. Implementation of an Audio Computer-Assisted Self-Interview (ACASI) System in a General Medicine Clinic

    PubMed Central

    Deamant, C.; Smith, J.; Garcia, D.; Angulo, F.

    2015-01-01

    Summary Background Routine implementation of instruments to capture patient-reported outcomes could guide clinical practice and facilitate health services research. Audio interviews facilitate self-interviews across literacy levels. Objectives To evaluate time burden for patients, and factors associated with response times for an audio computer-assisted self interview (ACASI) system integrated into the clinical workflow. Methods We developed an ACASI system, integrated with a research data warehouse. Instruments for symptom burden, self-reported health, depression screening, tobacco use, and patient satisfaction were administered through touch-screen monitors in the general medicine clinic at the Cook County Health & Hospitals System during April 8, 2011-July 27, 2012. We performed a cross-sectional study to evaluate the mean time burden per item and for each module of instruments; we evaluated factors associated with longer response latency. Results Among 1,670 interviews, the mean per-question response time was 18.4 [SD, 6.1] seconds. By multivariable analysis, age was most strongly associated with prolonged response time and increased per decade compared to < 50 years as follows (additional seconds per question; 95% CI): 50–59 years (1.4; 0.7 to 2.1 seconds); 60–69 (3.4; 2.6 to 4.1); 70–79 (5.1; 4.0 to 6.1); and 80–89 (5.5; 4.1 to 7.0). Response times also were longer for Spanish language (3.9; 2.9 to 4.9); no home computer use (3.3; 2.8 to 3.9); and, low mental self-reported health (0.6; 0.0 to 1.1). However, most interviews were completed within 10 minutes. Conclusions An ACASI software system can be included in a patient visit and adds minimal time burden. The burden was greatest for older patients, interviews in Spanish, and for those with less computer exposure. A patient’s self-reported health had minimal impact on response times. PMID:25848420

  10. Implementation of an audio computer-assisted self-interview (ACASI) system in a general medicine clinic: patient response burden.

    PubMed

    Trick, W E; Deamant, C; Smith, J; Garcia, D; Angulo, F

    2015-01-01

    Routine implementation of instruments to capture patient-reported outcomes could guide clinical practice and facilitate health services research. Audio interviews facilitate self-interviews across literacy levels. To evaluate time burden for patients, and factors associated with response times for an audio computer-assisted self interview (ACASI) system integrated into the clinical workflow. We developed an ACASI system, integrated with a research data warehouse. Instruments for symptom burden, self-reported health, depression screening, tobacco use, and patient satisfaction were administered through touch-screen monitors in the general medicine clinic at the Cook County Health & Hospitals System during April 8, 2011-July 27, 2012. We performed a cross-sectional study to evaluate the mean time burden per item and for each module of instruments; we evaluated factors associated with longer response latency. Among 1,670 interviews, the mean per-question response time was 18.4 [SD, 6.1] seconds. By multivariable analysis, age was most strongly associated with prolonged response time and increased per decade compared to < 50 years as follows (additional seconds per question; 95% CI): 50-59 years (1.4; 0.7 to 2.1 seconds); 60-69 (3.4; 2.6 to 4.1); 70-79 (5.1; 4.0 to 6.1); and 80-89 (5.5; 4.1 to 7.0). Response times also were longer for Spanish language (3.9; 2.9 to 4.9); no home computer use (3.3; 2.8 to 3.9); and, low mental self-reported health (0.6; 0.0 to 1.1). However, most interviews were completed within 10 minutes. An ACASI software system can be included in a patient visit and adds minimal time burden. The burden was greatest for older patients, interviews in Spanish, and for those with less computer exposure. A patient's self-reported health had minimal impact on response times.

  11. Research on Computers in Mathematics Education, IV. The Use of Computers in Mathematics Education Resource Series.

    ERIC Educational Resources Information Center

    Kieren, Thomas E.

    This last paper in a set of four reviews research on a wide variety of computer applications in the mathematics classroom. It covers computer-based instruction, especially drill-and-practice and tutorial modes; computer-managed instruction; and computer-augmented problem-solving. Analytical comments on the findings and status of the research are…

  12. The Ames Virtual Environment Workstation: Implementation issues and requirements

    NASA Technical Reports Server (NTRS)

    Fisher, Scott S.; Jacoby, R.; Bryson, S.; Stone, P.; Mcdowall, I.; Bolas, M.; Dasaro, D.; Wenzel, Elizabeth M.; Coler, C.; Kerr, D.

    1991-01-01

    This presentation describes recent developments in the implementation of a virtual environment workstation in the Aerospace Human Factors Research Division of NASA's Ames Research Center. Introductory discussions are presented on the primary research objectives and applications of the system and on the system's current hardware and software configuration. Principle attention is then focused on unique issues and problems encountered in the workstation's development with emphasis on its ability to meet original design specifications for computational graphics performance and for associated human factors requirements necessary to provide compelling sense of presence and efficient interaction in the virtual environment.

  13. The Secure Medical Research Workspace: An IT Infrastructure to Enable Secure Research on Clinical Data

    PubMed Central

    Owen, Phillips; Mostafa, Javed; Lamm, Brent; Wang, Xiaoshu; Schmitt, Charles P.; Ahalt, Stanley C.

    2013-01-01

    Abstract Clinical data have tremendous value for translational research, but only if security and privacy concerns can be addressed satisfactorily. A collaboration of clinical and informatics teams, including RENCI, NC TraCS, UNC's School of Information and Library Science, Information Technology Service's Research Computing and other partners at the University of North Carolina at Chapel Hill have developed a system called the Secure Medical Research Workspace (SMRW) that enables researchers to use clinical data securely for research. SMRW significantly minimizes the risk presented when using identified clinical data, thereby protecting patients, researchers, and institutions associated with the data. The SMRW is built on a novel combination of virtualization and data leakage protection and can be combined with other protection methodologies and scaled to production levels. PMID:23751029

  14. Ocean Sciences meets Big Data Analytics

    NASA Astrophysics Data System (ADS)

    Hurwitz, B. L.; Choi, I.; Hartman, J.

    2016-02-01

    Hundreds of researchers worldwide have joined forces in the Tara Oceans Expedition to create an unprecedented planetary-scale dataset comprised of state-of-the-art next generation sequencing, microscopy, and physical/chemical metadata to explore ocean biodiversity. This summer the complete collection of data from the 2009-2013 Tara voyage was released. Yet, despite herculean efforts by the Tara Oceans Consortium to make raw data and computationally derived assemblies and gene catalogs available, most researchers are stymied by the sheer volume of the data. Specifically, the most tantalizing research questions lie in understanding the unifying principles that guide the distribution of organisms across the sea and affect climate and ecosystem function. To use the data in this capacity researchers must download, integrate, and analyze more than 7.2 trillion bases of metagenomic data and associated metadata from viruses, bacteria, archaea and small eukaryotes at their own data centers ( 9 TB of raw data). Accessing large-scale data sets in this way impedes scientists' from replicating and building on prior work. To this end, we are developing a data platform called the Ocean Cloud Commons (OCC) as part of the iMicrobe project. The OCC is built using an algorithm we developed to pre-compute massive comparative metagenomic analyses in a Hadoop big data framework. By maintaining data in a cloud commons researchers have access to scalable computation and real-time analytics to promote the integrated and broad use of planetary-scale datasets, such as Tara.

  15. NCCA 2010 Water

    EPA Pesticide Factsheets

    Data from the National Aquatic Resource Surveys: The following data are available for download as comma separated values (.csv) files. Sort the table using the pull down menus or headers to more easily locate the data. Right click on the file name and select Save Link As to save the file to your computer. Make sure to also download the companion metadata file (.txt) for the list of field labels. See the survey technical document for more information on the data analyses.This dataset is associated with the following publications:Yurista , P., J. Kelly , and J. Scharold. Great Lakes nearshore-offshore: Distinct water quality regions. JOURNAL OF GREAT LAKES RESEARCH. International Association for Great Lakes Research, Ann Arbor, MI, USA, 42: 375-385, (2016).Kelly , J., P. Yurista , M. Starry, J. Scharold , W. Bartsch , and A. Cotter. The first US National Coastal Condition Assessment survey in the Great Lakes: Development of the GIS frame and exploration of spatial variation in nearshore water quality results. JOURNAL OF GREAT LAKES RESEARCH. International Association for Great Lakes Research, Ann Arbor, MI, USA, 41: 1060-1074, (2015).

  16. The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections.

    PubMed

    Zamawe, F C

    2015-03-01

    For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.

  17. Greater power and computational efficiency for kernel-based association testing of sets of genetic variants.

    PubMed

    Lippert, Christoph; Xiang, Jing; Horta, Danilo; Widmer, Christian; Kadie, Carl; Heckerman, David; Listgarten, Jennifer

    2014-11-15

    Set-based variance component tests have been identified as a way to increase power in association studies by aggregating weak individual effects. However, the choice of test statistic has been largely ignored even though it may play an important role in obtaining optimal power. We compared a standard statistical test-a score test-with a recently developed likelihood ratio (LR) test. Further, when correction for hidden structure is needed, or gene-gene interactions are sought, state-of-the art algorithms for both the score and LR tests can be computationally impractical. Thus we develop new computationally efficient methods. After reviewing theoretical differences in performance between the score and LR tests, we find empirically on real data that the LR test generally has more power. In particular, on 15 of 17 real datasets, the LR test yielded at least as many associations as the score test-up to 23 more associations-whereas the score test yielded at most one more association than the LR test in the two remaining datasets. On synthetic data, we find that the LR test yielded up to 12% more associations, consistent with our results on real data, but also observe a regime of extremely small signal where the score test yielded up to 25% more associations than the LR test, consistent with theory. Finally, our computational speedups now enable (i) efficient LR testing when the background kernel is full rank, and (ii) efficient score testing when the background kernel changes with each test, as for gene-gene interaction tests. The latter yielded a factor of 2000 speedup on a cohort of size 13 500. Software available at http://research.microsoft.com/en-us/um/redmond/projects/MSCompBio/Fastlmm/. heckerma@microsoft.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  18. A critical analysis of the internal logic in the Life-Space Assessment (LSA) composite score and suggested solutions.

    PubMed

    Siordia, Carlos

    2016-06-01

    An individual's ability to live independently is commonly measured in health research interested in identifying risk factors associated with disablement processes. In order to inform clinical practice, population research has attempted to identify the contraction of "lived-space" by using various survey instruments. Studies assessing habitual movements over the environment with the Life-Space Assessment (LSA) survey instrument should carefully consider how the LSA Composite Score (LSA-CS) is computed. Until now, no publication has carefully delineated the assumptions guiding the internal logic used in the computation of the LSA-CS. Because the internal logic of the LSA may need further justification, a non-data-editing scoring algorithm should be considered. Compute LSA-CS by only using non-edited data. Paper first delineates the logic guiding the algorithm used in the formation of the LSA-CS and explains how the scoring creates and changes participant responses when they conflict with its internal logic. An easy-to-use SAS® 9.3 program for estimating a Non-Data-Edited LSA-CS (NDE-LSA-CS) is also presented. Researchers interested in assessing lived-space should carefully consider if the internal logic of the LSA-CS is warranted. Clinicians should know it is important to understand the strengths and weaknesses of outcome measures used when deciding on whether to apply the results of research to direct clinical practice. © The Author(s) 2015.

  19. Eddy Current Influences on the Dynamic Behaviour of Magnetic Suspension Systems

    NASA Technical Reports Server (NTRS)

    Britcher, Colin P.; Bloodgood, Dale V.

    1998-01-01

    This report will summarize some results from a multi-year research effort at NASA Langley Research Center aimed at the development of an improved capability for practical modelling of eddy current effects in magnetic suspension systems. Particular attention is paid to large-gap systems, although generic results applicable to both large-gap and small-gap systems are presented. It is shown that eddy currents can significantly affect the dynamic behavior of magnetic suspension systems, but that these effects can be amenable to modelling and measurement. Theoretical frameworks are presented, together with comparisons of computed and experimental data particularly related to the Large Angle Magnetic Suspension Test Fixture at NASA Langley Research Center, and the Annular Suspension and Pointing System at Old Dominion University. In both cases, practical computations are capable of providing reasonable estimates of important performance-related parameters. The most difficult case is seen to be that of eddy currents in highly permeable material, due to the low skin depths. Problems associated with specification of material properties and areas for future research are discussed.

  20. An Integrative Bioinformatics Approach for Knowledge Discovery

    NASA Astrophysics Data System (ADS)

    Peña-Castillo, Lourdes; Phan, Sieu; Famili, Fazel

    The vast amount of data being generated by large scale omics projects and the computational approaches developed to deal with this data have the potential to accelerate the advancement of our understanding of the molecular basis of genetic diseases. This better understanding may have profound clinical implications and transform the medical practice; for instance, therapeutic management could be prescribed based on the patient’s genetic profile instead of being based on aggregate data. Current efforts have established the feasibility and utility of integrating and analysing heterogeneous genomic data to identify molecular associations to pathogenesis. However, since these initiatives are data-centric, they either restrict the research community to specific data sets or to a certain application domain, or force researchers to develop their own analysis tools. To fully exploit the potential of omics technologies, robust computational approaches need to be developed and made available to the community. This research addresses such challenge and proposes an integrative approach to facilitate knowledge discovery from diverse datasets and contribute to the advancement of genomic medicine.

  1. Computational mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D.more » Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.« less

  2. Topical perspective on massive threading and parallelism.

    PubMed

    Farber, Robert M

    2011-09-01

    Unquestionably computer architectures have undergone a recent and noteworthy paradigm shift that now delivers multi- and many-core systems with tens to many thousands of concurrent hardware processing elements per workstation or supercomputer node. GPGPU (General Purpose Graphics Processor Unit) technology in particular has attracted significant attention as new software development capabilities, namely CUDA (Compute Unified Device Architecture) and OpenCL™, have made it possible for students as well as small and large research organizations to achieve excellent speedup for many applications over more conventional computing architectures. The current scientific literature reflects this shift with numerous examples of GPGPU applications that have achieved one, two, and in some special cases, three-orders of magnitude increased computational performance through the use of massive threading to exploit parallelism. Multi-core architectures are also evolving quickly to exploit both massive-threading and massive-parallelism such as the 1.3 million threads Blue Waters supercomputer. The challenge confronting scientists in planning future experimental and theoretical research efforts--be they individual efforts with one computer or collaborative efforts proposing to use the largest supercomputers in the world is how to capitalize on these new massively threaded computational architectures--especially as not all computational problems will scale to massive parallelism. In particular, the costs associated with restructuring software (and potentially redesigning algorithms) to exploit the parallelism of these multi- and many-threaded machines must be considered along with application scalability and lifespan. This perspective is an overview of the current state of threading and parallelize with some insight into the future. Published by Elsevier Inc.

  3. A characterization of workflow management systems for extreme-scale applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  4. The Impact of Time Delay on the Content of Discussions at a Computer-Mediated Conference

    NASA Astrophysics Data System (ADS)

    Huntley, Byron C.; Thatcher, Andrew

    2008-11-01

    This study investigates the relationship between the content of computer-mediated discussions and the time delay between online postings. The study aims to broaden understanding of the dynamics of computer-mediated discussion regarding the time delay and the actual content of computer-mediated discussions (knowledge construction, social aspects, amount of words and number of postings) which has barely been researched. The computer-mediated discussions of the CybErg 2005 virtual conference served as the sample for this study. The Interaction Analysis Model [1] was utilized to analyze the level of knowledge construction in the content of the computer-mediated discussions. Correlations have been computed for all combinations of the variables. The results demonstrate that knowledge construction, social aspects and amount of words generated within postings were independent of, and not affected by, the time delay between the postings and the posting from which the reply was formulated. When greater numbers of words were utilized within postings, this was typically associated with a greater level of knowledge construction. Social aspects in the discussion were found to neither advantage nor disadvantage the overall effectiveness of the computer-mediated discussion.

  5. A characterization of workflow management systems for extreme-scale applications

    DOE PAGES

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...

    2017-02-16

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  6. Connecting Education with Careers. Business Education Association for Career and Technical Education Annual Convention Proceedings (Orlando, Florida, December 11-15, 1999).

    ERIC Educational Resources Information Center

    Wilkinson, Kelly S., Ed.

    This document contains five refereed research papers on connecting education with careers through business education. "The Different Skill Levels Students Possess When Entering Computer Software Applications College Courses" (Michael McDonald) reports on a 1998 survey examining the perceived skill level differences of college students…

  7. Tools and Strategies for Engaging the Supervisor in Technology-Supported Work-Based Learning, Evaluation Research

    ERIC Educational Resources Information Center

    Bianco, Manuela; Collis, Betty

    2004-01-01

    This study reports the results of the formative evaluations of two computer-supported tools and the associated strategies for their use. Tools and strategies embedded in web-based courses can increase a supervisor's involvement in helping employees transfer learning onto the workplace. Issues relating to characteristics of the tools and strategies…

  8. A Meta Analytical Review of the Relationship between Personal Epistemology and Self-Regulated Learning

    ERIC Educational Resources Information Center

    Alpaslan, Muhammet Mustafa; Yalvac, Bugrahan; Willson, Victor

    2017-01-01

    Recently, researchers have begun associating personal epistemology with self-regulated learning. Therefore, in the literature there is a need to examine what degree the studies have supported the relationship between the two. The purpose of this meta-analysis is two folds: a) to compute the mean effect size for the relations between personal…

  9. Introducing a "Means-End" Approach to Human-Computer Interaction: Why Users Choose Particular Web Sites Over Others.

    ERIC Educational Resources Information Center

    Subramony, Deepak Prem

    Gutman's means-end theory, widely used in market research, identifies three levels of abstraction: attributes, consequences, and values--associated with the use of products, representing the process by which physical attributes of products gain personal meaning for users. The primary methodological manifestation of means-end theory is the…

  10. Assessing Risk for Sexual Offenders in New Zealand: Development and Validation of a Computer-Scored Risk Measure

    ERIC Educational Resources Information Center

    Skelton, Alexander; Riley, David; Wales, David; Vess, James

    2006-01-01

    A growing research base supports the predictive validity of actuarial methods of risk assessment with sexual offenders. These methods use clearly defined variables with demonstrated empirical association with re-offending. The advantages of actuarial measures for screening large numbers of offenders quickly and economically are further enhanced…

  11. Experimentation and Research in The Birkman Program of the Austin College Total Institutional Project, 1972-1975.

    ERIC Educational Resources Information Center

    Austin Coll., Sherman, TX.

    The Birkman Method is proprietary, and consists of a battery of psychological instruments, an occupational interest survey, and associated reports, a self-awareness seminar based on the questionnaire, and the supporting computer software and data banks. The questions and occupations are designed to explore basic areas of a person's values,…

  12. The Write Help: A Handbook for Computers in Classrooms. Report No. 6.

    ERIC Educational Resources Information Center

    Mehan, Hugh, Ed.; Souviney, Randall, Ed.

    The result of a year of research and development in the classroom, the language arts activities presented in this handbook are designed for use with microcomputers in elementary and junior high classrooms. The first chapter reviews the current uses of microcomputers in the classroom and identifies the problems associated with the prevailing…

  13. Comparing Children's Performance on and Preference for a Number-Line Estimation Task: Tablet versus Paper and Pencil

    ERIC Educational Resources Information Center

    Piatt, Carley; Coret, Marian; Choi, Michael; Volden, Joanne; Bisanz, Jeffrey

    2016-01-01

    Tablet computers (tablets) are positioned to be powerful, innovative, effective, and motivating research and assessment tools. We addressed two questions critical for evaluating the appropriateness of using tablets to study number-line estimation, a skill associated with math achievement and argued to be central to numerical cognition. First, is…

  14. Mathematics and My Career. A Collection of Essays.

    ERIC Educational Resources Information Center

    Turner, Nura Dorothea Rains, Ed.

    The essays in this booklet have been written by persons who had ranked in the top one percent of the 1958-60 Upstate New York Mathematical Association of America Contests. Personal accounts are given of the role of mathematics in the authors' education and career. The careers described include applied mathematics, computer research and programing,…

  15. Revisiting an Old Methodology for Teaching Counting, Computation, and Place Value: The Effectiveness of the Finger Calculation Method for At-Risk Children

    ERIC Educational Resources Information Center

    Calder Stegemann, Kim; Grünke, Matthias

    2014-01-01

    Number sense is critical to the development of higher order mathematic abilities. However, some children have difficulty acquiring these fundamental skills and the knowledge base of effective interventions/remediation is relatively limited. Based on emerging neuro-scientific research which has identified the association between finger…

  16. The Contribution of CALL to Advanced-Level Foreign/Second Language Instruction

    ERIC Educational Resources Information Center

    Burston, Jack; Arispe, Kelly

    2016-01-01

    This paper evaluates the contribution of instructional technology to advanced-level foreign/second language learning (AL2) over the past thirty years. It is shown that the most salient feature of AL2 practice and associated Computer-Assisted Language Learning (CALL) research are their rarity and restricted nature. Based on an analysis of four…

  17. Ethnography at a Distance: Globally Mobile Parents Choosing International Schools

    ERIC Educational Resources Information Center

    Forsey, Martin; Breidenstein, Georg; Krüger, Oliver; Roch, Anna

    2015-01-01

    The research we report on was conducted from our computer desktops. We have not met the people we have studied; they are part of what Eichhorn described as a "textual community", gathered around the threads of online conversations associated with a website servicing the needs of English-language speakers in Germany. The thread in…

  18. The k-d Tree: A Hierarchical Model for Human Cognition.

    ERIC Educational Resources Information Center

    Vandendorpe, Mary M.

    This paper discusses a model of information storage and retrieval, the k-d tree (Bentley, 1975), a binary, hierarchical tree with multiple associate terms, which has been explored in computer research, and it is suggested that this model could be useful for describing human cognition. Included are two models of human long-term memory--networks and…

  19. Aerospace technology as a source of new ideas.

    NASA Technical Reports Server (NTRS)

    Hamilton, J. T.

    1972-01-01

    It is shown that technological products and processes resulting from aeronautical and space research and development can be a significant source of new product or product improvement ideas. The problems associated with technology transfer are discussed. As an example, the commercialization of NASTRAN, NASA's structural analysis computer program, is discussed. Some other current application projects are also outlined.

  20. Understanding Islamist political violence through computational social simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates themore » computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.« less

  1. Using Tutte polynomials to analyze the structure of the benzodiazepines

    NASA Astrophysics Data System (ADS)

    Cadavid Muñoz, Juan José

    2014-05-01

    Graph theory in general and Tutte polynomials in particular, are implemented for analyzing the chemical structure of the benzodiazepines. Similarity analysis are used with the Tutte polynomials for finding other molecules that are similar to the benzodiazepines and therefore that might show similar psycho-active actions for medical purpose, in order to evade the drawbacks associated to the benzodiazepines based medicine. For each type of benzodiazepines, Tutte polynomials are computed and some numeric characteristics are obtained, such as the number of spanning trees and the number of spanning forests. Computations are done using the computer algebra Maple's GraphTheory package. The obtained analytical results are of great importance in pharmaceutical engineering. As a future research line, the usage of the chemistry computational program named Spartan, will be used to extent and compare it with the obtained results from the Tutte polynomials of benzodiazepines.

  2. Computational physics in RISC environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhoades, C.E. Jr.

    The new high performance Reduced Instruction Set Computers (RISC) promise near Cray-level performance at near personal-computer prices. This paper explores the performance, conversion and compatibility issues associated with developing, testing and using our traditional, large-scale simulation models in the RISC environments exemplified by the IBM RS6000 and MISP R3000 machines. The questions of operating systems (CTSS versus UNIX), compilers (Fortran, C, pointers) and data are addressed in detail. Overall, it is concluded that the RISC environments are practical for a very wide range of computational physic activities. Indeed, all but the very largest two- and three-dimensional codes will work quitemore » well, particularly in a single user environment. Easily projected hardware-performance increases will revolutionize the field of computational physics. The way we do research will change profoundly in the next few years. There is, however, nothing more difficult to plan, nor more dangerous to manage than the creation of this new world.« less

  3. Computational physics in RISC environments. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhoades, C.E. Jr.

    The new high performance Reduced Instruction Set Computers (RISC) promise near Cray-level performance at near personal-computer prices. This paper explores the performance, conversion and compatibility issues associated with developing, testing and using our traditional, large-scale simulation models in the RISC environments exemplified by the IBM RS6000 and MISP R3000 machines. The questions of operating systems (CTSS versus UNIX), compilers (Fortran, C, pointers) and data are addressed in detail. Overall, it is concluded that the RISC environments are practical for a very wide range of computational physic activities. Indeed, all but the very largest two- and three-dimensional codes will work quitemore » well, particularly in a single user environment. Easily projected hardware-performance increases will revolutionize the field of computational physics. The way we do research will change profoundly in the next few years. There is, however, nothing more difficult to plan, nor more dangerous to manage than the creation of this new world.« less

  4. High Temperature Composite Analyzer (HITCAN) demonstration manual, version 1.0

    NASA Technical Reports Server (NTRS)

    Singhal, S. N; Lackney, J. J.; Murthy, P. L. N.

    1993-01-01

    This manual comprises a variety of demonstration cases for the HITCAN (HIgh Temperature Composite ANalyzer) code. HITCAN is a general purpose computer program for predicting nonlinear global structural and local stress-strain response of arbitrarily oriented, multilayered high temperature metal matrix composite structures. HITCAN is written in FORTRAN 77 computer language and has been configured and executed on the NASA Lewis Research Center CRAY XMP and YMP computers. Detailed description of all program variables and terms used in this manual may be found in the User's Manual. The demonstration includes various cases to illustrate the features and analysis capabilities of the HITCAN computer code. These cases include: (1) static analysis, (2) nonlinear quasi-static (incremental) analysis, (3) modal analysis, (4) buckling analysis, (5) fiber degradation effects, (6) fabrication-induced stresses for a variety of structures; namely, beam, plate, ring, shell, and built-up structures. A brief discussion of each demonstration case with the associated input data file is provided. Sample results taken from the actual computer output are also included.

  5. [Computer games in childhood and adolescence: relations to addictive behavior, ADHD, and aggression].

    PubMed

    Frölich, Jan; Lehmkuhl, Gerd; Döpfner, Manfred

    2009-09-01

    Playing computer games has become one of the main leisure activities in children and adolescents and increasingly replaces traditional playing and interactional activities. There might exist developmental benefits or positive effects of computer games that can be used for educational or therapeutic purposes. More important several studies have well demonstrated that excessive computer game playing is associated with behavior that features all components of non-chemical addiction and the prevalences across all age groups seem to be impressingly high. This overview relies on a Medline research. Its objective is to describe motivational and developmental characteristics attributed to computer games as well as the prevalences of computer playing in children and adolescents to better understand the risks for addictive use. We especially focus on the relations of excessive computer playing with attention-deficit hyperactivity disorder (ADHD) and aggressive behavior. The results demonstrate that children with ADHD are especially vulnerable to addictive use of computer games due to their neuropsychological profile. Moreover excessive violent computer game playing might be a significant risk variable for aggressive behavior in the presence of personality traits with aggressive cognitions and behavior scripts in the consumers. The increasing clinical meaning of addictive computer games playing urgently necessitates the development of diagnostic and therapeutic tools for clinical practice as well as the cooperation with allied disciplines.

  6. Differential patterns of laptop use and associated musculoskeletal discomfort in male and female college students.

    PubMed

    Bubric, Katherine; Hedge, Alan

    2016-11-22

    Laptop computers have surpassed desktop computers in popularity, especially among college student users. The portability of these devices raises concerns regarding healthy usage patterns in different settings and there is a need to investigate the postures with which these devices are being used and associated reports of musculoskeletal discomfort. This study investigated the configurations in which laptops are used and the prevalence of musculoskeletal discomfort associated with laptop use in a survey of college students. The purpose of this was to identify differences in discomfort and/or postural choice between males and females. A sample of 90 male and 96 female college students completed an online questionnaire consisting of demographic questions, musculoskeletal discomfort indicators and questions regarding configurations of laptop use. Over 53% of participants reported experiencing musculoskeletal discomfort while using a laptop computer, with females reporting a higher prevalence of neck discomfort (p = 0.05) and shoulder discomfort (p = 0.006) than males. Participants reported using a laptop most commonly in positions at a  desk and on a bed. Females were more likely than males to use a laptop on a bed with the computer positioned on their lap or with their legs crossed (p < 0.05). Males were more likely than females to work in positions necessitating a large trunk deviation to operate the laptop (p < 0.05), such as "sitting on a  sofa with your feet on the floor (knees at 90°), bending over to use laptop that is resting on a  coffee table or ottoman in front of you". A significant proportion of college students report experiencing musculoskeletal discomfort associated with laptop computer use. Sex differences exist in both choice of configurations and musculoskeletal discomfort associated with laptop use. Due to the portability of laptop computers, they are used in a variety of configurations and environments. This study identifies a number of different ways that laptop computers are used that have not been considered in previous research. These patterns of laptop use can be used to inform future work on the impact of technology use on discomfort.

  7. FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption

    PubMed Central

    2015-01-01

    Background The increasing availability of genome data motivates massive research studies in personalized treatment and precision medicine. Public cloud services provide a flexible way to mitigate the storage and computation burden in conducting genome-wide association studies (GWAS). However, data privacy has been widely concerned when sharing the sensitive information in a cloud environment. Methods We presented a novel framework (FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption) to fully outsource GWAS (i.e., chi-square statistic computation) using homomorphic encryption. The proposed framework enables secure divisions over encrypted data. We introduced two division protocols (i.e., secure errorless division and secure approximation division) with a trade-off between complexity and accuracy in computing chi-square statistics. Results The proposed framework was evaluated for the task of chi-square statistic computation with two case-control datasets from the 2015 iDASH genome privacy protection challenge. Experimental results show that the performance of FORESEE can be significantly improved through algorithmic optimization and parallel computation. Remarkably, the secure approximation division provides significant performance gain, but without missing any significance SNPs in the chi-square association test using the aforementioned datasets. Conclusions Unlike many existing HME based studies, in which final results need to be computed by the data owner due to the lack of the secure division operation, the proposed FORESEE framework support complete outsourcing to the cloud and output the final encrypted chi-square statistics. PMID:26733391

  8. Excessive computer game playing among Norwegian adults: self-reported consequences of playing and association with mental health problems.

    PubMed

    Wenzel, H G; Bakken, I J; Johansson, A; Götestam, K G; Øren, Anita

    2009-12-01

    Computer games are the most advanced form of gaming. For most people, the playing is an uncomplicated leisure activity; however, for a minority the gaming becomes excessive and is associated with negative consequences. The aim of the present study was to investigate computer game-playing behaviour in the general adult Norwegian population, and to explore mental health problems and self-reported consequences of playing. The survey includes 3,405 adults 16 to 74 years old (Norway 2007, response rate 35.3%). Overall, 65.5% of the respondents reported having ever played computer games (16-29 years, 93.9%; 30-39 years, 85.0%; 40-59 years, 56.2%; 60-74 years, 25.7%). Among 2,170 players, 89.8% reported playing less than 1 hr. as a daily average over the last month, 5.0% played 1-2 hr. daily, 3.1% played 2-4 hr. daily, and 2.2% reported playing > 4 hr. daily. The strongest risk factor for playing > 4 hr. daily was being an online player, followed by male gender, and single marital status. Reported negative consequences of computer game playing increased strongly with average daily playing time. Furthermore, prevalence of self-reported sleeping problems, depression, suicide ideations, anxiety, obsessions/ compulsions, and alcohol/substance abuse increased with increasing playing time. This study showed that adult populations should also be included in research on computer game-playing behaviour and its consequences.

  9. FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption.

    PubMed

    Zhang, Yuchen; Dai, Wenrui; Jiang, Xiaoqian; Xiong, Hongkai; Wang, Shuang

    2015-01-01

    The increasing availability of genome data motivates massive research studies in personalized treatment and precision medicine. Public cloud services provide a flexible way to mitigate the storage and computation burden in conducting genome-wide association studies (GWAS). However, data privacy has been widely concerned when sharing the sensitive information in a cloud environment. We presented a novel framework (FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption) to fully outsource GWAS (i.e., chi-square statistic computation) using homomorphic encryption. The proposed framework enables secure divisions over encrypted data. We introduced two division protocols (i.e., secure errorless division and secure approximation division) with a trade-off between complexity and accuracy in computing chi-square statistics. The proposed framework was evaluated for the task of chi-square statistic computation with two case-control datasets from the 2015 iDASH genome privacy protection challenge. Experimental results show that the performance of FORESEE can be significantly improved through algorithmic optimization and parallel computation. Remarkably, the secure approximation division provides significant performance gain, but without missing any significance SNPs in the chi-square association test using the aforementioned datasets. Unlike many existing HME based studies, in which final results need to be computed by the data owner due to the lack of the secure division operation, the proposed FORESEE framework support complete outsourcing to the cloud and output the final encrypted chi-square statistics.

  10. Overview: early history of crop growth and photosynthesis modeling.

    PubMed

    El-Sharkawy, Mabrouk A

    2011-02-01

    As in industrial and engineering systems, there is a need to quantitatively study and analyze the many constituents of complex natural biological systems as well as agro-ecosystems via research-based mechanistic modeling. This objective is normally addressed by developing mathematically built descriptions of multilevel biological processes to provide biologists a means to integrate quantitatively experimental research findings that might lead to a better understanding of the whole systems and their interactions with surrounding environments. Aided with the power of computational capacities associated with computer technology then available, pioneering cropping systems simulations took place in the second half of the 20th century by several research groups across continents. This overview summarizes that initial pioneering effort made to simulate plant growth and photosynthesis of crop canopies, focusing on the discovery of gaps that exist in the current scientific knowledge. Examples are given for those gaps where experimental research was needed to improve the validity and application of the constructed models, so that their benefit to mankind was enhanced. Such research necessitates close collaboration among experimentalists and model builders while adopting a multidisciplinary/inter-institutional approach. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  11. NASA/DOD Aerospace Knowledge Diffusion Research Project. Report 35: The use of computer networks in aerospace engineering

    NASA Technical Reports Server (NTRS)

    Bishop, Ann P.; Pinelli, Thomas E.

    1995-01-01

    This research used survey research to explore and describe the use of computer networks by aerospace engineers. The study population included 2000 randomly selected U.S. aerospace engineers and scientists who subscribed to Aerospace Engineering. A total of 950 usable questionnaires were received by the cutoff date of July 1994. Study results contribute to existing knowledge about both computer network use and the nature of engineering work and communication. We found that 74 percent of mail survey respondents personally used computer networks. Electronic mail, file transfer, and remote login were the most widely used applications. Networks were used less often than face-to-face interactions in performing work tasks, but about equally with reading and telephone conversations, and more often than mail or fax. Network use was associated with a range of technical, organizational, and personal factors: lack of compatibility across systems, cost, inadequate access and training, and unwillingness to embrace new technologies and modes of work appear to discourage network use. The greatest positive impacts from networking appear to be increases in the amount of accurate and timely information available, better exchange of ideas across organizational boundaries, and enhanced work flexibility, efficiency, and quality. Involvement with classified or proprietary data and type of organizational structure did not distinguish network users from nonusers. The findings can be used by people involved in the design and implementation of networks in engineering communities to inform the development of more effective networking systems, services, and policies.

  12. Alliance for Computational Science Collaboration HBCU Partnership at Fisk University. Final Report 2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, W. E.

    2004-08-16

    Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less

  13. Review on pen-and-paper-based observational methods for assessing ergonomic risk factors of computer work.

    PubMed

    Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika

    2017-01-01

    Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.

  14. The impact of computer use on therapeutic alliance and continuance in care during the mental health intake.

    PubMed

    Rosen, Daniel C; Nakash, Ora; Alegría, Margarita

    2016-03-01

    Advances in information technology within clinical practice have rapidly expanded over recent years. Despite the documented benefits of using electronic health records, which often necessitate computer use during the clinical encounter, little is known about the impact of computer use during the mental health visit and its effect on the quality of the therapeutic alliance. We investigated the association between computer use and quality of the working alliance and continuance in care in 104 naturalistic mental health intake sessions. Data were collected from 8 safety-net outpatient clinics in the Northeast offering mental health services to a diverse client population. All intakes were video recorded. Use of computer during the intake session was ascertained directly from the recording of the session (n = 22; 22.15% of intakes). Working alliance was assessed from the session videotapes by independent reliable coders, using the Working Alliance Inventory, Observer Form-bond scale. Therapist computer use was significantly associated with the quality of the observer-rated therapeutic alliance (Coefficient = -6.29, SE = 2.2, p < .01; Cohen's effect size of d = -0.76), and client's continuance in care (Odds ratio = .11, CI = 0.03-0.38; p < .001). The quality of the observer-rated working alliance and client's continuance in care were significantly lower in intakes in which the therapist used a computer during the session. Findings indicate a cautionary call in advancing computer use within the mental health intake, and demonstrate the need for future research to identify the specific behaviors that promote or hinder a strong working alliance within the context of psychotherapy in the technological era. (c) 2016 APA, all rights reserved).

  15. The state of ergonomics for mobile computing technology.

    PubMed

    Dennerlein, Jack T

    2015-01-01

    Because mobile computing technologies, such as notebook computers, smart mobile phones, and tablet computers afford users many different configurations through their intended mobility, there is concern about their effects on musculoskeletal pain and a need for usage recommendations. Therefore the main goal of this paper to determine which best practices surrounding the use of mobile computing devices can be gleaned from current field and laboratory studies of mobile computing devices. An expert review was completed. Field studies have documented various user configurations, which often include non-neutral postures, that users adopt when using mobile technology, along with some evidence suggesting that longer duration of use is associated with more discomfort. It is therefore prudent for users to take advantage of their mobility and not get stuck in any given posture for too long. The use of accessories such as appropriate cases or riser stands, as well as external keyboards and pointing devices, can also improve postures and comfort. Overall, the state of ergonomics for mobile technology is a work in progress and there are more research questions to be addressed.

  16. Computer simulation models as tools for identifying research needs: A black duck population model

    USGS Publications Warehouse

    Ringelman, J.K.; Longcore, J.R.

    1980-01-01

    Existing data on the mortality and production rates of the black duck (Anas rubripes) were used to construct a WATFIV computer simulation model. The yearly cycle was divided into 8 phases: hunting, wintering, reproductive, molt, post-molt, and juvenile dispersal mortality, and production from original and renesting attempts. The program computes population changes for sex and age classes during each phase. After completion of a standard simulation run with all variable default values in effect, a sensitivity analysis was conducted by changing each of 50 input variables, 1 at a time, to assess the responsiveness of the model to changes in each variable. Thirteen variables resulted in a substantial change in population level. Adult mortality factors were important during hunting and wintering phases. All production and mortality associated with original nesting attempts were sensitive, as was juvenile dispersal mortality. By identifying those factors which invoke the greatest population change, and providing an indication of the accuracy required in estimating these factors, the model helps to identify those variables which would be most profitable topics for future research.

  17. Experimental investigation of the persuasive impact of computer generated presentation graphics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogel, D.R.

    1986-01-01

    Computer generated presentation graphics are increasingly becoming a tool to aid management in communicating information and to cause an audience to accept a point of view or take action. Unfortunately, technological capability significantly exceeds current levels of user understanding and effective application. This research examines experimentally one aspect of this problem, the persuasive impact of characteristics of computer generated presentation graphics. The research was founded in theory based on the message learning approach to persuasion. Characteristics examined were color versus black and white, text versus image enhancement, and overhead transparencies versus 35 mm slides. Treatments were presented in association withmore » a videotaped presentation intended to persuade subjects to invest time and money in a set of time management seminars. Data were collected using pre-measure, post measure, and post measure follow up questionnaires. Presentation support had a direct impact on perceptions of the presenter as well as components of persuasion, i.e., attention, comprehension, yielding, and retention. Further, a strong positive relationship existed between enhanced perceptions of the presenter and attention and yielding.« less

  18. Simulation and analysis of a geopotential research mission

    NASA Technical Reports Server (NTRS)

    Schutz, B. E.

    1987-01-01

    Computer simulations were performed for a Geopotential Research Mission (GRM) to enable the study of the gravitational sensitivity of the range rate measurements between the two satellites and to provide a set of simulated measurements to assist in the evaluation of techniques developed for the determination of the gravity field. The simulations were conducted with two satellites in near circular, frozen orbits at 160 km altitudes separated by 300 km. High precision numerical integration of the polar orbits were used with a gravitational field complete to degree and order 360. The set of simulated data for a mission duration of about 32 days was generated on a Cray X-MP computer. The results presented cover the most recent simulation, S8703, and includes a summary of the numerical integration of the simulated trajectories, a summary of the requirements to compute nominal reference trajectories to meet the initial orbit determination requirements for the recovery of the geopotential, an analysis of the nature of the one way integrated Doppler measurements associated with the simulation, and a discussion of the data set to be made available.

  19. Design and analysis of advanced flight planning concepts

    NASA Technical Reports Server (NTRS)

    Sorensen, John A.

    1987-01-01

    The objectives of this continuing effort are to develop and evaluate new algorithms and advanced concepts for flight management and flight planning. This includes the minimization of fuel or direct operating costs, the integration of the airborne flight management and ground-based flight planning processes, and the enhancement of future traffic management systems design. Flight management (FMS) concepts are for on-board profile computation and steering of transport aircraft in the vertical plane between a city pair and along a given horizontal path. Flight planning (FPS) concepts are for the pre-flight ground based computation of the three-dimensional reference trajectory that connects the city pair and specifies the horizontal path, fuel load, and weather profiles for initializing the FMS. As part of these objectives, a new computer program called EFPLAN has been developed and utilized to study advanced flight planning concepts. EFPLAN represents an experimental version of an FPS. It has been developed to generate reference flight plans compatible as input to an FMS and to provide various options for flight planning research. This report describes EFPLAN and the associated research conducted in its development.

  20. Jackson State University's Center for Spatial Data Research and Applications: New facilities and new paradigms

    NASA Technical Reports Server (NTRS)

    Davis, Bruce E.; Elliot, Gregory

    1989-01-01

    Jackson State University recently established the Center for Spatial Data Research and Applications, a Geographical Information System (GIS) and remote sensing laboratory. Taking advantage of new technologies and new directions in the spatial (geographic) sciences, JSU is building a Center of Excellence in Spatial Data Management. New opportunities for research, applications, and employment are emerging. GIS requires fundamental shifts and new demands in traditional computer science and geographic training. The Center is not merely another computer lab but is one setting the pace in a new applied frontier. GIS and its associated technologies are discussed. The Center's facilities are described. An ARC/INFO GIS runs on a Vax mainframe, with numerous workstations. Image processing packages include ELAS, LIPS, VICAR, and ERDAS. A host of hardware and software peripheral are used in support. Numerous projects are underway, such as the construction of a Gulf of Mexico environmental data base, development of AI in image processing, a land use dynamics study of metropolitan Jackson, and others. A new academic interdisciplinary program in Spatial Data Management is under development, combining courses in Geography and Computer Science. The broad range of JSU's GIS and remote sensing activities is addressed. The impacts on changing paradigms in the university and in the professional world conclude the discussion.

  1. Online Self-Administered Cognitive Testing Using the Amsterdam Cognition Scan: Establishing Psychometric Properties and Normative Data.

    PubMed

    Feenstra, Heleen Em; Vermeulen, Ivar E; Murre, Jaap Mj; Schagen, Sanne B

    2018-05-30

    Online tests enable efficient self-administered assessments and consequently facilitate large-scale data collection for many fields of research. The Amsterdam Cognition Scan is a new online neuropsychological test battery that measures a broad variety of cognitive functions. The aims of this study were to evaluate the psychometric properties of the Amsterdam Cognition Scan and to establish regression-based normative data. The Amsterdam Cognition Scan was self-administrated twice from home-with an interval of 6 weeks-by 248 healthy Dutch-speaking adults aged 18 to 81 years. Test-retest reliability was moderate to high and comparable with that of equivalent traditional tests (intraclass correlation coefficients: .45 to .80; .83 for the Amsterdam Cognition Scan total score). Multiple regression analyses indicated that (1) participants' age negatively influenced all (12) cognitive measures, (2) gender was associated with performance on six measures, and (3) education level was positively associated with performance on four measures. In addition, we observed influences of tested computer skills and of self-reported amount of computer use on cognitive performance. Demographic characteristics that proved to influence Amsterdam Cognition Scan test performance were included in regression-based predictive formulas to establish demographically adjusted normative data. Initial results from a healthy adult sample indicate that the Amsterdam Cognition Scan has high usability and can give reliable measures of various generic cognitive ability areas. For future use, the influence of computer skills and experience should be further studied, and for repeated measurements, computer configuration should be consistent. The reported normative data allow for initial interpretation of Amsterdam Cognition Scan performances. ©Heleen EM Feenstra, Ivar E Vermeulen, Jaap MJ Murre, Sanne B Schagen. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 30.05.2018.

  2. A Research Program in Computer Technology. 1986 Annual Technical Report

    DTIC Science & Technology

    1989-08-01

    1986 (Annual Technical Report I July 1985 - June 1986 A Research Program in Computer Technology ISI/SR-87-178 U S C INFORMA-TION S C I EN C ES...Program in Computer Technology (Unclassified) 12. PERSONAL AUTHOR(S) 151 Research Staff 13a. TYPE OF REPORT 113b. TIME COVERED 14 DATE OF REPORT (Yeer...survivable networks 17. distributed processing, local networks, personal computers, workstation environment 18. computer acquisition, Strategic Computing 19

  3. Computer Plotting Data Points in the Engine Research Building

    NASA Image and Video Library

    1956-09-21

    A female computer plotting compressor data in the Engine Research Building at the NACA’s Lewis Flight Propulsion Laboratory. The Computing Section was introduced during World War II to relieve short-handed research engineers of some of the tedious data-taking work. The computers made the initial computations and plotted the data graphically. The researcher then analyzed the data and either summarized the findings in a report or made modifications or ran the test again. With the introduction of mechanical computer systems in the 1950s the female computers learned how to encode the punch cards. As the data processing capabilities increased, fewer female computers were needed. Many left on their own to start families, while others earned mathematical degrees and moved into advanced positions.

  4. Task scheduling in dataflow computer architectures

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1994-01-01

    Dataflow computers provide a platform for the solution of a large class of computational problems, which includes digital signal processing and image processing. Many typical applications are represented by a set of tasks which can be repetitively executed in parallel as specified by an associated dataflow graph. Research in this area aims to model these architectures, develop scheduling procedures, and predict the transient and steady state performance. Researchers at NASA have created a model and developed associated software tools which are capable of analyzing a dataflow graph and predicting its runtime performance under various resource and timing constraints. These models and tools were extended and used in this work. Experiments using these tools revealed certain properties of such graphs that require further study. Specifically, the transient behavior at the beginning of the execution of a graph can have a significant effect on the steady state performance. Transformation and retiming of the application algorithm and its initial conditions can produce a different transient behavior and consequently different steady state performance. The effect of such transformations on the resource requirements or under resource constraints requires extensive study. Task scheduling to obtain maximum performance (based on user-defined criteria), or to satisfy a set of resource constraints, can also be significantly affected by a transformation of the application algorithm. Since task scheduling is performed by heuristic algorithms, further research is needed to determine if new scheduling heuristics can be developed that can exploit such transformations. This work has provided the initial development for further long-term research efforts. A simulation tool was completed to provide insight into the transient and steady state execution of a dataflow graph. A set of scheduling algorithms was completed which can operate in conjunction with the modeling and performance tools previously developed. Initial studies on the performance of these algorithms were done to examine the effects of application algorithm transformations as measured by such quantities as number of processors, time between outputs, time between input and output, communication time, and memory size.

  5. Government Cloud Computing Policies: Potential Opportunities for Advancing Military Biomedical Research.

    PubMed

    Lebeda, Frank J; Zalatoris, Jeffrey J; Scheerer, Julia B

    2018-02-07

    This position paper summarizes the development and the present status of Department of Defense (DoD) and other government policies and guidances regarding cloud computing services. Due to the heterogeneous and growing biomedical big datasets, cloud computing services offer an opportunity to mitigate the associated storage and analysis requirements. Having on-demand network access to a shared pool of flexible computing resources creates a consolidated system that should reduce potential duplications of effort in military biomedical research. Interactive, online literature searches were performed with Google, at the Defense Technical Information Center, and at two National Institutes of Health research portfolio information sites. References cited within some of the collected documents also served as literature resources. We gathered, selected, and reviewed DoD and other government cloud computing policies and guidances published from 2009 to 2017. These policies were intended to consolidate computer resources within the government and reduce costs by decreasing the number of federal data centers and by migrating electronic data to cloud systems. Initial White House Office of Management and Budget information technology guidelines were developed for cloud usage, followed by policies and other documents from the DoD, the Defense Health Agency, and the Armed Services. Security standards from the National Institute of Standards and Technology, the Government Services Administration, the DoD, and the Army were also developed. Government Services Administration and DoD Inspectors General monitored cloud usage by the DoD. A 2016 Government Accountability Office report characterized cloud computing as being economical, flexible and fast. A congressionally mandated independent study reported that the DoD was active in offering a wide selection of commercial cloud services in addition to its milCloud system. Our findings from the Department of Health and Human Services indicated that the security infrastructure in cloud services may be more compliant with the Health Insurance Portability and Accountability Act of 1996 regulations than traditional methods. To gauge the DoD's adoption of cloud technologies proposed metrics included cost factors, ease of use, automation, availability, accessibility, security, and policy compliance. Since 2009, plans and policies were developed for the use of cloud technology to help consolidate and reduce the number of data centers which were expected to reduce costs, improve environmental factors, enhance information technology security, and maintain mission support for service members. Cloud technologies were also expected to improve employee efficiency and productivity. Federal cloud computing policies within the last decade also offered increased opportunities to advance military healthcare. It was assumed that these opportunities would benefit consumers of healthcare and health science data by allowing more access to centralized cloud computer facilities to store, analyze, search and share relevant data, to enhance standardization, and to reduce potential duplications of effort. We recommend that cloud computing be considered by DoD biomedical researchers for increasing connectivity, presumably by facilitating communications and data sharing, among the various intra- and extramural laboratories. We also recommend that policies and other guidances be updated to include developing additional metrics that will help stakeholders evaluate the above mentioned assumptions and expectations. Published by Oxford University Press on behalf of the Association of Military Surgeons of the United States 2018. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  6. Human-computer interaction: psychological aspects of the human use of computing.

    PubMed

    Olson, Gary M; Olson, Judith S

    2003-01-01

    Human-computer interaction (HCI) is a multidisciplinary field in which psychology and other social sciences unite with computer science and related technical fields with the goal of making computing systems that are both useful and usable. It is a blend of applied and basic research, both drawing from psychological research and contributing new ideas to it. New technologies continuously challenge HCI researchers with new options, as do the demands of new audiences and uses. A variety of usability methods have been developed that draw upon psychological principles. HCI research has expanded beyond its roots in the cognitive processes of individual users to include social and organizational processes involved in computer usage in real environments as well as the use of computers in collaboration. HCI researchers need to be mindful of the longer-term changes brought about by the use of computing in a variety of venues.

  7. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    NASA Astrophysics Data System (ADS)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  8. Assessment methodology for computer-based instructional simulations.

    PubMed

    Koenig, Alan; Iseli, Markus; Wainess, Richard; Lee, John J

    2013-10-01

    Computer-based instructional simulations are becoming more and more ubiquitous, particularly in military and medical domains. As the technology that drives these simulations grows ever more sophisticated, the underlying pedagogical models for how instruction, assessment, and feedback are implemented within these systems must evolve accordingly. In this article, we review some of the existing educational approaches to medical simulations, and present pedagogical methodologies that have been used in the design and development of games and simulations at the University of California, Los Angeles, Center for Research on Evaluation, Standards, and Student Testing. In particular, we present a methodology for how automated assessments of computer-based simulations can be implemented using ontologies and Bayesian networks, and discuss their advantages and design considerations for pedagogical use. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  9. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 39: The role of computer networks in aerospace engineering

    NASA Technical Reports Server (NTRS)

    Bishop, Ann P.; Pinelli, Thomas E.

    1994-01-01

    This paper presents selected results from an empirical investigation into the use of computer networks in aerospace engineering. Such networks allow aerospace engineers to communicate with people and access remote resources through electronic mail, file transfer, and remote log-in. The study drew its subjects from private sector, government and academic organizations in the U.S. aerospace industry. Data presented here were gathered in a mail survey, conducted in Spring 1993, that was distributed to aerospace engineers performing a wide variety of jobs. Results from the mail survey provide a snapshot of the current use of computer networks in the aerospace industry, suggest factors associated with the use of networks, and identify perceived impacts of networks on aerospace engineering work and communication.

  10. Reynolds Number Effects on Leading Edge Radius Variations of a Supersonic Transport at Transonic Conditions

    NASA Technical Reports Server (NTRS)

    Rivers, S. M. B.; Wahls, R. A.; Owens, L. R.

    2001-01-01

    A computational study focused on leading-edge radius effects and associated Reynolds number sensitivity for a High Speed Civil Transport configuration at transonic conditions was conducted as part of NASA's High Speed Research Program. The primary purposes were to assess the capabilities of computational fluid dynamics to predict Reynolds number effects for a range of leading-edge radius distributions on a second-generation supersonic transport configuration, and to evaluate the potential performance benefits of each at the transonic cruise condition. Five leading-edge radius distributions are described, and the potential performance benefit including the Reynolds number sensitivity for each is presented. Computational results for two leading-edge radius distributions are compared with experimental results acquired in the National Transonic Facility over a broad Reynolds number range.

  11. Statistical physics of hard combinatorial optimization: Vertex cover problem

    NASA Astrophysics Data System (ADS)

    Zhao, Jin-Hua; Zhou, Hai-Jun

    2014-07-01

    Typical-case computation complexity is a research topic at the boundary of computer science, applied mathematics, and statistical physics. In the last twenty years, the replica-symmetry-breaking mean field theory of spin glasses and the associated message-passing algorithms have greatly deepened our understanding of typical-case computation complexity. In this paper, we use the vertex cover problem, a basic nondeterministic-polynomial (NP)-complete combinatorial optimization problem of wide application, as an example to introduce the statistical physical methods and algorithms. We do not go into the technical details but emphasize mainly the intuitive physical meanings of the message-passing equations. A nonfamiliar reader shall be able to understand to a large extent the physics behind the mean field approaches and to adjust the mean field methods in solving other optimization problems.

  12. Runway exit designs for capacity improvement demonstrations. Phase 2: Computer model development

    NASA Technical Reports Server (NTRS)

    Trani, A. A.; Hobeika, A. G.; Kim, B. J.; Nunna, V.; Zhong, C.

    1992-01-01

    The development is described of a computer simulation/optimization model to: (1) estimate the optimal locations of existing and proposed runway turnoffs; and (2) estimate the geometric design requirements associated with newly developed high speed turnoffs. The model described, named REDIM 2.0, represents a stand alone application to be used by airport planners, designers, and researchers alike to estimate optimal turnoff locations. The main procedures are described in detail which are implemented in the software package and possible applications are illustrated when using 6 major runway scenarios. The main output of the computer program is the estimation of the weighted average runway occupancy time for a user defined aircraft population. Also, the location and geometric characteristics of each turnoff are provided to the user.

  13. Integrating information technologies as tools for surgical research.

    PubMed

    Schell, Scott R

    2005-10-01

    Surgical research is dependent upon information technologies. Selection of the computer, operating system, and software tool that best support the surgical investigator's needs requires careful planning before research commences. This manuscript presents a brief tutorial on how surgical investigators can best select these information technologies, with comparisons and recommendations between existing systems, software, and solutions. Privacy concerns, based upon HIPAA and other regulations, now require careful proactive attention to avoid legal penalties, civil litigation, and financial loss. Security issues are included as part of the discussions related to selection and application of information technology. This material was derived from a segment of the Association for Academic Surgery's Fundamentals of Surgical Research course.

  14. Marcus Theory of Ion-Pairing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roy, Santanu; Baer, Marcel D.; Mundy, Christopher J.

    We present a theory for ion pair dissociation and association, motivated by the concepts of the Marcus theory of electron transfer. Despite the extensive research on ion-pairing in many chemical and biological processes, much can be learned from the exploration of collective reaction coordinates. To this end, we explore two reaction coordinates, ion pair distance and coordination number. The study of the correlation between these reaction coordinates provides a new insight into the mechanism and kinetics of ion pair dissociation and association in water. The potential of mean force on these 2D-surfaces computed from molecular dynamics simulations of different monovalentmore » ion pairs reveal a Marcus-like mechanism for ion-pairing: Water molecules rearrange forming an activated coordination state prior to ion pair dissociation or association, followed by relaxation of the coordination state due to further water rearrangement. Like Marcus theory, we find the existence of an inverted region where the transition rates are slower with increasing exergonicity. This study provides a new perspective for the future investigations of ion-pairing and transport. SR, CJM, and GKS were supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. MDB was supported by MS3 (Materials Synthesis and Simulation Across Scales) Initiative, a Laboratory Directed Research and Development Program at Pacific Northwest National Laboratory (PNNL). The research was performed using PNNL Institutional Computing. PNNL is a multi-program national laboratory operated by Battelle for the U.S. Department of Energy.« less

  15. An overview of computer viruses in a research environment

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1991-01-01

    The threat of attack by computer viruses is in reality a very small part of a much more general threat, specifically threats aimed at subverting computer security. Here, computer viruses are examined as a malicious logic in a research and development environment. A relation is drawn between the viruses and various models of security and integrity. Current research techniques aimed at controlling the threats posed to computer systems by threatening viruses in particular and malicious logic in general are examined. Finally, a brief examination of the vulnerabilities of research and development systems that malicious logic and computer viruses may exploit is undertaken.

  16. Opportunities for Breakthroughs in Large-Scale Computational Simulation and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia; Alter, Stephen J.; Atkins, Harold L.; Bey, Kim S.; Bibb, Karen L.; Biedron, Robert T.; Carpenter, Mark H.; Cheatwood, F. McNeil; Drummond, Philip J.; Gnoffo, Peter A.

    2002-01-01

    Opportunities for breakthroughs in the large-scale computational simulation and design of aerospace vehicles are presented. Computational fluid dynamics tools to be used within multidisciplinary analysis and design methods are emphasized. The opportunities stem from speedups and robustness improvements in the underlying unit operations associated with simulation (geometry modeling, grid generation, physical modeling, analysis, etc.). Further, an improved programming environment can synergistically integrate these unit operations to leverage the gains. The speedups result from reducing the problem setup time through geometry modeling and grid generation operations, and reducing the solution time through the operation counts associated with solving the discretized equations to a sufficient accuracy. The opportunities are addressed only at a general level here, but an extensive list of references containing further details is included. The opportunities discussed are being addressed through the Fast Adaptive Aerospace Tools (FAAST) element of the Advanced Systems Concept to Test (ASCoT) and the third Generation Reusable Launch Vehicles (RLV) projects at NASA Langley Research Center. The overall goal is to enable greater inroads into the design process with large-scale simulations.

  17. The influence of leg-to-body ratio (LBR) on judgments of female physical attractiveness: assessments of computer-generated images varying in LBR.

    PubMed

    Frederick, David A; Hadji-Michael, Maria; Furnham, Adrian; Swami, Viren

    2010-01-01

    The leg-to-body ratio (LBR), which is reliably associated with developmental stability and health outcomes, is an understudied component of human physical attractiveness. Several studies examining the effects of LBR on aesthetic judgments have been limited by the reliance on stimuli composed of hand-drawn silhouettes. In the present study, we developed a new set of female computer-generated images portraying eight levels of LBR that fell within the typical range of human variation. A community sample of 207 Britons in London and students from two samples drawn from a US university (Ns=940, 114) rated the physical attractiveness of the images. We found that mid-ranging female LBRs were perceived as maximally attractive. The present research overcomes some of the problems associated with past work on LBR and aesthetic preferences through use of computer-generated images rather than hand-drawn images and provides an instrument that may be useful in future investigations of LBR preferences. Copyright 2009 Elsevier Ltd. All rights reserved.

  18. The use of cross-section warping functions in composite rotor blade analysis

    NASA Technical Reports Server (NTRS)

    Kosmatka, J. B.

    1992-01-01

    During the contracted period, our research was concentrated into three areas. The first was the development of an accurate and a computationally efficient method for predicting the cross-section warping functions in an arbitrary cross-section composed of isotropic and/or anisotropic materials. The second area of research was the development of a general higher-order one-dimensional theory for anisotropic beams. The third area of research was the development of an analytical model for assessing the extension-bend-twist coupling behavior of nonhomogeneous anisotropic beams with initial twist. In the remaining six chapters of this report, the three different research areas and associated sub-research areas are covered independently including separate introductions, theoretical developments, numerical results, and references.

  19. Research in progress and other activities of the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics and computer science during the period April 1, 1993 through September 30, 1993. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustic and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  20. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  1. Computational Pathology: A Path Ahead.

    PubMed

    Louis, David N; Feldman, Michael; Carter, Alexis B; Dighe, Anand S; Pfeifer, John D; Bry, Lynn; Almeida, Jonas S; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E; Gilbertson, John R; Sinard, John H; Gerber, Georg K; Galli, Stephen J; Golden, Jeffrey A; Becich, Michael J

    2016-01-01

    We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. To define the scope and needs of computational pathology. A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and nonpathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology.

  2. Addressing the translational dilemma: dynamic knowledge representation of inflammation using agent-based modeling.

    PubMed

    An, Gary; Christley, Scott

    2012-01-01

    Given the panoply of system-level diseases that result from disordered inflammation, such as sepsis, atherosclerosis, cancer, and autoimmune disorders, understanding and characterizing the inflammatory response is a key target of biomedical research. Untangling the complex behavioral configurations associated with a process as ubiquitous as inflammation represents a prototype of the translational dilemma: the ability to translate mechanistic knowledge into effective therapeutics. A critical failure point in the current research environment is a throughput bottleneck at the level of evaluating hypotheses of mechanistic causality; these hypotheses represent the key step toward the application of knowledge for therapy development and design. Addressing the translational dilemma will require utilizing the ever-increasing power of computers and computational modeling to increase the efficiency of the scientific method in the identification and evaluation of hypotheses of mechanistic causality. More specifically, development needs to focus on facilitating the ability of non-computer trained biomedical researchers to utilize and instantiate their knowledge in dynamic computational models. This is termed "dynamic knowledge representation." Agent-based modeling is an object-oriented, discrete-event, rule-based simulation method that is well suited for biomedical dynamic knowledge representation. Agent-based modeling has been used in the study of inflammation at multiple scales. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggest that this modeling framework is well suited for addressing the translational dilemma. This review describes agent-based modeling, gives examples of its applications in the study of inflammation, and introduces a proposed general expansion of the use of modeling and simulation to augment the generation and evaluation of knowledge by the biomedical research community at large.

  3. Greater power and computational efficiency for kernel-based association testing of sets of genetic variants

    PubMed Central

    Lippert, Christoph; Xiang, Jing; Horta, Danilo; Widmer, Christian; Kadie, Carl; Heckerman, David; Listgarten, Jennifer

    2014-01-01

    Motivation: Set-based variance component tests have been identified as a way to increase power in association studies by aggregating weak individual effects. However, the choice of test statistic has been largely ignored even though it may play an important role in obtaining optimal power. We compared a standard statistical test—a score test—with a recently developed likelihood ratio (LR) test. Further, when correction for hidden structure is needed, or gene–gene interactions are sought, state-of-the art algorithms for both the score and LR tests can be computationally impractical. Thus we develop new computationally efficient methods. Results: After reviewing theoretical differences in performance between the score and LR tests, we find empirically on real data that the LR test generally has more power. In particular, on 15 of 17 real datasets, the LR test yielded at least as many associations as the score test—up to 23 more associations—whereas the score test yielded at most one more association than the LR test in the two remaining datasets. On synthetic data, we find that the LR test yielded up to 12% more associations, consistent with our results on real data, but also observe a regime of extremely small signal where the score test yielded up to 25% more associations than the LR test, consistent with theory. Finally, our computational speedups now enable (i) efficient LR testing when the background kernel is full rank, and (ii) efficient score testing when the background kernel changes with each test, as for gene–gene interaction tests. The latter yielded a factor of 2000 speedup on a cohort of size 13 500. Availability: Software available at http://research.microsoft.com/en-us/um/redmond/projects/MSCompBio/Fastlmm/. Contact: heckerma@microsoft.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25075117

  4. Using Computational Toxicology to Enable Risk-Based ...

    EPA Pesticide Factsheets

    presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment. Slide presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment.

  5. Efficient Monte Carlo Estimation of the Expected Value of Sample Information Using Moment Matching.

    PubMed

    Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca

    2018-02-01

    The Expected Value of Sample Information (EVSI) is used to calculate the economic value of a new research strategy. Although this value would be important to both researchers and funders, there are very few practical applications of the EVSI. This is due to computational difficulties associated with calculating the EVSI in practical health economic models using nested simulations. We present an approximation method for the EVSI that is framed in a Bayesian setting and is based on estimating the distribution of the posterior mean of the incremental net benefit across all possible future samples, known as the distribution of the preposterior mean. Specifically, this distribution is estimated using moment matching coupled with simulations that are available for probabilistic sensitivity analysis, which is typically mandatory in health economic evaluations. This novel approximation method is applied to a health economic model that has previously been used to assess the performance of other EVSI estimators and accurately estimates the EVSI. The computational time for this method is competitive with other methods. We have developed a new calculation method for the EVSI which is computationally efficient and accurate. This novel method relies on some additional simulation so can be expensive in models with a large computational cost.

  6. An autonomous molecular computer for logical control of gene expression

    PubMed Central

    Benenson, Yaakov; Gil, Binyamin; Ben-Dor, Uri; Adar, Rivka; Shapiro, Ehud

    2013-01-01

    Early biomolecular computer research focused on laboratory-scale, human-operated computers for complex computational problems1–7. Recently, simple molecular-scale autonomous programmable computers were demonstrated8–15 allowing both input and output information to be in molecular form. Such computers, using biological molecules as input data and biologically active molecules as outputs, could produce a system for ‘logical’ control of biological processes. Here we describe an autonomous biomolecular computer that, at least in vitro, logically analyses the levels of messenger RNA species, and in response produces a molecule capable of affecting levels of gene expression. The computer operates at a concentration of close to a trillion computers per microlitre and consists of three programmable modules: a computation module, that is, a stochastic molecular automaton12–17; an input module, by which specific mRNA levels or point mutations regulate software molecule concentrations, and hence automaton transition probabilities; and an output module, capable of controlled release of a short single-stranded DNA molecule. This approach might be applied in vivo to biochemical sensing, genetic engineering and even medical diagnosis and treatment. As a proof of principle we programmed the computer to identify and analyse mRNA of disease-related genes18–22 associated with models of small-cell lung cancer and prostate cancer, and to produce a single-stranded DNA molecule modelled after an anticancer drug. PMID:15116117

  7. Toward an automated parallel computing environment for geosciences

    NASA Astrophysics Data System (ADS)

    Zhang, Huai; Liu, Mian; Shi, Yaolin; Yuen, David A.; Yan, Zhenzhen; Liang, Guoping

    2007-08-01

    Software for geodynamic modeling has not kept up with the fast growing computing hardware and network resources. In the past decade supercomputing power has become available to most researchers in the form of affordable Beowulf clusters and other parallel computer platforms. However, to take full advantage of such computing power requires developing parallel algorithms and associated software, a task that is often too daunting for geoscience modelers whose main expertise is in geosciences. We introduce here an automated parallel computing environment built on open-source algorithms and libraries. Users interact with this computing environment by specifying the partial differential equations, solvers, and model-specific properties using an English-like modeling language in the input files. The system then automatically generates the finite element codes that can be run on distributed or shared memory parallel machines. This system is dynamic and flexible, allowing users to address different problems in geosciences. It is capable of providing web-based services, enabling users to generate source codes online. This unique feature will facilitate high-performance computing to be integrated with distributed data grids in the emerging cyber-infrastructures for geosciences. In this paper we discuss the principles of this automated modeling environment and provide examples to demonstrate its versatility.

  8. Future fundamental combustion research for aeropropulsion systems

    NASA Technical Reports Server (NTRS)

    Mularz, E. J.

    1985-01-01

    Physical fluid mechanics, heat transfer, and chemical kinetic processes which occur in the combustion chamber of aeropropulsion systems were investigated. With the component requirements becoming more severe for future engines, the current design methodology needs the new tools to obtain the optimum configuration in a reasonable design and development cycle. Research efforts in the last few years were encouraging but to achieve these benefits research is required into the fundamental aerothermodynamic processes of combustion. It is recommended that research continues in the areas of flame stabilization, combustor aerodynamics, heat transfer, multiphase flow and atomization, turbulent reacting flows, and chemical kinetics. Associated with each of these engineering sciences is the need for research into computational methods to accurately describe and predict these complex physical processes. Research needs in each of these areas are highlighted.

  9. Provision of Computer Printing Capabilities to Library Patrons. SPEC Kit 183.

    ERIC Educational Resources Information Center

    Taylor, Suzanne; Welch, C. Brigid, Ed.

    1992-01-01

    This publication reports on the results of a 1992 survey of 80 Association of Research Libraries (ARL) member libraries on their use of printers for recording references found in online public access catalog (OPAC) and CD-ROM searches. As was found in a 1990 survey on the same topic, only a small proportion of libraries reported that they charged…

  10. New Partnerships: People, Technology, and Organizations. Proceedings of the International ADCIS Conference (35th, Nashville, Tennessee, February 15-19, 1994).

    ERIC Educational Resources Information Center

    Orey, Michael, Ed.

    The theme of the Association for the Development of Computer-Based Instructional Systems (ADCIS) 1994 conference was "New Partnerships: People, Technology, and Organizations." Included in the 38 papers and abstracts compiled in this document are the following topics: hypermedia; the National Research and Education Network and K-12…

  11. Proceedings of Selected Research and Development Papers Presented at the National Convention of the Association for Educational Communications and Technology [AECT] (21st, Houston, Texas, February 10-14, 1999).

    ERIC Educational Resources Information Center

    Sparks, Kristin E., Ed.; Simonson, Michael, Ed.

    1999-01-01

    Subjects addressed by the 65 papers in this proceedings include: challenges for emerging instructional designers; instructional technology clinical experience; color coding and field dependence; effects of visualization on cognitive development; effects of learning structure and summarization during computer-based instruction; individually-guided…

  12. An Analysis of Young Students' Thinking When Completing Basic Coding Tasks Using Scratch Jnr. on the iPad

    ERIC Educational Resources Information Center

    Falloon, G.

    2016-01-01

    Recent government moves in many countries have seen coding included in school curricula, or promoted as part of computing, mathematics or science programmes. While these moves have generally been associated with a need to engage more young people in technology study, research has hinted at possible benefits from learning to program including…

  13. Learning Analytics and Computational Techniques for Detecting and Evaluating Patterns in Learning: An Introduction to the Special Issue

    ERIC Educational Resources Information Center

    Martin, Taylor; Sherin, Bruce

    2013-01-01

    The learning sciences community's interest in learning analytics (LA) has been growing steadily over the past several years. Three recent symposia on the theme (at the American Educational Research Association 2011 and 2012 annual conferences, and the International Conference of the Learning Sciences 2012), organized by Paulo Blikstein, led…

  14. Analyzing Conflict Dynamics with the Aid of an Interactive Microworld Simulator of a Fishing Dispute

    ERIC Educational Resources Information Center

    Kuperman, Ranan D.

    2010-01-01

    This article presents findings from a research project that uses an interactive simulator of an imaginary fishing dispute. Subjects operating the simulator play the role of a state leader, while the computer program controls the behavior of a contending state as well as provides all the environmental data associated with the conflict. The…

  15. What Are Young People Doing on Internet? Use of ICT, Parental Supervision Strategies and Exposure to Risks

    ERIC Educational Resources Information Center

    Giménez, Ana M.; Luengo, José A.; Bartrina, M. José

    2017-01-01

    Introduction: Current research emphasizes young people's access to and use of social networks, chat and WhatsApp. However, this situation is not associated with active parental mediation to protect them from the risks involved. This study analyzes Murcian students' perception of cell phone and computer use, parental mediation strategies and their…

  16. Community Science Workshops: A Powerful and Feasible Model for Serving Underserved Youth. An Evaluation Brief

    ERIC Educational Resources Information Center

    Inverness Research Associates, 2007

    2007-01-01

    The people at Inverness Research Associates spent 12 years studying Community Science Workshops (CSW) in California and in six other states. They gathered statistics on the scale, scope, and cost-efficiency of CSW services to youth. They observed youth at work in the shops--taking apart computers, repairing bikes, growing plants, and so on--and…

  17. Exploring the quantum speed limit with computer games

    NASA Astrophysics Data System (ADS)

    Sørensen, Jens Jakob W. H.; Pedersen, Mads Kock; Munch, Michael; Haikka, Pinja; Jensen, Jesper Halkjær; Planke, Tilo; Andreasen, Morten Ginnerup; Gajdacz, Miroslav; Mølmer, Klaus; Lieberoth, Andreas; Sherson, Jacob F.

    2016-04-01

    Humans routinely solve problems of immense computational complexity by intuitively forming simple, low-dimensional heuristic strategies. Citizen science (or crowd sourcing) is a way of exploiting this ability by presenting scientific research problems to non-experts. ‘Gamification’—the application of game elements in a non-game context—is an effective tool with which to enable citizen scientists to provide solutions to research problems. The citizen science games Foldit, EteRNA and EyeWire have been used successfully to study protein and RNA folding and neuron mapping, but so far gamification has not been applied to problems in quantum physics. Here we report on Quantum Moves, an online platform gamifying optimization problems in quantum physics. We show that human players are able to find solutions to difficult problems associated with the task of quantum computing. Players succeed where purely numerical optimization fails, and analyses of their solutions provide insights into the problem of optimization of a more profound and general nature. Using player strategies, we have thus developed a few-parameter heuristic optimization method that efficiently outperforms the most prominent established numerical methods. The numerical complexity associated with time-optimal solutions increases for shorter process durations. To understand this better, we produced a low-dimensional rendering of the optimization landscape. This rendering reveals why traditional optimization methods fail near the quantum speed limit (that is, the shortest process duration with perfect fidelity). Combined analyses of optimization landscapes and heuristic solution strategies may benefit wider classes of optimization problems in quantum physics and beyond.

  18. Exploring the quantum speed limit with computer games.

    PubMed

    Sørensen, Jens Jakob W H; Pedersen, Mads Kock; Munch, Michael; Haikka, Pinja; Jensen, Jesper Halkjær; Planke, Tilo; Andreasen, Morten Ginnerup; Gajdacz, Miroslav; Mølmer, Klaus; Lieberoth, Andreas; Sherson, Jacob F

    2016-04-14

    Humans routinely solve problems of immense computational complexity by intuitively forming simple, low-dimensional heuristic strategies. Citizen science (or crowd sourcing) is a way of exploiting this ability by presenting scientific research problems to non-experts. 'Gamification'--the application of game elements in a non-game context--is an effective tool with which to enable citizen scientists to provide solutions to research problems. The citizen science games Foldit, EteRNA and EyeWire have been used successfully to study protein and RNA folding and neuron mapping, but so far gamification has not been applied to problems in quantum physics. Here we report on Quantum Moves, an online platform gamifying optimization problems in quantum physics. We show that human players are able to find solutions to difficult problems associated with the task of quantum computing. Players succeed where purely numerical optimization fails, and analyses of their solutions provide insights into the problem of optimization of a more profound and general nature. Using player strategies, we have thus developed a few-parameter heuristic optimization method that efficiently outperforms the most prominent established numerical methods. The numerical complexity associated with time-optimal solutions increases for shorter process durations. To understand this better, we produced a low-dimensional rendering of the optimization landscape. This rendering reveals why traditional optimization methods fail near the quantum speed limit (that is, the shortest process duration with perfect fidelity). Combined analyses of optimization landscapes and heuristic solution strategies may benefit wider classes of optimization problems in quantum physics and beyond.

  19. Proceedings of the Workshop on software tools for distributed intelligent control systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herget, C.J.

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can formmore » the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.« less

  20. MKRMDA: multiple kernel learning-based Kronecker regularized least squares for MiRNA-disease association prediction.

    PubMed

    Chen, Xing; Niu, Ya-Wei; Wang, Guang-Hui; Yan, Gui-Ying

    2017-12-12

    Recently, as the research of microRNA (miRNA) continues, there are plenty of experimental evidences indicating that miRNA could be associated with various human complex diseases development and progression. Hence, it is necessary and urgent to pay more attentions to the relevant study of predicting diseases associated miRNAs, which may be helpful for effective prevention, diagnosis and treatment of human diseases. Especially, constructing computational methods to predict potential miRNA-disease associations is worthy of more studies because of the feasibility and effectivity. In this work, we developed a novel computational model of multiple kernels learning-based Kronecker regularized least squares for MiRNA-disease association prediction (MKRMDA), which could reveal potential miRNA-disease associations by automatically optimizing the combination of multiple kernels for disease and miRNA. MKRMDA obtained AUCs of 0.9040 and 0.8446 in global and local leave-one-out cross validation, respectively. Meanwhile, MKRMDA achieved average AUCs of 0.8894 ± 0.0015 in fivefold cross validation. Furthermore, we conducted three different kinds of case studies on some important human cancers for further performance evaluation. In the case studies of colonic cancer, esophageal cancer and lymphoma based on known miRNA-disease associations in HMDDv2.0 database, 76, 94 and 88% of the corresponding top 50 predicted miRNAs were confirmed by experimental reports, respectively. In another two kinds of case studies for new diseases without any known associated miRNAs and diseases only with known associations in HMDDv1.0 database, the verified ratios of two different cancers were 88 and 94%, respectively. All the results mentioned above adequately showed the reliable prediction ability of MKRMDA. We anticipated that MKRMDA could serve to facilitate further developments in the field and the follow-up investigations by biomedical researchers.

  1. miRwayDB: a database for experimentally validated microRNA-pathway associations in pathophysiological conditions

    PubMed Central

    Das, Sankha Subhra; Saha, Pritam

    2018-01-01

    Abstract MicroRNAs (miRNAs) are well-known as key regulators of diverse biological pathways. A series of experimental evidences have shown that abnormal miRNA expression profiles are responsible for various pathophysiological conditions by modulating genes in disease associated pathways. In spite of the rapid increase in research data confirming such associations, scientists still do not have access to a consolidated database offering these miRNA-pathway association details for critical diseases. We have developed miRwayDB, a database providing comprehensive information of experimentally validated miRNA-pathway associations in various pathophysiological conditions utilizing data collected from published literature. To the best of our knowledge, it is the first database that provides information about experimentally validated miRNA mediated pathway dysregulation as seen specifically in critical human diseases and hence indicative of a cause-and-effect relationship in most cases. The current version of miRwayDB collects an exhaustive list of miRNA-pathway association entries for 76 critical disease conditions by reviewing 663 published articles. Each database entry contains complete information on the name of the pathophysiological condition, associated miRNA(s), experimental sample type(s), regulation pattern (up/down) of miRNA, pathway association(s), targeted member of dysregulated pathway(s) and a brief description. In addition, miRwayDB provides miRNA, gene and pathway score to evaluate the role of a miRNA regulated pathways in various pathophysiological conditions. The database can also be used for other biomedical approaches such as validation of computational analysis, integrated analysis and prediction of computational model. It also offers a submission page to submit novel data from recently published studies. We believe that miRwayDB will be a useful tool for miRNA research community. Database URL: http://www.mirway.iitkgp.ac.in PMID:29688364

  2. Session on High Speed Civil Transport Design Capability Using MDO and High Performance Computing

    NASA Technical Reports Server (NTRS)

    Rehder, Joe

    2000-01-01

    Since the inception of CAS in 1992, NASA Langley has been conducting research into applying multidisciplinary optimization (MDO) and high performance computing toward reducing aircraft design cycle time. The focus of this research has been the development of a series of computational frameworks and associated applications that increased in capability, complexity, and performance over time. The culmination of this effort is an automated high-fidelity analysis capability for a high speed civil transport (HSCT) vehicle installed on a network of heterogeneous computers with a computational framework built using Common Object Request Broker Architecture (CORBA) and Java. The main focus of the research in the early years was the development of the Framework for Interdisciplinary Design Optimization (FIDO) and associated HSCT applications. While the FIDO effort was eventually halted, work continued on HSCT applications of ever increasing complexity. The current application, HSCT4.0, employs high fidelity CFD and FEM analysis codes. For each analysis cycle, the vehicle geometry and computational grids are updated using new values for design variables. Processes for aeroelastic trim, loads convergence, displacement transfer, stress and buckling, and performance have been developed. In all, a total of 70 processes are integrated in the analysis framework. Many of the key processes include automatic differentiation capabilities to provide sensitivity information that can be used in optimization. A software engineering process was developed to manage this large project. Defining the interactions among 70 processes turned out to be an enormous, but essential, task. A formal requirements document was prepared that defined data flow among processes and subprocesses. A design document was then developed that translated the requirements into actual software design. A validation program was defined and implemented to ensure that codes integrated into the framework produced the same results as their standalone counterparts. Finally, a Commercial Off the Shelf (COTS) configuration management system was used to organize the software development. A computational environment, CJOPT, based on the Common Object Request Broker Architecture, CORBA, and the Java programming language has been developed as a framework for multidisciplinary analysis and Optimization. The environment exploits the parallelisms inherent in the application and distributes the constituent disciplines on machines best suited to their needs. In CJOpt, a discipline code is "wrapped" as an object. An interface to the object identifies the functionality (services) provided by the discipline, defined in Interface Definition Language (IDL) and implemented using Java. The results of using the HSCT4.0 capability are described. A summary of lessons learned is also presented. The use of some of the processes, codes, and techniques by industry are highlighted. The application of the methodology developed in this research to other aircraft are described. Finally, we show how the experience gained is being applied to entirely new vehicles, such as the Reusable Space Transportation System. Additional information is contained in the original.

  3. Computer Vision Syndrome and Associated Factors Among Medical and Engineering Students in Chennai

    PubMed Central

    Logaraj, M; Madhupriya, V; Hegde, SK

    2014-01-01

    Background: Almost all institutions, colleges, universities and homes today were using computer regularly. Very little research has been carried out on Indian users especially among college students the effects of computer use on the eye and vision related problems. Aim: The aim of this study was to assess the prevalence of computer vision syndrome (CVS) among medical and engineering students and the factors associated with the same. Subjects and Methods: A cross-sectional study was conducted among medical and engineering college students of a University situated in the suburban area of Chennai. Students who used computer in the month preceding the date of study were included in the study. The participants were surveyed using pre-tested structured questionnaire. Results: Among engineering students, the prevalence of CVS was found to be 81.9% (176/215) while among medical students; it was found to be 78.6% (158/201). A significantly higher proportion of engineering students 40.9% (88/215) used computers for 4-6 h/day as compared to medical students 10% (20/201) (P < 0.001). The reported symptoms of CVS were higher among engineering students compared with medical students. Students who used computer for 4-6 h were at significantly higher risk of developing redness (OR = 1.2, 95% CI = 1.0-3.1,P = 0.04), burning sensation (OR = 2.1,95% CI = 1.3-3.1, P < 0.01) and dry eyes (OR = 1.8, 95% CI = 1.1-2.9, P = 0.02) compared to those who used computer for less than 4 h. Significant correlation was found between increased hours of computer use and the symptoms redness, burning sensation, blurred vision and dry eyes. Conclusion: The present study revealed that more than three-fourth of the students complained of any one of the symptoms of CVS while working on the computer. PMID:24761234

  4. Computer vision syndrome and associated factors among medical and engineering students in chennai.

    PubMed

    Logaraj, M; Madhupriya, V; Hegde, Sk

    2014-03-01

    Almost all institutions, colleges, universities and homes today were using computer regularly. Very little research has been carried out on Indian users especially among college students the effects of computer use on the eye and vision related problems. The aim of this study was to assess the prevalence of computer vision syndrome (CVS) among medical and engineering students and the factors associated with the same. A cross-sectional study was conducted among medical and engineering college students of a University situated in the suburban area of Chennai. Students who used computer in the month preceding the date of study were included in the study. The participants were surveyed using pre-tested structured questionnaire. Among engineering students, the prevalence of CVS was found to be 81.9% (176/215) while among medical students; it was found to be 78.6% (158/201). A significantly higher proportion of engineering students 40.9% (88/215) used computers for 4-6 h/day as compared to medical students 10% (20/201) (P < 0.001). The reported symptoms of CVS were higher among engineering students compared with medical students. Students who used computer for 4-6 h were at significantly higher risk of developing redness (OR = 1.2, 95% CI = 1.0-3.1,P = 0.04), burning sensation (OR = 2.1,95% CI = 1.3-3.1, P < 0.01) and dry eyes (OR = 1.8, 95% CI = 1.1-2.9, P = 0.02) compared to those who used computer for less than 4 h. Significant correlation was found between increased hours of computer use and the symptoms redness, burning sensation, blurred vision and dry eyes. The present study revealed that more than three-fourth of the students complained of any one of the symptoms of CVS while working on the computer.

  5. The prevalence of computer-related musculoskeletal complaints in female college students.

    PubMed

    Hamilton, Audra G; Jacobs, Karen; Orsmond, Gael

    2005-01-01

    The purpose of this study was to determine the prevalence of computer-related musculoskeletal complaints in female college students. This research also explored whether the number of hours per day spent using a computer, type of computer used (laptop vs. desktop), or academic major was related to the presence of musculoskeletal complaints. Additionally, "job strain", a measure of job stress which can affect the physical health of an individual, was measured to determine whether students feel stress from the job of "student" and if so, whether it contributed to these complaints. Two surveys, The Boston University Computer and Health Survey and the Job Content Questionnaire [9], were distributed to 111 female college students to measure musculoskeletal complaints and job strain. Seventy-two surveys were returned. Chi-square and logistical regression were used to analyze the data. The results indicated that 80.6% of the participants reported computer-related musculoskeletal complaints in the two weeks prior to completing the survey, although none of the examined factors were associated with the complaints. It is notable, however, that 82% of the students reported spending 0-6 hours/day using a computer, with almost 28% reporting 4-6 hours/day of usage. Eleven percent of the participants reported using the computer more than 8 hours/day. Of those students who use a laptop computer for all computer use, 90.1% reported musculoskeletal complaints. The students reported that they did not experience job strain. Further studies should be performed using a survey specifically intended for college students. The majority of female college students in this study reported musculoskeletal discomfort during or after computer use. Although a statistical correlation could not be made, students using laptop computers reported a higher incidence of musculoskeletal symptoms than those using desktop computers. Additionally, female college students did not seem to experience job strain. Future research should continue on larger, more diverse samples of students to better understand the prevalence and contributors of musculoskeletal complaints, how college students experience job strain (stress), and whether these two factors are related.

  6. A cloud-based data network approach for translational cancer research.

    PubMed

    Xing, Wei; Tsoumakos, Dimitrios; Ghanem, Moustafa

    2015-01-01

    We develop a new model and associated technology for constructing and managing self-organizing data to support translational cancer research studies. We employ a semantic content network approach to address the challenges of managing cancer research data. Such data is heterogeneous, large, decentralized, growing and continually being updated. Moreover, the data originates from different information sources that may be partially overlapping, creating redundancies as well as contradictions and inconsistencies. Building on the advantages of elasticity of cloud computing, we deploy the cancer data networks on top of the CELAR Cloud platform to enable more effective processing and analysis of Big cancer data.

  7. Introducing the Geneva Multimodal expression corpus for experimental research on emotion perception.

    PubMed

    Bänziger, Tanja; Mortillaro, Marcello; Scherer, Klaus R

    2012-10-01

    Research on the perception of emotional expressions in faces and voices is exploding in psychology, the neurosciences, and affective computing. This article provides an overview of some of the major emotion expression (EE) corpora currently available for empirical research and introduces a new, dynamic, multimodal corpus of emotion expressions, the Geneva Multimodal Emotion Portrayals Core Set (GEMEP-CS). The design features of the corpus are outlined and justified, and detailed validation data for the core set selection are presented and discussed. Finally, an associated database with microcoded facial, vocal, and body action elements, as well as observer ratings, is introduced.

  8. Synthesis of aircraft structures using integrated design and analysis methods

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Goetz, R. C.

    1978-01-01

    A systematic research is reported to develop and validate methods for structural sizing of an airframe designed with the use of composite materials and active controls. This research program includes procedures for computing aeroelastic loads, static and dynamic aeroelasticity, analysis and synthesis of active controls, and optimization techniques. Development of the methods is concerned with the most effective ways of integrating and sequencing the procedures in order to generate structural sizing and the associated active control system, which is optimal with respect to a given merit function constrained by strength and aeroelasticity requirements.

  9. Procrastinating Behavior in Computer-Based Learning Environments to Predict Performance: A Case Study in Moodle

    PubMed Central

    Cerezo, Rebeca; Esteban, María; Sánchez-Santillán, Miguel; Núñez, José C.

    2017-01-01

    Introduction: Research about student performance has traditionally considered academic procrastination as a behavior that has negative effects on academic achievement. Although there is much evidence for this in class-based environments, there is a lack of research on Computer-Based Learning Environments (CBLEs). Therefore, the purpose of this study is to evaluate student behavior in a blended learning program and specifically procrastination behavior in relation to performance through Data Mining techniques. Materials and Methods: A sample of 140 undergraduate students participated in a blended learning experience implemented in a Moodle (Modular Object Oriented Developmental Learning Environment) Management System. Relevant interaction variables were selected for the study, taking into account student achievement and analyzing data by means of association rules, a mining technique. The association rules were arrived at and filtered through two selection criteria: 1, rules must have an accuracy over 0.8 and 2, they must be present in both sub-samples. Results: The findings of our study highlight the influence of time management in online learning environments, particularly on academic achievement, as there is an association between procrastination variables and student performance. Conclusion: Negative impact of procrastination in learning outcomes has been observed again but in virtual learning environments where practical implications, prevention of, and intervention in, are different from class-based learning. These aspects are discussed to help resolve student difficulties at various ages. PMID:28883801

  10. Procrastinating Behavior in Computer-Based Learning Environments to Predict Performance: A Case Study in Moodle.

    PubMed

    Cerezo, Rebeca; Esteban, María; Sánchez-Santillán, Miguel; Núñez, José C

    2017-01-01

    Introduction: Research about student performance has traditionally considered academic procrastination as a behavior that has negative effects on academic achievement. Although there is much evidence for this in class-based environments, there is a lack of research on Computer-Based Learning Environments (CBLEs) . Therefore, the purpose of this study is to evaluate student behavior in a blended learning program and specifically procrastination behavior in relation to performance through Data Mining techniques. Materials and Methods: A sample of 140 undergraduate students participated in a blended learning experience implemented in a Moodle (Modular Object Oriented Developmental Learning Environment) Management System. Relevant interaction variables were selected for the study, taking into account student achievement and analyzing data by means of association rules, a mining technique. The association rules were arrived at and filtered through two selection criteria: 1, rules must have an accuracy over 0.8 and 2, they must be present in both sub-samples. Results: The findings of our study highlight the influence of time management in online learning environments, particularly on academic achievement, as there is an association between procrastination variables and student performance. Conclusion: Negative impact of procrastination in learning outcomes has been observed again but in virtual learning environments where practical implications, prevention of, and intervention in, are different from class-based learning. These aspects are discussed to help resolve student difficulties at various ages.

  11. Issues in human/computer control of dexterous remote hands

    NASA Technical Reports Server (NTRS)

    Salisbury, K.

    1987-01-01

    Much research on dexterous robot hands has been aimed at the design and control problems associated with their autonomous operation, while relatively little research has addressed the problem of direct human control. It is likely that these two modes can be combined in a complementary manner yielding more capability than either alone could provide. While many of the issues in mixed computer/human control of dexterous hands parallel those found in supervisory control of traditional remote manipulators, the unique geometry and capabilities of dexterous hands pose many new problems. Among these are the control of redundant degrees of freedom, grasp stabilization and specification of non-anthropomorphic behavior. An overview is given of progress made at the MIT AI Laboratory in control of the Salisbury 3 finger hand, including experiments in grasp planning and manipulation via controlled slip. It is also suggested how we might introduce human control into the process at a variety of functional levels.

  12. Spray and High-Pressure Flow Computations in the National Combustion Code (NCC) Improved

    NASA Technical Reports Server (NTRS)

    Raju, Manthena S.

    2002-01-01

    Sprays occur in a wide variety of industrial and power applications and in materials processing. A liquid spray is a two-phase flow with a gas as the continuous phase and a liquid as the dispersed phase in the form of droplets or ligaments. The interactions between the two phases--which are coupled through exchanges of mass, momentum, and energy--can occur in different ways at disparate time and length scales involving various thermal, mass, and fluid dynamic factors. An understanding of the flow, combustion, and thermal properties of a rapidly vaporizing spray requires careful modeling of the ratecontrolling processes associated with turbulent transport, mixing, chemical kinetics, evaporation, and spreading rates of the spray, among many other factors. With the aim of developing an efficient solution procedure for use in multidimensional combustor modeling, researchers at the NASA Glenn Research Center have advanced the state-of-the-art in spray computations in several important ways.

  13. Computational structural mechanics methods research using an evolving framework

    NASA Technical Reports Server (NTRS)

    Knight, N. F., Jr.; Lotts, C. G.; Gillian, R. E.

    1990-01-01

    Advanced structural analysis and computational methods that exploit high-performance computers are being developed in a computational structural mechanics research activity sponsored by the NASA Langley Research Center. These new methods are developed in an evolving framework and applied to representative complex structural analysis problems from the aerospace industry. An overview of the methods development environment is presented, and methods research areas are described. Selected application studies are also summarized.

  14. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  15. Data Crosscutting Requirements Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleese van Dam, Kerstin; Shoshani, Arie; Plata, Charity

    2013-04-01

    In April 2013, a diverse group of researchers from the U.S. Department of Energy (DOE) scientific community assembled to assess data requirements associated with DOE-sponsored scientific facilities and large-scale experiments. Participants in the review included facilities staff, program managers, and scientific experts from the offices of Basic Energy Sciences, Biological and Environmental Research, High Energy Physics, and Advanced Scientific Computing Research. As part of the meeting, review participants discussed key issues associated with three distinct aspects of the data challenge: 1) processing, 2) management, and 3) analysis. These discussions identified commonalities and differences among the needs of varied scientific communities.more » They also helped to articulate gaps between current approaches and future needs, as well as the research advances that will be required to close these gaps. Moreover, the review provided a rare opportunity for experts from across the Office of Science to learn about their collective expertise, challenges, and opportunities. The "Data Crosscutting Requirements Review" generated specific findings and recommendations for addressing large-scale data crosscutting requirements.« less

  16. 48 CFR 252.227-7018 - Rights in noncommercial technical data and computer software-Small Business Innovation Research...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Clauses 252.227-7018 Rights in noncommercial technical data and computer software—Small Business... Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (MAR 2011...

  17. Models of Educational Computing @ Home: New Frontiers for Research on Technology in Learning.

    ERIC Educational Resources Information Center

    Kafai, Yasmin B.; Fishman, Barry J.; Bruckman, Amy S.; Rockman, Saul

    2002-01-01

    Discusses models of home educational computing that are linked to learning in school and recommends the need for research that addresses the home as a computer-based learning environment. Topics include a history of research on educational computing at home; technological infrastructure, including software and compatibility; Internet access;…

  18. Division of Computer Research Summary of Awards. Fiscal Year 1984.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC. Directorate for Mathematical and Physical Sciences.

    Provided in this report are summaries of grants awarded by the National Science Foundation Division of Computer Research in fiscal year 1984. Similar areas of research are grouped (for the purposes of this report only) into these major categories: (1) computational mathematics; (2) computer systems design; (3) intelligent systems; (4) software…

  19. A parallel-processing approach to computing for the geographic sciences; applications and systems enhancements

    USGS Publications Warehouse

    Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George

    2001-01-01

    The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.

  20. Do mothers affect daughter's behaviors? Diet, physical activity, and sedentary behaviors in Kuwaiti mother-daughter dyads.

    PubMed

    Shaban, Lemia H; Vaccaro, Joan A; Sukhram, Shiryn D; Huffman, Fatma G

    2018-01-01

    The objective of the study was to evaluate 169 Kuwaiti mother-daughter dyads and their associations with health behaviors for eating healthy, engaging in physical activity, daughters perceived body weight, time spent with computer/video, and time viewing television. Female students aged 10-14 years were selected from private and public schools in the State of Kuwait. Results demonstrated that daughters exhibited similar behaviors to their mothers in their perceived eating behavior, physical activity, computer/video game use, and TV screen time. Future research is essential to determine the role of mothers in effective health behavior intervention strategies for female Kuwaiti adolescents.

  1. Issues and recommendations associated with distributed computation and data management systems for the space sciences

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The primary purpose of the report is to explore management approaches and technology developments for computation and data management systems designed to meet future needs in the space sciences.The report builds on work presented in previous reports on solar-terrestrial and planetary reports, broadening the outlook to all of the space sciences, and considering policy issues aspects related to coordiantion between data centers, missions, and ongoing research activities, because it is perceived that the rapid growth of data and the wide geographic distribution of relevant facilities will present especially troublesome problems for data archiving, distribution, and analysis.

  2. Numerical experience with a class of algorithms for nonlinear optimization using inexact function and gradient information

    NASA Technical Reports Server (NTRS)

    Carter, Richard G.

    1989-01-01

    For optimization problems associated with engineering design, parameter estimation, image reconstruction, and other optimization/simulation applications, low accuracy function and gradient values are frequently much less expensive to obtain than high accuracy values. Here, researchers investigate the computational performance of trust region methods for nonlinear optimization when high accuracy evaluations are unavailable or prohibitively expensive, and confirm earlier theoretical predictions when the algorithm is convergent even with relative gradient errors of 0.5 or more. The proper choice of the amount of accuracy to use in function and gradient evaluations can result in orders-of-magnitude savings in computational cost.

  3. Sensory perception in autism.

    PubMed

    Robertson, Caroline E; Baron-Cohen, Simon

    2017-11-01

    Autism is a complex neurodevelopmental condition, and little is known about its neurobiology. Much of autism research has focused on the social, communication and cognitive difficulties associated with the condition. However, the recent revision of the diagnostic criteria for autism has brought another key domain of autistic experience into focus: sensory processing. Here, we review the properties of sensory processing in autism and discuss recent computational and neurobiological insights arising from attention to these behaviours. We argue that sensory traits have important implications for the development of animal and computational models of the condition. Finally, we consider how difficulties in sensory processing may relate to the other domains of behaviour that characterize autism.

  4. HNET - A National Computerized Health Network

    PubMed Central

    Casey, Mark; Hamilton, Richard

    1988-01-01

    The HNET system demonstrated conceptually and technically a national text (and limited bit mapped graphics) computer network for use between innovative members of the health care industry. The HNET configuration of a leased high speed national packet switching network connecting any number of mainframe, mini, and micro computers was unique in it's relatively low capital costs and freedom from obsolescence. With multiple simultaneous conferences, databases, bulletin boards, calendars, and advanced electronic mail and surveys, it is marketable to innovative hospitals, clinics, physicians, health care associations and societies, nurses, multisite research projects libraries, etc.. Electronic publishing and education capabilities along with integrated voice and video transmission are identified as future enhancements.

  5. Analysis of random point images with the use of symbolic computation codes and generalized Catalan numbers

    NASA Astrophysics Data System (ADS)

    Reznik, A. L.; Tuzikov, A. V.; Solov'ev, A. A.; Torgov, A. V.

    2016-11-01

    Original codes and combinatorial-geometrical computational schemes are presented, which are developed and applied for finding exact analytical formulas that describe the probability of errorless readout of random point images recorded by a scanning aperture with a limited number of threshold levels. Combinatorial problems encountered in the course of the study and associated with the new generalization of Catalan numbers are formulated and solved. An attempt is made to find the explicit analytical form of these numbers, which is, on the one hand, a necessary stage of solving the basic research problem and, on the other hand, an independent self-consistent problem.

  6. High Performance Computing Facility Operational Assessment 2015: Oak Ridge Leadership Computing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barker, Ashley D.; Bernholdt, David E.; Bland, Arthur S.

    Oak Ridge National Laboratory’s (ORNL’s) Leadership Computing Facility (OLCF) continues to surpass its operational target goals: supporting users; delivering fast, reliable systems; creating innovative solutions for high-performance computing (HPC) needs; and managing risks, safety, and security aspects associated with operating one of the most powerful computers in the world. The results can be seen in the cutting-edge science delivered by users and the praise from the research community. Calendar year (CY) 2015 was filled with outstanding operational results and accomplishments: a very high rating from users on overall satisfaction that ties the highest-ever mark set in CY 2014; the greatestmore » number of core-hours delivered to research projects; the largest percentage of capability usage since the OLCF began tracking the metric in 2009; and success in delivering on the allocation of 60, 30, and 10% of core hours offered for the INCITE (Innovative and Novel Computational Impact on Theory and Experiment), ALCC (Advanced Scientific Computing Research Leadership Computing Challenge), and Director’s Discretionary programs, respectively. These accomplishments, coupled with the extremely high utilization rate, represent the fulfillment of the promise of Titan: maximum use by maximum-size simulations. The impact of all of these successes and more is reflected in the accomplishments of OLCF users, with publications this year in notable journals Nature, Nature Materials, Nature Chemistry, Nature Physics, Nature Climate Change, ACS Nano, Journal of the American Chemical Society, and Physical Review Letters, as well as many others. The achievements included in the 2015 OLCF Operational Assessment Report reflect first-ever or largest simulations in their communities; for example Titan enabled engineers in Los Angeles and the surrounding region to design and begin building improved critical infrastructure by enabling the highest-resolution Cybershake map for Southern California to date. The Titan system provides the largest extant heterogeneous architecture for computing and computational science. Usage is high, delivering on the promise of a system well-suited for capability simulations for science. This success is due in part to innovations in tracking and reporting the activity on the compute nodes, and using this information to further enable and optimize applications, extending and balancing workload across the entire node. The OLCF continues to invest in innovative processes, tools, and resources necessary to meet continuing user demand. The facility’s leadership in data analysis and workflows was featured at the Department of Energy (DOE) booth at SC15, for the second year in a row, highlighting work with researchers from the National Library of Medicine coupled with unique computational and data resources serving experimental and observational data across facilities. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. Building on the exemplary year of 2014, as shown by the 2014 Operational Assessment Report (OAR) review committee response in Appendix A, this OAR delineates the policies, procedures, and innovations implemented by the OLCF to continue delivering a multi-petaflop resource for cutting-edge research. This report covers CY 2015, which, unless otherwise specified, denotes January 1, 2015, through December 31, 2015.« less

  7. Does Formal Research Training Lead to Academic Success in Plastic Surgery? A Comprehensive Analysis of U.S. Academic Plastic Surgeons.

    PubMed

    Lopez, Joseph; Ameri, Afshin; Susarla, Srinivas M; Reddy, Sashank; Soni, Ashwin; Tong, J W; Amini, Neda; Ahmed, Rizwan; May, James W; Lee, W P Andrew; Dorafshar, Amir

    2016-01-01

    It is currently unknown whether formal research training has an influence on academic advancement in plastic surgery. The purpose of this study was to determine whether formal research training was associated with higher research productivity, academic rank, and procurement of extramural National Institutes of Health (NIH) funding in plastic surgery, comparing academic surgeons who completed said research training with those without. This was a cross-sectional study of full-time academic plastic surgeons in the United States. The main predictor variable was formal research training, defined as completion of a postdoctoral research fellowship or attainment of a Doctor of Philosophy (PhD). The primary outcome was scientific productivity measured by the Hirsh-index (h-index, the number of publications, h that have at least h citations each). The secondary outcomes were academic rank and NIH funding. Descriptive, bivariate, and multiple regression statistics were computed. A total of 607 academic surgeons were identified from 94 Accreditation Council for Graduate Medical Education-accredited plastic surgery training programs. In all, 179 (29.5%) surgeons completed formal research training. The mean h-index was 11.7 ± 9.9. And, 58 (9.6%) surgeons successfully procured NIH funding. The distribution of academic rank was the following: endowed professor (5.4%), professor (23.9%), associate professor (23.4%), assistant professor (46.0%), and instructor (1.3%). In a multiple regression analysis, completion of formal research training was significantly predictive of a higher h-index and successful procurement of NIH funding. Current evidence demonstrates that formal research training is associated with higher scientific productivity and increased likelihood of future NIH funding. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  8. Advanced Scientific Computing Research Exascale Requirements Review. An Office of Science review sponsored by Advanced Scientific Computing Research, September 27-29, 2016, Rockville, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Almgren, Ann; DeMar, Phil; Vetter, Jeffrey

    The widespread use of computing in the American economy would not be possible without a thoughtful, exploratory research and development (R&D) community pushing the performance edge of operating systems, computer languages, and software libraries. These are the tools and building blocks — the hammers, chisels, bricks, and mortar — of the smartphone, the cloud, and the computing services on which we rely. Engineers and scientists need ever-more specialized computing tools to discover new material properties for manufacturing, make energy generation safer and more efficient, and provide insight into the fundamentals of the universe, for example. The research division of themore » U.S. Department of Energy’s (DOE’s) Office of Advanced Scientific Computing and Research (ASCR Research) ensures that these tools and building blocks are being developed and honed to meet the extreme needs of modern science. See also http://exascaleage.org/ascr/ for additional information.« less

  9. The Applicability of Emerging Quantum Computing Capabilities to Exo-Planet Research

    NASA Astrophysics Data System (ADS)

    Correll, Randall; Worden, S.

    2014-01-01

    In conjunction with the Universities Space Research Association and Google, Inc. NASA Ames has acquired a quantum computing device built by DWAVE Systems with approximately 512 “qubits.” Quantum computers have the feature that their capabilities to find solutions to problems with large numbers of variables scale linearly with the number of variables rather than exponentially with that number. These devices may have significant applicability to detection of exoplanet signals in noisy data. We have therefore explored the application of quantum computing to analyse stellar transiting exoplanet data from NASA’s Kepler Mission. The analysis of the case studies was done using the DWAVE Systems’s BlackBox compiler software emulator, although one dataset was run successfully on the DWAVE Systems’s 512 qubit Vesuvius machine. The approach first extracts a list of candidate transits from the photometric lightcurve of a given Kepler target, and then applies a quantum annealing algorithm to find periodicity matches between subsets of the candidate transit list. We examined twelve case studies and were successful in reproducing the results of the Kepler science pipeline in finding validated exoplanets, and matched the results for a pair of candidate exoplanets. We conclude that the current implementation of the algorithm is not sufficiently challenging to require a quantum computer as opposed to a conventional computer. We are developing more robust algorithms better tailored to the quantum computer and do believe that our approach has the potential to extract exoplanet transits in some cases where a conventional approach would not in Kepler data. Additionally, we believe the new quantum capabilities may have even greater relevance for new exoplanet data sets such as that contemplated for NASA’s Transiting Exoplanet Survey Satellite (TESS) and other astrophysics data sets.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Jack A.; Quinn, Robert A.; Debelius, Justine

    Rapid advances in DNA sequencing, metabolomics, proteomics and computation dramatically increase accessibility of microbiome studies and identify links between the microbiome and disease. Microbial time-series and multiple molecular perspectives enable Microbiome-Wide Association Studies (MWAS), analogous to Genome-Wide Association Studies (GWAS). Rapid research advances point towards actionable results, although approved clinical tests based on MWAS are still in the future. Appreciating the complexity of interactions between diet, chemistry, health and the microbiome, and determining the frequency of observations needed to capture and integrate this dynamic interface, is paramount for addressing the need for personalized and precision microbiome-based diagnostics and therapies.

  11. Hierarchical Parallelism in Finite Difference Analysis of Heat Conduction

    NASA Technical Reports Server (NTRS)

    Padovan, Joseph; Krishna, Lala; Gute, Douglas

    1997-01-01

    Based on the concept of hierarchical parallelism, this research effort resulted in highly efficient parallel solution strategies for very large scale heat conduction problems. Overall, the method of hierarchical parallelism involves the partitioning of thermal models into several substructured levels wherein an optimal balance into various associated bandwidths is achieved. The details are described in this report. Overall, the report is organized into two parts. Part 1 describes the parallel modelling methodology and associated multilevel direct, iterative and mixed solution schemes. Part 2 establishes both the formal and computational properties of the scheme.

  12. Design of Intelligent Robot as A Tool for Teaching Media Based on Computer Interactive Learning and Computer Assisted Learning to Improve the Skill of University Student

    NASA Astrophysics Data System (ADS)

    Zuhrie, M. S.; Basuki, I.; Asto B, I. G. P.; Anifah, L.

    2018-01-01

    The focus of the research is the teaching module which incorporates manufacturing, planning mechanical designing, controlling system through microprocessor technology and maneuverability of the robot. Computer interactive and computer-assisted learning is strategies that emphasize the use of computers and learning aids (computer assisted learning) in teaching and learning activity. This research applied the 4-D model research and development. The model is suggested by Thiagarajan, et.al (1974). 4-D Model consists of four stages: Define Stage, Design Stage, Develop Stage, and Disseminate Stage. This research was conducted by applying the research design development with an objective to produce a tool of learning in the form of intelligent robot modules and kit based on Computer Interactive Learning and Computer Assisted Learning. From the data of the Indonesia Robot Contest during the period of 2009-2015, it can be seen that the modules that have been developed confirm the fourth stage of the research methods of development; disseminate method. The modules which have been developed for students guide students to produce Intelligent Robot Tool for Teaching Based on Computer Interactive Learning and Computer Assisted Learning. Results of students’ responses also showed a positive feedback to relate to the module of robotics and computer-based interactive learning.

  13. Computer science security research and human subjects: emerging considerations for research ethics boards.

    PubMed

    Buchanan, Elizabeth; Aycock, John; Dexter, Scott; Dittrich, David; Hvizdak, Erin

    2011-06-01

    This paper explores the growing concerns with computer science research, and in particular, computer security research and its relationship with the committees that review human subjects research. It offers cases that review boards are likely to confront, and provides a context for appropriate consideration of such research, as issues of bots, clouds, and worms enter the discourse of human subjects review.

  14. Cohort mortality study of garment industry workers exposed to formaldehyde: update and internal comparisons.

    PubMed

    Meyers, Alysha R; Pinkerton, Lynne E; Hein, Misty J

    2013-09-01

    To further evaluate the association between formaldehyde and leukemia, we extended follow-up through 2008 for a cohort mortality study of 11,043 US formaldehyde-exposed garment workers. We computed standardized mortality ratios and standardized rate ratios stratified by year of first exposure, exposure duration, and time since first exposure. Associations between exposure duration and rates of leukemia and myeloid leukemia were further examined using Poisson regression models. Compared to the US population, myeloid leukemia mortality was elevated but overall leukemia mortality was not. In internal analyses, overall leukemia mortality increased with increasing exposure duration and this trend was statistically significant. We continue to see limited evidence of an association between formaldehyde and leukemia. However, the extended follow-up did not strengthen previously observed associations. In addition to continued epidemiologic research, we recommend further research to evaluate the biological plausibility of a causal relation between formaldehyde and leukemia. Copyright © 2013 Wiley Periodicals, Inc.

  15. Understanding Aprun Use Patterns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Hwa-Chun Wendy

    2009-05-06

    On the Cray XT, aprun is the command to launch an application to a set of compute nodes reserved through the Application Level Placement Scheduler (ALPS). At the National Energy Research Scientific Computing Center (NERSC), interactive aprun is disabled. That is, invocations of aprun have to go through the batch system. Batch scripts can and often do contain several apruns which either use subsets of the reserved nodes in parallel, or use all reserved nodes in consecutive apruns. In order to better understand how NERSC users run on the XT, it is necessary to associate aprun information with jobs. Itmore » is surprisingly more challenging than it sounds. In this paper, we describe those challenges and how we solved them to produce daily per-job reports for completed apruns. We also describe additional uses of the data, e.g. adjusting charging policy accordingly or associating node failures with jobs/users, and plans for enhancements.« less

  16. Iterative CT reconstruction using coordinate descent with ordered subsets of data

    NASA Astrophysics Data System (ADS)

    Noo, F.; Hahn, K.; Schöndube, H.; Stierstorfer, K.

    2016-04-01

    Image reconstruction based on iterative minimization of a penalized weighted least-square criteria has become an important topic of research in X-ray computed tomography. This topic is motivated by increasing evidence that such a formalism may enable a significant reduction in dose imparted to the patient while maintaining or improving image quality. One important issue associated with this iterative image reconstruction concept is slow convergence and the associated computational effort. For this reason, there is interest in finding methods that produce approximate versions of the targeted image with a small number of iterations and an acceptable level of discrepancy. We introduce here a novel method to produce such approximations: ordered subsets in combination with iterative coordinate descent. Preliminary results demonstrate that this method can produce, within 10 iterations and using only a constant image as initial condition, satisfactory reconstructions that retain the noise properties of the targeted image.

  17. Why Don't All Professors Use Computers?

    ERIC Educational Resources Information Center

    Drew, David Eli

    1989-01-01

    Discusses the adoption of computer technology at universities and examines reasons why some professors don't use computers. Topics discussed include computer applications, including artificial intelligence, social science research, statistical analysis, and cooperative research; appropriateness of the technology for the task; the Computer Aptitude…

  18. Good enough practices in scientific computing.

    PubMed

    Wilson, Greg; Bryan, Jennifer; Cranston, Karen; Kitzes, Justin; Nederbragt, Lex; Teal, Tracy K

    2017-06-01

    Computers are now essential in all branches of science, but most researchers are never taught the equivalent of basic lab skills for research computing. As a result, data can get lost, analyses can take much longer than necessary, and researchers are limited in how effectively they can work with software and data. Computing workflows need to follow the same practices as lab projects and notebooks, with organized data, documented steps, and the project structured for reproducibility, but researchers new to computing often don't know where to start. This paper presents a set of good computing practices that every researcher can adopt, regardless of their current level of computational skill. These practices, which encompass data management, programming, collaborating with colleagues, organizing projects, tracking work, and writing manuscripts, are drawn from a wide variety of published sources from our daily lives and from our work with volunteer organizations that have delivered workshops to over 11,000 people since 2010.

  19. Genevar: a database and Java application for the analysis and visualization of SNP-gene associations in eQTL studies.

    PubMed

    Yang, Tsun-Po; Beazley, Claude; Montgomery, Stephen B; Dimas, Antigone S; Gutierrez-Arcelus, Maria; Stranger, Barbara E; Deloukas, Panos; Dermitzakis, Emmanouil T

    2010-10-01

    Genevar (GENe Expression VARiation) is a database and Java tool designed to integrate multiple datasets, and provides analysis and visualization of associations between sequence variation and gene expression. Genevar allows researchers to investigate expression quantitative trait loci (eQTL) associations within a gene locus of interest in real time. The database and application can be installed on a standard computer in database mode and, in addition, on a server to share discoveries among affiliations or the broader community over the Internet via web services protocols. http://www.sanger.ac.uk/resources/software/genevar.

  20. An Overview of NASA's Intelligent Systems Program

    NASA Technical Reports Server (NTRS)

    Cooke, Daniel E.; Norvig, Peter (Technical Monitor)

    2001-01-01

    NASA and the Computer Science Research community are poised to enter a critical era. An era in which - it seems - that each needs the other. Market forces, driven by the immediate economic viability of computer science research results, place Computer Science in a relatively novel position. These forces impact how research is done, and could, in worst case, drive the field away from significant innovation opting instead for incremental advances that result in greater stability in the market place. NASA, however, requires significant advances in computer science research in order to accomplish the exploration and science agenda it has set out for itself. NASA may indeed be poised to advance computer science research in this century much the way it advanced aero-based research in the last.

  1. 78 FR 26626 - Applications for New Awards; National Institute on Disability and Rehabilitation Research...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-07

    ... Rehabilitation Research--Disability and Rehabilitation Research Projects--Inclusive Cloud and Web Computing... Rehabilitation Research Projects (DRRPs)--Inclusive Cloud and Web Computing Notice inviting applications for new...#DRRP . Priorities: Priority 1--DRRP on Inclusive Cloud and Web Computing-- is from the notice of final...

  2. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  3. Graphics supercomputer for computational fluid dynamics research

    NASA Astrophysics Data System (ADS)

    Liaw, Goang S.

    1994-11-01

    The objective of this project is to purchase a state-of-the-art graphics supercomputer to improve the Computational Fluid Dynamics (CFD) research capability at Alabama A & M University (AAMU) and to support the Air Force research projects. A cutting-edge graphics supercomputer system, Onyx VTX, from Silicon Graphics Computer Systems (SGI), was purchased and installed. Other equipment including a desktop personal computer, PC-486 DX2 with a built-in 10-BaseT Ethernet card, a 10-BaseT hub, an Apple Laser Printer Select 360, and a notebook computer from Zenith were also purchased. A reading room has been converted to a research computer lab by adding some furniture and an air conditioning unit in order to provide an appropriate working environments for researchers and the purchase equipment. All the purchased equipment were successfully installed and are fully functional. Several research projects, including two existing Air Force projects, are being performed using these facilities.

  4. Advanced computations in plasma physics

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2002-05-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.

  5. How Computer Literacy and Socioeconomic Status Affect Attitudes Toward a Web-Based Cohort: Results From the NutriNet-Santé Study

    PubMed Central

    Méjean, Caroline; Andreeva, Valentina A; Kesse-Guyot, Emmanuelle; Fassier, Philippine; Galan, Pilar; Hercberg, Serge; Touvier, Mathilde

    2015-01-01

    Background In spite of the growing literature in the field of e-epidemiology, clear evidence about computer literacy or attitudes toward respondent burden among e-cohort participants is largely lacking. Objective We assessed the computer and Internet skills of participants in the NutriNet-Santé Web-based cohort. We then explored attitudes toward the study demands/respondent burden according to levels of computer literacy and sociodemographic status. Methods Self-reported data from 43,028 e-cohort participants were collected in 2013 via a Web-based questionnaire. We employed unconditional logistic and linear regression analyses. Results Approximately one-quarter of participants (23.79%, 10,235/43,028) reported being inexperienced in terms of computer use. Regarding attitudes toward participant burden, women tended to be more favorable (eg, “The overall website use is easy”) than were men (OR 0.65, 95% CI 0.59-0.71, P<.001), whereas better educated participants (>12 years of schooling) were less likely to accept the demands associated with participation (eg, “I receive questionnaires too often”) compared to their less educated counterparts (OR 1.62, 95% CI 1.48-1.76, P<.001). Conclusions A substantial proportion of participants had low computer/Internet skills, suggesting that this does not represent a barrier to participation in Web-based cohorts. Our study also suggests that several subgroups of participants with lower computer skills (eg, women or those with lower educational level) might more readily accept the demands associated with participation in the Web cohort. These findings can help guide future Web-based research strategies. PMID:25648178

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hules, John

    This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.

  7. Computers in Education: Realizing the Potential. Chairmen's Report of a Research Conference, Pittsburgh, Pennsylvania, November 20-24, 1982.

    ERIC Educational Resources Information Center

    Lesgold, Alan; Reif, Frederick

    The future of computers in education and the research needed to realize the computer's potential are discussed in this report, which presents a summary and the conclusions from an invitational conference involving 40 computer scientists, psychologists, educational researchers, teachers, school administrators, and parents. The summary stresses the…

  8. A prototype Upper Atmospheric Research Collaboratory (UARC)

    NASA Technical Reports Server (NTRS)

    Clauer, C. R.; Atkins, D. E; Weymouth, T. E.; Olson, G. M.; Niciejewski, R.; Finholt, T. A.; Prakash, A.; Rasmussen, C. E.; Killeen, T.; Rosenberg, T. J.

    1995-01-01

    The National Collaboratory concept has great potential for enabling 'critical mass' working groups and highly interdisciplinary research projects. We report here on a new program to build a prototype collaboratory using the Sondrestrom Upper Atmospheric Research Facility in Kangerlussuaq, Greenland and a group of associated scientists. The Upper Atmospheric Research Collaboratory (UARC) is a joint venture of researchers in upper atmospheric and space science, computer science, and behavioral science to develop a testbed for collaborative remote research. We define the 'collaboratory' as an advanced information technology environment which enables teams to work together over distance and time on a wide variety of intellectual tasks. It provides: (1) human-to-human communications using shared computer tools and work spaces; (2) group access and use of a network of information, data, and knowledge sources; and (3) remote access and control of instruments for data acquisition. The UARC testbed is being implemented to support a distributed community of space scientists so that they have network access to the remote instrument facility in Kangerlussuaq and are able to interact among geographically distributed locations. The goal is to enable them to use the UARC rather than physical travel to Greenland to conduct team research campaigns. Even on short notice through the collaboratory from their home institutions, participants will be able to meet together to operate a battery of remote interactive observations and to acquire, process, and interpret the data.

  9. Cognitive computing and eScience in health and life science research: artificial intelligence and obesity intervention programs.

    PubMed

    Marshall, Thomas; Champagne-Langabeer, Tiffiany; Castelli, Darla; Hoelscher, Deanna

    2017-12-01

    To present research models based on artificial intelligence and discuss the concept of cognitive computing and eScience as disruptive factors in health and life science research methodologies. The paper identifies big data as a catalyst to innovation and the development of artificial intelligence, presents a framework for computer-supported human problem solving and describes a transformation of research support models. This framework includes traditional computer support; federated cognition using machine learning and cognitive agents to augment human intelligence; and a semi-autonomous/autonomous cognitive model, based on deep machine learning, which supports eScience. The paper provides a forward view of the impact of artificial intelligence on our human-computer support and research methods in health and life science research. By augmenting or amplifying human task performance with artificial intelligence, cognitive computing and eScience research models are discussed as novel and innovative systems for developing more effective adaptive obesity intervention programs.

  10. Computer Science Research at Langley

    NASA Technical Reports Server (NTRS)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  11. Cloud computing security.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Dongwan; Claycomb, William R.; Urias, Vincent E.

    Cloud computing is a paradigm rapidly being embraced by government and industry as a solution for cost-savings, scalability, and collaboration. While a multitude of applications and services are available commercially for cloud-based solutions, research in this area has yet to fully embrace the full spectrum of potential challenges facing cloud computing. This tutorial aims to provide researchers with a fundamental understanding of cloud computing, with the goals of identifying a broad range of potential research topics, and inspiring a new surge in research to address current issues. We will also discuss real implementations of research-oriented cloud computing systems for bothmore » academia and government, including configuration options, hardware issues, challenges, and solutions.« less

  12. Validation of numerical model for cook stove using Reynolds averaged Navier-Stokes based solver

    NASA Astrophysics Data System (ADS)

    Islam, Md. Moinul; Hasan, Md. Abdullah Al; Rahman, Md. Mominur; Rahaman, Md. Mashiur

    2017-12-01

    Biomass fired cook stoves, for many years, have been the main cooking appliance for the rural people of developing countries. Several researches have been carried out to the find efficient stoves. In the present study, numerical model of an improved household cook stove is developed to analyze the heat transfer and flow behavior of gas during operation. The numerical model is validated with the experimental results. Computation of the numerical model is executed the using non-premixed combustion model. Reynold's averaged Navier-Stokes (RaNS) equation along with the κ - ɛ model governed the turbulent flow associated within the computed domain. The computational results are in well agreement with the experiment. Developed numerical model can be used to predict the effect of different biomasses on the efficiency of the cook stove.

  13. Advanced computational tools for 3-D seismic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advancemore » in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.« less

  14. Tractable Experiment Design via Mathematical Surrogates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Brian J.

    This presentation summarizes the development and implementation of quantitative design criteria motivated by targeted inference objectives for identifying new, potentially expensive computational or physical experiments. The first application is concerned with estimating features of quantities of interest arising from complex computational models, such as quantiles or failure probabilities. A sequential strategy is proposed for iterative refinement of the importance distributions used to efficiently sample the uncertain inputs to the computational model. In the second application, effective use of mathematical surrogates is investigated to help alleviate the analytical and numerical intractability often associated with Bayesian experiment design. This approach allows formore » the incorporation of prior information into the design process without the need for gross simplification of the design criterion. Illustrative examples of both design problems will be presented as an argument for the relevance of these research problems.« less

  15. Comment on "Most computational hydrology is not reproducible, so is it really science?" by Christopher Hutton et al.

    NASA Astrophysics Data System (ADS)

    Añel, Juan A.

    2017-03-01

    Nowadays, the majority of the scientific community is not aware of the risks and problems associated with an inadequate use of computer systems for research, mostly for reproducibility of scientific results. Such reproducibility can be compromised by the lack of clear standards and insufficient methodological description of the computational details involved in an experiment. In addition, the inappropriate application or ignorance of copyright laws can have undesirable effects on access to aspects of great importance of the design of experiments and therefore to the interpretation of results.Plain Language SummaryThis article highlights several important issues to ensure the scientific reproducibility of results within the current scientific framework, going beyond simple documentation. Several specific examples are discussed in the field of hydrological modeling.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/967143','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/967143"><span>Alliance for Computational Science Collaboration: HBCU Partnership at Alabama A&M University Continuing High Performance Computing Research and Education at AAMU</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Qian, Xiaoqing; Deng, Z. T.</p> <p>2009-11-10</p> <p>This is the final report for the Department of Energy (DOE) project DE-FG02-06ER25746, entitled, "Continuing High Performance Computing Research and Education at AAMU". This three-year project was started in August 15, 2006, and it was ended in August 14, 2009. The objective of this project was to enhance high performance computing research and education capabilities at Alabama A&M University (AAMU), and to train African-American and other minority students and scientists in the computational science field for eventual employment with DOE. AAMU has successfully completed all the proposed research and educational tasks. Through the support of DOE, AAMU was able tomore » provide opportunities to minority students through summer interns and DOE computational science scholarship program. In the past three years, AAMU (1). Supported three graduate research assistants in image processing for hypersonic shockwave control experiment and in computational science related area; (2). Recruited and provided full financial support for six AAMU undergraduate summer research interns to participate Research Alliance in Math and Science (RAMS) program at Oak Ridge National Lab (ORNL); (3). Awarded highly competitive 30 DOE High Performance Computing Scholarships ($1500 each) to qualified top AAMU undergraduate students in science and engineering majors; (4). Improved high performance computing laboratory at AAMU with the addition of three high performance Linux workstations; (5). Conducted image analysis for electromagnetic shockwave control experiment and computation of shockwave interactions to verify the design and operation of AAMU-Supersonic wind tunnel. The high performance computing research and education activities at AAMU created great impact to minority students. As praised by Accreditation Board for Engineering and Technology (ABET) in 2009, ?The work on high performance computing that is funded by the Department of Energy provides scholarships to undergraduate students as computational science scholars. This is a wonderful opportunity to recruit under-represented students.? Three ASEE papers were published in 2007, 2008 and 2009 proceedings of ASEE Annual Conferences, respectively. Presentations of these papers were also made at the ASEE Annual Conferences. It is very critical to continue the research and education activities.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19890001916','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19890001916"><span>The CSM testbed software system: A development environment for structural analysis methods on the NAS CRAY-2</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Gillian, Ronnie E.; Lotts, Christine G.</p> <p>1988-01-01</p> <p>The Computational Structural Mechanics (CSM) Activity at Langley Research Center is developing methods for structural analysis on modern computers. To facilitate that research effort, an applications development environment has been constructed to insulate the researcher from the many computer operating systems of a widely distributed computer network. The CSM Testbed development system was ported to the Numerical Aerodynamic Simulator (NAS) Cray-2, at the Ames Research Center, to provide a high end computational capability. This paper describes the implementation experiences, the resulting capability, and the future directions for the Testbed on supercomputers.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19860023560','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19860023560"><span>First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Denning, P. J.</p> <p>1986-01-01</p> <p>The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Scientific+AND+Research&id=EJ1008271','ERIC'); return false;" href="https://eric.ed.gov/?q=Scientific+AND+Research&id=EJ1008271"><span>An Analysis on the Effect of Computer Self-Efficacy over Scientific Research Self-Efficacy and Information Literacy Self-Efficacy</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Tuncer, Murat</p> <p>2013-01-01</p> <p>Present research investigates reciprocal relations amidst computer self-efficacy, scientific research and information literacy self-efficacy. Research findings have demonstrated that according to standardized regression coefficients, computer self-efficacy has a positive effect on information literacy self-efficacy. Likewise it has been detected…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title48-vol3/pdf/CFR-2010-title48-vol3-sec252-227-7018.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title48-vol3/pdf/CFR-2010-title48-vol3-sec252-227-7018.pdf"><span>48 CFR 252.227-7018 - Rights in noncommercial technical data and computer software-Small Business Innovation Research...</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-10-01</p> <p>... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Innovation Research (SBIR) Program. As prescribed in 227.7104(a), use the following clause: Rights in Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (JUN 1995...</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_25 --> <div class="footer-extlink text-muted" style="margin-bottom:1rem; text-align:center;">Some links on this page may take you to non-federal websites. Their policies may differ from this site.</div> </div><!-- container --> <a id="backToTop" href="#top"> Top </a> <footer> <nav> <ul class="links"> <li><a href="/sitemap.html">Site Map</a></li> <li><a href="/website-policies.html">Website Policies</a></li> <li><a href="https://www.energy.gov/vulnerability-disclosure-policy" target="_blank">Vulnerability Disclosure Program</a></li> <li><a href="/contact.html">Contact Us</a></li> </ul> </nav> </footer> <script type="text/javascript"><!-- // var lastDiv = ""; function showDiv(divName) { // hide last div if (lastDiv) { document.getElementById(lastDiv).className = "hiddenDiv"; } //if value of the box is not nothing and an object with that name exists, then change the class if (divName && document.getElementById(divName)) { document.getElementById(divName).className = "visibleDiv"; lastDiv = divName; } } //--> </script> <script> /** * Function that tracks a click on an outbound link in Google Analytics. * This function takes a valid URL string as an argument, and uses that URL string * as the event label. */ var trackOutboundLink = function(url,collectionCode) { try { h = window.open(url); setTimeout(function() { ga('send', 'event', 'topic-page-click-through', collectionCode, url); }, 1000); } catch(err){} }; </script> <!-- Google Analytics --> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1122789-34', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Analytics --> <script> showDiv('page_1') </script> </body> </html>