Sample records for methods publications computer

  1. 47 CFR 80.771 - Method of computing coverage.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Method of computing coverage. 80.771 Section 80... STATIONS IN THE MARITIME SERVICES Standards for Computing Public Coast Station VHF Coverage § 80.771 Method of computing coverage. Compute the +17 dBu contour as follows: (a) Determine the effective antenna...

  2. Assessing Tax Form Distribution Costs: A Proposed Method for Computing the Dollar Value of Tax Form Distribution in a Public Library.

    ERIC Educational Resources Information Center

    Casey, James B.

    1998-01-01

    Explains how a public library can compute the actual cost of distributing tax forms to the public by listing all direct and indirect costs and demonstrating the formulae and necessary computations. Supplies directions for calculating costs involved for all levels of staff as well as associated public relations efforts, space, and utility costs.…

  3. Climate Change Discourse in Mass Media: Application of Computer-Assisted Content Analysis

    ERIC Educational Resources Information Center

    Kirilenko, Andrei P.; Stepchenkova, Svetlana O.

    2012-01-01

    Content analysis of mass media publications has become a major scientific method used to analyze public discourse on climate change. We propose a computer-assisted content analysis method to extract prevalent themes and analyze discourse changes over an extended period in an objective and quantifiable manner. The method includes the following: (1)…

  4. Testing and Validation of Computational Methods for Mass Spectrometry.

    PubMed

    Gatto, Laurent; Hansen, Kasper D; Hoopmann, Michael R; Hermjakob, Henning; Kohlbacher, Oliver; Beyer, Andreas

    2016-03-04

    High-throughput methods based on mass spectrometry (proteomics, metabolomics, lipidomics, etc.) produce a wealth of data that cannot be analyzed without computational methods. The impact of the choice of method on the overall result of a biological study is often underappreciated, but different methods can result in very different biological findings. It is thus essential to evaluate and compare the correctness and relative performance of computational methods. The volume of the data as well as the complexity of the algorithms render unbiased comparisons challenging. This paper discusses some problems and challenges in testing and validation of computational methods. We discuss the different types of data (simulated and experimental validation data) as well as different metrics to compare methods. We also introduce a new public repository for mass spectrometric reference data sets ( http://compms.org/RefData ) that contains a collection of publicly available data sets for performance evaluation for a wide range of different methods.

  5. 47 CFR 80.771 - Method of computing coverage.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 5 2012-10-01 2012-10-01 false Method of computing coverage. 80.771 Section 80.771 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES STATIONS IN THE MARITIME SERVICES Standards for Computing Public Coast Station VHF Coverage § 80.771 Method...

  6. A Computer-Assisted Instruction in Teaching Abstract Statistics to Public Affairs Undergraduates

    ERIC Educational Resources Information Center

    Ozturk, Ali Osman

    2012-01-01

    This article attempts to demonstrate the applicability of a computer-assisted instruction supported with simulated data in teaching abstract statistical concepts to political science and public affairs students in an introductory research methods course. The software is called the Elaboration Model Computer Exercise (EMCE) in that it takes a great…

  7. Advanced Computational Methods for Optimization of Non-Periodic Inspection Intervals for Aging Infrastructure

    DTIC Science & Technology

    2017-01-05

    AFRL-AFOSR-JP-TR-2017-0002 Advanced Computational Methods for Optimization of Non-Periodic Inspection Intervals for Aging Infrastructure Manabu...Computational Methods for Optimization of Non-Periodic Inspection Intervals for Aging Infrastructure 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA2386...UNLIMITED: PB Public Release 13. SUPPLEMENTARY NOTES 14. ABSTRACT This report for the project titled ’Advanced Computational Methods for Optimization of

  8. Leveraging Social Computing for Personalized Crisis Communication using Social Media

    PubMed Central

    Leykin, Dmitry; Aharonson-Daniel, Limor; Lahad, Mooli

    2016-01-01

    Introduction: The extensive use of social media in modern life redefines social interaction and communication. Communication plays an important role in mitigating, or exacerbating, the psychological and behavioral responses to critical incidents and disasters. As recent disasters demonstrated, people tend to converge to social media during and following emergencies. Authorities can then use this media and other computational methods to gain insights from the public, mainly to enhance situational awareness, but also to improve their communication with the public and public adherence to instructions. Methods: The current review presents a conceptual framework for studying psychological aspects of crisis and risk communication using the social media through social computing. Results: Advanced analytical tools can be integrated in the processes and objectives of crisis communication. The availability of the computational techniques can improve communication with the public by a process of Hyper-Targeted Crisis Communication. Discussion: The review suggests that using advanced computational tools for target-audience profiling and linguistic matching in social media, can facilitate more sensitive and personalized emergency communication. PMID:27092290

  9. [Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure].

    PubMed

    Yokohama, Noriya

    2013-07-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost.

  10. Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods, and Results for a User Study

    DTIC Science & Technology

    2016-11-01

    Display Design, Methods , and Results for a User Study by Christopher J Garneau and Robert F Erbacher Approved for public...NOV 2016 US Army Research Laboratory Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods ...January 2013–September 2015 4. TITLE AND SUBTITLE Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods

  11. [Economic efficiency of computer monitoring of health].

    PubMed

    Il'icheva, N P; Stazhadze, L L

    2001-01-01

    Presents the method of computer monitoring of health, based on utilization of modern information technologies in public health. The method helps organize preventive activities of an outpatient clinic at a high level and essentially decrease the time and money loss. Efficiency of such preventive measures, increased number of computer and Internet users suggests that such methods are promising and further studies in this field are needed.

  12. Cumulative reports and publications through December 31, 1991

    NASA Technical Reports Server (NTRS)

    1992-01-01

    A reports and publications list is given from the Institute for Computer Applications in Science and Engineering (ICASE) through December 31, 1991. The major categories of the current ICASE research program are; numerical methods, control and parameter identification problems, computational problems in engineering and the physical sciences, and computer systems and software. Since ICASE reports are intended to be preprints of articles that will appear in journals or conference proceedings, the published reference is included when available.

  13. Progress in computational toxicology.

    PubMed

    Ekins, Sean

    2014-01-01

    Computational methods have been widely applied to toxicology across pharmaceutical, consumer product and environmental fields over the past decade. Progress in computational toxicology is now reviewed. A literature review was performed on computational models for hepatotoxicity (e.g. for drug-induced liver injury (DILI)), cardiotoxicity, renal toxicity and genotoxicity. In addition various publications have been highlighted that use machine learning methods. Several computational toxicology model datasets from past publications were used to compare Bayesian and Support Vector Machine (SVM) learning methods. The increasing amounts of data for defined toxicology endpoints have enabled machine learning models that have been increasingly used for predictions. It is shown that across many different models Bayesian and SVM perform similarly based on cross validation data. Considerable progress has been made in computational toxicology in a decade in both model development and availability of larger scale or 'big data' models. The future efforts in toxicology data generation will likely provide us with hundreds of thousands of compounds that are readily accessible for machine learning models. These models will cover relevant chemistry space for pharmaceutical, consumer product and environmental applications. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. The Observation of Bahasa Indonesia Official Computer Terms Implementation in Scientific Publication

    NASA Astrophysics Data System (ADS)

    Gunawan, D.; Amalia, A.; Lydia, M. S.; Muthaqin, M. I.

    2018-03-01

    The government of the Republic of Indonesia had issued a regulation to substitute computer terms in foreign language that have been used earlier into official computer terms in Bahasa Indonesia. This regulation was stipulated in Presidential Decree No. 2 of 2001 concerning the introduction of official computer terms in Bahasa Indonesia (known as Senarai Padanan Istilah/SPI). After sixteen years, people of Indonesia, particularly for academics, should have implemented the official computer terms in their official publications. This observation is conducted to discover the implementation of official computer terms usage in scientific publications which are written in Bahasa Indonesia. The data source used in this observation are the publications by the academics, particularly in computer science field. The method used in the observation is divided into four stages. The first stage is metadata harvesting by using Open Archive Initiative - Protocol for Metadata Harvesting (OAI-PMH). Second, converting the harvested document (in pdf format) to plain text. The third stage is text-preprocessing as the preparation of string matching. Then the final stage is searching the official computer terms based on 629 SPI terms by using Boyer-Moore algorithm. We observed that there are 240,781 foreign computer terms in 1,156 scientific publications from six universities. This result shows that the foreign computer terms are still widely used by the academics.

  15. 48 CFR 252.204-7014 - Limitations on the Use or Disclosure of Information by Litigation Support Contractors.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... Computer software does not include computer data bases or computer software documentation. Litigation... includes technical data and computer software, but does not include information that is lawfully, publicly available without restriction. Technical data means recorded information, regardless of the form or method...

  16. Secure Genomic Computation through Site-Wise Encryption

    PubMed Central

    Zhao, Yongan; Wang, XiaoFeng; Tang, Haixu

    2015-01-01

    Commercial clouds provide on-demand IT services for big-data analysis, which have become an attractive option for users who have no access to comparable infrastructure. However, utilizing these services for human genome analysis is highly risky, as human genomic data contains identifiable information of human individuals and their disease susceptibility. Therefore, currently, no computation on personal human genomic data is conducted on public clouds. To address this issue, here we present a site-wise encryption approach to encrypt whole human genome sequences, which can be subject to secure searching of genomic signatures on public clouds. We implemented this method within the Hadoop framework, and tested it on the case of searching disease markers retrieved from the ClinVar database against patients’ genomic sequences. The secure search runs only one order of magnitude slower than the simple search without encryption, indicating our method is ready to be used for secure genomic computation on public clouds. PMID:26306278

  17. Secure Genomic Computation through Site-Wise Encryption.

    PubMed

    Zhao, Yongan; Wang, XiaoFeng; Tang, Haixu

    2015-01-01

    Commercial clouds provide on-demand IT services for big-data analysis, which have become an attractive option for users who have no access to comparable infrastructure. However, utilizing these services for human genome analysis is highly risky, as human genomic data contains identifiable information of human individuals and their disease susceptibility. Therefore, currently, no computation on personal human genomic data is conducted on public clouds. To address this issue, here we present a site-wise encryption approach to encrypt whole human genome sequences, which can be subject to secure searching of genomic signatures on public clouds. We implemented this method within the Hadoop framework, and tested it on the case of searching disease markers retrieved from the ClinVar database against patients' genomic sequences. The secure search runs only one order of magnitude slower than the simple search without encryption, indicating our method is ready to be used for secure genomic computation on public clouds.

  18. Leveraging Social Computing for Personalized Crisis Communication using Social Media.

    PubMed

    Leykin, Dmitry; Aharonson-Daniel, Limor; Lahad, Mooli

    2016-03-24

    The extensive use of social media in modern life redefines social interaction and communication. Communication plays an important role in mitigating, or exacerbating, the psychological and behavioral responses to critical incidents and disasters. As recent disasters demonstrated, people tend to converge to social media during and following emergencies. Authorities can then use this media and other computational methods to gain insights from the public, mainly to enhance situational awareness, but also to improve their communication with the public and public adherence to instructions. The current review presents a conceptual framework for studying psychological aspects of crisis and risk communication using the social media through social computing. Advanced analytical tools can be integrated in the processes and objectives of crisis communication. The availability of the computational techniques can improve communication with the public by a process of Hyper-Targeted Crisis Communication. The review suggests that using advanced computational tools for target-audience profiling and linguistic matching in social media, can facilitate more sensitive and personalized emergency communication.

  19. QMC Goes BOINC: Using Public Resource Computing to Perform Quantum Monte Carlo Calculations

    NASA Astrophysics Data System (ADS)

    Rainey, Cameron; Engelhardt, Larry; Schröder, Christian; Hilbig, Thomas

    2008-10-01

    Theoretical modeling of magnetic molecules traditionally involves the diagonalization of quantum Hamiltonian matrices. However, as the complexity of these molecules increases, the matrices become so large that this process becomes unusable. An additional challenge to this modeling is that many repetitive calculations must be performed, further increasing the need for computing power. Both of these obstacles can be overcome by using a quantum Monte Carlo (QMC) method and a distributed computing project. We have recently implemented a QMC method within the Spinhenge@home project, which is a Public Resource Computing (PRC) project where private citizens allow part-time usage of their PCs for scientific computing. The use of PRC for scientific computing will be described in detail, as well as how you can contribute to the project. See, e.g., L. Engelhardt, et. al., Angew. Chem. Int. Ed. 47, 924 (2008). C. Schröoder, in Distributed & Grid Computing - Science Made Transparent for Everyone. Principles, Applications and Supporting Communities. (Weber, M.H.W., ed., 2008). Project URL: http://spin.fh-bielefeld.de

  20. Computational perspectives in the history of science: to the memory of Peter Damerow.

    PubMed

    Laubichler, Manfred D; Maienschein, Jane; Renn, Jürgen

    2013-03-01

    Computational methods and perspectives can transform the history of science by enabling the pursuit of novel types of questions, dramatically expanding the scale of analysis (geographically and temporally), and offering novel forms of publication that greatly enhance access and transparency. This essay presents a brief summary of a computational research system for the history of science, discussing its implications for research, education, and publication practices and its connections to the open-access movement and similar transformations in the natural and social sciences that emphasize big data. It also argues that computational approaches help to reconnect the history of science to individual scientific disciplines.

  1. [Text mining, a method for computer-assisted analysis of scientific texts, demonstrated by an analysis of author networks].

    PubMed

    Hahn, P; Dullweber, F; Unglaub, F; Spies, C K

    2014-06-01

    Searching for relevant publications is becoming more difficult with the increasing number of scientific articles. Text mining as a specific form of computer-based data analysis may be helpful in this context. Highlighting relations between authors and finding relevant publications concerning a specific subject using text analysis programs are illustrated graphically by 2 performed examples. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Trainable multiscript orientation detection

    NASA Astrophysics Data System (ADS)

    Van Beusekom, Joost; Rangoni, Yves; Breuel, Thomas M.

    2010-01-01

    Detecting the correct orientation of document images is an important step in large scale digitization processes, as most subsequent document analysis and optical character recognition methods assume upright position of the document page. Many methods have been proposed to solve the problem, most of which base on ascender to descender ratio computation. Unfortunately, this cannot be used for scripts having no descenders nor ascenders. Therefore, we present a trainable method using character similarity to compute the correct orientation. A connected component based distance measure is computed to compare the characters of the document image to characters whose orientation is known. This allows to detect the orientation for which the distance is lowest as the correct orientation. Training is easily achieved by exchanging the reference characters by characters of the script to be analyzed. Evaluation of the proposed approach showed accuracy of above 99% for Latin and Japanese script from the public UW-III and UW-II datasets. An accuracy of 98.9% was obtained for Fraktur on a non-public dataset. Comparison of the proposed method to two methods using ascender / descender ratio based orientation detection shows a significant improvement.

  3. Comprehensive Thematic T-Matrix Reference Database: A 2014-2015 Update

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Zakharova, Nadezhda; Khlebtsov, Nikolai G.; Videen, Gorden; Wriedt, Thomas

    2015-01-01

    The T-matrix method is one of the most versatile and efficient direct computer solvers of the macroscopic Maxwell equations and is widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, and particles in the vicinity of an interface separating two half-spaces with different refractive indices. This paper is the seventh update to the comprehensive thematic database of peer-reviewed T-matrix publications initiated by us in 2004 and includes relevant publications that have appeared since 2013. It also lists a number of earlier publications overlooked previously.

  4. Computation of Standard Errors

    PubMed Central

    Dowd, Bryan E; Greene, William H; Norton, Edward C

    2014-01-01

    Objectives We discuss the problem of computing the standard errors of functions involving estimated parameters and provide the relevant computer code for three different computational approaches using two popular computer packages. Study Design We show how to compute the standard errors of several functions of interest: the predicted value of the dependent variable for a particular subject, and the effect of a change in an explanatory variable on the predicted value of the dependent variable for an individual subject and average effect for a sample of subjects. Empirical Application Using a publicly available dataset, we explain three different methods of computing standard errors: the delta method, Krinsky–Robb, and bootstrapping. We provide computer code for Stata 12 and LIMDEP 10/NLOGIT 5. Conclusions In most applications, choice of the computational method for standard errors of functions of estimated parameters is a matter of convenience. However, when computing standard errors of the sample average of functions that involve both estimated parameters and nonstochastic explanatory variables, it is important to consider the sources of variation in the function's values. PMID:24800304

  5. Practices in source code sharing in astrophysics

    NASA Astrophysics Data System (ADS)

    Shamir, Lior; Wallin, John F.; Allen, Alice; Berriman, Bruce; Teuben, Peter; Nemiroff, Robert J.; Mink, Jessica; Hanisch, Robert J.; DuPrie, Kimberly

    2013-02-01

    While software and algorithms have become increasingly important in astronomy, the majority of authors who publish computational astronomy research do not share the source code they develop, making it difficult to replicate and reuse the work. In this paper we discuss the importance of sharing scientific source code with the entire astrophysics community, and propose that journals require authors to make their code publicly available when a paper is published. That is, we suggest that a paper that involves a computer program not be accepted for publication unless the source code becomes publicly available. The adoption of such a policy by editors, editorial boards, and reviewers will improve the ability to replicate scientific results, and will also make computational astronomy methods more available to other researchers who wish to apply them to their data.

  6. Toward the Geoscience Paper of the Future: Best practices for documenting and sharing research from data to software to provenance

    NASA Astrophysics Data System (ADS)

    Gil, Yolanda; David, Cédric H.; Demir, Ibrahim; Essawy, Bakinam T.; Fulweiler, Robinson W.; Goodall, Jonathan L.; Karlstrom, Leif; Lee, Huikyo; Mills, Heath J.; Oh, Ji-Hyun; Pierce, Suzanne A.; Pope, Allen; Tzeng, Mimi W.; Villamizar, Sandra R.; Yu, Xuan

    2016-10-01

    Geoscientists now live in a world rich with digital data and methods, and their computational research cannot be fully captured in traditional publications. The Geoscience Paper of the Future (GPF) presents an approach to fully document, share, and cite all their research products including data, software, and computational provenance. This article proposes best practices for GPF authors to make data, software, and methods openly accessible, citable, and well documented. The publication of digital objects empowers scientists to manage their research products as valuable scientific assets in an open and transparent way that enables broader access by other scientists, students, decision makers, and the public. Improving documentation and dissemination of research will accelerate the pace of scientific discovery by improving the ability of others to build upon published work.

  7. The Relationship between Teachers' Computer Self-Efficacy and Technology Integration in a School District's Bring Your Own Technology Initiative

    ERIC Educational Resources Information Center

    Ellis, Ashley F.

    2014-01-01

    The purpose of this mixed methods program evaluation study was to investigate the ways in which one public school district and its teachers implemented a Bring Your Own Technology (BYOT) initiative. This study also measured teachers' computer self-efficacy, as measured by Cassidy and Eachus' (2002) Computer User Self-Efficacy Scale, and…

  8. 32 CFR 701.38 - Technical data.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC FOIA Definitions and Terms § 701.38 Technical data. Recorded information, regardless of form or method of the recording, of a scientific or technical nature (including computer...

  9. 32 CFR 701.38 - Technical data.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC FOIA Definitions and Terms § 701.38 Technical data. Recorded information, regardless of form or method of the recording, of a scientific or technical nature (including computer...

  10. 32 CFR 701.38 - Technical data.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC FOIA Definitions and Terms § 701.38 Technical data. Recorded information, regardless of form or method of the recording, of a scientific or technical nature (including computer...

  11. Comprehensive Thematic T-Matrix Reference Database: A 2015-2017 Update

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Zakharova, Nadezhda; Khlebtsov, Nikolai G.; Videen, Gorden; Wriedt, Thomas

    2017-01-01

    The T-matrix method pioneered by Peter C. Waterman is one of the most versatile and efficient numerically exact computer solvers of the time-harmonic macroscopic Maxwell equations. It is widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, periodic structures (including metamaterials), and particles in the vicinity of plane or rough interfaces separating media with different refractive indices. This paper is the eighth update to the comprehensive thematic database of peer-reviewed T-matrix publications initiated in 2004 and lists relevant publications that have appeared since 2015. It also references a small number of earlier publications overlooked previously.

  12. Comprehensive thematic T-matrix reference database: A 2015-2017 update

    NASA Astrophysics Data System (ADS)

    Mishchenko, Michael I.; Zakharova, Nadezhda T.; Khlebtsov, Nikolai G.; Videen, Gorden; Wriedt, Thomas

    2017-11-01

    The T-matrix method pioneered by Peter C. Waterman is one of the most versatile and efficient numerically exact computer solvers of the time-harmonic macroscopic Maxwell equations. It is widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, periodic structures (including metamaterials), and particles in the vicinity of plane or rough interfaces separating media with different refractive indices. This paper is the eighth update to the comprehensive thematic database of peer-reviewed T-matrix publications initiated in 2004 and lists relevant publications that have appeared since 2015. It also references a small number of earlier publications overlooked previously.

  13. Cost-Benefit Analysis for ECIA Chapter 1 and State DPPF Programs Comparing Groups Receiving Regular Program Instruction and Groups Receiving Computer Assisted Instruction/Computer Management System (CAI/CMS). 1986-87.

    ERIC Educational Resources Information Center

    Chamberlain, Ed

    A cost benefit study was conducted to determine the effectiveness of a computer assisted instruction/computer management system (CAI/CMS) as an alternative to conventional methods of teaching reading within Chapter 1 and DPPF funded programs of the Columbus (Ohio) Public Schools. The Chapter 1 funded Compensatory Language Experiences and Reading…

  14. Verification of Spatial Forecasts of Continuous Meteorological Variables Using Categorical and Object-Based Methods

    DTIC Science & Technology

    2016-08-01

    Using Categorical and Object-Based Methods by John W Raby and Huaqing Cai Approved for public release; distribution...by John W Raby and Huaqing Cai Computational and Information Sciences Directorate, ARL Approved for public release...AUTHOR(S) John W Raby and Huaqing Cai 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND

  15. Cumulative reports and publications through December 31, 1989

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A complete list of reports from the Institute for Computer Applications in Science and Engineering (ICASE) is presented. The major categories of the current ICASE research program are: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effectual numerical methods; computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, structural analysis, and chemistry; computer systems and software, especially vector and parallel computers, microcomputers, and data management. Since ICASE reports are intended to be preprints of articles that will appear in journals or conference proceedings, the published reference is included when it is available.

  16. Large-scale virtual screening on public cloud resources with Apache Spark.

    PubMed

    Capuccini, Marco; Ahmed, Laeeq; Schaal, Wesley; Laure, Erwin; Spjuth, Ola

    2017-01-01

    Structure-based virtual screening is an in-silico method to screen a target receptor against a virtual molecular library. Applying docking-based screening to large molecular libraries can be computationally expensive, however it constitutes a trivially parallelizable task. Most of the available parallel implementations are based on message passing interface, relying on low failure rate hardware and fast network connection. Google's MapReduce revolutionized large-scale analysis, enabling the processing of massive datasets on commodity hardware and cloud resources, providing transparent scalability and fault tolerance at the software level. Open source implementations of MapReduce include Apache Hadoop and the more recent Apache Spark. We developed a method to run existing docking-based screening software on distributed cloud resources, utilizing the MapReduce approach. We benchmarked our method, which is implemented in Apache Spark, docking a publicly available target receptor against [Formula: see text]2.2 M compounds. The performance experiments show a good parallel efficiency (87%) when running in a public cloud environment. Our method enables parallel Structure-based virtual screening on public cloud resources or commodity computer clusters. The degree of scalability that we achieve allows for trying out our method on relatively small libraries first and then to scale to larger libraries. Our implementation is named Spark-VS and it is freely available as open source from GitHub (https://github.com/mcapuccini/spark-vs).Graphical abstract.

  17. Computational Methods for Failure Analysis and Life Prediction

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Harris, Charles E. (Compiler); Housner, Jerrold M. (Compiler); Hopkins, Dale A. (Compiler)

    1993-01-01

    This conference publication contains the presentations and discussions from the joint UVA/NASA Workshop on Computational Methods for Failure Analysis and Life Prediction held at NASA Langley Research Center 14-15 Oct. 1992. The presentations focused on damage failure and life predictions of polymer-matrix composite structures. They covered some of the research activities at NASA Langley, NASA Lewis, Southwest Research Institute, industry, and universities. Both airframes and propulsion systems were considered.

  18. Whole earth modeling: developing and disseminating scientific software for computational geophysics.

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2016-12-01

    Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.

  19. BEYOND THE PRINT—VIRTUAL PALEONTOLOGY IN SCIENCE PUBLISHING, OUTREACH, AND EDUCATION

    PubMed Central

    LAUTENSCHLAGER, STEPHAN; RÜCKLIN, MARTIN

    2015-01-01

    Virtual paleontology unites a variety of computational techniques and methods for the visualization and analysis of fossils. Due to their great potential and increasing availability, these methods have become immensely popular in the last decade. However, communicating the wealth of digital information and results produced by the various techniques is still exacerbated by traditional methods of publication. Transferring and processing three-dimensional information, such as interactive models or animations, into scientific publications still poses a challenge. Here, we present different methods and applications to communicate digital data in academia, outreach and education. Three-dimensional PDFs, QR codes, anaglyph stereo imaging, and rapid prototyping—methods routinely used in the engineering, entertainment, or medical industries—are outlined and evaluated for their potential in science publishing and public engagement. Although limitations remain, these are simple, mostly cost-effective, and powerful tools to create novel and innovative resources for education, public engagement, or outreach. PMID:26306051

  20. Equivalency of Paper versus Tablet Computer Survey Data

    ERIC Educational Resources Information Center

    Ravert, Russell D.; Gomez-Scott, Jessica; Donnellan, M. Brent

    2015-01-01

    Survey responses collected via paper surveys and computer tablets were compared to test for differences between those methods of obtaining self-report data. College students (N = 258) were recruited in public campus locations and invited to complete identical surveys on either paper or iPad tablet. Only minor homogeneity differences were found…

  1. Letter to the Editor: Use of Publicly Available Image Resources

    DOE PAGES

    Armato, Samuel G.; Drukker, Karen; Li, Feng; ...

    2017-05-11

    Here we write with regard to the Academic Radiology article entitled, “Computer-aided Diagnosis for Lung Cancer: Usefulness of Nodule Heterogeneity” by Drs. Nishio and Nagashima (1). The authors also report on a computerized method to classify as benign or malignant lung nodules present in computed tomography (CT) scans.

  2. Computer Software: Does It Support a New View of Reading?

    ERIC Educational Resources Information Center

    Case, Carolyn J.

    A study examined commercially available computer software to ascertain its degree of congruency with current methods of reading instruction (the Interactive model) at the first and second grade levels. A survey was conducted of public school educators in Connecticut and experts in the field to determine their level of satisfaction with available…

  3. A Workshop on the Integration of Numerical and Symbolic Computing Methods Held in Saratoga Springs, New York on July 9-11, 1990

    DTIC Science & Technology

    1991-04-01

    SUMMARY OF COMPLETED PROJECT (for public use) The summary (about 200 words) must be self-contained and intellegible to a scientifically literate reader...dialogue among re- searchers in symbolic methods and numerical computation, and their appli- cations in certain disciplines of artificial intelligence...Lozano-Perez Purdue University Artificial Intelligence Laboratory West Lafayette, IN 47907 Massachusetts Institute of Technology (317) 494-6181 545

  4. Structural dynamics branch research and accomplishments to FY 1992

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles

    1992-01-01

    This publication contains a collection of fiscal year 1992 research highlights from the Structural Dynamics Branch at NASA LeRC. Highlights from the branch's major work areas--Aeroelasticity, Vibration Control, Dynamic Systems, and Computational Structural Methods are included in the report as well as a listing of the fiscal year 1992 branch publications.

  5. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...

  6. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...

  7. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...

  8. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...

  9. Bringing computational science to the public.

    PubMed

    McDonagh, James L; Barker, Daniel; Alderson, Rosanna G

    2016-01-01

    The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.

  10. Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation

    PubMed Central

    Hu, Hongzhi; Mao, Huajuan; Hu, Xiaohua; Hu, Feng; Sun, Xuemin; Jing, Zaiping; Duan, Yunsuo

    2015-01-01

    Due to the extensive social influence, public health emergency has attracted great attention in today's society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event's social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution) provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback) based on ACP simulation system which was successfully applied to the analysis of A (H1N1) Flu emergency. PMID:26609303

  11. Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation.

    PubMed

    Hu, Hongzhi; Mao, Huajuan; Hu, Xiaohua; Hu, Feng; Sun, Xuemin; Jing, Zaiping; Duan, Yunsuo

    2015-01-01

    Due to the extensive social influence, public health emergency has attracted great attention in today's society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event's social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution) provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback) based on ACP simulation system which was successfully applied to the analysis of A (H1N1) Flu emergency.

  12. Cumulutive reports and publications through December 31, 1984

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A complete list of the Institute for Computer Applications in Science and Engineering (ICASE) Reports are given. Since ICASE Reports are intended to be preprints of articles that will appear in journals or conference proceedings, the published reference is included when it is available. Topics include numerical methods, parameter identification, fluid dynamics, acoustics, structural analysis, and computers.

  13. ANALYTIC ELEMENT MODELING FOR SOURCE WATER ASSESSMENTS OF PUBLIC WATER SUPPLY WELLS: CASE STUDIES IN GLACIAL OUTWASH AND BASIN-AND-RANGE

    EPA Science Inventory

    Over the last 10 years the EPA has invested in analytic elements as a computational method used in public domain software supporting capture zone delineation for source water assessments and wellhead protection. The current release is called WhAEM2000 (wellhead analytic element ...

  14. Distributed Factorization Computation on Multiple Volunteered Mobile Resource to Break RSA Key

    NASA Astrophysics Data System (ADS)

    Jaya, I.; Hardi, S. M.; Tarigan, J. T.; Zamzami, E. M.; Sihombing, P.

    2017-01-01

    Similar to common asymmeric encryption, RSA can be cracked by usmg a series mathematical calculation. The private key used to decrypt the massage can be computed using the public key. However, finding the private key may require a massive amount of calculation. In this paper, we propose a method to perform a distributed computing to calculate RSA’s private key. The proposed method uses multiple volunteered mobile devices to contribute during the calculation process. Our objective is to demonstrate how the use of volunteered computing on mobile devices may be a feasible option to reduce the time required to break a weak RSA encryption and observe the behavior and running time of the application on mobile devices.

  15. Computer Science and Technology Publications. NBS Publications List 84.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…

  16. Deep classification hashing for person re-identification

    NASA Astrophysics Data System (ADS)

    Wang, Jiabao; Li, Yang; Zhang, Xiancai; Miao, Zhuang; Tao, Gang

    2018-04-01

    As the development of surveillance in public, person re-identification becomes more and more important. The largescale databases call for efficient computation and storage, hashing technique is one of the most important methods. In this paper, we proposed a new deep classification hashing network by introducing a new binary appropriation layer in the traditional ImageNet pre-trained CNN models. It outputs binary appropriate features, which can be easily quantized into binary hash-codes for hamming similarity comparison. Experiments show that our deep hashing method can outperform the state-of-the-art methods on the public CUHK03 and Market1501 datasets.

  17. Bringing numerous methods for expression and promoter analysis to a public cloud computing service.

    PubMed

    Polanski, Krzysztof; Gao, Bo; Mason, Sam A; Brown, Paul; Ott, Sascha; Denby, Katherine J; Wild, David L

    2018-03-01

    Every year, a large number of novel algorithms are introduced to the scientific community for a myriad of applications, but using these across different research groups is often troublesome, due to suboptimal implementations and specific dependency requirements. This does not have to be the case, as public cloud computing services can easily house tractable implementations within self-contained dependency environments, making the methods easily accessible to a wider public. We have taken 14 popular methods, the majority related to expression data or promoter analysis, developed these up to a good implementation standard and housed the tools in isolated Docker containers which we integrated into the CyVerse Discovery Environment, making these easily usable for a wide community as part of the CyVerse UK project. The integrated apps can be found at http://www.cyverse.org/discovery-environment, while the raw code is available at https://github.com/cyversewarwick and the corresponding Docker images are housed at https://hub.docker.com/r/cyversewarwick/. info@cyverse.warwick.ac.uk or D.L.Wild@warwick.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  18. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... of public domain computer software. (a) General. This section prescribes the procedures for... software under section 805 of Public Law 101-650, 104 Stat. 5089 (1990). Documents recorded in the...

  19. Secure and Efficient Signature Scheme Based on NTRU for Mobile Payment

    NASA Astrophysics Data System (ADS)

    Xia, Yunhao; You, Lirong; Sun, Zhe; Sun, Zhixin

    2017-10-01

    Mobile payment becomes more and more popular, however the traditional public-key encryption algorithm has higher requirements for hardware which is not suitable for mobile terminals of limited computing resources. In addition, these public-key encryption algorithms do not have the ability of anti-quantum computing. This paper researches public-key encryption algorithm NTRU for quantum computation through analyzing the influence of parameter q and k on the probability of generating reasonable signature value. Two methods are proposed to improve the probability of generating reasonable signature value. Firstly, increase the value of parameter q. Secondly, add the authentication condition that meet the reasonable signature requirements during the signature phase. Experimental results show that the proposed signature scheme can realize the zero leakage of the private key information of the signature value, and increase the probability of generating the reasonable signature value. It also improve rate of the signature, and avoid the invalid signature propagation in the network, but the scheme for parameter selection has certain restrictions.

  20. A Computational Approach to Qualitative Analysis in Large Textual Datasets

    PubMed Central

    Evans, Michael S.

    2014-01-01

    In this paper I introduce computational techniques to extend qualitative analysis into the study of large textual datasets. I demonstrate these techniques by using probabilistic topic modeling to analyze a broad sample of 14,952 documents published in major American newspapers from 1980 through 2012. I show how computational data mining techniques can identify and evaluate the significance of qualitatively distinct subjects of discussion across a wide range of public discourse. I also show how examining large textual datasets with computational methods can overcome methodological limitations of conventional qualitative methods, such as how to measure the impact of particular cases on broader discourse, how to validate substantive inferences from small samples of textual data, and how to determine if identified cases are part of a consistent temporal pattern. PMID:24498398

  1. Key management of the double random-phase-encoding method using public-key encryption

    NASA Astrophysics Data System (ADS)

    Saini, Nirmala; Sinha, Aloka

    2010-03-01

    Public-key encryption has been used to encode the key of the encryption process. In the proposed technique, an input image has been encrypted by using the double random-phase-encoding method using extended fractional Fourier transform. The key of the encryption process have been encoded by using the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. The encoded key has then been transmitted to the receiver side along with the encrypted image. In the decryption process, first the encoded key has been decrypted using the secret key and then the encrypted image has been decrypted by using the retrieved key parameters. The proposed technique has advantage over double random-phase-encoding method because the problem associated with the transmission of the key has been eliminated by using public-key encryption. Computer simulation has been carried out to validate the proposed technique.

  2. Fixed-Base Comb with Window-Non-Adjacent Form (NAF) Method for Scalar Multiplication

    PubMed Central

    Seo, Hwajeong; Kim, Hyunjin; Park, Taehwan; Lee, Yeoncheol; Liu, Zhe; Kim, Howon

    2013-01-01

    Elliptic curve cryptography (ECC) is one of the most promising public-key techniques in terms of short key size and various crypto protocols. For this reason, many studies on the implementation of ECC on resource-constrained devices within a practical execution time have been conducted. To this end, we must focus on scalar multiplication, which is the most expensive operation in ECC. A number of studies have proposed pre-computation and advanced scalar multiplication using a non-adjacent form (NAF) representation, and more sophisticated approaches have employed a width-w NAF representation and a modified pre-computation table. In this paper, we propose a new pre-computation method in which zero occurrences are much more frequent than in previous methods. This method can be applied to ordinary group scalar multiplication, but it requires large pre-computation table, so we combined the previous method with ours for practical purposes. This novel structure establishes a new feature that adjusts speed performance and table size finely, so we can customize the pre-computation table for our own purposes. Finally, we can establish a customized look-up table for embedded microprocessors. PMID:23881143

  3. Advanced wave field sensing using computational shear interferometry

    NASA Astrophysics Data System (ADS)

    Falldorf, Claas; Agour, Mostafa; Bergmann, Ralf B.

    2014-07-01

    In this publication we give a brief introduction into the field of Computational Shear Interferometry (CoSI), which allows for determining arbitrary wave fields from a set of shear interferograms. We discuss limitations of the method with respect to the coherence of the underlying wave field and present various numerical methods to recover it from its sheared representations. Finally, we show experimental results on Digital Holography of objects with rough surface using a fiber coupled light emitting diode and quantitative phase contrast imaging as well as numerical refocusing in Differential Interference Contrast (DIC) microscopy.

  4. Computational Fluid Dynamics Symposium on Aeropropulsion

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Recognizing the considerable advances that have been made in computational fluid dynamics, the Internal Fluid Mechanics Division of NASA Lewis Research Center sponsored this symposium with the objective of providing a forum for exchanging information regarding recent developments in numerical methods, physical and chemical modeling, and applications. This conference publication is a compilation of 4 invited and 34 contributed papers presented in six sessions: algorithms one and two, turbomachinery, turbulence, components application, and combustors. Topics include numerical methods, grid generation, chemically reacting flows, turbulence modeling, inlets, nozzles, and unsteady flows.

  5. Computer-aided drug discovery research at a global contract research organization

    NASA Astrophysics Data System (ADS)

    Kitchen, Douglas B.

    2017-03-01

    Computer-aided drug discovery started at Albany Molecular Research, Inc in 1997. Over nearly 20 years the role of cheminformatics and computational chemistry has grown throughout the pharmaceutical industry and at AMRI. This paper will describe the infrastructure and roles of CADD throughout drug discovery and some of the lessons learned regarding the success of several methods. Various contributions provided by computational chemistry and cheminformatics in chemical library design, hit triage, hit-to-lead and lead optimization are discussed. Some frequently used computational chemistry techniques are described. The ways in which they may contribute to discovery projects are presented based on a few examples from recent publications.

  6. Computer-aided drug discovery research at a global contract research organization.

    PubMed

    Kitchen, Douglas B

    2017-03-01

    Computer-aided drug discovery started at Albany Molecular Research, Inc in 1997. Over nearly 20 years the role of cheminformatics and computational chemistry has grown throughout the pharmaceutical industry and at AMRI. This paper will describe the infrastructure and roles of CADD throughout drug discovery and some of the lessons learned regarding the success of several methods. Various contributions provided by computational chemistry and cheminformatics in chemical library design, hit triage, hit-to-lead and lead optimization are discussed. Some frequently used computational chemistry techniques are described. The ways in which they may contribute to discovery projects are presented based on a few examples from recent publications.

  7. Computers in Public Broadcasting: Who, What, Where.

    ERIC Educational Resources Information Center

    Yousuf, M. Osman

    This handbook offers guidance to public broadcasting managers on computer acquisition and development activities. Based on a 1981 survey of planned and current computer uses conducted by the Corporation for Public Broadcasting (CPB) Information Clearinghouse, computer systems in public radio and television broadcasting stations are listed by…

  8. Gaussian Multiscale Aggregation Applied to Segmentation in Hand Biometrics

    PubMed Central

    de Santos Sierra, Alberto; Ávila, Carmen Sánchez; Casanova, Javier Guerra; del Pozo, Gonzalo Bailador

    2011-01-01

    This paper presents an image segmentation algorithm based on Gaussian multiscale aggregation oriented to hand biometric applications. The method is able to isolate the hand from a wide variety of background textures such as carpets, fabric, glass, grass, soil or stones. The evaluation was carried out by using a publicly available synthetic database with 408,000 hand images in different backgrounds, comparing the performance in terms of accuracy and computational cost to two competitive segmentation methods existing in literature, namely Lossy Data Compression (LDC) and Normalized Cuts (NCuts). The results highlight that the proposed method outperforms current competitive segmentation methods with regard to computational cost, time performance, accuracy and memory usage. PMID:22247658

  9. Gaussian multiscale aggregation applied to segmentation in hand biometrics.

    PubMed

    de Santos Sierra, Alberto; Avila, Carmen Sánchez; Casanova, Javier Guerra; del Pozo, Gonzalo Bailador

    2011-01-01

    This paper presents an image segmentation algorithm based on Gaussian multiscale aggregation oriented to hand biometric applications. The method is able to isolate the hand from a wide variety of background textures such as carpets, fabric, glass, grass, soil or stones. The evaluation was carried out by using a publicly available synthetic database with 408,000 hand images in different backgrounds, comparing the performance in terms of accuracy and computational cost to two competitive segmentation methods existing in literature, namely Lossy Data Compression (LDC) and Normalized Cuts (NCuts). The results highlight that the proposed method outperforms current competitive segmentation methods with regard to computational cost, time performance, accuracy and memory usage.

  10. Computer Attitude, and the Impact of Personal Characteristics and Information and Communication Technology Adoption Patterns on Performance of Teaching Faculty in Higher Education in Ghana, West Africa

    ERIC Educational Resources Information Center

    Larbi-Apau, Josephine A.

    2011-01-01

    This study examined computer attitude, and the impact of personal characteristics and ICT adoption patterns on performance of multidisciplinary teaching faculty in three public universities in Ghana. A cross-sectional research of mixed methods was applied in collecting data and information. Quantitative data from 164 respondents were analyzed…

  11. Establishing Information Security Systems via Optical Imaging

    DTIC Science & Technology

    2015-08-11

    SLM, spatial light modulator; BSC, non - polarizing beam splitter cube; CCD, charge-coupled device. In computational ghost imaging, a series of...Laser Object Computer Fig. 5. A schematic setup for the proposed method using holography: BSC, Beam splitter cube; CCD, Charge-coupled device. The...interference between reference and object beams . (a) (e) (d) (c) (b) Distribution Code A: Approved for public release, distribution is unlimited

  12. Methods for design and evaluation of integrated hardware-software systems for concurrent computation

    NASA Technical Reports Server (NTRS)

    Pratt, T. W.

    1985-01-01

    Research activities and publications are briefly summarized. The major tasks reviewed are: (1) VAX implementation of the PISCES parallel programming environment; (2) Apollo workstation network implementation of the PISCES environment; (3) FLEX implementation of the PISCES environment; (4) sparse matrix iterative solver in PSICES Fortran; (5) image processing application of PISCES; and (6) a formal model of concurrent computation being developed.

  13. High-performance parallel computing in the classroom using the public goods game as an example

    NASA Astrophysics Data System (ADS)

    Perc, Matjaž

    2017-07-01

    The use of computers in statistical physics is common because the sheer number of equations that describe the behaviour of an entire system particle by particle often makes it impossible to solve them exactly. Monte Carlo methods form a particularly important class of numerical methods for solving problems in statistical physics. Although these methods are simple in principle, their proper use requires a good command of statistical mechanics, as well as considerable computational resources. The aim of this paper is to demonstrate how the usage of widely accessible graphics cards on personal computers can elevate the computing power in Monte Carlo simulations by orders of magnitude, thus allowing live classroom demonstration of phenomena that would otherwise be out of reach. As an example, we use the public goods game on a square lattice where two strategies compete for common resources in a social dilemma situation. We show that the second-order phase transition to an absorbing phase in the system belongs to the directed percolation universality class, and we compare the time needed to arrive at this result by means of the main processor and by means of a suitable graphics card. Parallel computing on graphics processing units has been developed actively during the last decade, to the point where today the learning curve for entry is anything but steep for those familiar with programming. The subject is thus ripe for inclusion in graduate and advanced undergraduate curricula, and we hope that this paper will facilitate this process in the realm of physics education. To that end, we provide a documented source code for an easy reproduction of presented results and for further development of Monte Carlo simulations of similar systems.

  14. Comprehensive T-matrix Reference Database: A 2009-2011 Update

    NASA Technical Reports Server (NTRS)

    Zakharova, Nadezhda T.; Videen, G.; Khlebtsov, Nikolai G.

    2012-01-01

    The T-matrix method is one of the most versatile and efficient theoretical techniques widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, and particles in the vicinity of an interface separating two half-spaces with different refractive indices. This paper presents an update to the comprehensive database of peer-reviewed T-matrix publications compiled by us previously and includes the publications that appeared since 2009. It also lists several earlier publications not included in the original database.

  15. An effective and secure key-management scheme for hierarchical access control in E-medicine system.

    PubMed

    Odelu, Vanga; Das, Ashok Kumar; Goswami, Adrijit

    2013-04-01

    Recently several hierarchical access control schemes are proposed in the literature to provide security of e-medicine systems. However, most of them are either insecure against 'man-in-the-middle attack' or they require high storage and computational overheads. Wu and Chen proposed a key management method to solve dynamic access control problems in a user hierarchy based on hybrid cryptosystem. Though their scheme improves computational efficiency over Nikooghadam et al.'s approach, it suffers from large storage space for public parameters in public domain and computational inefficiency due to costly elliptic curve point multiplication. Recently, Nikooghadam and Zakerolhosseini showed that Wu-Chen's scheme is vulnerable to man-in-the-middle attack. In order to remedy this security weakness in Wu-Chen's scheme, they proposed a secure scheme which is again based on ECC (elliptic curve cryptography) and efficient one-way hash function. However, their scheme incurs huge computational cost for providing verification of public information in the public domain as their scheme uses ECC digital signature which is costly when compared to symmetric-key cryptosystem. In this paper, we propose an effective access control scheme in user hierarchy which is only based on symmetric-key cryptosystem and efficient one-way hash function. We show that our scheme reduces significantly the storage space for both public and private domains, and computational complexity when compared to Wu-Chen's scheme, Nikooghadam-Zakerolhosseini's scheme, and other related schemes. Through the informal and formal security analysis, we further show that our scheme is secure against different attacks and also man-in-the-middle attack. Moreover, dynamic access control problems in our scheme are also solved efficiently compared to other related schemes, making our scheme is much suitable for practical applications of e-medicine systems.

  16. A Community Publication and Dissemination System for Hydrology Education Materials

    NASA Astrophysics Data System (ADS)

    Ruddell, B. L.

    2015-12-01

    Hosted by CUAHSI and the Science Education Resource Center (SERC), federated by the National Science Digital Library (NSDL), and allied with the Water Data Center (WDC), Hydrologic Information System (HIS), and HydroShare projects, a simple cyberinfrastructure has been launched for the publication and dissemination of data and model driven university hydrology education materials. This lightweight system's metadata describes learning content as a data-driven module with defined data inputs and outputs. This structure allows a user to mix and match modules to create sequences of content that teach both hydrology and computer learning outcomes. Importantly, this modular infrastructure allows an instructor to substitute a module based on updated computer methods for one based on outdated computer methods, hopefully solving the problem of rapid obsolescence that has hampered previous community efforts. The prototype system is now available from CUAHSI and SERC, with some example content. The system is designed to catalog, link to, make visible, and make accessible the existing and future contributions of the community; this system does not create content. Submissions from hydrology educators are eagerly solicited, especially for existing content.

  17. Cut set-based risk and reliability analysis for arbitrarily interconnected networks

    DOEpatents

    Wyss, Gregory D.

    2000-01-01

    Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.

  18. An Innovative Method for Evaluating Strategic Goals in a Public Agency: Conservation Leadership in the U.S. Forest Service

    Treesearch

    David N. Bengston; David P. Fan

    1999-01-01

    This article presents an innovative methodology for evaluating strategic planning goals in a public agency. Computer-coded content analysis was used to evaluate attitudes expressed in about 28,000 on-line news media stories about the U.S. Department of Agriculture Forest Service and its strategic goal of conservation leadership. Three dimensions of conservation...

  19. A computational method for drug repositioning using publicly available gene expression data.

    PubMed

    Shabana, K M; Abdul Nazeer, K A; Pradhan, Meeta; Palakal, Mathew

    2015-01-01

    The identification of new therapeutic uses of existing drugs, or drug repositioning, offers the possibility of faster drug development, reduced risk, lesser cost and shorter paths to approval. The advent of high throughput microarray technology has enabled comprehensive monitoring of transcriptional response associated with various disease states and drug treatments. This data can be used to characterize disease and drug effects and thereby give a measure of the association between a given drug and a disease. Several computational methods have been proposed in the literature that make use of publicly available transcriptional data to reposition drugs against diseases. In this work, we carry out a data mining process using publicly available gene expression data sets associated with a few diseases and drugs, to identify the existing drugs that can be used to treat genes causing lung cancer and breast cancer. Three strong candidates for repurposing have been identified- Letrozole and GDC-0941 against lung cancer, and Ribavirin against breast cancer. Letrozole and GDC-0941 are drugs currently used in breast cancer treatment and Ribavirin is used in the treatment of Hepatitis C.

  20. DISTMIX: direct imputation of summary statistics for unmeasured SNPs from mixed ethnicity cohorts.

    PubMed

    Lee, Donghyung; Bigdeli, T Bernard; Williamson, Vernell S; Vladimirov, Vladimir I; Riley, Brien P; Fanous, Ayman H; Bacanu, Silviu-Alin

    2015-10-01

    To increase the signal resolution for large-scale meta-analyses of genome-wide association studies, genotypes at unmeasured single nucleotide polymorphisms (SNPs) are commonly imputed using large multi-ethnic reference panels. However, the ever increasing size and ethnic diversity of both reference panels and cohorts makes genotype imputation computationally challenging for moderately sized computer clusters. Moreover, genotype imputation requires subject-level genetic data, which unlike summary statistics provided by virtually all studies, is not publicly available. While there are much less demanding methods which avoid the genotype imputation step by directly imputing SNP statistics, e.g. Directly Imputing summary STatistics (DIST) proposed by our group, their implicit assumptions make them applicable only to ethnically homogeneous cohorts. To decrease computational and access requirements for the analysis of cosmopolitan cohorts, we propose DISTMIX, which extends DIST capabilities to the analysis of mixed ethnicity cohorts. The method uses a relevant reference panel to directly impute unmeasured SNP statistics based only on statistics at measured SNPs and estimated/user-specified ethnic proportions. Simulations show that the proposed method adequately controls the Type I error rates. The 1000 Genomes panel imputation of summary statistics from the ethnically diverse Psychiatric Genetic Consortium Schizophrenia Phase 2 suggests that, when compared to genotype imputation methods, DISTMIX offers comparable imputation accuracy for only a fraction of computational resources. DISTMIX software, its reference population data, and usage examples are publicly available at http://code.google.com/p/distmix. dlee4@vcu.edu Supplementary Data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  1. NASA spinoffs to public service

    NASA Technical Reports Server (NTRS)

    Ault, L. A.; Cleland, J. G.

    1989-01-01

    The National Aeronautics and Space Administration (NASA) Technology Utilization (TU) Division of the Office of Commercial Programs has been quite successful in directing the transfer to technology into the public sector. NASA developments of particular interest have been those in the areas of aerodynamics and aviation transport, safety, sensors, electronics and computing, and satellites and remote sensing. NASA technology has helped law enforcement, firefighting, public transportation, education, search and rescue, and practically every other sector of activity serving the U.S. public. NASA works closely with public service agencies and associations, especially those serving local needs of citizens, to expedite technology transfer benefits. A number of examples exist to demonstrate the technology transfer method and opportunities of NASA spinoffs to public service.

  2. An image hiding method based on cascaded iterative Fourier transform and public-key encryption algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, B.; Sang, Jun; Alam, Mohammad S.

    2013-03-01

    An image hiding method based on cascaded iterative Fourier transform and public-key encryption algorithm was proposed. Firstly, the original secret image was encrypted into two phase-only masks M1 and M2 via cascaded iterative Fourier transform (CIFT) algorithm. Then, the public-key encryption algorithm RSA was adopted to encrypt M2 into M2' . Finally, a host image was enlarged by extending one pixel into 2×2 pixels and each element in M1 and M2' was multiplied with a superimposition coefficient and added to or subtracted from two different elements in the 2×2 pixels of the enlarged host image. To recover the secret image from the stego-image, the two masks were extracted from the stego-image without the original host image. By applying public-key encryption algorithm, the key distribution was facilitated, and also compared with the image hiding method based on optical interference, the proposed method may reach higher robustness by employing the characteristics of the CIFT algorithm. Computer simulations show that this method has good robustness against image processing.

  3. Identifying Methods for Monitoring Foodborne Illness: Review of Existing Public Health Surveillance Techniques.

    PubMed

    Oldroyd, Rachel A; Morris, Michelle A; Birkin, Mark

    2018-06-06

    Traditional methods of monitoring foodborne illness are associated with problems of untimeliness and underreporting. In recent years, alternative data sources such as social media data have been used to monitor the incidence of disease in the population (infodemiology and infoveillance). These data sources prove timelier than traditional general practitioner data, they can help to fill the gaps in the reporting process, and they often include additional metadata that is useful for supplementary research. The aim of the study was to identify and formally analyze research papers using consumer-generated data, such as social media data or restaurant reviews, to quantify a disease or public health ailment. Studies of this nature are scarce within the food safety domain, therefore identification and understanding of transferrable methods in other health-related fields are of particular interest. Structured scoping methods were used to identify and analyze primary research papers using consumer-generated data for disease or public health surveillance. The title, abstract, and keyword fields of 5 databases were searched using predetermined search terms. A total of 5239 papers matched the search criteria, of which 145 were taken to full-text review-62 papers were deemed relevant and were subjected to data characterization and thematic analysis. The majority of studies (40/62, 65%) focused on the surveillance of influenza-like illness. Only 10 studies (16%) used consumer-generated data to monitor outbreaks of foodborne illness. Twitter data (58/62, 94%) and Yelp reviews (3/62, 5%) were the most commonly used data sources. Studies reporting high correlations against baseline statistics used advanced statistical and computational approaches to calculate the incidence of disease. These include classification and regression approaches, clustering approaches, and lexicon-based approaches. Although they are computationally intensive due to the requirement of training data, studies using classification approaches reported the best performance. By analyzing studies in digital epidemiology, computer science, and public health, this paper has identified and analyzed methods of disease monitoring that can be transferred to foodborne disease surveillance. These methods fall into 4 main categories: basic approach, classification and regression, clustering approaches, and lexicon-based approaches. Although studies using a basic approach to calculate disease incidence generally report good performance against baseline measures, they are sensitive to chatter generated by media reports. More computationally advanced approaches are required to filter spurious messages and protect predictive systems against false alarms. Research using consumer-generated data for monitoring influenza-like illness is expansive; however, research regarding the use of restaurant reviews and social media data in the context of food safety is limited. Considering the advantages reported in this review, methods using consumer-generated data for foodborne disease surveillance warrant further investment. ©Rachel A Oldroyd, Michelle A Morris, Mark Birkin. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 06.06.2018.

  4. Project JOVE. [microgravity experiments and applications

    NASA Technical Reports Server (NTRS)

    Lyell, M. J.

    1994-01-01

    The goal of this project is to investigate new areas of research pertaining to free surface-interface fluids mechanics and/or microgravity which have potential commercial applications. This paper presents an introduction to ferrohydrodynamics (FHD), and discusses some applications. Also, computational methods for solving free surface flow problems are presented in detail. Both have diverse applications in industry and in microgravity fluids applications. Three different modeling schemes for FHD flows are addressed and the governing equations, including Maxwell's equations, are introduced. In the area of computational modeling of free surface flows, both Eulerian and Lagrangian schemes are discussed. The state of the art in computational methods applied to free surface flows is elucidated. In particular, adaptive grids and re-zoning methods are discussed. Additional research results are addressed and copies of the publications produced under the JOVE Project are included.

  5. Analysis of self-reported versus biomarker based smoking prevalence: methodology to compute corrected smoking prevalence rates.

    PubMed

    Jain, Ram B

    2017-07-01

    Prevalence of smoking is needed to estimate the need for future public health resources. To compute and compare smoking prevalence rates by using self-reported smoking statuses, two serum cotinine (SCOT) based biomarker methods, and one urinary 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanol (NNAL) based biomarker method. These estimates were then used to develop correction factors to be applicable to self-reported prevalences to arrive at corrected smoking prevalence rates. Data from National Health and Nutrition Examination Survey (NHANES) for 2007-2012 for those aged ≥20 years (N = 16826) were used. Self-reported prevalence rate for the total population computed as the weighted number of self-reported smokers divided by weighted number of all participants was 21.6% and 24% when computed by weighted number of self-reported smokers divided by the weighted number of self-reported smokers and nonsmokers. The corrected prevalence rate was found to be 25.8%. A 1% underestimate in smoking prevalence is equivalent to not being able to identify 2.2 million smokers in US in a given year. This underestimation, if not corrected, could lead to serious gap in the public health services available and needed to provide adequate preventive and corrective treatment to smokers.

  6. Small Private Key PKS on an Embedded Microprocessor

    PubMed Central

    Seo, Hwajeong; Kim, Jihyun; Choi, Jongseok; Park, Taehwan; Liu, Zhe; Kim, Howon

    2014-01-01

    Multivariate quadratic ( ) cryptography requires the use of long public and private keys to ensure a sufficient security level, but this is not favorable to embedded systems, which have limited system resources. Recently, various approaches to cryptography using reduced public keys have been studied. As a result of this, at CHES2011 (Cryptographic Hardware and Embedded Systems, 2011), a small public key scheme, was proposed, and its feasible implementation on an embedded microprocessor was reported at CHES2012. However, the implementation of a small private key scheme was not reported. For efficient implementation, random number generators can contribute to reduce the key size, but the cost of using a random number generator is much more complex than computing on modern microprocessors. Therefore, no feasible results have been reported on embedded microprocessors. In this paper, we propose a feasible implementation on embedded microprocessors for a small private key scheme using a pseudo-random number generator and hash function based on a block-cipher exploiting a hardware Advanced Encryption Standard (AES) accelerator. To speed up the performance, we apply various implementation methods, including parallel computation, on-the-fly computation, optimized logarithm representation, vinegar monomials and assembly programming. The proposed method reduces the private key size by about 99.9% and boosts signature generation and verification by 5.78% and 12.19% than previous results in CHES2012. PMID:24651722

  7. Small private key MQPKS on an embedded microprocessor.

    PubMed

    Seo, Hwajeong; Kim, Jihyun; Choi, Jongseok; Park, Taehwan; Liu, Zhe; Kim, Howon

    2014-03-19

    Multivariate quadratic (MQ) cryptography requires the use of long public and private keys to ensure a sufficient security level, but this is not favorable to embedded systems, which have limited system resources. Recently, various approaches to MQ cryptography using reduced public keys have been studied. As a result of this, at CHES2011 (Cryptographic Hardware and Embedded Systems, 2011), a small public key MQ scheme, was proposed, and its feasible implementation on an embedded microprocessor was reported at CHES2012. However, the implementation of a small private key MQ scheme was not reported. For efficient implementation, random number generators can contribute to reduce the key size, but the cost of using a random number generator is much more complex than computing MQ on modern microprocessors. Therefore, no feasible results have been reported on embedded microprocessors. In this paper, we propose a feasible implementation on embedded microprocessors for a small private key MQ scheme using a pseudo-random number generator and hash function based on a block-cipher exploiting a hardware Advanced Encryption Standard (AES) accelerator. To speed up the performance, we apply various implementation methods, including parallel computation, on-the-fly computation, optimized logarithm representation, vinegar monomials and assembly programming. The proposed method reduces the private key size by about 99.9% and boosts signature generation and verification by 5.78% and 12.19% than previous results in CHES2012.

  8. Teaching an Interdisciplinary Graduate-Level Methods Course in an Openly-Networked Connected Learning Environment: A Glass Half-Full

    ERIC Educational Resources Information Center

    Secret, Mary; Bryant, Nita L.; Cummings, Cory R.

    2017-01-01

    Our paper describes the design and delivery of an online interdisciplinary social science research methods course (ISRM) for graduate students in sociology, education, social work, and public administration. Collaborative activities and learning took place in two types of computer-mediated learning environments: a closed Blackboard course…

  9. Innovative Methods for High Resolution Imaging

    DTIC Science & Technology

    2012-08-02

    findings, recent publication, and presentations in the areas of lenslet array imaging , wavefront encoding, and non-negative matrix factorization for...on their findings, recent publication, and presentations in the areas of lenslet array imaging , wavefront encoding, and non-negative matrix...Computational Optical Sensing and Imaging . 2007/06/18 00:00:00, . : , 2012/07/16 15:30:42 9 Kelly N. Smith, V. Paul Pauca, Arun Ross, Todd Torgersen, Michael C

  10. Genetic Epidemiology and Public Health: The Evolution From Theory to Technology.

    PubMed

    Fallin, M Daniele; Duggal, Priya; Beaty, Terri H

    2016-03-01

    Genetic epidemiology represents a hybrid of epidemiologic designs and statistical models that explicitly consider both genetic and environmental risk factors for disease. It is a relatively new field in public health; the term was first coined only 35 years ago. In this short time, the field has been through a major evolution, changing from a field driven by theory, without the technology for genetic measurement or computational capacity to apply much of the designs and methods developed, to a field driven by rapidly expanding technology in genomic measurement and computational analyses while epidemiologic theory struggles to keep up. In this commentary, we describe 4 different eras of genetic epidemiology, spanning this evolution from theory to technology, what we have learned, what we have added to the broader field of public health, and what remains to be done. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Automated Literature Searches for Longitudinal Tracking of Cancer Research Training Program Graduates.

    PubMed

    Padilla, Luz A; Desmond, Renee A; Brooks, C Michael; Waterbor, John W

    2018-06-01

    A key outcome measure of cancer research training programs is the number of cancer-related peer-reviewed publications after training. Because program graduates do not routinely report their publications, staff must periodically conduct electronic literature searches on each graduate. The purpose of this study is to compare findings of an innovative computer-based automated search program versus repeated manual literature searches to identify post-training peer-reviewed publications. In late 2014, manual searches for publications by former R25 students identified 232 cancer-related articles published by 112 of 543 program graduates. In 2016, a research assistant was instructed in performing Scopus literature searches for comparison with individual PubMed searches on our 543 program graduates. Through 2014, Scopus found 304 cancer publications, 220 of that had been retrieved manually plus an additional 84 papers. However, Scopus missed 12 publications found manually. Together, both methods found 316 publications. The automated method found 96.2 % of the 316 publications while individual searches found only 73.4 %. An automated search method such as using the Scopus database is a key tool for conducting comprehensive literature searches, but it must be supplemented with periodic manual searches to find the initial publications of program graduates. A time-saving feature of Scopus is the periodic automatic alerts of new publications. Although a training period is needed and initial costs can be high, an automated search method is worthwhile due to its high sensitivity and efficiency in the long term.

  12. MultiPhyl: a high-throughput phylogenomics webserver using distributed computing

    PubMed Central

    Keane, Thomas M.; Naughton, Thomas J.; McInerney, James O.

    2007-01-01

    With the number of fully sequenced genomes increasing steadily, there is greater interest in performing large-scale phylogenomic analyses from large numbers of individual gene families. Maximum likelihood (ML) has been shown repeatedly to be one of the most accurate methods for phylogenetic construction. Recently, there have been a number of algorithmic improvements in maximum-likelihood-based tree search methods. However, it can still take a long time to analyse the evolutionary history of many gene families using a single computer. Distributed computing refers to a method of combining the computing power of multiple computers in order to perform some larger overall calculation. In this article, we present the first high-throughput implementation of a distributed phylogenetics platform, MultiPhyl, capable of using the idle computational resources of many heterogeneous non-dedicated machines to form a phylogenetics supercomputer. MultiPhyl allows a user to upload hundreds or thousands of amino acid or nucleotide alignments simultaneously and perform computationally intensive tasks such as model selection, tree searching and bootstrapping of each of the alignments using many desktop machines. The program implements a set of 88 amino acid models and 56 nucleotide maximum likelihood models and a variety of statistical methods for choosing between alternative models. A MultiPhyl webserver is available for public use at: http://www.cs.nuim.ie/distributed/multiphyl.php. PMID:17553837

  13. Proportional Topology Optimization: A New Non-Sensitivity Method for Solving Stress Constrained and Minimum Compliance Problems and Its Implementation in MATLAB

    PubMed Central

    Biyikli, Emre; To, Albert C.

    2015-01-01

    A new topology optimization method called the Proportional Topology Optimization (PTO) is presented. As a non-sensitivity method, PTO is simple to understand, easy to implement, and is also efficient and accurate at the same time. It is implemented into two MATLAB programs to solve the stress constrained and minimum compliance problems. Descriptions of the algorithm and computer programs are provided in detail. The method is applied to solve three numerical examples for both types of problems. The method shows comparable efficiency and accuracy with an existing optimality criteria method which computes sensitivities. Also, the PTO stress constrained algorithm and minimum compliance algorithm are compared by feeding output from one algorithm to the other in an alternative manner, where the former yields lower maximum stress and volume fraction but higher compliance compared to the latter. Advantages and disadvantages of the proposed method and future works are discussed. The computer programs are self-contained and publicly shared in the website www.ptomethod.org. PMID:26678849

  14. Computational biology for ageing

    PubMed Central

    Wieser, Daniela; Papatheodorou, Irene; Ziehm, Matthias; Thornton, Janet M.

    2011-01-01

    High-throughput genomic and proteomic technologies have generated a wealth of publicly available data on ageing. Easy access to these data, and their computational analysis, is of great importance in order to pinpoint the causes and effects of ageing. Here, we provide a description of the existing databases and computational tools on ageing that are available for researchers. We also describe the computational approaches to data interpretation in the field of ageing including gene expression, comparative and pathway analyses, and highlight the challenges for future developments. We review recent biological insights gained from applying bioinformatics methods to analyse and interpret ageing data in different organisms, tissues and conditions. PMID:21115530

  15. Application of the control volume mixed finite element method to a triangular discretization

    USGS Publications Warehouse

    Naff, R.L.

    2012-01-01

    A two-dimensional control volume mixed finite element method is applied to the elliptic equation. Discretization of the computational domain is based in triangular elements. Shape functions and test functions are formulated on the basis of an equilateral reference triangle with unit edges. A pressure support based on the linear interpolation of elemental edge pressures is used in this formulation. Comparisons are made between results from the standard mixed finite element method and this control volume mixed finite element method. Published 2011. This article is a US Government work and is in the public domain in the USA. ?? 2012 John Wiley & Sons, Ltd. This article is a US Government work and is in the public domain in the USA.

  16. Comprehensive T-Matrix Reference Database: A 2007-2009 Update

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Zakharova, Nadia T.; Videen, Gorden; Khlebtsov, Nikolai G.; Wriedt, Thomas

    2010-01-01

    The T-matrix method is among the most versatile, efficient, and widely used theoretical techniques for the numerically exact computation of electromagnetic scattering by homogeneous and composite particles, clusters of particles, discrete random media, and particles in the vicinity of an interface separating two half-spaces with different refractive indices. This paper presents an update to the comprehensive database of T-matrix publications compiled by us previously and includes the publications that appeared since 2007. It also lists several earlier publications not included in the original database.

  17. Returns on Investment in California County Departments of Public Health

    PubMed Central

    2016-01-01

    Objectives. To estimate the average return on investment for the overall activities of county departments of public health in California. Methods. I gathered the elements necessary to estimate the average return on investment for county departments of public health in California during the period 2001 to 2008–2009. These came from peer-reviewed journal articles published as part of a larger project to develop a method for determining return on investment for public health by using a health economics framework. I combined these elements by using the standard formula for computing return on investment, and performed a sensitivity analysis. Then I compared the return on investment for county departments of public health with the returns on investment generated for various aspects of medical care. Results. The estimated return on investment from $1 invested in county departments of public health in California ranges from $67.07 to $88.21. Conclusions. The very large estimated return on investment for California county departments of public health relative to the return on investment for selected aspects of medical care suggests that public health is a wise investment. PMID:27310339

  18. Computational aeroacoustics and numerical simulation of supersonic jets

    NASA Technical Reports Server (NTRS)

    Morris, Philip J.; Long, Lyle N.

    1996-01-01

    The research project has been a computational study of computational aeroacoustics algorithms and numerical simulations of the flow and noise of supersonic jets. During this study a new method for the implementation of solid wall boundary conditions for complex geometries in three dimensions has been developed. In addition, a detailed study of the simulation of the flow in and noise from supersonic circular and rectangular jets has been conducted. Extensive comparisons have been made with experimental measurements. A summary of the results of the research program are attached as the main body of this report in the form of two publications. Also, the report lists the names of the students who were supported by this grant, their degrees, and the titles of their dissertations. In addition, a list of presentations and publications made by the Principal Investigators and the research students is also included.

  19. Comprehensive T-Matrix Reference Database: A 2012 - 2013 Update

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Videen, Gorden; Khlebtsov, Nikolai G.; Wriedt, Thomas

    2013-01-01

    The T-matrix method is one of the most versatile, efficient, and accurate theoretical techniques widely used for numerically exact computer calculations of electromagnetic scattering by single and composite particles, discrete random media, and particles imbedded in complex environments. This paper presents the fifth update to the comprehensive database of peer-reviewed T-matrix publications initiated by us in 2004 and includes relevant publications that have appeared since 2012. It also lists several earlier publications not incorporated in the original database, including Peter Waterman's reports from the 1960s illustrating the history of the T-matrix approach and demonstrating that John Fikioris and Peter Waterman were the true pioneers of the multi-sphere method otherwise known as the generalized Lorenz - Mie theory.

  20. Who's in the Queue? A Demographic Analysis of Public Access Computer Users and Uses in U.S. Public Libraries. Research Brief Number 4

    ERIC Educational Resources Information Center

    Manjarrez, Carlos A.; Schoembs, Kyle

    2011-01-01

    Over the past decade, policy discussions about public access computing in libraries have focused on the role that these institutions play in bridging the digital divide. In these discussions, public access computing services are generally targeted at individuals who either cannot afford a computer and Internet access, or have never received formal…

  1. 32 CFR 310.52 - Computer matching publication and review requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 2 2012-07-01 2012-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...

  2. 32 CFR 310.52 - Computer matching publication and review requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 2 2014-07-01 2014-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...

  3. 32 CFR 310.52 - Computer matching publication and review requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 2 2013-07-01 2013-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...

  4. 32 CFR 310.52 - Computer matching publication and review requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 2 2010-07-01 2010-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...

  5. 32 CFR 310.52 - Computer matching publication and review requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 2 2011-07-01 2011-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...

  6. Effects of Turbulence Model on Prediction of Hot-Gas Lateral Jet Interaction in a Supersonic Crossflow

    DTIC Science & Technology

    2015-07-01

    performance computing time from the US Department of Defense (DOD) High Performance Computing Modernization program at the US Army Research Laboratory...Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time ...dimensional, compressible, Reynolds-averaged Navier-Stokes (RANS) equations are solved using a finite volume method. A point-implicit time - integration

  7. Reusable design: A proposed approach to Public Health Informatics system design

    PubMed Central

    2011-01-01

    Background Since it was first defined in 1995, Public Health Informatics (PHI) has become a recognized discipline, with a research agenda, defined domain-specific competencies and a specialized corpus of technical knowledge. Information systems form a cornerstone of PHI research and implementation, representing significant progress for the nascent field. However, PHI does not advocate or incorporate standard, domain-appropriate design methods for implementing public health information systems. Reusable design is generalized design advice that can be reused in a range of similar contexts. We propose that PHI create and reuse information design knowledge by taking a systems approach that incorporates design methods from the disciplines of Human-Computer Interaction, Interaction Design and other related disciplines. Discussion Although PHI operates in a domain with unique characteristics, many design problems in public health correspond to classic design problems, suggesting that existing design methods and solution approaches are applicable to the design of public health information systems. Among the numerous methodological frameworks used in other disciplines, we identify scenario-based design and participatory design as two widely-employed methodologies that are appropriate for adoption as PHI standards. We make the case that these methods show promise to create reusable design knowledge in PHI. Summary We propose the formalization of a set of standard design methods within PHI that can be used to pursue a strategy of design knowledge creation and reuse for cost-effective, interoperable public health information systems. We suggest that all public health informaticians should be able to use these design methods and the methods should be incorporated into PHI training. PMID:21333000

  8. On public space design for Chinese urban residential area based on integrated architectural physics environment evaluation

    NASA Astrophysics Data System (ADS)

    Dong, J. Y.; Cheng, W.; Ma, C. P.; Tan, Y. T.; Xin, L. S.

    2017-04-01

    The residential public space is an important part in designing the ecological residence, and a proper physics environment of public space is of greater significance to urban residence in China. Actually, the measure to apply computer aided design software into residential design can effectively avoid an inconformity of design intent with actual using condition, and a negative impact on users due to bad architectural physics environment of buildings, etc. The paper largely adopts a design method of analyzing architectural physics environment of residential public space. By analyzing and evaluating various physics environments, a suitability assessment is obtained for residential public space, thereby guiding the space design.

  9. Computational methods for identifying miRNA sponge interactions.

    PubMed

    Le, Thuc Duy; Zhang, Junpeng; Liu, Lin; Li, Jiuyong

    2017-07-01

    Recent findings show that coding genes are not the only targets that miRNAs interact with. In fact, there is a pool of different RNAs competing with each other to attract miRNAs for interactions, thus acting as competing endogenous RNAs (ceRNAs). The ceRNAs indirectly regulate each other via the titration mechanism, i.e. the increasing concentration of a ceRNA will decrease the number of miRNAs that are available for interacting with other targets. The cross-talks between ceRNAs, i.e. their interactions mediated by miRNAs, have been identified as the drivers in many disease conditions, including cancers. In recent years, some computational methods have emerged for identifying ceRNA-ceRNA interactions. However, there remain great challenges and opportunities for developing computational methods to provide new insights into ceRNA regulatory mechanisms.In this paper, we review the publically available databases of ceRNA-ceRNA interactions and the computational methods for identifying ceRNA-ceRNA interactions (also known as miRNA sponge interactions). We also conduct a comparison study of the methods with a breast cancer dataset. Our aim is to provide a current snapshot of the advances of the computational methods in identifying miRNA sponge interactions and to discuss the remaining challenges. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Computational Fact Checking from Knowledge Networks

    PubMed Central

    Ciampaglia, Giovanni Luca; Shiralkar, Prashant; Rocha, Luis M.; Bollen, Johan; Menczer, Filippo; Flammini, Alessandro

    2015-01-01

    Traditional fact checking by expert journalists cannot keep up with the enormous volume of information that is now generated online. Computational fact checking may significantly enhance our ability to evaluate the veracity of dubious information. Here we show that the complexities of human fact checking can be approximated quite well by finding the shortest path between concept nodes under properly defined semantic proximity metrics on knowledge graphs. Framed as a network problem this approach is feasible with efficient computational techniques. We evaluate this approach by examining tens of thousands of claims related to history, entertainment, geography, and biographical information using a public knowledge graph extracted from Wikipedia. Statements independently known to be true consistently receive higher support via our method than do false ones. These findings represent a significant step toward scalable computational fact-checking methods that may one day mitigate the spread of harmful misinformation. PMID:26083336

  11. Roads On The U.S. nation Forests: An Analysis of Public Attitudes, Beliefs, and Values Expressed in the News Media

    Treesearch

    David N. Bengston; David P. Fan

    1999-01-01

    Public attitudes, beliefs, and underlying values about roads on the U.S. national forests expressed in more than 4,000 on-line news stories during a 3-year period are analyzed by using computer methods. The belief that forest roads provide access for recreation was expressed most frequently, accounting for about 40% of all beliefs expressed. The belief that roads cause...

  12. Translations from Kommunist, Number 18, December 1977.

    DTIC Science & Technology

    1978-01-26

    reproduction of these copyrighted journal articles is prohibited without permission from the copyright agency of the Soviet Union. BIBLIOGRAPHIC DATA ...III - USSR - 5] Author (s) PUBLICATION DATA English title TRANSLATIONS FROM KOMMUNIST, No 18, Dec 1977 Russian title KOMMUNIST Editor (s) R. I...experimental research and production base and insure the extensive use of network planning methods , computers, progres- sive design methods , and so

  13. Program Office Guide to Ada. Edition 1

    DTIC Science & Technology

    1986-09-17

    publication. MARK V. ZIEMBA , 2Lt, USAF Project Officer, Software Engineering Tools & Methods ARTHUR G. DECELLES, Capt, USAF Program Manager, Computer...UNLIMITED G3 SAME AS RPT D DTIC USERS 21 ABSTRACT SECURITY CLASSIFICATION UNCLASSIFIED 22a. NAME OF RESPONSIBLE INDIVIDUAL M.V. Ziemba

  14. Making Cloud Computing Available For Researchers and Innovators (Invited)

    NASA Astrophysics Data System (ADS)

    Winsor, R.

    2010-12-01

    High Performance Computing (HPC) facilities exist in most academic institutions but are almost invariably over-subscribed. Access is allocated based on academic merit, the only practical method of assigning valuable finite compute resources. Cloud computing on the other hand, and particularly commercial clouds, draw flexibly on an almost limitless resource as long as the user has sufficient funds to pay the bill. How can the commercial cloud model be applied to scientific computing? Is there a case to be made for a publicly available research cloud and how would it be structured? This talk will explore these themes and describe how Cybera, a not-for-profit non-governmental organization in Alberta Canada, aims to leverage its high speed research and education network to provide cloud computing facilities for a much wider user base.

  15. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures.

    PubMed

    Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena

    2018-01-01

    The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.

  16. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures

    PubMed Central

    Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena

    2018-01-01

    The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results. PMID:29765315

  17. 76 FR 67418 - Request for Comments on NIST Special Publication 500-293, US Government Cloud Computing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-01

    ...-1659-01] Request for Comments on NIST Special Publication 500-293, US Government Cloud Computing... Publication 500-293, US Government Cloud Computing Technology Roadmap, Release 1.0 (Draft). This document is... (USG) agencies to accelerate their adoption of cloud computing. The roadmap has been developed through...

  18. Solving optimization problems by the public goods game

    NASA Astrophysics Data System (ADS)

    Javarone, Marco Alberto

    2017-09-01

    We introduce a method based on the Public Goods Game for solving optimization tasks. In particular, we focus on the Traveling Salesman Problem, i.e. a NP-hard problem whose search space exponentially grows increasing the number of cities. The proposed method considers a population whose agents are provided with a random solution to the given problem. In doing so, agents interact by playing the Public Goods Game using the fitness of their solution as currency of the game. Notably, agents with better solutions provide higher contributions, while those with lower ones tend to imitate the solution of richer agents for increasing their fitness. Numerical simulations show that the proposed method allows to compute exact solutions, and suboptimal ones, in the considered search spaces. As result, beyond to propose a new heuristic for combinatorial optimization problems, our work aims to highlight the potentiality of evolutionary game theory beyond its current horizons.

  19. Calculation of Level of Comfort of the Micro-Climate in Buildings During the Estimation of the Energy-Saving Measures

    NASA Astrophysics Data System (ADS)

    Prorokova, M. V.; Bukhmirov, V. V.

    2016-02-01

    The article describes the method of valuation of comfort of microclimate of residen-tial, public and administrative buildings. The method is based on calculation of the coefficient of thermal comfort of a person in the room. Further amendments are introduced to the asym-metry of the thermal radiation, radiation cooling and air quality. The method serves as the basis for a computer program.

  20. Pacific Symposium on Biocomputing 2002/2003/2004

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A.Keith Dunker

    2004-10-26

    Brief introduction to Pacific Symposium on Biocomputing The Pacific Symposium on Biocomputing is an international, multidisciplinary conference covering current research in the theory and the application of computational methods in problems of biological significance. Researchers from the United States, the Asian Pacific nations and around the world gather each year at PSB to exchange research results and discuss open issues in all aspects of computational biology. PSB provides a forum for work on databases, algorithms, interfaces, visualization, modeling and other computational methods, as applied to biological problems. The data-rich areas of molecular biology are emphasized. PSB is the only meetingmore » in the bioinformatics field with sessions defined dynamically each year in response to specific proposals from the participants. Sessions are organized by leaders in emerging areas to provide forums for publication and for discussion of research in biocomputing ''hot topics''. PSB therefore enables discussion of emerging methods and approaches in this rapidly changing field. PSB has been designated as one of the major meetings in this field by the recently established International Society for Computational Biology (see www.iscb.org). Papers and presentations are peer reviewed typically with 3 reviews per paper plus editorial oversight from the conference organizers. The accepted papers are published in an archival proceedings volume, which is indexed by PubMed, and electronically (see http://psb.stanford.edu/). Finally, given the tight schedule from submission of papers to their publication, typically 5 to 5 1/2 months, the PSB proceedings each year represents one of the most up-to-date surveys of current trends in bioinformatics.« less

  1. FLAPS (Fatigue Life Analysis Programs): Computer Programs to Predict Cyclic Life Using the Total Strain Version of Strainrange Partitioning and Other Life Prediction Methods. Users' Manual and Example Problems, Version 1.0

    NASA Technical Reports Server (NTRS)

    Arya, Vinod K.; Halford, Gary R. (Technical Monitor)

    2003-01-01

    This manual presents computer programs FLAPS for characterizing and predicting fatigue and creep-fatigue resistance of metallic materials in the high-temperature, long-life regime for isothermal and nonisothermal fatigue. The programs use the Total Strain version of Strainrange Partitioning (TS-SRP), and several other life prediction methods described in this manual. The user should be thoroughly familiar with the TS-SRP and these life prediction methods before attempting to use any of these programs. Improper understanding can lead to incorrect use of the method and erroneous life predictions. An extensive database has also been developed in a parallel effort. The database is probably the largest source of high-temperature, creep-fatigue test data available in the public domain and can be used with other life-prediction methods as well. This users' manual, software, and database are all in the public domain and can be obtained by contacting the author. The Compact Disk (CD) accompanying this manual contains an executable file for the FLAPS program, two datasets required for the example problems in the manual, and the creep-fatigue data in a format compatible with these programs.

  2. Improving public transportation systems with self-organization: A headway-based model and regulation of passenger alighting and boarding.

    PubMed

    Carreón, Gustavo; Gershenson, Carlos; Pineda, Luis A

    2017-01-01

    The equal headway instability-the fact that a configuration with regular time intervals between vehicles tends to be volatile-is a common regulation problem in public transportation systems. An unsatisfactory regulation results in low efficiency and possible collapses of the service. Computational simulations have shown that self-organizing methods can regulate the headway adaptively beyond the theoretical optimum. In this work, we develop a computer simulation for metro systems fed with real data from the Mexico City Metro to test the current regulatory method with a novel self-organizing approach. The current model considers overall system's data such as minimum and maximum waiting times at stations, while the self-organizing method regulates the headway in a decentralized manner using local information such as the passenger's inflow and the positions of neighboring trains. The simulation shows that the self-organizing method improves the performance over the current one as it adapts to environmental changes at the timescale they occur. The correlation between the simulation of the current model and empirical observations carried out in the Mexico City Metro provides a base to calculate the expected performance of the self-organizing method in case it is implemented in the real system. We also performed a pilot study at the Balderas station to regulate the alighting and boarding of passengers through guide signs on platforms. The analysis of empirical data shows a delay reduction of the waiting time of trains at stations. Finally, we provide recommendations to improve public transportation systems.

  3. Improving public transportation systems with self-organization: A headway-based model and regulation of passenger alighting and boarding

    PubMed Central

    Gershenson, Carlos; Pineda, Luis A.

    2017-01-01

    The equal headway instability—the fact that a configuration with regular time intervals between vehicles tends to be volatile—is a common regulation problem in public transportation systems. An unsatisfactory regulation results in low efficiency and possible collapses of the service. Computational simulations have shown that self-organizing methods can regulate the headway adaptively beyond the theoretical optimum. In this work, we develop a computer simulation for metro systems fed with real data from the Mexico City Metro to test the current regulatory method with a novel self-organizing approach. The current model considers overall system’s data such as minimum and maximum waiting times at stations, while the self-organizing method regulates the headway in a decentralized manner using local information such as the passenger’s inflow and the positions of neighboring trains. The simulation shows that the self-organizing method improves the performance over the current one as it adapts to environmental changes at the timescale they occur. The correlation between the simulation of the current model and empirical observations carried out in the Mexico City Metro provides a base to calculate the expected performance of the self-organizing method in case it is implemented in the real system. We also performed a pilot study at the Balderas station to regulate the alighting and boarding of passengers through guide signs on platforms. The analysis of empirical data shows a delay reduction of the waiting time of trains at stations. Finally, we provide recommendations to improve public transportation systems. PMID:29287120

  4. 36 CFR 1254.32 - What rules apply to public access use of the Internet on NARA-supplied computers?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... access use of the Internet on NARA-supplied computers? 1254.32 Section 1254.32 Parks, Forests, and Public... of the Internet on NARA-supplied computers? (a) Public access computers (workstations) are available for Internet use in all NARA research rooms. The number of workstations varies per location. We...

  5. 36 CFR 1254.32 - What rules apply to public access use of the Internet on NARA-supplied computers?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... access use of the Internet on NARA-supplied computers? 1254.32 Section 1254.32 Parks, Forests, and Public... of the Internet on NARA-supplied computers? (a) Public access computers (workstations) are available for Internet use in all NARA research rooms. The number of workstations varies per location. We...

  6. 36 CFR 1254.32 - What rules apply to public access use of the Internet on NARA-supplied computers?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... access use of the Internet on NARA-supplied computers? 1254.32 Section 1254.32 Parks, Forests, and Public... of the Internet on NARA-supplied computers? (a) Public access computers (workstations) are available... should not expect privacy while using these workstations. These workstations are operated and maintained...

  7. 36 CFR 1254.32 - What rules apply to public access use of the Internet on NARA-supplied computers?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... access use of the Internet on NARA-supplied computers? 1254.32 Section 1254.32 Parks, Forests, and Public... of the Internet on NARA-supplied computers? (a) Public access computers (workstations) are available... should not expect privacy while using these workstations. These workstations are operated and maintained...

  8. Structural dynamics branch research and accomplishments

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Summaries are presented of fiscal year 1989 research highlights from the Structural Dynamics Branch at NASA Lewis Research Center. Highlights from the branch's major work areas include aeroelasticity, vibration control, dynamic systems, and computation structural methods. A listing of the fiscal year 1989 branch publications is given.

  9. Online Catalogs and Their Users.

    ERIC Educational Resources Information Center

    Broadus, Robert N.

    1983-01-01

    Review of research on online public access catalogs sponsored by Council on Library Resources notes the scope and method (questionnaires administered to catalog users and nonusers in 29 participating institutions) and findings and applications (including organizational setting and computer system, catalog use and satisfaction, and implications).…

  10. Population Education Accessions Lists, July-December 1986.

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Bangkok (Thailand). Regional Office for Education in Asia and the Pacific.

    Part I of this resource guide contains listings of instructional materials, computer-assisted instructions, classroom activities and teaching methods. Part II deals with the knowledge base of population education. These publications are divided into 11 topics including: (1) demography; (2) documentation; (3) education (including environmental,…

  11. 26 CFR 1.509(a)-3 - Broadly, publicly supported organizations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... this paragraph, if applicable, the basic consideration is whether its organizational structure... paragraph in determining whether the organizational structure, programs or activities, and method of... subparagraph based on a computation period of taxable years 1971 through 1974 or 1972 through 1975, such an...

  12. But Is It Nutritious? Computer Analysis Creates Healthier Meals.

    ERIC Educational Resources Information Center

    Corrigan, Kathleen A.; Aumann, Margaret B.

    1993-01-01

    A computerized menu-planning method, "Nutrient Standard Menu Planning" (NSMP), uses today's technology to create healthier menus. Field tested in 20 California school districts, the advantages of NSMP are cost effectiveness, increased flexibility, greater productivity, improved public relations, improved finances, and improved student…

  13. Computationally efficient method for localizing the spiral rotor source using synthetic intracardiac electrograms during atrial fibrillation.

    PubMed

    Shariat, M H; Gazor, S; Redfearn, D

    2015-08-01

    Atrial fibrillation (AF), the most common sustained cardiac arrhythmia, is an extremely costly public health problem. Catheter-based ablation is a common minimally invasive procedure to treat AF. Contemporary mapping methods are highly dependent on the accuracy of anatomic localization of rotor sources within the atria. In this paper, using simulated atrial intracardiac electrograms (IEGMs) during AF, we propose a computationally efficient method for localizing the tip of the electrical rotor with an Archimedean/arithmetic spiral wavefront. The proposed method deploys the locations of electrodes of a catheter and their IEGMs activation times to estimate the unknown parameters of the spiral wavefront including its tip location. The proposed method is able to localize the spiral as soon as the wave hits three electrodes of the catheter. Our simulation results show that the method can efficiently localize the spiral wavefront that rotates either clockwise or counterclockwise.

  14. Galaxy morphology - An unsupervised machine learning approach

    NASA Astrophysics Data System (ADS)

    Schutter, A.; Shamir, L.

    2015-09-01

    Structural properties poses valuable information about the formation and evolution of galaxies, and are important for understanding the past, present, and future universe. Here we use unsupervised machine learning methodology to analyze a network of similarities between galaxy morphological types, and automatically deduce a morphological sequence of galaxies. Application of the method to the EFIGI catalog show that the morphological scheme produced by the algorithm is largely in agreement with the De Vaucouleurs system, demonstrating the ability of computer vision and machine learning methods to automatically profile galaxy morphological sequences. The unsupervised analysis method is based on comprehensive computer vision techniques that compute the visual similarities between the different morphological types. Rather than relying on human cognition, the proposed system deduces the similarities between sets of galaxy images in an automatic manner, and is therefore not limited by the number of galaxies being analyzed. The source code of the method is publicly available, and the protocol of the experiment is included in the paper so that the experiment can be replicated, and the method can be used to analyze user-defined datasets of galaxy images.

  15. Guidelines in preparing computer-generated plots for NASA technical reports with the LaRC graphics output system

    NASA Technical Reports Server (NTRS)

    Taylor, N. L.

    1983-01-01

    To response to a need for improved computer-generated plots that are acceptable to the Langley publication process, the LaRC Graphics Output System has been modified to encompass the publication requirements, and a guideline has been established. This guideline deals only with the publication requirements of computer-generated plots. This report explains the capability that authors of NASA technical reports can use to obtain publication--quality computer-generated plots or the Langley publication process. The rules applied in developing this guideline and examples illustrating the rules are included.

  16. Subject-enabled analytics model on measurement statistics in health risk expert system for public health informatics.

    PubMed

    Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun

    2017-11-01

    This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. 36 CFR § 1254.32 - What rules apply to public access use of the Internet on NARA-supplied computers?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... access use of the Internet on NARA-supplied computers? § 1254.32 Section § 1254.32 Parks, Forests, and... public access use of the Internet on NARA-supplied computers? (a) Public access computers (workstations... equipment. (b) You should not expect privacy while using these workstations. These workstations are operated...

  18. International Developments in Computer Science.

    DTIC Science & Technology

    1982-06-01

    background on 52 53 China’s scientific research and on their computer science before 1978. A useful companion to the directory is another publication of the...bimonthly publication in Portuguese; occasional translation of foreign articles into Portuguese. Data News: A bimonthly industry newsletter. Sistemas ...computer-related topics; Spanish. Delta: Publication of local users group; Spanish. Sistemas : Publication of System Engineers of Colombia; Spanish. CUBA

  19. Center for Interface Science and Catalysis | Theory

    Science.gov Websites

    & Stanford School of Engineering Toggle navigation Home Research Publications People About Academic to overcome challenges associated with the atomic-scale design of catalysts for chemical computational methods we are developing a quantitative description of chemical processes at the solid-gas and

  20. Three-dimensional reconstructions from computed tomographic scans on smartphones and tablets: a simple tutorial for the ward and operating room using public domain software.

    PubMed

    Ketoff, Serge; Khonsari, Roman Hossein; Schouman, Thomas; Bertolus, Chloé

    2014-11-01

    Handling 3-dimensional reconstructions of computed tomographic scans on portable devices is problematic because of the size of the Digital Imaging and Communications in Medicine (DICOM) stacks. The authors provide a user-friendly method allowing the production, transfer, and sharing of good-quality 3-dimensional reconstructions on smartphones and tablets. Copyright © 2014 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  1. Factors influencing health professions students' use of computers for data analysis at three Ugandan public medical schools: a cross-sectional survey.

    PubMed

    Munabi, Ian G; Buwembo, William; Bajunirwe, Francis; Kitara, David Lagoro; Joseph, Ruberwa; Peter, Kawungezi; Obua, Celestino; Quinn, John; Mwaka, Erisa S

    2015-02-25

    Effective utilization of computers and their applications in medical education and research is of paramount importance to students. The objective of this study was to determine the association between owning a computer and use of computers for research data analysis and the other factors influencing health professions students' computer use for data analysis. We conducted a cross sectional study among undergraduate health professions students at three public universities in Uganda using a self-administered questionnaire. The questionnaire was composed of questions on participant demographics, students' participation in research, computer ownership, and use of computers for data analysis. Descriptive and inferential statistics (uni-variable and multi- level logistic regression analysis) were used to analyse data. The level of significance was set at 0.05. Six hundred (600) of 668 questionnaires were completed and returned (response rate 89.8%). A majority of respondents were male (68.8%) and 75.3% reported owning computers. Overall, 63.7% of respondents reported that they had ever done computer based data analysis. The following factors were significant predictors of having ever done computer based data analysis: ownership of a computer (adj. OR 1.80, p = 0.02), recently completed course in statistics (Adj. OR 1.48, p =0.04), and participation in research (Adj. OR 2.64, p <0.01). Owning a computer, participation in research and undertaking courses in research methods influence undergraduate students' use of computers for research data analysis. Students are increasingly participating in research, and thus need to have competencies for the successful conduct of research. Medical training institutions should encourage both curricular and extra-curricular efforts to enhance research capacity in line with the modern theories of adult learning.

  2. Net present value analysis: appropriate for public utilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, W.N. III

    1980-08-28

    The net-present-value technique widely used by unregulated companies for capital budgeting can also apply to regulated public utilities. Used to decide whether an investment is worthwhile, the NPV technique discounts an investment's initial outlay or cost. The type of project most appropriate for an NPV analysis is that designed to lower costs. Efficiency-improving investments can be adequately evaluated by the NPV method, which in certain cases is easier to use than some of the more complicated revenue-requirement computer models.

  3. Reduction of collisional-radiative models for transient, atomic plasmas

    NASA Astrophysics Data System (ADS)

    Abrantes, Richard June; Karagozian, Ann; Bilyeu, David; Le, Hai

    2017-10-01

    Interactions between plasmas and any radiation field, whether by lasers or plasma emissions, introduce many computational challenges. One of these computational challenges involves resolving the atomic physics, which can influence other physical phenomena in the radiated system. In this work, a collisional-radiative (CR) model with reduction capabilities is developed to capture the atomic physics at a reduced computational cost. Although the model is made with any element in mind, the model is currently supplemented by LANL's argon database, which includes the relevant collisional and radiative processes for all of the ionic stages. Using the detailed data set as the true solution, reduction mechanisms in the form of Boltzmann grouping, uniform grouping, and quasi-steady-state (QSS), are implemented to compare against the true solution. Effects on the transient plasma stemming from the grouping methods are compared. Distribution A: Approved for public release; unlimited distribution, PA (Public Affairs) Clearance Number 17449. This work was supported by the Air Force Office of Scientific Research (AFOSR), Grant Number 17RQCOR463 (Dr. Jason Marshall).

  4. Guidelines and standard procedures for continuous water-quality monitors: Site selection, field operation, calibration, record computation, and reporting

    USGS Publications Warehouse

    Wagner, Richard J.; Mattraw, Harold C.; Ritz, George F.; Smith, Brett A.

    2000-01-01

    The U.S. Geological Survey uses continuous water-quality monitors to assess variations in the quality of the Nation's surface water. A common system configuration for data collection is the four-parameter water-quality monitoring system, which collects temperature, specific conductance, dissolved oxygen, and pH data, although systems can be configured to measure other properties such as turbidity or chlorophyll. The sensors that are used to measure these water properties require careful field observation, cleaning, and calibration procedures, as well as thorough procedures for the computation and publication of final records. Data from sensors can be used in conjunction with collected samples and chemical analyses to estimate chemical loads. This report provides guidelines for site-selection considerations, sensor test methods, field procedures, error correction, data computation, and review and publication processes. These procedures have evolved over the past three decades, and the process continues to evolve with newer technologies.

  5. Beyond Theory: Improving Public Relations Writing through Computer Technology.

    ERIC Educational Resources Information Center

    Neff, Bonita Dostal

    Computer technology (primarily word processing) enables the student of public relations writing to improve the writing process through increased flexibility in writing, enhanced creativity, increased support of management skills and team work. A new instructional model for computer use in public relations courses at Purdue University Calumet…

  6. An Application of Discrete Mathematics to Coding Theory.

    ERIC Educational Resources Information Center

    Donohoe, L. Joyce

    1992-01-01

    Presents a public-key cryptosystem application to introduce students to several topics in discrete mathematics. A computer algorithms using recursive methods is presented to solve a problem in which one person wants to send a coded message to a second person while keeping the message secret from a third person. (MDH)

  7. ENVIRONMENTAL RESEARCH BRIEF : ANALYTIC ELEMENT MODELING OF GROUND-WATER FLOW AND HIGH PERFORMANCE COMPUTING

    EPA Science Inventory

    Several advances in the analytic element method have been made to enhance its performance and facilitate three-dimensional ground-water flow modeling in a regional aquifer setting. First, a new public domain modular code (ModAEM) has been developed for modeling ground-water flow ...

  8. Public Schools Energy Conservation Measures, Report Number 4: Hindman Elementary School, Hindman, Kentucky.

    ERIC Educational Resources Information Center

    American Association of School Administrators, Arlington, VA.

    Presented is a study identifying and evaluating opportunities for decreasing energy use at Hindman Elementary School, Hindman, Kentucky. Methods used in this engineering investigation include building surveys, computer simulations and cost estimates. Findings revealed that modifications to the school's boiler, temperature controls, electrical…

  9. Structural Dynamics Branch research and accomplishments for FY 1990

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Presented here is a collection of FY 1990 research highlights from the Structural Dynamics Branch at the NASA Lewis Research Center. Highlights are from the branch's major work areas: aeroelasticity, vibration control, dynamic systems, and computational structural methods. A listing is given of FY 1990 branch publications.

  10. APPLICATION OF NEURAL COMPUTING METHODS FOR INTERPRETING PHOSPHOLIPID FATTY ACID PROFILES OF NATURAL MICROBIAL COMMUNITIES. (R826944)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  11. Cumulative reports and publications through 31 December 1983

    NASA Technical Reports Server (NTRS)

    1983-01-01

    All reports for the calendar years 1975 through December 1983 are listed by author. Since ICASE reports are intended to be preprints of articles for journals and conference proceedings, the published reference is included when available. Thirteen older journal and conference proceedings references are included as well as five additional reports by ICASE personnel. Major categories of research covered include: (1) numerical methods, with particular emphasis on the development and analysis of basic algorithms; (2) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, structural analysis, and chemistry; and (3) computer systems and software, especially vector and parallel computers, microcomputers, and data management.

  12. Data distribution method of workflow in the cloud environment

    NASA Astrophysics Data System (ADS)

    Wang, Yong; Wu, Junjuan; Wang, Ying

    2017-08-01

    Cloud computing for workflow applications provides the required high efficiency calculation and large storage capacity and it also brings challenges to the protection of trade secrets and other privacy data. Because of privacy data will cause the increase of the data transmission time, this paper presents a new data allocation algorithm based on data collaborative damage degree, to improve the existing data allocation strategy? Safety and public cloud computer algorithm depends on the private cloud; the static allocation method in the initial stage only to the non-confidential data division to improve the original data, in the operational phase will continue to generate data to dynamically adjust the data distribution scheme. The experimental results show that the improved method is effective in reducing the data transmission time.

  13. Revision by means of computer-mediated peer discussions

    NASA Astrophysics Data System (ADS)

    Soong, Benson; Mercer, Neil; Er, Siew Shin

    2010-05-01

    In this article, we provide a discussion on our revision method (termed prescriptive tutoring) aimed at revealing students' misconceptions and misunderstandings by getting them to solve physics problems with an anonymous partner via the computer. It is currently being implemented and evaluated in a public secondary school in Singapore, and statistical analysis of our initial small-scale study shows that students in the experimental group significantly outperformed students in both the control and alternative intervention groups. In addition, students in the experimental group perceived that they had gained improved understanding of the physics concepts covered during the intervention, and reported that they would like to continue revising physics concepts using the intervention methods.

  14. CNN for breaking text-based CAPTCHA with noise

    NASA Astrophysics Data System (ADS)

    Liu, Kaixuan; Zhang, Rong; Qing, Ke

    2017-07-01

    A CAPTCHA ("Completely Automated Public Turing test to tell Computers and Human Apart") system is a program that most humans can pass but current computer programs could hardly pass. As the most common type of CAPTCHAs , text-based CAPTCHA has been widely used in different websites to defense network bots. In order to breaking textbased CAPTCHA, in this paper, two trained CNN models are connected for the segmentation and classification of CAPTCHA images. Then base on these two models, we apply sliding window segmentation and voting classification methods realize an end-to-end CAPTCHA breaking system with high success rate. The experiment results show that our method is robust and effective in breaking text-based CAPTCHA with noise.

  15. Public computing options for individuals with cognitive impairments: survey outcomes.

    PubMed

    Fox, Lynn Elizabeth; Sohlberg, McKay Moore; Fickas, Stephen; Lemoncello, Rik; Prideaux, Jason

    2009-09-01

    To examine availability and accessibility of public computing for individuals with cognitive impairment (CI) who reside in the USA. A telephone survey was administered as a semi-structured interview to 145 informants representing seven types of public facilities across three geographically distinct regions using a snowball sampling technique. An Internet search of wireless (Wi-Fi) hotspots supplemented the survey. Survey results showed the availability of public computer terminals and Internet hotspots was greatest in the urban sample, followed by the mid-sized and rural cities. Across seven facility types surveyed, libraries had the highest percentage of access barriers, including complex queue procedures, login and password requirements, and limited technical support. University assistive technology centres and facilities with a restricted user policy, such as brain injury centres, had the lowest incidence of access barriers. Findings suggest optimal outcomes for people with CI will result from a careful match of technology and the user that takes into account potential barriers and opportunities to computing in an individual's preferred public environments. Trends in public computing, including the emergence of widespread Wi-Fi and limited access to terminals that permit auto-launch applications, should guide development of technology designed for use in public computing environments.

  16. A secure protocol for protecting the identity of providers when disclosing data for disease surveillance

    PubMed Central

    Hu, Jun; Mercer, Jay; Peyton, Liam; Kantarcioglu, Murat; Malin, Bradley; Buckeridge, David; Samet, Saeed; Earle, Craig

    2011-01-01

    Background Providers have been reluctant to disclose patient data for public-health purposes. Even if patient privacy is ensured, the desire to protect provider confidentiality has been an important driver of this reluctance. Methods Six requirements for a surveillance protocol were defined that satisfy the confidentiality needs of providers and ensure utility to public health. The authors developed a secure multi-party computation protocol using the Paillier cryptosystem to allow the disclosure of stratified case counts and denominators to meet these requirements. The authors evaluated the protocol in a simulated environment on its computation performance and ability to detect disease outbreak clusters. Results Theoretical and empirical assessments demonstrate that all requirements are met by the protocol. A system implementing the protocol scales linearly in terms of computation time as the number of providers is increased. The absolute time to perform the computations was 12.5 s for data from 3000 practices. This is acceptable performance, given that the reporting would normally be done at 24 h intervals. The accuracy of detection disease outbreak cluster was unchanged compared with a non-secure distributed surveillance protocol, with an F-score higher than 0.92 for outbreaks involving 500 or more cases. Conclusion The protocol and associated software provide a practical method for providers to disclose patient data for sentinel, syndromic or other indicator-based surveillance while protecting patient privacy and the identity of individual providers. PMID:21486880

  17. Managing expectations when publishing tools and methods for computational proteomics.

    PubMed

    Martens, Lennart; Kohlbacher, Oliver; Weintraub, Susan T

    2015-05-01

    Computational tools are pivotal in proteomics because they are crucial for identification, quantification, and statistical assessment of data. The gateway to finding the best choice of a tool or approach for a particular problem is frequently journal articles, yet there is often an overwhelming variety of options that makes it hard to decide on the best solution. This is particularly difficult for nonexperts in bioinformatics. The maturity, reliability, and performance of tools can vary widely because publications may appear at different stages of development. A novel idea might merit early publication despite only offering proof-of-principle, while it may take years before a tool can be considered mature, and by that time it might be difficult for a new publication to be accepted because of a perceived lack of novelty. After discussions with members of the computational mass spectrometry community, we describe here proposed recommendations for organization of informatics manuscripts as a way to set the expectations of readers (and reviewers) through three different manuscript types that are based on existing journal designations. Brief Communications are short reports describing novel computational approaches where the implementation is not necessarily production-ready. Research Articles present both a novel idea and mature implementation that has been suitably benchmarked. Application Notes focus on a mature and tested tool or concept and need not be novel but should offer advancement from improved quality, ease of use, and/or implementation. Organizing computational proteomics contributions into these three manuscript types will facilitate the review process and will also enable readers to identify the maturity and applicability of the tool for their own workflows.

  18. A National Survey of the Public's Attitudes Toward Computers.

    ERIC Educational Resources Information Center

    American Federation of Information Processing Societies, Montvale, NJ.

    The general public's attitudes towards continually expanding computer usage is frequently speculated about but is far from understood. This study is aimed at providing objective data on the public's attitudes towards computers, their uses, their perceived impact on the American economy as well as on the individual, and their future uses. The…

  19. Protecting genomic data analytics in the cloud: state of the art and opportunities.

    PubMed

    Tang, Haixu; Jiang, Xiaoqian; Wang, Xiaofeng; Wang, Shuang; Sofia, Heidi; Fox, Dov; Lauter, Kristin; Malin, Bradley; Telenti, Amalio; Xiong, Li; Ohno-Machado, Lucila

    2016-10-13

    The outsourcing of genomic data into public cloud computing settings raises concerns over privacy and security. Significant advancements in secure computation methods have emerged over the past several years, but such techniques need to be rigorously evaluated for their ability to support the analysis of human genomic data in an efficient and cost-effective manner. With respect to public cloud environments, there are concerns about the inadvertent exposure of human genomic data to unauthorized users. In analyses involving multiple institutions, there is additional concern about data being used beyond agreed research scope and being prcoessed in untrused computational environments, which may not satisfy institutional policies. To systematically investigate these issues, the NIH-funded National Center for Biomedical Computing iDASH (integrating Data for Analysis, 'anonymization' and SHaring) hosted the second Critical Assessment of Data Privacy and Protection competition to assess the capacity of cryptographic technologies for protecting computation over human genomes in the cloud and promoting cross-institutional collaboration. Data scientists were challenged to design and engineer practical algorithms for secure outsourcing of genome computation tasks in working software, whereby analyses are performed only on encrypted data. They were also challenged to develop approaches to enable secure collaboration on data from genomic studies generated by multiple organizations (e.g., medical centers) to jointly compute aggregate statistics without sharing individual-level records. The results of the competition indicated that secure computation techniques can enable comparative analysis of human genomes, but greater efficiency (in terms of compute time and memory utilization) are needed before they are sufficiently practical for real world environments.

  20. The Mesa Arizona Pupil Tracking System

    NASA Technical Reports Server (NTRS)

    Wright, D. L.

    1973-01-01

    A computer-based Pupil Tracking/Teacher Monitoring System was designed for Mesa Public Schools, Mesa, Arizona. The established objectives of the system were to: (1) facilitate the economical collection and storage of student performance data necessary to objectively evaluate the relative effectiveness of teachers, instructional methods, materials, and applied concepts; and (2) identify, on a daily basis, those students requiring special attention in specific subject areas. The system encompasses computer hardware/software and integrated curricula progression/administration devices. It provides daily evaluation and monitoring of performance as students progress at class or individualized rates. In the process, it notifies the student and collects information necessary to validate or invalidate subject presentation devices, methods, materials, and measurement devices in terms of direct benefit to the students. The system utilizes a small-scale computer (e.g., IBM 1130) to assure low-cost replicability, and may be used for many subjects of instruction.

  1. Analysis of CO2 trapping capacities and long-term migration for geological formations in the Norwegian North Sea using MRST-co2lab

    NASA Astrophysics Data System (ADS)

    Møll Nilsen, Halvor; Lie, Knut-Andreas; Andersen, Odd

    2015-06-01

    MRST-co2lab is a collection of open-source computational tools for modeling large-scale and long-time migration of CO2 in conductive aquifers, combining ideas from basin modeling, computational geometry, hydrology, and reservoir simulation. Herein, we employ the methods of MRST-co2lab to study long-term CO2 storage on the scale of hundreds of megatonnes. We consider public data sets of two aquifers from the Norwegian North Sea and use geometrical methods for identifying structural traps, percolation-type methods for identifying potential spill paths, and vertical-equilibrium methods for efficient simulation of structural, residual, and solubility trapping in a thousand-year perspective. In particular, we investigate how data resolution affects estimates of storage capacity and discuss workflows for identifying good injection sites and optimizing injection strategies.

  2. BCILAB: a platform for brain-computer interface development

    NASA Astrophysics Data System (ADS)

    Kothe, Christian Andreas; Makeig, Scott

    2013-10-01

    Objective. The past two decades have seen dramatic progress in our ability to model brain signals recorded by electroencephalography, functional near-infrared spectroscopy, etc., and to derive real-time estimates of user cognitive state, response, or intent for a variety of purposes: to restore communication by the severely disabled, to effect brain-actuated control and, more recently, to augment human-computer interaction. Continuing these advances, largely achieved through increases in computational power and methods, requires software tools to streamline the creation, testing, evaluation and deployment of new data analysis methods. Approach. Here we present BCILAB, an open-source MATLAB-based toolbox built to address the need for the development and testing of brain-computer interface (BCI) methods by providing an organized collection of over 100 pre-implemented methods and method variants, an easily extensible framework for the rapid prototyping of new methods, and a highly automated framework for systematic testing and evaluation of new implementations. Main results. To validate and illustrate the use of the framework, we present two sample analyses of publicly available data sets from recent BCI competitions and from a rapid serial visual presentation task. We demonstrate the straightforward use of BCILAB to obtain results compatible with the current BCI literature. Significance. The aim of the BCILAB toolbox is to provide the BCI community a powerful toolkit for methods research and evaluation, thereby helping to accelerate the pace of innovation in the field, while complementing the existing spectrum of tools for real-time BCI experimentation, deployment and use.

  3. A Comparative Assessment of Computer Literacy of Private and Public Secondary School Students in Lagos State, Nigeria

    ERIC Educational Resources Information Center

    Osunwusi, Adeyinka Olumuyiwa; Abifarin, Michael Segun

    2013-01-01

    The aim of this study was to conduct a comparative assessment of computer literacy of private and public secondary school students. Although the definition of computer literacy varies widely, this study treated computer literacy in terms of access to, and use of, computers and the internet, basic knowledge and skills required to use computers and…

  4. Flight Dynamics and Control of a Morphing UAV: Bio inspired by Natural Fliers

    DTIC Science & Technology

    2017-02-17

    Approved for public release: distribution unlimited. IV Modelling and Sizing Tornado Vortex Lattice Method (VLM) was used for aerodynamic prediction... Tornado is a Vortex Lattice Method software programmed in MATLAB; it was selected due to its fast solving time and ability to be controlled through...custom MATLAB scripts. Tornado VLM models the wing as thin sheet of discrete vortices and computes the pressure and force distributions around the

  5. Jaccard distance based weighted sparse representation for coarse-to-fine plant species recognition.

    PubMed

    Zhang, Shanwen; Wu, Xiaowei; You, Zhuhong

    2017-01-01

    Leaf based plant species recognition plays an important role in ecological protection, however its application to large and modern leaf databases has been a long-standing obstacle due to the computational cost and feasibility. Recognizing such limitations, we propose a Jaccard distance based sparse representation (JDSR) method which adopts a two-stage, coarse to fine strategy for plant species recognition. In the first stage, we use the Jaccard distance between the test sample and each training sample to coarsely determine the candidate classes of the test sample. The second stage includes a Jaccard distance based weighted sparse representation based classification(WSRC), which aims to approximately represent the test sample in the training space, and classify it by the approximation residuals. Since the training model of our JDSR method involves much fewer but more informative representatives, this method is expected to overcome the limitation of high computational and memory costs in traditional sparse representation based classification. Comparative experimental results on a public leaf image database demonstrate that the proposed method outperforms other existing feature extraction and SRC based plant recognition methods in terms of both accuracy and computational speed.

  6. Spatial Statistics for Tumor Cell Counting and Classification

    NASA Astrophysics Data System (ADS)

    Wirjadi, Oliver; Kim, Yoo-Jin; Breuel, Thomas

    To count and classify cells in histological sections is a standard task in histology. One example is the grading of meningiomas, benign tumors of the meninges, which requires to assess the fraction of proliferating cells in an image. As this process is very time consuming when performed manually, automation is required. To address such problems, we propose a novel application of Markov point process methods in computer vision, leading to algorithms for computing the locations of circular objects in images. In contrast to previous algorithms using such spatial statistics methods in image analysis, the present one is fully trainable. This is achieved by combining point process methods with statistical classifiers. Using simulated data, the method proposed in this paper will be shown to be more accurate and more robust to noise than standard image processing methods. On the publicly available SIMCEP benchmark for cell image analysis algorithms, the cell count performance of the present paper is significantly more accurate than results published elsewhere, especially when cells form dense clusters. Furthermore, the proposed system performs as well as a state-of-the-art algorithm for the computer-aided histological grading of meningiomas when combined with a simple k-nearest neighbor classifier for identifying proliferating cells.

  7. DREAMTools: a Python package for scoring collaborative challenges

    PubMed Central

    Cokelaer, Thomas; Bansal, Mukesh; Bare, Christopher; Bilal, Erhan; Bot, Brian M.; Chaibub Neto, Elias; Eduati, Federica; de la Fuente, Alberto; Gönen, Mehmet; Hill, Steven M.; Hoff, Bruce; Karr, Jonathan R.; Küffner, Robert; Menden, Michael P.; Meyer, Pablo; Norel, Raquel; Pratap, Abhishek; Prill, Robert J.; Weirauch, Matthew T.; Costello, James C.; Stolovitzky, Gustavo; Saez-Rodriguez, Julio

    2016-01-01

    DREAM challenges are community competitions designed to advance computational methods and address fundamental questions in system biology and translational medicine. Each challenge asks participants to develop and apply computational methods to either predict unobserved outcomes or to identify unknown model parameters given a set of training data. Computational methods are evaluated using an automated scoring metric, scores are posted to a public leaderboard, and methods are published to facilitate community discussions on how to build improved methods. By engaging participants from a wide range of science and engineering backgrounds, DREAM challenges can comparatively evaluate a wide range of statistical, machine learning, and biophysical methods. Here, we describe DREAMTools, a Python package for evaluating DREAM challenge scoring metrics. DREAMTools provides a command line interface that enables researchers to test new methods on past challenges, as well as a framework for scoring new challenges. As of March 2016, DREAMTools includes more than 80% of completed DREAM challenges. DREAMTools complements the data, metadata, and software tools available at the DREAM website http://dreamchallenges.org and on the Synapse platform at https://www.synapse.org. Availability:  DREAMTools is a Python package. Releases and documentation are available at http://pypi.python.org/pypi/dreamtools. The source code is available at http://github.com/dreamtools/dreamtools. PMID:27134723

  8. Art History Interactive Videodisc Project at the University of Iowa.

    ERIC Educational Resources Information Center

    Sustik, Joan M.

    A project which developed a retrieval system to evaluate the advantages and disadvantages of an interactive computer and video display system over traditional methods for using a slide library is described in this publication. The art school slide library of the University of Iowa stores transparencies which are arranged alphabetically within…

  9. CACTUS: Command and Control Training Using Knowledge-Based Simulations

    ERIC Educational Resources Information Center

    Hartley, Roger; Ravenscroft, Andrew; Williams, R. J.

    2008-01-01

    The CACTUS project was concerned with command and control training of large incidents where public order may be at risk, such as large demonstrations and marches. The training requirements and objectives of the project are first summarized justifying the use of knowledge-based computer methods to support and extend conventional training…

  10. Practical Tools for Content Development: Pre-Service Teachers' Experiences and Perceptions

    ERIC Educational Resources Information Center

    Yurtseven Avci, Zeynep; Eren, Esra; Seckin Kapucu, Munise

    2016-01-01

    This study adopts phenomenology approach as the research design method to investigate pre-service teachers' experiences and perceptions on using practical tools for content development. The participants are twenty-four pre-service teachers who were taking Computer II course during 2013-2014 spring semester at a public university in Turkey. During…

  11. Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework

    EPA Science Inventory

    Driven by major scientific advances in analytical methods, biomonitoring, computation, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deploy...

  12. Virginia ridesharing statistics : methodologies for determining carpooler and vanpool average life bases and the average fuel economy of commuter vehicles.

    DOT National Transportation Integrated Search

    1985-01-01

    The objective of this research was to investigate methods of computing average life values for carpoolers and vanpools in Virginia. These statistics are to be used by the Rail and Public Transportation Division in evaluating the efficiency and cost-e...

  13. Information Communication Technology Policy and Public Primary Schools' Efficiency in Rwanda

    ERIC Educational Resources Information Center

    Munyengabe, Sylvestre; Haiyan, He; Yiyi, Zhao

    2018-01-01

    Teaching and learning processes have been developed through different methods and materials; nowadays the introduction of computers and other ICT tools in different forms and levels of education have been found to be highly influential in education system of different countries. The main objective of this study was to correlate Information…

  14. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    PubMed Central

    2010-01-01

    Background The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/. PMID:20482791

  15. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments.

    PubMed

    Ma, Jingming; Dykes, Carrie; Wu, Tao; Huang, Yangxin; Demeter, Lisa; Wu, Hulin

    2010-05-18

    The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  16. Steady and unsteady three-dimensional transonic flow computations by integral equation method

    NASA Technical Reports Server (NTRS)

    Hu, Hong

    1994-01-01

    This is the final technical report of the research performed under the grant: NAG1-1170, from the National Aeronautics and Space Administration. The report consists of three parts. The first part presents the work on unsteady flows around a zero-thickness wing. The second part presents the work on steady flows around non-zero thickness wings. The third part presents the massively parallel processing implementation and performance analysis of integral equation computations. At the end of the report, publications resulting from this grant are listed and attached.

  17. Public health situation awareness: toward a semantic approach

    NASA Astrophysics Data System (ADS)

    Mirhaji, Parsa; Richesson, Rachel L.; Turley, James P.; Zhang, Jiajie; Smith, Jack W.

    2004-04-01

    We propose a knowledge-based public health situation awareness system. The basis for this system is an explicit representation of public health situation awareness concepts and their interrelationships. This representation is based upon the users" (public health decision makers) cognitive model of the world, and optimized towards the efficacy of performance and relevance to the public health situation awareness processes and tasks. In our approach, explicit domain knowledge is the foundation for interpretation of public health data, as apposed to conventional systems where the statistical methods are the essence of the processes. Objectives: To develop a prototype knowledge-based system for public health situation awareness and to demonstrate the utility of knowledge intensive approaches in integration of heterogeneous information, eliminating the effects of incomplete and poor quality surveillance data, uncertainty in syndrome and aberration detection and visualization of complex information structures in public health surveillance settings, particularly in the context of bioterrorism (BT) preparedness. The system employs the Resource Definition Framework (RDF) and additional layers of more expressive languages to explicate the knowledge of domain experts into machine interpretable and computable problem-solving modules that can then guide users and computer systems in sifting through the most "relevant" data for syndrome and outbreak detection and investigation of root cause of the event. The Center for Biosecurity and Public Health Informatics Research is developing a prototype knowledge-based system around influenza, which has complex natural disease patterns, many public health implications, and is a potential agent for bioterrorism. The preliminary data from this effort may demonstrate superior performance in information integration, syndrome and aberration detection, information access through information visualization, and cross-domain investigation of the root causes of public health events.

  18. SLDAssay: A software package and web tool for analyzing limiting dilution assays.

    PubMed

    Trumble, Ilana M; Allmon, Andrew G; Archin, Nancie M; Rigdon, Joseph; Francis, Owen; Baldoni, Pedro L; Hudgens, Michael G

    2017-11-01

    Serial limiting dilution (SLD) assays are used in many areas of infectious disease related research. This paper presents SLDAssay, a free and publicly available R software package and web tool for analyzing data from SLD assays. SLDAssay computes the maximum likelihood estimate (MLE) for the concentration of target cells, with corresponding exact and asymptotic confidence intervals. Exact and asymptotic goodness of fit p-values, and a bias-corrected (BC) MLE are also provided. No other publicly available software currently implements the BC MLE or the exact methods. For validation of SLDAssay, results from Myers et al. (1994) are replicated. Simulations demonstrate the BC MLE is less biased than the MLE. Additionally, simulations demonstrate that exact methods tend to give better confidence interval coverage and goodness-of-fit tests with lower type I error than the asymptotic methods. Additional advantages of using exact methods are also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Extension of research data repository system to support direct compute access to biomedical datasets: enhancing Dataverse to support large datasets

    PubMed Central

    McKinney, Bill; Meyer, Peter A.; Crosas, Mercè; Sliz, Piotr

    2016-01-01

    Access to experimental X-ray diffraction image data is important for validation and reproduction of macromolecular models and indispensable for the development of structural biology processing methods. In response to the evolving needs of the structural biology community, we recently established a diffraction data publication system, the Structural Biology Data Grid (SBDG, data.sbgrid.org), to preserve primary experimental datasets supporting scientific publications. All datasets published through the SBDG are freely available to the research community under a public domain dedication license, with metadata compliant with the DataCite Schema (schema.datacite.org). A proof-of-concept study demonstrated community interest and utility. Publication of large datasets is a challenge shared by several fields, and the SBDG has begun collaborating with the Institute for Quantitative Social Science at Harvard University to extend the Dataverse (dataverse.org) open-source data repository system to structural biology datasets. Several extensions are necessary to support the size and metadata requirements for structural biology datasets. In this paper, we describe one such extension—functionality supporting preservation of filesystem structure within Dataverse—which is essential for both in-place computation and supporting non-http data transfers. PMID:27862010

  20. 76 FR 11435 - Privacy Act of 1974; Computer Matching Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-02

    ... Security Administration. SUMMARY: Pursuant to the Computer Matching and Privacy Protection Act of 1988, Public Law 100-503, the Computer Matching and Privacy Protections Amendments of 1990, Pub. L. 101-508... Interpreting the Provisions of Public Law 100-503, the Computer Matching and Privacy Protection Act of 1988...

  1. Big Data’s Role in Precision Public Health

    PubMed Central

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts. PMID:29594091

  2. Big Data's Role in Precision Public Health.

    PubMed

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  3. Computers and the Internet: Tools for Youth Empowerment

    PubMed Central

    2005-01-01

    Background Youth are often disenfranchised in their communities and may feel they have little voice. Since computers are an important aspect of youth culture, they may offer solutions to increasing youth participation in communities. Objective This qualitative case study investigated the perceptions of 19 (predominantly female) inner-city school youth about their use of computers and the Internet in a school-based community development project. Methods Youth working with public health nurses in a school-based community development project communicated with local community members using computer-mediated communication, surveyed peers online, built websites, searched for information online, and prepared project materials using computers and the Internet. Participant observation, semistructured interviews, analysis of online messages, and online- and paper-based surveys were used to gather data about youth’s and adults’ perceptions and use of the technologies. Constant comparison method and between-method triangulation were used in the analysis to satisfy the existence of themes. Results Not all youth were interested in working with computers. Some electronic messages from adults were perceived to be critical, and writing to adults was intimidating for some youth. In addition, technical problems were experienced. Despite these barriers, most youth perceived that using computers and the Internet reduced their anxiety concerning communication with adults, increased their control when dealing with adults, raised their perception of their social status, increased participation within the community, supported reflective thought, increased efficiency, and improved their access to resources. Conclusions Overall, youth perceived computers and the Internet to be empowering tools, and they should be encouraged to use such technology to support them in community initiatives. PMID:16403715

  4. Examining the Relationship between Psychosocial Work Factors and Musculoskeletal Discomfort among Computer Users in Malaysia

    PubMed Central

    Zakerian, SA; Subramaniam, ID

    2011-01-01

    Background: With computers rapidly carving a niche in virtually every nook and crevice of today’s fast-paced society, musculoskeletal disorders are becoming more prevalent among computer users, which comprise a wide spectrum of the Malaysian population, including office workers. While extant literature depicts extensive research on musculoskeletal disorders in general, the five dimensions of psychosocial work factors (job demands, job contentment, job control, computer-related problems and social interaction) attributed to work-related musculoskeletal disorders have been neglected. This study examines the aforementioned elements in detail, pertaining to their relationship with musculoskeletal disorders, focusing in particular, on 120 office workers at Malaysian public sector organizations, whose jobs require intensive computer usage. Methods: Research was conducted between March and July 2009 in public service organizations in Malaysia. This study was conducted via a survey utilizing self-complete questionnaires and diary. The relationship between psychosocial work factors and musculoskeletal discomfort was ascertained through regression analyses, which revealed that some factors were more important than others were. Results: The results indicate a significant relationship among psychosocial work factors and musculoskeletal discomfort among computer users. Several of these factors such as job control, computer-related problem and social interaction of psychosocial work factors are found to be more important than others in musculoskeletal discomfort. Conclusion: With computer usage on the rise among users, the prevalence of musculoskeletal discomfort could lead to unnecessary disabilities, hence, the vital need for greater attention to be given on this aspect in the work place, to alleviate to some extent, potential problems in future. PMID:23113058

  5. Predicting "Hot" and "Warm" Spots for Fragment Binding.

    PubMed

    Rathi, Prakash Chandra; Ludlow, R Frederick; Hall, Richard J; Murray, Christopher W; Mortenson, Paul N; Verdonk, Marcel L

    2017-05-11

    Computational fragment mapping methods aim to predict hotspots on protein surfaces where small fragments will bind. Such methods are popular for druggability assessment as well as structure-based design. However, to date researchers developing or using such tools have had no clear way of assessing the performance of these methods. Here, we introduce the first diverse, high quality validation set for computational fragment mapping. The set contains 52 diverse examples of fragment binding "hot" and "warm" spots from the Protein Data Bank (PDB). Additionally, we describe PLImap, a novel protocol for fragment mapping based on the Protein-Ligand Informatics force field (PLIff). We evaluate PLImap against the new fragment mapping test set, and compare its performance to that of simple shape-based algorithms and fragment docking using GOLD. PLImap is made publicly available from https://bitbucket.org/AstexUK/pli .

  6. Measurement of breast volume using body scan technology(computer-aided anthropometry).

    PubMed

    Veitch, Daisy; Burford, Karen; Dench, Phil; Dean, Nicola; Griffin, Philip

    2012-01-01

    Assessment of breast volume is an important tool for preoperative planning in various breast surgeries and other applications, such as bra development. Accurate assessment can improve the consistency and quality of surgery outcomes. This study outlines a non-invasive method to measure breast volume using a whole body 3D laser surface anatomy scanner, the Cyberware WBX. It expands on a previous publication where this method was validated against patients undergoing mastectomy. It specifically outlines and expands the computer-aided anthropometric (CAA) method for extracting breast volumes in a non-invasive way from patients enrolled in a breast reduction study at Flinders Medical Centre, South Australia. This step-by-step description allows others to replicate this work and provides an additional tool to assist them in their own clinical practice and development of designs.

  7. Selected papers in the applied computer sciences 1992

    USGS Publications Warehouse

    Wiltshire, Denise A.

    1992-01-01

    This compilation of short papers reports on technical advances in the applied computer sciences. The papers describe computer applications in support of earth science investigations and research. This is the third volume in the series "Selected Papers in the Applied Computer Sciences." Listed below are the topics addressed in the compilation:Integration of geographic information systems and expert systems for resource management,Visualization of topography using digital image processing,Development of a ground-water data base for the southeastern Uited States using a geographic information system,Integration and aggregation of stream-drainage data using a geographic information system,Procedures used in production of digital geologic coverage using compact disc read-only memory (CD-ROM) technology, andAutomated methods for producing a technical publication on estimated water use in the United States.

  8. Project Final Report: Ubiquitous Computing and Monitoring System (UCoMS) for Discovery and Management of Energy Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas

    2012-07-14

    The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively onmore » such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.« less

  9. Creating, documenting and sharing network models.

    PubMed

    Crook, Sharon M; Bednar, James A; Berger, Sandra; Cannon, Robert; Davison, Andrew P; Djurfeldt, Mikael; Eppler, Jochen; Kriener, Birgit; Furber, Steve; Graham, Bruce; Plesser, Hans E; Schwabe, Lars; Smith, Leslie; Steuber, Volker; van Albada, Sacha

    2012-01-01

    As computational neuroscience matures, many simulation environments are available that are useful for neuronal network modeling. However, methods for successfully documenting models for publication and for exchanging models and model components among these projects are still under development. Here we briefly review existing software and applications for network model creation, documentation and exchange. Then we discuss a few of the larger issues facing the field of computational neuroscience regarding network modeling and suggest solutions to some of these problems, concentrating in particular on standardized network model terminology, notation, and descriptions and explicit documentation of model scaling. We hope this will enable and encourage computational neuroscientists to share their models more systematically in the future.

  10. Surface Modeling, Grid Generation, and Related Issues in Computational Fluid Dynamic (CFD) Solutions

    NASA Technical Reports Server (NTRS)

    Choo, Yung K. (Compiler)

    1995-01-01

    The NASA Steering Committee for Surface Modeling and Grid Generation (SMAGG) sponsored a workshop on surface modeling, grid generation, and related issues in Computational Fluid Dynamics (CFD) solutions at Lewis Research Center, Cleveland, Ohio, May 9-11, 1995. The workshop provided a forum to identify industry needs, strengths, and weaknesses of the five grid technologies (patched structured, overset structured, Cartesian, unstructured, and hybrid), and to exchange thoughts about where each technology will be in 2 to 5 years. The workshop also provided opportunities for engineers and scientists to present new methods, approaches, and applications in SMAGG for CFD. This Conference Publication (CP) consists of papers on industry overview, NASA overview, five grid technologies, new methods/ approaches/applications, and software systems.

  11. Improved Hybrid Modeling of Spent Fuel Storage Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bibber, Karl van

    This work developed a new computational method for improving the ability to calculate the neutron flux in deep-penetration radiation shielding problems that contain areas with strong streaming. The “gold standard” method for radiation transport is Monte Carlo (MC) as it samples the physics exactly and requires few approximations. Historically, however, MC was not useful for shielding problems because of the computational challenge of following particles through dense shields. Instead, deterministic methods, which are superior in term of computational effort for these problems types but are not as accurate, were used. Hybrid methods, which use deterministic solutions to improve MC calculationsmore » through a process called variance reduction, can make it tractable from a computational time and resource use perspective to use MC for deep-penetration shielding. Perhaps the most widespread and accessible of these methods are the Consistent Adjoint Driven Importance Sampling (CADIS) and Forward-Weighted CADIS (FW-CADIS) methods. For problems containing strong anisotropies, such as power plants with pipes through walls, spent fuel cask arrays, active interrogation, and locations with small air gaps or plates embedded in water or concrete, hybrid methods are still insufficiently accurate. In this work, a new method for generating variance reduction parameters for strongly anisotropic, deep penetration radiation shielding studies was developed. This method generates an alternate form of the adjoint scalar flux quantity, Φ Ω, which is used by both CADIS and FW-CADIS to generate variance reduction parameters for local and global response functions, respectively. The new method, called CADIS-Ω, was implemented in the Denovo/ADVANTG software. Results indicate that the flux generated by CADIS-Ω incorporates localized angular anisotropies in the flux more effectively than standard methods. CADIS-Ω outperformed CADIS in several test problems. This initial work indicates that CADIS- may be highly useful for shielding problems with strong angular anisotropies. This is a benefit to the public by increasing accuracy for lower computational effort for many problems that have energy, security, and economic importance.« less

  12. 45 CFR 79.27 - Computation of time.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Computation of time. 79.27 Section 79.27 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION PROGRAM FRAUD CIVIL REMEDIES § 79.27 Computation of time. (a) In computing any period of time under this part or in an order issued...

  13. ComputerTown. A Do-It-Yourself Community Computer Project.

    ERIC Educational Resources Information Center

    Loop, Liza; And Others

    This manual based on Menlo Park's experiences in participating in a nationwide experimental computer literacy project provides guidelines for the development and management of a ComputerTown. This project, which was begun in September 1981 in the Menlo Park Public Library, concentrates on providing public access to microcomputers through hands-on…

  14. 76 FR 12397 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Bureau of the Public Debt (BPD...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-07

    ...; Computer Matching Program (SSA/ Bureau of the Public Debt (BPD))--Match Number 1038 AGENCY: Social Security... as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection... containing SSNs extracted from the Supplemental Security Record database. Exchanges for this computer...

  15. A New Method of Synthetic Aperture Radar Image Reconstruction Using Modified Convolution Back-Projection Algorithm.

    DTIC Science & Technology

    1986-08-01

    SECURITY CLASSIFICATION AUTHORITY 3 DISTRIBUTIONAVAILABILITY OF REPORT N/A \\pproved for public release, 21b. OECLASS FI) CAT ) ON/OOWNGRAOING SCMEOLLE...from this set of projections. The Convolution Back-Projection (CBP) algorithm is widely used technique in Computer Aide Tomography ( CAT ). In this work...University of Illinois at Urbana-Champaign. 1985 Ac % DTICEl_ FCTE " AUG 1 11986 Urbana. Illinois U,) I A NEW METHOD OF SYNTHETIC APERTURE RADAR IMAGE

  16. Exploiting Data Sparsity in Parallel Matrix Powers Computations

    DTIC Science & Technology

    2013-05-03

    2013 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour...matrices of the form A = D+USV H, where D is sparse and USV H has low rank but may be dense. Matrices of this form arise in many practical applications...methods numerical partial di erential equation solvers, and preconditioned iterative methods. If A has this form , our algorithm enables a communication

  17. A Probabilistic Finite Element Analysis of Residual Stress Formation in Shrink-Fit Ceramic/Steel Gun Barrels

    DTIC Science & Technology

    2002-01-01

    the present work, the Advanced Mean Value method developed by Millwater and co-workers is used [6-10]. II.1.1 Advanced Mean-Value Method The...Engineering A, submitted for publication, December, , 2001. 6. H. R. Millwater and Y.-T. Wu, “Computational Structural Reliability Analysis of a...Turbine Blade,” Proceedings International Gas Turbine and Aeroengine Congress and Exposition, Cincinnati, OH, May 24-27, 1993. 7. Millwater , H.R., Y

  18. Parallel mutual information estimation for inferring gene regulatory networks on GPUs

    PubMed Central

    2011-01-01

    Background Mutual information is a measure of similarity between two variables. It has been widely used in various application domains including computational biology, machine learning, statistics, image processing, and financial computing. Previously used simple histogram based mutual information estimators lack the precision in quality compared to kernel based methods. The recently introduced B-spline function based mutual information estimation method is competitive to the kernel based methods in terms of quality but at a lower computational complexity. Results We present a new approach to accelerate the B-spline function based mutual information estimation algorithm with commodity graphics hardware. To derive an efficient mapping onto this type of architecture, we have used the Compute Unified Device Architecture (CUDA) programming model to design and implement a new parallel algorithm. Our implementation, called CUDA-MI, can achieve speedups of up to 82 using double precision on a single GPU compared to a multi-threaded implementation on a quad-core CPU for large microarray datasets. We have used the results obtained by CUDA-MI to infer gene regulatory networks (GRNs) from microarray data. The comparisons to existing methods including ARACNE and TINGe show that CUDA-MI produces GRNs of higher quality in less time. Conclusions CUDA-MI is publicly available open-source software, written in CUDA and C++ programming languages. It obtains significant speedup over sequential multi-threaded implementation by fully exploiting the compute capability of commonly used CUDA-enabled low-cost GPUs. PMID:21672264

  19. Cloud Computing for DoD

    DTIC Science & Technology

    2012-05-01

    cloud computing 17 NASA Nebula Platform •  Cloud computing pilot program at NASA Ames •  Integrates open-source components into seamless, self...Mission support •  Education and public outreach (NASA Nebula , 2010) 18 NSF Supported Cloud Research •  Support for Cloud Computing in...Mell, P. & Grance, T. (2011). The NIST Definition of Cloud Computing. NIST Special Publication 800-145 •  NASA Nebula (2010). Retrieved from

  20. The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update

    PubMed Central

    Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy

    2016-01-01

    High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889

  1. Laser Spot Tracking Based on Modified Circular Hough Transform and Motion Pattern Analysis

    PubMed Central

    Krstinić, Damir; Skelin, Ana Kuzmanić; Milatić, Ivan

    2014-01-01

    Laser pointers are one of the most widely used interactive and pointing devices in different human-computer interaction systems. Existing approaches to vision-based laser spot tracking are designed for controlled indoor environments with the main assumption that the laser spot is very bright, if not the brightest, spot in images. In this work, we are interested in developing a method for an outdoor, open-space environment, which could be implemented on embedded devices with limited computational resources. Under these circumstances, none of the assumptions of existing methods for laser spot tracking can be applied, yet a novel and fast method with robust performance is required. Throughout the paper, we will propose and evaluate an efficient method based on modified circular Hough transform and Lucas–Kanade motion analysis. Encouraging results on a representative dataset demonstrate the potential of our method in an uncontrolled outdoor environment, while achieving maximal accuracy indoors. Our dataset and ground truth data are made publicly available for further development. PMID:25350502

  2. Laser spot tracking based on modified circular Hough transform and motion pattern analysis.

    PubMed

    Krstinić, Damir; Skelin, Ana Kuzmanić; Milatić, Ivan

    2014-10-27

    Laser pointers are one of the most widely used interactive and pointing devices in different human-computer interaction systems. Existing approaches to vision-based laser spot tracking are designed for controlled indoor environments with the main assumption that the laser spot is very bright, if not the brightest, spot in images. In this work, we are interested in developing a method for an outdoor, open-space environment, which could be implemented on embedded devices with limited computational resources. Under these circumstances, none of the assumptions of existing methods for laser spot tracking can be applied, yet a novel and fast method with robust performance is required. Throughout the paper, we will propose and evaluate an efficient method based on modified circular Hough transform and Lucas-Kanade motion analysis. Encouraging results on a representative dataset demonstrate the potential of our method in an uncontrolled outdoor environment, while achieving maximal accuracy indoors. Our dataset and ground truth data are made publicly available for further development.

  3. Differential privacy based on importance weighting

    PubMed Central

    Ji, Zhanglong

    2014-01-01

    This paper analyzes a novel method for publishing data while still protecting privacy. The method is based on computing weights that make an existing dataset, for which there are no confidentiality issues, analogous to the dataset that must be kept private. The existing dataset may be genuine but public already, or it may be synthetic. The weights are importance sampling weights, but to protect privacy, they are regularized and have noise added. The weights allow statistical queries to be answered approximately while provably guaranteeing differential privacy. We derive an expression for the asymptotic variance of the approximate answers. Experiments show that the new mechanism performs well even when the privacy budget is small, and when the public and private datasets are drawn from different populations. PMID:24482559

  4. Use of Lean response to improve pandemic influenza surge in public health laboratories.

    PubMed

    Isaac-Renton, Judith L; Chang, Yin; Prystajecky, Natalie; Petric, Martin; Mak, Annie; Abbott, Brendan; Paris, Benjamin; Decker, K C; Pittenger, Lauren; Guercio, Steven; Stott, Jeff; Miller, Joseph D

    2012-01-01

    A novel influenza A (H1N1) virus detected in April 2009 rapidly spread around the world. North American provincial and state laboratories have well-defined roles and responsibilities, including providing accurate, timely test results for patients and information for regional public health and other decision makers. We used the multidisciplinary response and rapid implementation of process changes based on Lean methods at the provincial public health laboratory in British Columbia, Canada, to improve laboratory surge capacity in the 2009 influenza pandemic. Observed and computer simulating evaluation results from rapid processes changes showed that use of Lean tools successfully expanded surge capacity, which enabled response to the 10-fold increase in testing demands.

  5. Mentorship and competencies for applied chronic disease epidemiology.

    PubMed

    Lengerich, Eugene J; Siedlecki, Jennifer C; Brownson, Ross; Aldrich, Tim E; Hedberg, Katrina; Remington, Patrick; Siegel, Paul Z

    2003-01-01

    To understand the potential and establish a framework for mentoring as a method to develop professional competencies of state-level applied chronic disease epidemiologists, model mentorship programs were reviewed, specific competencies were identified, and competencies were then matched to essential public health services. Although few existing mentorship programs in public health were identified, common themes in other professional mentorship programs support the potential of mentoring as an effective means to develop capacity for applied chronic disease epidemiology. Proposed competencies for chronic disease epidemiologists in a mentorship program include planning, analysis, communication, basic public health, informatics and computer knowledge, and cultural diversity. Mentoring may constitute a viable strategy to build chronic disease epidemiology capacity, especially in public health agencies where resource and personnel system constraints limit opportunities to recruit and hire new staff.

  6. Computers for Political Change: PeaceNet and Public Data Access.

    ERIC Educational Resources Information Center

    Downing, John D. H.

    1989-01-01

    Describes two computer communication projects: PeaceNet, devoted to peace issues; and Public Data Access, devoted to making U.S. government information more broadly available. Discusses the potential of new technology (computer communication) for grass-roots political movements. (SR)

  7. Differences in Views of School Principals and Teachers Regarding Technology Integration

    ERIC Educational Resources Information Center

    Claro, Magdalena; Nussbaum, Miguel; López, Ximena; Contardo, Victoria

    2017-01-01

    This paper studies the similarities and differences among the views of school principals and teachers regarding a mobile computer lab (MCL) initiative implemented in 1,591 public schools in Chile. It also characterizes the aspects in which their views diverge. A mixed methods study was carried out in two stages: first, a quantitative stage, where…

  8. Becoming an Interdisciplinary Scientist: An Analysis of Students' Experiences in Three Computer Science Doctoral Programmes

    ERIC Educational Resources Information Center

    Calatrava Moreno, María del Carmen; Danowitz, Mary Ann

    2016-01-01

    The aim of this study was to identify how and why doctoral students do interdisciplinary research. A mixed-methods approach utilising bibliometric analysis of the publications of 195 students identified those who had published interdisciplinary research. This objective measurement of the interdisciplinarity, applying the Rao-Stirling index to Web…

  9. 76 FR 47648 - Proposed Collection; Comment Request for Form 8697

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-05

    ... Reduction Act of 1995, Public Law 104-13 (44 U.S.C. 3506(c)(2)(A)). Currently, the IRS is soliciting... Contracts. DATES: Written comments should be received on or before October 4, 2011 to be assured of...: Interest Computation Under the Look-Back Method for Completed Long-Term Contracts. OMB Number: 1545-1031...

  10. Communicating forest management science and practices through visualized and animated media approaches to community presentations: An exploration and assessment

    Treesearch

    Donald E. Zimmerman; Carol Akerelrea; Jane Kapler Smith; Garrett J. O' Keefe

    2006-01-01

    Natural-resource managers have used a variety of computer-mediated presentation methods to communicate management practices to diverse publics. We explored the effects of visualizing and animating predictions from mathematical models in computerized presentations explaining forest succession (forest growth and change through time), fire behavior, and management options...

  11. CAG - computer-aid-georeferencing, or rapid sharing, restructuring and presentation of environmental data using remote-server georeferencing for the GE clients. Educational and scientific implications.

    NASA Astrophysics Data System (ADS)

    Hronusov, V. V.

    2006-12-01

    We suggest a method of using external public servers for rearranging, restructuring and rapid sharing of environmental data for the purpose of quick presentations in numerous GE clients. The method allows to add new philosophy for the presentation (publication) of the data (mostly static) stored in the public domain (e.g., Blue Marble, Visible Earth, etc). - The new approach is generated by publishing freely accessible spreadsheets which contain enough information and links to the data. Due to the fact that most of the large depositories of the data on the environmental monitoring have rather simple net address system as well as simple hierarchy mostly based on the date and type of the data, it is possible to develop the http-based link to the file which contains the data. Publication of new data on the server is recorded by a simple entering a new address into a cell in the spreadsheet. At the moment we use the EditGrid (www.editgrid.com) system as a spreadsheet platform. The generation of kml-codes is achieved on the basis of XML data and XSLT procedures. Since the EditGride environment supports "fetch" and similar commands, it is possible to create"smart-adaptive" KML generation on the fly based on the data streams from RSS and XML sources. The previous GIS-based methods could combine hi-definition data combined from various sources, but large- scale comparisons of dynamic processes have been usually out of reach of the technology. The suggested method allows unlimited number of GE clients to view, review and compare dynamic and static process of previously un-combinable sources, and on unprecedent scales. The ease of automated or computer-assisted georeferencing has already led to translation about 3000 raster public domain imagery, point and linear data sources into GE-language. In addition the suggested method allows a user to create rapid animations to demonstrate dynamic processes; roducts of high demand in education, meteorology, volcanology and potentially in a number of industries. In general it is possible to state that the new approach, which we have tested on numerous projects, saves times and energy in creating huge amounts of georeferenced data of various kinds, and thus provided an excellent tools for education and science.

  12. Constructing Pairing-Friendly Elliptic Curves under Embedding Degree 1 for Securing Critical Infrastructures.

    PubMed

    Wang, Maocai; Dai, Guangming; Choo, Kim-Kwang Raymond; Jayaraman, Prem Prakash; Ranjan, Rajiv

    2016-01-01

    Information confidentiality is an essential requirement for cyber security in critical infrastructure. Identity-based cryptography, an increasingly popular branch of cryptography, is widely used to protect the information confidentiality in the critical infrastructure sector due to the ability to directly compute the user's public key based on the user's identity. However, computational requirements complicate the practical application of Identity-based cryptography. In order to improve the efficiency of identity-based cryptography, this paper presents an effective method to construct pairing-friendly elliptic curves with low hamming weight 4 under embedding degree 1. Based on the analysis of the Complex Multiplication(CM) method, the soundness of our method to calculate the characteristic of the finite field is proved. And then, three relative algorithms to construct pairing-friendly elliptic curve are put forward. 10 elliptic curves with low hamming weight 4 under 160 bits are presented to demonstrate the utility of our approach. Finally, the evaluation also indicates that it is more efficient to compute Tate pairing with our curves, than that of Bertoni et al.

  13. Constructing Pairing-Friendly Elliptic Curves under Embedding Degree 1 for Securing Critical Infrastructures

    PubMed Central

    Dai, Guangming

    2016-01-01

    Information confidentiality is an essential requirement for cyber security in critical infrastructure. Identity-based cryptography, an increasingly popular branch of cryptography, is widely used to protect the information confidentiality in the critical infrastructure sector due to the ability to directly compute the user’s public key based on the user’s identity. However, computational requirements complicate the practical application of Identity-based cryptography. In order to improve the efficiency of identity-based cryptography, this paper presents an effective method to construct pairing-friendly elliptic curves with low hamming weight 4 under embedding degree 1. Based on the analysis of the Complex Multiplication(CM) method, the soundness of our method to calculate the characteristic of the finite field is proved. And then, three relative algorithms to construct pairing-friendly elliptic curve are put forward. 10 elliptic curves with low hamming weight 4 under 160 bits are presented to demonstrate the utility of our approach. Finally, the evaluation also indicates that it is more efficient to compute Tate pairing with our curves, than that of Bertoni et al. PMID:27564373

  14. Methodology and Estimates of Scour at Selected Bridge Sites in Alaska

    USGS Publications Warehouse

    Heinrichs, Thomas A.; Kennedy, Ben W.; Langley, Dustin E.; Burrows, Robert L.

    2001-01-01

    The U.S. Geological Survey estimated scour depths at 325 bridges in Alaska as part of a cooperative agreement with the Alaska Department of Transportation and Public Facilities. The department selected these sites from approximately 806 State-owned bridges as potentially susceptible to scour during extreme floods. Pier scour and contraction scour were computed for the selected bridges by using methods recommended by the Federal Highway Administration. The U.S. Geological Survey used a four-step procedure to estimate scour: (1) Compute magnitudes of the 100- and 500-year floods. (2) Determine cross-section geometry and hydraulic properties for each bridge site. (3) Compute the water-surface profile for the 100- and 500-year floods. (4) Compute contraction and pier scour. This procedure is unique because the cross sections were developed from existing data on file to make a quantitative estimate of scour. This screening method has the advantage of providing scour depths and bed elevations for comparison with bridge-foundation elevations without the time and expense of a field survey. Four examples of bridge-scour analyses are summarized in the appendix.

  15. Using Microsoft Excel to compute the 5% overall site X/Q value and the 95th percentile of the distribution of doses to the nearest maximally exposed offsite individual (MEOI).

    PubMed

    Vickers, Linda D

    2010-05-01

    This paper describes the method using Microsoft Excel (Microsoft Corporation One Microsoft Way Redmond, WA 98052-6399) to compute the 5% overall site X/Q value and the 95th percentile of the distribution of doses to the nearest maximally exposed offsite individual (MEOI) in accordance with guidance from DOE-STD-3009-1994 and U.S. NRC Regulatory Guide 1.145-1982. The accurate determination of the 5% overall site X/Q value is the most important factor in the computation of the 95th percentile of the distribution of doses to the nearest MEOI. This method should be used to validate software codes that compute the X/Q. The 95th percentile of the distribution of doses to the nearest MEOI must be compared to the U.S. DOE Evaluation Guide of 25 rem to determine the relative severity of hazard to the public from a postulated, unmitigated design basis accident that involves an offsite release of radioactive material.

  16. A bibliometric analysis of natural language processing in medical research.

    PubMed

    Chen, Xieling; Xie, Haoran; Wang, Fu Lee; Liu, Ziqing; Xu, Juan; Hao, Tianyong

    2018-03-22

    Natural language processing (NLP) has become an increasingly significant role in advancing medicine. Rich research achievements of NLP methods and applications for medical information processing are available. It is of great significance to conduct a deep analysis to understand the recent development of NLP-empowered medical research field. However, limited study examining the research status of this field could be found. Therefore, this study aims to quantitatively assess the academic output of NLP in medical research field. We conducted a bibliometric analysis on NLP-empowered medical research publications retrieved from PubMed in the period 2007-2016. The analysis focused on three aspects. Firstly, the literature distribution characteristics were obtained with a statistics analysis method. Secondly, a network analysis method was used to reveal scientific collaboration relations. Finally, thematic discovery and evolution was reflected using an affinity propagation clustering method. There were 1405 NLP-empowered medical research publications published during the 10 years with an average annual growth rate of 18.39%. 10 most productive publication sources together contributed more than 50% of the total publications. The USA had the highest number of publications. A moderately significant correlation between country's publications and GDP per capita was revealed. Denny, Joshua C was the most productive author. Mayo Clinic was the most productive affiliation. The annual co-affiliation and co-country rates reached 64.04% and 15.79% in 2016, respectively. 10 main great thematic areas were identified including Computational biology, Terminology mining, Information extraction, Text classification, Social medium as data source, Information retrieval, etc. CONCLUSIONS: A bibliometric analysis of NLP-empowered medical research publications for uncovering the recent research status is presented. The results can assist relevant researchers, especially newcomers in understanding the research development systematically, seeking scientific cooperation partners, optimizing research topic choices and monitoring new scientific or technological activities.

  17. A Partially-Stirred Batch Reactor Model for Under-Ventilated Fire Dynamics

    NASA Astrophysics Data System (ADS)

    McDermott, Randall; Weinschenk, Craig

    2013-11-01

    A simple discrete quadrature method is developed for closure of the mean chemical source term in large-eddy simulations (LES) and implemented in the publicly available fire model, Fire Dynamics Simulator (FDS). The method is cast as a partially-stirred batch reactor model for each computational cell. The model has three distinct components: (1) a subgrid mixing environment, (2) a mixing model, and (3) a set of chemical rate laws. The subgrid probability density function (PDF) is described by a linear combination of Dirac delta functions with quadrature weights set to satisfy simple integral constraints for the computational cell. It is shown that under certain limiting assumptions, the present method reduces to the eddy dissipation concept (EDC). The model is used to predict carbon monoxide concentrations in direct numerical simulation (DNS) of a methane slot burner and in LES of an under-ventilated compartment fire.

  18. Active classifier selection for RGB-D object categorization using a Markov random field ensemble method

    NASA Astrophysics Data System (ADS)

    Durner, Maximilian; Márton, Zoltán.; Hillenbrand, Ulrich; Ali, Haider; Kleinsteuber, Martin

    2017-03-01

    In this work, a new ensemble method for the task of category recognition in different environments is presented. The focus is on service robotic perception in an open environment, where the robot's task is to recognize previously unseen objects of predefined categories, based on training on a public dataset. We propose an ensemble learning approach to be able to flexibly combine complementary sources of information (different state-of-the-art descriptors computed on color and depth images), based on a Markov Random Field (MRF). By exploiting its specific characteristics, the MRF ensemble method can also be executed as a Dynamic Classifier Selection (DCS) system. In the experiments, the committee- and topology-dependent performance boost of our ensemble is shown. Despite reduced computational costs and using less information, our strategy performs on the same level as common ensemble approaches. Finally, the impact of large differences between datasets is analyzed.

  19. Computational prediction of muon stopping sites using ab initio random structure searching (AIRSS)

    NASA Astrophysics Data System (ADS)

    Liborio, Leandro; Sturniolo, Simone; Jochym, Dominik

    2018-04-01

    The stopping site of the muon in a muon-spin relaxation experiment is in general unknown. There are some techniques that can be used to guess the muon stopping site, but they often rely on approximations and are not generally applicable to all cases. In this work, we propose a purely theoretical method to predict muon stopping sites in crystalline materials from first principles. The method is based on a combination of ab initio calculations, random structure searching, and machine learning, and it has successfully predicted the MuT and MuBC stopping sites of muonium in Si, diamond, and Ge, as well as the muonium stopping site in LiF, without any recourse to experimental results. The method makes use of Soprano, a Python library developed to aid ab initio computational crystallography, that was publicly released and contains all the software tools necessary to reproduce our analysis.

  20. Computing ordinary least-squares parameter estimates for the National Descriptive Model of Mercury in Fish

    USGS Publications Warehouse

    Donato, David I.

    2013-01-01

    A specialized technique is used to compute weighted ordinary least-squares (OLS) estimates of the parameters of the National Descriptive Model of Mercury in Fish (NDMMF) in less time using less computer memory than general methods. The characteristics of the NDMMF allow the two products X'X and X'y in the normal equations to be filled out in a second or two of computer time during a single pass through the N data observations. As a result, the matrix X does not have to be stored in computer memory and the computationally expensive matrix multiplications generally required to produce X'X and X'y do not have to be carried out. The normal equations may then be solved to determine the best-fit parameters in the OLS sense. The computational solution based on this specialized technique requires O(8p2+16p) bytes of computer memory for p parameters on a machine with 8-byte double-precision numbers. This publication includes a reference implementation of this technique and a Gaussian-elimination solver in preliminary custom software.

  1. Publication Bias in Methodological Computational Research.

    PubMed

    Boulesteix, Anne-Laure; Stierle, Veronika; Hapfelmeier, Alexander

    2015-01-01

    The problem of publication bias has long been discussed in research fields such as medicine. There is a consensus that publication bias is a reality and that solutions should be found to reduce it. In methodological computational research, including cancer informatics, publication bias may also be at work. The publication of negative research findings is certainly also a relevant issue, but has attracted very little attention to date. The present paper aims at providing a new formal framework to describe the notion of publication bias in the context of methodological computational research, facilitate and stimulate discussions on this topic, and increase awareness in the scientific community. We report an exemplary pilot study that aims at gaining experiences with the collection and analysis of information on unpublished research efforts with respect to publication bias, and we outline the encountered problems. Based on these experiences, we try to formalize the notion of publication bias.

  2. A Platform-Independent Plugin for Navigating Online Radiology Cases.

    PubMed

    Balkman, Jason D; Awan, Omer A

    2016-06-01

    Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.

  3. Theoretical basis of the DOE-2 building energy use analysis program

    NASA Astrophysics Data System (ADS)

    Curtis, R. B.

    1981-04-01

    A user-oriented, public domain, computer program was developed that will enable architects and engineers to perform design and retrofit studies of the energy-use of buildings under realistic weather conditions. The DOE-2.1A has been named by the US DOE as the standard evaluation technique for the Congressionally mandated building energy performance standards (BEPS). A number of program design decisions were made that determine the breadth of applicability of DOE-2.1. Such design decisions are intrinsic to all building energy use analysis computer programs and determine the types of buildings or the kind of HVAC systems that can be modeled. In particular, the weighting factor method used in DOE-2 has both advantages and disadvantages relative to other computer programs.

  4. Quantum communication and information processing

    NASA Astrophysics Data System (ADS)

    Beals, Travis Roland

    Quantum computers enable dramatically more efficient algorithms for solving certain classes of computational problems, but, in doing so, they create new problems. In particular, Shor's Algorithm allows for efficient cryptanalysis of many public-key cryptosystems. As public key cryptography is a critical component of present-day electronic commerce, it is crucial that a working, secure replacement be found. Quantum key distribution (QKD), first developed by C.H. Bennett and G. Brassard, offers a partial solution, but many challenges remain, both in terms of hardware limitations and in designing cryptographic protocols for a viable large-scale quantum communication infrastructure. In Part I, I investigate optical lattice-based approaches to quantum information processing. I look at details of a proposal for an optical lattice-based quantum computer, which could potentially be used for both quantum communications and for more sophisticated quantum information processing. In Part III, I propose a method for converting and storing photonic quantum bits in the internal state of periodically-spaced neutral atoms by generating and manipulating a photonic band gap and associated defect states. In Part II, I present a cryptographic protocol which allows for the extension of present-day QKD networks over much longer distances without the development of new hardware. I also present a second, related protocol which effectively solves the authentication problem faced by a large QKD network, thus making QKD a viable, information-theoretic secure replacement for public key cryptosystems.

  5. Computers in Public Education Study.

    ERIC Educational Resources Information Center

    HBJ Enterprises, Highland Park, NJ.

    This survey conducted for the National Institute of Education reports the use of computers in U.S. public schools in the areas of instructional computing, student accounting, management of educational resources, research, guidance, testing, and library applications. From a stratified random sample of 1800 schools in varying geographic areas and…

  6. 36 CFR 1194.26 - Desktop and portable computers.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...

  7. 36 CFR 1194.26 - Desktop and portable computers.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...

  8. 36 CFR 1194.26 - Desktop and portable computers.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...

  9. 36 CFR 1194.26 - Desktop and portable computers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...

  10. Study of basic computer competence among public health nurses in Taiwan.

    PubMed

    Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling

    2004-03-01

    Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.

  11. Extension of research data repository system to support direct compute access to biomedical datasets: enhancing Dataverse to support large datasets.

    PubMed

    McKinney, Bill; Meyer, Peter A; Crosas, Mercè; Sliz, Piotr

    2017-01-01

    Access to experimental X-ray diffraction image data is important for validation and reproduction of macromolecular models and indispensable for the development of structural biology processing methods. In response to the evolving needs of the structural biology community, we recently established a diffraction data publication system, the Structural Biology Data Grid (SBDG, data.sbgrid.org), to preserve primary experimental datasets supporting scientific publications. All datasets published through the SBDG are freely available to the research community under a public domain dedication license, with metadata compliant with the DataCite Schema (schema.datacite.org). A proof-of-concept study demonstrated community interest and utility. Publication of large datasets is a challenge shared by several fields, and the SBDG has begun collaborating with the Institute for Quantitative Social Science at Harvard University to extend the Dataverse (dataverse.org) open-source data repository system to structural biology datasets. Several extensions are necessary to support the size and metadata requirements for structural biology datasets. In this paper, we describe one such extension-functionality supporting preservation of file system structure within Dataverse-which is essential for both in-place computation and supporting non-HTTP data transfers. © 2016 New York Academy of Sciences.

  12. Complex Conjugated certificateless-based signcryption with differential integrated factor for secured message communication in mobile network

    PubMed Central

    Rajagopalan, S. P.

    2017-01-01

    Certificateless-based signcryption overcomes inherent shortcomings in traditional Public Key Infrastructure (PKI) and Key Escrow problem. It imparts efficient methods to design PKIs with public verifiability and cipher text authenticity with minimum dependency. As a classic primitive in public key cryptography, signcryption performs validity of cipher text without decryption by combining authentication, confidentiality, public verifiability and cipher text authenticity much more efficiently than the traditional approach. In this paper, we first define a security model for certificateless-based signcryption called, Complex Conjugate Differential Integrated Factor (CC-DIF) scheme by introducing complex conjugates through introduction of the security parameter and improving secured message distribution rate. However, both partial private key and secret value changes with respect to time. To overcome this weakness, a new certificateless-based signcryption scheme is proposed by setting the private key through Differential (Diff) Equation using an Integration Factor (DiffEIF), minimizing computational cost and communication overhead. The scheme is therefore said to be proven secure (i.e. improving the secured message distributing rate) against certificateless access control and signcryption-based scheme. In addition, compared with the three other existing schemes, the CC-DIF scheme has the least computational cost and communication overhead for secured message communication in mobile network. PMID:29040290

  13. Complex Conjugated certificateless-based signcryption with differential integrated factor for secured message communication in mobile network.

    PubMed

    Alagarsamy, Sumithra; Rajagopalan, S P

    2017-01-01

    Certificateless-based signcryption overcomes inherent shortcomings in traditional Public Key Infrastructure (PKI) and Key Escrow problem. It imparts efficient methods to design PKIs with public verifiability and cipher text authenticity with minimum dependency. As a classic primitive in public key cryptography, signcryption performs validity of cipher text without decryption by combining authentication, confidentiality, public verifiability and cipher text authenticity much more efficiently than the traditional approach. In this paper, we first define a security model for certificateless-based signcryption called, Complex Conjugate Differential Integrated Factor (CC-DIF) scheme by introducing complex conjugates through introduction of the security parameter and improving secured message distribution rate. However, both partial private key and secret value changes with respect to time. To overcome this weakness, a new certificateless-based signcryption scheme is proposed by setting the private key through Differential (Diff) Equation using an Integration Factor (DiffEIF), minimizing computational cost and communication overhead. The scheme is therefore said to be proven secure (i.e. improving the secured message distributing rate) against certificateless access control and signcryption-based scheme. In addition, compared with the three other existing schemes, the CC-DIF scheme has the least computational cost and communication overhead for secured message communication in mobile network.

  14. Optic cup segmentation from fundus images for glaucoma diagnosis.

    PubMed

    Hu, Man; Zhu, Chenghao; Li, Xiaoxing; Xu, Yongli

    2017-01-02

    Glaucoma is a serious disease that can cause complete, permanent blindness, and its early diagnosis is very difficult. In recent years, computer-aided screening and diagnosis of glaucoma has made considerable progress. The optic cup segmentation from fundus images is an extremely important part for the computer-aided screening and diagnosis of glaucoma. This paper presented an automatic optic cup segmentation method that used both color difference information and vessel bends information from fundus images to determine the optic cup boundary. During the implementation of this algorithm, not only were the locations of the 2 types of information points used, but also the confidences of the information points were evaluated. In this way, the information points with higher confidence levels contributed more to the determination of the final cup boundary. The proposed method was evaluated using a public database for fundus images. The experimental results demonstrated that the cup boundaries obtained by the proposed method were more consistent than existing methods with the results obtained by ophthalmologists.

  15. Multiscale Mathematics for Biomass Conversion to Renewable Hydrogen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katsoulakis, Markos

    2014-08-09

    Our two key accomplishments in the first three years were towards the development of, (1) a mathematically rigorous and at the same time computationally flexible framework for parallelization of Kinetic Monte Carlo methods, and its implementation on GPUs, and (2) spatial multilevel coarse-graining methods for Monte Carlo sampling and molecular simulation. A common underlying theme in both these lines of our work is the development of numerical methods which are at the same time both computationally efficient and reliable, the latter in the sense that they provide controlled-error approximations for coarse observables of the simulated molecular systems. Finally, our keymore » accomplishment in the last year of the grant is that we started developing (3) pathwise information theory-based and goal-oriented sensitivity analysis and parameter identification methods for complex high-dimensional dynamics and in particular of nonequilibrium extended (high-dimensional) systems. We discuss these three research directions in some detail below, along with the related publications.« less

  16. Methods for extracting social network data from chatroom logs

    NASA Astrophysics Data System (ADS)

    Osesina, O. Isaac; McIntire, John P.; Havig, Paul R.; Geiselman, Eric E.; Bartley, Cecilia; Tudoreanu, M. Eduard

    2012-06-01

    Identifying social network (SN) links within computer-mediated communication platforms without explicit relations among users poses challenges to researchers. Our research aims to extract SN links in internet chat with multiple users engaging in synchronous overlapping conversations all displayed in a single stream. We approached this problem using three methods which build on previous research. Response-time analysis builds on temporal proximity of chat messages; word context usage builds on keywords analysis and direct addressing which infers links by identifying the intended message recipient from the screen name (nickname) referenced in the message [1]. Our analysis of word usage within the chat stream also provides contexts for the extracted SN links. To test the capability of our methods, we used publicly available data from Internet Relay Chat (IRC), a real-time computer-mediated communication (CMC) tool used by millions of people around the world. The extraction performances of individual methods and their hybrids were assessed relative to a ground truth (determined a priori via manual scoring).

  17. Optic cup segmentation from fundus images for glaucoma diagnosis

    PubMed Central

    Hu, Man; Zhu, Chenghao; Li, Xiaoxing; Xu, Yongli

    2017-01-01

    ABSTRACT Glaucoma is a serious disease that can cause complete, permanent blindness, and its early diagnosis is very difficult. In recent years, computer-aided screening and diagnosis of glaucoma has made considerable progress. The optic cup segmentation from fundus images is an extremely important part for the computer-aided screening and diagnosis of glaucoma. This paper presented an automatic optic cup segmentation method that used both color difference information and vessel bends information from fundus images to determine the optic cup boundary. During the implementation of this algorithm, not only were the locations of the 2 types of information points used, but also the confidences of the information points were evaluated. In this way, the information points with higher confidence levels contributed more to the determination of the final cup boundary. The proposed method was evaluated using a public database for fundus images. The experimental results demonstrated that the cup boundaries obtained by the proposed method were more consistent than existing methods with the results obtained by ophthalmologists. PMID:27764542

  18. The iPad as an Instructional Tool: An Examination of Teacher Implementation Experiences

    ERIC Educational Resources Information Center

    Benton, Brandie Kay

    2012-01-01

    At present, handheld devices and tablet computers are infiltrating public schools across the nation, the most popular model being the Apple iPad. Schools and teachers are attempting to integrate the devices and are using a variety of methods and models for implementation. The purpose of this study was to examine the implementation of the iPad as…

  19. Effects of using visualization and animation in presentations to communities about forest succession and fire behavior potential

    Treesearch

    Jane Kapler Smith; Donald E. Zimmerman; Carol Akerelrea; Garrett O' Keefe

    2008-01-01

    Natural resource managers use a variety of computer-mediated presentation methods to communicate management practices to the public. We explored the effects of using the Stand Visualization System to visualize and animate predictions from the Forest Vegetation Simulator-Fire and Fuels Extension in presentations explaining forest succession (forest growth and change...

  20. Modeling Human Behavior with Fuzzy and Soft Computing Methods

    DTIC Science & Technology

    2017-12-13

    REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 The public reporting burden for this collection of information is estimated to average 1...completing and reviewing the collection of information . Send comments regarding this burden estimate or any other aspect of this collection of... information , including suggestions for reducing the burden, to Department of Defense, Washington Headquarters Services, Directorate for Information

  1. 77 FR 66729 - National Oil and Hazardous Substances Pollution Contingency Plan; Revision To Increase Public...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-07

    ... technology, to include computer telecommunications or other electronic means, that the lead agency is... assess the capacity and resources of the public to utilize and maintain an electronic- or computer... the technology, to include computer telecommunications or other electronic means, that the lead agency...

  2. 77 FR 64834 - Computational Fluid Dynamics Best Practice Guidelines for Dry Cask Applications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-23

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0250] Computational Fluid Dynamics Best Practice... public comments on draft NUREG-2152, ``Computational Fluid Dynamics Best Practice Guidelines for Dry Cask... System (ADAMS): You may access publicly-available documents online in the NRC Library at http://www.nrc...

  3. How You Can Protect Public Access Computers "and" Their Users

    ERIC Educational Resources Information Center

    Huang, Phil

    2007-01-01

    By providing the public with online computing facilities, librarians make available a world of information resources beyond their traditional print materials. Internet-connected computers in libraries greatly enhance the opportunity for patrons to enjoy the benefits of the digital age. Unfortunately, as hackers become more sophisticated and…

  4. Use of Lean Response to Improve Pandemic Influenza Surge in Public Health Laboratories

    PubMed Central

    Chang, Yin; Prystajecky, Natalie; Petric, Martin; Mak, Annie; Abbott, Brendan; Paris, Benjamin; Decker, K.C.; Pittenger, Lauren; Guercio, Steven; Stott, Jeff; Miller, Joseph D.

    2012-01-01

    A novel influenza A (H1N1) virus detected in April 2009 rapidly spread around the world. North American provincial and state laboratories have well-defined roles and responsibilities, including providing accurate, timely test results for patients and information for regional public health and other decision makers. We used the multidisciplinary response and rapid implementation of process changes based on Lean methods at the provincial public health laboratory in British Columbia, Canada, to improve laboratory surge capacity in the 2009 influenza pandemic. Observed and computer simulating evaluation results from rapid processes changes showed that use of Lean tools successfully expanded surge capacity, which enabled response to the 10-fold increase in testing demands. PMID:22257385

  5. BioInfra.Prot: A comprehensive proteomics workflow including data standardization, protein inference, expression analysis and data publication.

    PubMed

    Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin

    2017-11-10

    The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  6. 42 CFR 121.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false Definitions. 121.2 Section 121.2 Public Health... PROCUREMENT AND TRANSPLANTATION NETWORK § 121.2 Definitions. As used in this part— Act means the Public Health..., transplant recipient, or organ donor. OPTN computer match program means a set of computer-based instructions...

  7. 42 CFR 121.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Definitions. 121.2 Section 121.2 Public Health... PROCUREMENT AND TRANSPLANTATION NETWORK § 121.2 Definitions. As used in this part— Act means the Public Health..., transplant recipient, or organ donor. OPTN computer match program means a set of computer-based instructions...

  8. Returns on Investment in California County Departments of Public Health.

    PubMed

    Brown, Timothy T

    2016-08-01

    To estimate the average return on investment for the overall activities of county departments of public health in California. I gathered the elements necessary to estimate the average return on investment for county departments of public health in California during the period 2001 to 2008-2009. These came from peer-reviewed journal articles published as part of a larger project to develop a method for determining return on investment for public health by using a health economics framework. I combined these elements by using the standard formula for computing return on investment, and performed a sensitivity analysis. Then I compared the return on investment for county departments of public health with the returns on investment generated for various aspects of medical care. The estimated return on investment from $1 invested in county departments of public health in California ranges from $67.07 to $88.21. The very large estimated return on investment for California county departments of public health relative to the return on investment for selected aspects of medical care suggests that public health is a wise investment.

  9. Covariance analysis for evaluating head trackers

    NASA Astrophysics Data System (ADS)

    Kang, Donghoon

    2017-10-01

    Existing methods for evaluating the performance of head trackers usually rely on publicly available face databases, which contain facial images and the ground truths of their corresponding head orientations. However, most of the existing publicly available face databases are constructed by assuming that a frontal head orientation can be determined by compelling the person under examination to look straight ahead at the camera on the first video frame. Since nobody can accurately direct one's head toward the camera, this assumption may be unrealistic. Rather than obtaining estimation errors, we present a method for computing the covariance of estimation error rotations to evaluate the reliability of head trackers. As an uncertainty measure of estimators, the Schatten 2-norm of a square root of error covariance (or the algebraic average of relative error angles) can be used. The merit of the proposed method is that it does not disturb the person under examination by asking him to direct his head toward certain directions. Experimental results using real data validate the usefulness of our method.

  10. Comparison of several methods for estimating low speed stability derivatives

    NASA Technical Reports Server (NTRS)

    Fletcher, H. S.

    1971-01-01

    Methods presented in five different publications have been used to estimate the low-speed stability derivatives of two unpowered airplane configurations. One configuration had unswept lifting surfaces, the other configuration was the D-558-II swept-wing research airplane. The results of the computations were compared with each other, with existing wind-tunnel data, and with flight-test data for the D-558-II configuration to assess the relative merits of the methods for estimating derivatives. The results of the study indicated that, in general, for low subsonic speeds, no one text appeared consistently better for estimating all derivatives.

  11. Sentiment analysis of Chinese microblogging based on sentiment ontology: a case study of `7.23 Wenzhou Train Collision'

    NASA Astrophysics Data System (ADS)

    Shi, Wei; Wang, Hongwei; He, Shaoyi

    2013-12-01

    Sentiment analysis of microblogging texts can facilitate both organisations' public opinion monitoring and governments' response strategies development. Nevertheless, most of the existing analysis methods are conducted on Twitter, lacking of sentiment analysis of Chinese microblogging (Weibo), and they generally rely on a large number of manually annotated training or machine learning to perform sentiment classification, yielding with difficulties in application. This paper addresses these problems and employs a sentiment ontology model to examine sentiment analysis of Chinese microblogging. We conduct a sentiment analysis of all public microblogging posts about '7.23 Wenzhou Train Collision' broadcasted by Sina microblogging users between 23 July and 1 August 2011. For every day in this time period, we first extract eight dimensions of sentiment (expect, joy, love, surprise, anxiety, sorrow, angry, and hate), and then build fuzzy sentiment ontology based on HowNet and semantic similarity for sentiment analysis; we also establish computing methods of influence and sentiment of microblogging texts; and we finally explore the change of public sentiment after '7.23 Wenzhou Train Collision'. The results show that the established sentiment analysis method has excellent application, and the change of different emotional values can reflect the success or failure of guiding the public opinion by the government.

  12. 77 FR 62231 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-12

    .... Facilities update. ESnet-5. Early Career technical talks. Co-design. Innovative and Novel Computational Impact on Theory and Experiment (INCITE). Public Comment (10-minute rule). Public Participation: The...

  13. 78 FR 6087 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building... Theory and Experiment (INCITE) Public Comment (10-minute rule) Public Participation: The meeting is open...

  14. Feasibility of Acoustic Doppler Velocity Meters for the Production of Discharge Records from U.S. Geological Survey Streamflow-Gaging Stations

    USGS Publications Warehouse

    Morlock, Scott E.; Nguyen, Hieu T.; Ross, Jerry H.

    2002-01-01

    It is feasible to use acoustic Doppler velocity meters (ADVM's) installed at U.S. Geological Survey (USGS) streamflow-gaging stations to compute records of river discharge. ADVM's are small acoustic current meters that use the Doppler principle to measure water velocities in a two-dimensional plane. Records of river discharge can be computed from stage and ADVM velocity data using the 'index velocity' method. The ADVM-measured velocities are used as an estimator or 'index' of the mean velocity in the channel. In evaluations of ADVM's for the computation of records of river discharge, the USGS installed ADVM's at three streamflow-gaging stations in Indiana: Kankakee River at Davis, Fall Creek at Millersville, and Iroquois River near Foresman. The ADVM evaluation study period was from June 1999 to February 2001. Discharge records were computed, using ADVM data from each station. Discharge records also were computed using conventional stage-discharge methods of the USGS. The records produced from ADVM and conventional methods were compared with discharge record hydrographs and statistics. Overall, the records compared closely from the Kankakee River and Fall Creek stations. For the Iroquois River station, variable backwater was present and affected the comparison; because the ADVM record compensates for backwater, the ADVM record may be superior to the conventional record. For the three stations, the ADVM records were judged to be of a quality acceptable to USGS standards for publications and near realtime ADVM-computed discharges are served on USGS real-time data World Wide Web pages.

  15. Multilinear Computing and Multilinear Algebraic Geometry

    DTIC Science & Technology

    2016-08-10

    instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send...performance period of this project. 15. SUBJECT TERMS Tensors , multilinearity, algebraic geometry, numerical computations, computational tractability, high...Reset DISTRIBUTION A: Distribution approved for public release. DISTRIBUTION A: Distribution approved for public release. INSTRUCTIONS FOR COMPLETING

  16. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., COMPETITIVE MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Risk Basis § 417.588 Computation... 42 Public Health 3 2012-10-01 2012-10-01 false Computation of adjusted average per capita cost (AAPCC). 417.588 Section 417.588 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF...

  17. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Risk Basis § 417.588 Computation of... 42 Public Health 3 2011-10-01 2011-10-01 false Computation of adjusted average per capita cost (AAPCC). 417.588 Section 417.588 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF...

  18. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Risk Basis § 417.588 Computation of... 42 Public Health 3 2010-10-01 2010-10-01 false Computation of adjusted average per capita cost (AAPCC). 417.588 Section 417.588 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF...

  19. Computer-Based National Information Systems. Technology and Public Policy Issues.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Office of Technology Assessment.

    A general introduction to computer based national information systems, and the context and basis for future studies are provided in this report. Chapter One, the introduction, summarizes computers and information systems and their relation to society, the structure of information policy issues, and public policy issues. Chapter Two describes the…

  20. 41 CFR 301-11.532 - How should we compute the employee's ITRA?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 4 2010-07-01 2010-07-01 false How should we compute the employee's ITRA? 301-11.532 Section 301-11.532 Public Contracts and Property Management Federal...-11.532 How should we compute the employee's ITRA? You should follow the procedures prescribed for the...

  1. Campus Computing 1993. The USC National Survey of Desktop Computing in Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.; Eastman, Skip

    A national survey of desktop computing in higher education was conducted in spring and summer 1993 at over 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges and community colleges. Respondents (N=1011) were individuals specifically responsible for the operation and future…

  2. The "Magic" of Wireless Access in the Library

    ERIC Educational Resources Information Center

    Balas, Janet L.

    2006-01-01

    It seems that the demand for public access computers grows exponentially every time a library network is expanded, making it impossible to ever have enough computers available for patrons. One solution that many libraries are implementing to ease the demand for public computer use is to offer wireless technology that allows patrons to bring in…

  3. Adding It Up: Is Computer Use Associated with Higher Achievement in Public Elementary Mathematics Classrooms?

    ERIC Educational Resources Information Center

    Kao, Linda Lee

    2009-01-01

    Despite support for technology in schools, there is little evidence indicating whether using computers in public elementary mathematics classrooms is associated with improved outcomes for students. This exploratory study examined data from the Early Childhood Longitudinal Study, investigating whether students' frequency of computer use was related…

  4. 36 CFR § 1194.26 - Desktop and portable computers.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true Desktop and portable computers. § 1194.26 Section § 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...

  5. 76 FR 12398 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Bureau of the Public Debt (BPD...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-07

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2010-0034] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Bureau of the Public Debt (BPD))--Match Number 1304 AGENCY: Social Security... as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection...

  6. Campus Computing 1991. The EDUCOM-USC Survey of Desktop Computing in Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.; Eastman, Skip

    A national survey of desktop computing in higher education was conducted in 1991 of 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges, and community colleges. Respondents (N=1099) were individuals specifically responsible for the operation and future direction of academic…

  7. Implementation of Audio Computer-Assisted Interviewing Software in HIV/AIDS Research

    PubMed Central

    Pluhar, Erika; Yeager, Katherine A.; Corkran, Carol; McCarty, Frances; Holstad, Marcia McDonnell; Denzmore-Nwagbara, Pamela; Fielder, Bridget; DiIorio, Colleen

    2007-01-01

    Computer assisted interviewing (CAI) has begun to play a more prominent role in HIV/AIDS prevention research. Despite the increased popularity of CAI, particularly audio computer assisted self-interviewing (ACASI), some research teams are still reluctant to implement ACASI technology due to lack of familiarity with the practical issues related to using these software packages. The purpose of this paper is to describe the implementation of one particular ACASI software package, the Questionnaire Development System™ (QDS™), in several nursing and HIV/AIDS prevention research settings. We present acceptability and satisfaction data from two large-scale public health studies in which we have used QDS with diverse populations. We also address issues related to developing and programming a questionnaire, discuss practical strategies related to planning for and implementing ACASI in the field, including selecting equipment, training staff, and collecting and transferring data, and summarize advantages and disadvantages of computer assisted research methods. PMID:17662924

  8. 41 CFR 60-30.3 - Computation of time.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 41 Public Contracts and Property Management 1 2011-07-01 2009-07-01 true Computation of time. 60-30.3 Section 60-30.3 Public Contracts and Property Management Other Provisions Relating to Public... these rules or in an order issued hereunder, the time begins with the day following the act, event, or...

  9. 41 CFR 60-30.3 - Computation of time.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 1 2010-07-01 2010-07-01 true Computation of time. 60-30.3 Section 60-30.3 Public Contracts and Property Management Other Provisions Relating to Public... these rules or in an order issued hereunder, the time begins with the day following the act, event, or...

  10. 45 CFR 233.24 - Retrospective budgeting; determining eligibility and computing the assistance payment in the...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 2 2013-10-01 2012-10-01 true Retrospective budgeting; determining eligibility and computing the assistance payment in the initial one or two months. 233.24 Section 233.24 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS...

  11. 45 CFR 233.24 - Retrospective budgeting; determining eligibility and computing the assistance payment in the...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 2 2010-10-01 2010-10-01 false Retrospective budgeting; determining eligibility and computing the assistance payment in the initial one or two months. 233.24 Section 233.24 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS...

  12. 45 CFR 233.25 - Retrospective budgeting; computing the assistance payment after the initial one or two months.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 2 2010-10-01 2010-10-01 false Retrospective budgeting; computing the assistance payment after the initial one or two months. 233.25 Section 233.25 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN AND...

  13. 45 CFR 233.25 - Retrospective budgeting; computing the assistance payment after the initial one or two months.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 2 2013-10-01 2012-10-01 true Retrospective budgeting; computing the assistance payment after the initial one or two months. 233.25 Section 233.25 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN AND...

  14. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.

    1988-01-01

    A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).

  15. Soft computing methods for geoidal height transformation

    NASA Astrophysics Data System (ADS)

    Akyilmaz, O.; Özlüdemir, M. T.; Ayan, T.; Çelik, R. N.

    2009-07-01

    Soft computing techniques, such as fuzzy logic and artificial neural network (ANN) approaches, have enabled researchers to create precise models for use in many scientific and engineering applications. Applications that can be employed in geodetic studies include the estimation of earth rotation parameters and the determination of mean sea level changes. Another important field of geodesy in which these computing techniques can be applied is geoidal height transformation. We report here our use of a conventional polynomial model, the Adaptive Network-based Fuzzy (or in some publications, Adaptive Neuro-Fuzzy) Inference System (ANFIS), an ANN and a modified ANN approach to approximate geoid heights. These approximation models have been tested on a number of test points. The results obtained through the transformation processes from ellipsoidal heights into local levelling heights have also been compared.

  16. Computing the Effects of Strain on Electronic States: A Survey of Methods and Issues

    DTIC Science & Technology

    2012-12-01

    DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT We present a...lays the foundations for first-principles approaches, including discussion of spin-orbit coupling. Section 3 presents an overview of empirical...addition and removal energies of the independent-electron approximation. For simplicity, the energy levels in the figure have been presented as if they

  17. Computational Biomathematics: Toward Optimal Control of Complex Biological Systems

    DTIC Science & Technology

    2016-09-26

    The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...comments regarding this burden estimate or any other aspect of this collection of information, including suggesstions for reducing this burden, to...equations seems daunting. However, we are currently working on parameter estimation methods that show some promise. In this approach, we generate data from

  18. Constrained Fisher Scoring for a Mixture of Factor Analyzers

    DTIC Science & Technology

    2016-09-01

    expectation -maximization algorithm with similar computational requirements. Lastly, we demonstrate the efficacy of the proposed method for learning a... expectation maximization 44 Gene T Whipps 301 394 2372Unclassified Unclassified Unclassified UU ii Approved for public release; distribution is unlimited...14 3.6 Relationship with Expectation -Maximization 16 4. Simulation Examples 16 4.1 Synthetic MFA Example 17 4.2 Manifold Learning Example 22 5

  19. Stream temperature investigations: field and analytic methods

    USGS Publications Warehouse

    Bartholow, J.M.

    1989-01-01

    Alternative public domain stream and reservoir temperature models are contrasted with SNTEMP. A distinction is made between steady-flow and dynamic-flow models and their respective capabilities. Regression models are offered as an alternative approach for some situations, with appropriate mathematical formulas suggested. Appendices provide information on State and Federal agencies that are good data sources, vendors for field instrumentation, and small computer programs useful in data reduction.

  20. Least-squares Legendre spectral element solutions to sound propagation problems.

    PubMed

    Lin, W H

    2001-02-01

    This paper presents a novel algorithm and numerical results of sound wave propagation. The method is based on a least-squares Legendre spectral element approach for spatial discretization and the Crank-Nicolson [Proc. Cambridge Philos. Soc. 43, 50-67 (1947)] and Adams-Bashforth [D. Gottlieb and S. A. Orszag, Numerical Analysis of Spectral Methods: Theory and Applications (CBMS-NSF Monograph, Siam 1977)] schemes for temporal discretization to solve the linearized acoustic field equations for sound propagation. Two types of NASA Computational Aeroacoustics (CAA) Workshop benchmark problems [ICASE/LaRC Workshop on Benchmark Problems in Computational Aeroacoustics, edited by J. C. Hardin, J. R. Ristorcelli, and C. K. W. Tam, NASA Conference Publication 3300, 1995a] are considered: a narrow Gaussian sound wave propagating in a one-dimensional space without flows, and the reflection of a two-dimensional acoustic pulse off a rigid wall in the presence of a uniform flow of Mach 0.5 in a semi-infinite space. The first problem was used to examine the numerical dispersion and dissipation characteristics of the proposed algorithm. The second problem was to demonstrate the capability of the algorithm in treating sound propagation in a flow. Comparisons were made of the computed results with analytical results and results obtained by other methods. It is shown that all results computed by the present method are in good agreement with the analytical solutions and results of the first problem agree very well with those predicted by other schemes.

  1. Advances in medical image computing.

    PubMed

    Tolxdorff, T; Deserno, T M; Handels, H; Meinzer, H-P

    2009-01-01

    Medical image computing has become a key technology in high-tech applications in medicine and an ubiquitous part of modern imaging systems and the related processes of clinical diagnosis and intervention. Over the past years significant progress has been made in the field, both on methodological and on application level. Despite this progress there are still big challenges to meet in order to establish image processing routinely in health care. In this issue, selected contributions of the German Conference on Medical Image Processing (BVM) are assembled to present latest advances in the field of medical image computing. The winners of scientific awards of the German Conference on Medical Image Processing (BVM) 2008 were invited to submit a manuscript on their latest developments and results for possible publication in Methods of Information in Medicine. Finally, seven excellent papers were selected to describe important aspects of recent advances in the field of medical image processing. The selected papers give an impression of the breadth and heterogeneity of new developments. New methods for improved image segmentation, non-linear image registration and modeling of organs are presented together with applications of image analysis methods in different medical disciplines. Furthermore, state-of-the-art tools and techniques to support the development and evaluation of medical image processing systems in practice are described. The selected articles describe different aspects of the intense development in medical image computing. The image processing methods presented enable new insights into the patient's image data and have the future potential to improve medical diagnostics and patient treatment.

  2. Impact of the mass media on calls to the CDC National AIDS Hotline.

    PubMed

    Fan, D P

    1996-06-01

    This paper considers new computer methodologies for assessing the impact of different types of public health information. The example used public service announcements (PSAs) and mass media news to predict the volume of attempts to call the CDC National AIDS Hotline from December 1992 through to the end of 1993. The analysis relied solely on data from electronic databases. Newspaper stories and television news transcripts were obtained from the NEXIS electronic database and were scored by machine for AIDS coverage. The PSA database was generated by computer monitoring of advertising distributed by the Centers for Disease Control and Prevention (CDC) and by others. The volume of call attempts was collected automatically by the public branch exchange (PBX) of the Hotline telephone system. The call attempts, the PSAs and the news story data were related to each other using both a standard time series method and the statistical model of ideodynamics. The analysis indicated that the only significant explanatory variable for the call attempts was PSAs produced by the CDC. One possible explanation was that these commercials all included the Hotline telephone number while the other information sources did not.

  3. Protein-protein interaction predictions using text mining methods.

    PubMed

    Papanikolaou, Nikolas; Pavlopoulos, Georgios A; Theodosiou, Theodosios; Iliopoulos, Ioannis

    2015-03-01

    It is beyond any doubt that proteins and their interactions play an essential role in most complex biological processes. The understanding of their function individually, but also in the form of protein complexes is of a great importance. Nowadays, despite the plethora of various high-throughput experimental approaches for detecting protein-protein interactions, many computational methods aiming to predict new interactions have appeared and gained interest. In this review, we focus on text-mining based computational methodologies, aiming to extract information for proteins and their interactions from public repositories such as literature and various biological databases. We discuss their strengths, their weaknesses and how they complement existing experimental techniques by simultaneously commenting on the biological databases which hold such information and the benchmark datasets that can be used for evaluating new tools. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. A review of computer-aided design/computer-aided manufacture techniques for removable denture fabrication.

    PubMed

    Bilgin, Mehmet Selim; Baytaroğlu, Ebru Nur; Erdem, Ali; Dilber, Erhan

    2016-01-01

    The aim of this review was to investigate usage of computer-aided design/computer-aided manufacture (CAD/CAM) such as milling and rapid prototyping (RP) technologies for removable denture fabrication. An electronic search was conducted in the PubMed/MEDLINE, ScienceDirect, Google Scholar, and Web of Science databases. Databases were searched from 1987 to 2014. The search was performed using a variety of keywords including CAD/CAM, complete/partial dentures, RP, rapid manufacturing, digitally designed, milled, computerized, and machined. The identified developments (in chronological order), techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication are summarized. Using a variety of keywords and aiming to find the topic, 78 publications were initially searched. For the main topic, the abstract of these 78 articles were scanned, and 52 publications were selected for reading in detail. Full-text of these articles was gained and searched in detail. Totally, 40 articles that discussed the techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication and the articles were incorporated in this review. Totally, 16 of the papers summarized in the table. Following review of all relevant publications, it can be concluded that current innovations and technological developments of CAD/CAM and RP allow the digitally planning and manufacturing of removable dentures from start to finish. As a result according to the literature review CAD/CAM techniques and supportive maxillomandibular relationship transfer devices are growing fast. In the close future, fabricating removable dentures will become medical informatics instead of needing a technical staff and procedures. However the methods have several limitations for now.

  5. A review of computer-aided design/computer-aided manufacture techniques for removable denture fabrication

    PubMed Central

    Bilgin, Mehmet Selim; Baytaroğlu, Ebru Nur; Erdem, Ali; Dilber, Erhan

    2016-01-01

    The aim of this review was to investigate usage of computer-aided design/computer-aided manufacture (CAD/CAM) such as milling and rapid prototyping (RP) technologies for removable denture fabrication. An electronic search was conducted in the PubMed/MEDLINE, ScienceDirect, Google Scholar, and Web of Science databases. Databases were searched from 1987 to 2014. The search was performed using a variety of keywords including CAD/CAM, complete/partial dentures, RP, rapid manufacturing, digitally designed, milled, computerized, and machined. The identified developments (in chronological order), techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication are summarized. Using a variety of keywords and aiming to find the topic, 78 publications were initially searched. For the main topic, the abstract of these 78 articles were scanned, and 52 publications were selected for reading in detail. Full-text of these articles was gained and searched in detail. Totally, 40 articles that discussed the techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication and the articles were incorporated in this review. Totally, 16 of the papers summarized in the table. Following review of all relevant publications, it can be concluded that current innovations and technological developments of CAD/CAM and RP allow the digitally planning and manufacturing of removable dentures from start to finish. As a result according to the literature review CAD/CAM techniques and supportive maxillomandibular relationship transfer devices are growing fast. In the close future, fabricating removable dentures will become medical informatics instead of needing a technical staff and procedures. However the methods have several limitations for now. PMID:27095912

  6. Sum and mean. Standard programs for activation analysis.

    PubMed

    Lindstrom, R M

    1994-01-01

    Two computer programs in use for over a decade in the Nuclear Methods Group at NIST illustrate the utility of standard software: programs widely available and widely used, in which (ideally) well-tested public algorithms produce results that are well understood, and thereby capable of comparison, within the community of users. Sum interactively computes the position, net area, and uncertainty of the area of spectral peaks, and can give better results than automatic peak search programs when peaks are very small, very large, or unusually shaped. Mean combines unequal measurements of a single quantity, tests for consistency, and obtains the weighted mean and six measures of its uncertainty.

  7. Computers in Louisiana Public Schools: A Trend Analysis, 1980-1990.

    ERIC Educational Resources Information Center

    Jordan, Daniel W.

    A survey instrument was developed to determine the status of microcomputers in the public schools of Louisiana. The instrument solicited information concerning the number of microcomputers in the schools, the manufacturers of the computers, and how the computers were being used in the school system. The survey was mailed to 66 districts in the…

  8. Observations on the Use of Computer and Broadcast Television Technology in One Public Elementary School.

    ERIC Educational Resources Information Center

    Hoge, John Douglas

    This paper provides participant observations regarding the use of computer and broadcast television technology at a suburban public elementary school in Athens, Georgia during the 1995-1996 school year. The paper describes the hardware and software available in the school, and the use and misuse of computers and broadcast television in the…

  9. Computer-Assisted Management of Instruction in Veterinary Public Health

    ERIC Educational Resources Information Center

    Holt, Elsbeth; And Others

    1975-01-01

    Reviews a course in Food Hygiene and Public Health at the University of Illinois College of Veterinary Medicine in which students are sequenced through a series of computer-based lessons or autotutorial slide-tape lessons, the computer also being used to route, test, and keep records. Since grades indicated mastery of the subject, the course will…

  10. 41 CFR 301-11.632 - How should we compute the employee's ITRA?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 4 2010-07-01 2010-07-01 false How should we compute the employee's ITRA? 301-11.632 Section 301-11.632 Public Contracts and Property Management Federal... § 301-11.632 How should we compute the employee's ITRA? You should follow the procedures prescribed for...

  11. Climate@Home: Crowdsourcing Climate Change Research

    NASA Astrophysics Data System (ADS)

    Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.

    2011-12-01

    Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.

  12. Privacy-preserving public auditing for data integrity in cloud

    NASA Astrophysics Data System (ADS)

    Shaik Saleem, M.; Murali, M.

    2018-04-01

    Cloud computing which has collected extent concentration from communities of research and with industry research development, a large pool of computing resources using virtualized sharing method like storage, processing power, applications and services. The users of cloud are vend with on demand resources as they want in the cloud computing. Outsourced file of the cloud user can easily tampered as it is stored at the third party service providers databases, so there is no integrity of cloud users data as it has no control on their data, therefore providing security assurance to the users data has become one of the primary concern for the cloud service providers. Cloud servers are not responsible for any data loss as it doesn’t provide the security assurance to the cloud user data. Remote data integrity checking (RDIC) licenses an information to data storage server, to determine that it is really storing an owners data truthfully. RDIC is composed of security model and ID-based RDIC where it is responsible for the security of every server and make sure the data privacy of cloud user against the third party verifier. Generally, by running a two-party Remote data integrity checking (RDIC) protocol the clients would themselves be able to check the information trustworthiness of their cloud. Within the two party scenario the verifying result is given either from the information holder or the cloud server may be considered as one-sided. Public verifiability feature of RDIC gives the privilege to all its users to verify whether the original data is modified or not. To ensure the transparency of the publicly verifiable RDIC protocols, Let’s figure out there exists a TPA who is having knowledge and efficiency to verify the work to provide the condition clearly by publicly verifiable RDIC protocols.

  13. The 100 most-cited original articles in cardiac computed tomography: A bibliometric analysis.

    PubMed

    O'Keeffe, Michael E; Hanna, Tarek N; Holmes, Davis; Marais, Olivia; Mohammed, Mohammed F; Clark, Sheldon; McLaughlin, Patrick; Nicolaou, Savvas; Khosa, Faisal

    2016-01-01

    Bibliometric analysis is the application of statistical methods to analyze quantitative data about scientific publications. It can evaluate research performance, author productivity, and manuscript impact. To the best of our knowledge, no bibliometric analysis has focused on cardiac computed tomography (CT). The purpose of this paper was to compile a list of the 100 most-cited articles related to cardiac CT literature using Scopus and Web of Science (WOS). A list of the 100 most-cited articles was compiled by order of citation frequency, as well a list of the top 10 most-cited guideline and review articles and the 20 most-cited articles of the years 2014-2015. The database of 100 most-cited articles was analyzed to identify characteristics of highly cited publications. For each manuscript, the number of authors, study design, size of patient cohort and departmental affiliations were cataloged. The 100 most-cited articles were published from 1990 to 2012, with the majority (53) published between 2005 and 2009. The total number of citations varied from 3354 to 196, and the number of citations per year varied from 9.5 to 129.0 with a median and mean of 30.9 and 38.7, respectively. The majority of publications had a study patients sample size of 200 patients or less. The USA and Germany were the nations with the highest number of frequently cited publications. This bibliometric analysis provides insights on the most-cited articles published on the subject of cardiac CT and calcium volume, thus helping to characterize the field and guide future research. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  14. Benchmarking information needs and use in the Tennessee public health community*

    PubMed Central

    Lee, Patricia; Giuse, Nunzia B.; Sathe, Nila A.

    2003-01-01

    Objective: The objective is to provide insight to understanding public health officials' needs and promote access to data repositories and communication tools. Methods: Survey questions were identified by a focus group with members drawn from the fields of librarianship, public health, and informatics. The resulting comprehensive information needs survey, organized in five distinct broad categories, was distributed to 775 Tennessee public health workers from ninety-five counties in 1999 as part of the National Library of Medicine–funded Partners in Information Access contract. Results: The assessment pooled responses from 571 public health workers (73% return rate) representing seventy-two of ninety-five counties (53.4% urban and 46.6% rural) about their information-seeking behaviors, frequency of resources used, computer skills, and level of Internet access. Sixty-four percent of urban and 43% of rural respondents had email access at work and more than 50% of both urban and rural respondents had email at home (N = 289). Approximately 70% of urban and 78% of rural public health officials never or seldom used or needed the Centers for Disease Control (CDC) Website. Frequency data pooled from eleven job categories representing a subgroup of 232 health care professionals showed 72% never or seldom used or needed MEDLINE. Electronic resources used daily or weekly were email, Internet search engines, internal databases and mailing lists, and the Tennessee Department of Health Website. Conclusions: While, due to the small sample size, data cannot be generalized to the larger population, a clear trend of significant barriers to computer and Internet access can be identified across the public health community. This contributes to an overall limited use of existing electronic resources that inhibits evidence-based practice. PMID:12883562

  15. A survey of current trends in computational drug repositioning.

    PubMed

    Li, Jiao; Zheng, Si; Chen, Bin; Butte, Atul J; Swamidass, S Joshua; Lu, Zhiyong

    2016-01-01

    Computational drug repositioning or repurposing is a promising and efficient tool for discovering new uses from existing drugs and holds the great potential for precision medicine in the age of big data. The explosive growth of large-scale genomic and phenotypic data, as well as data of small molecular compounds with granted regulatory approval, is enabling new developments for computational repositioning. To achieve the shortest path toward new drug indications, advanced data processing and analysis strategies are critical for making sense of these heterogeneous molecular measurements. In this review, we show recent advancements in the critical areas of computational drug repositioning from multiple aspects. First, we summarize available data sources and the corresponding computational repositioning strategies. Second, we characterize the commonly used computational techniques. Third, we discuss validation strategies for repositioning studies, including both computational and experimental methods. Finally, we highlight potential opportunities and use-cases, including a few target areas such as cancers. We conclude with a brief discussion of the remaining challenges in computational drug repositioning. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.

  16. Electronic Publishing.

    ERIC Educational Resources Information Center

    Lancaster, F. W.

    1989-01-01

    Describes various stages involved in the applications of electronic media to the publishing industry. Highlights include computer typesetting, or photocomposition; machine-readable databases; the distribution of publications in electronic form; computer conferencing and electronic mail; collaborative authorship; hypertext; hypermedia publications;…

  17. Needs assessment for next generation computer-aided mammography reference image databases and evaluation studies.

    PubMed

    Horsch, Alexander; Hapfelmeier, Alexander; Elter, Matthias

    2011-11-01

    Breast cancer is globally a major threat for women's health. Screening and adequate follow-up can significantly reduce the mortality from breast cancer. Human second reading of screening mammograms can increase breast cancer detection rates, whereas this has not been proven for current computer-aided detection systems as "second reader". Critical factors include the detection accuracy of the systems and the screening experience and training of the radiologist with the system. When assessing the performance of systems and system components, the choice of evaluation methods is particularly critical. Core assets herein are reference image databases and statistical methods. We have analyzed characteristics and usage of the currently largest publicly available mammography database, the Digital Database for Screening Mammography (DDSM) from the University of South Florida, in literature indexed in Medline, IEEE Xplore, SpringerLink, and SPIE, with respect to type of computer-aided diagnosis (CAD) (detection, CADe, or diagnostics, CADx), selection of database subsets, choice of evaluation method, and quality of descriptions. 59 publications presenting 106 evaluation studies met our selection criteria. In 54 studies (50.9%), the selection of test items (cases, images, regions of interest) extracted from the DDSM was not reproducible. Only 2 CADx studies, not any CADe studies, used the entire DDSM. The number of test items varies from 100 to 6000. Different statistical evaluation methods are chosen. Most common are train/test (34.9% of the studies), leave-one-out (23.6%), and N-fold cross-validation (18.9%). Database-related terminology tends to be imprecise or ambiguous, especially regarding the term "case". Overall, both the use of the DDSM as data source for evaluation of mammography CAD systems, and the application of statistical evaluation methods were found highly diverse. Results reported from different studies are therefore hardly comparable. Drawbacks of the DDSM (e.g. varying quality of lesion annotations) may contribute to the reasons. But larger bias seems to be caused by authors' own decisions upon study design. RECOMMENDATIONS/CONCLUSION: For future evaluation studies, we derive a set of 13 recommendations concerning the construction and usage of a test database, as well as the application of statistical evaluation methods.

  18. Architecture and Initial Development of a Digital Library Platform for Computable Knowledge Objects for Health.

    PubMed

    Flynn, Allen J; Bahulekar, Namita; Boisvert, Peter; Lagoze, Carl; Meng, George; Rampton, James; Friedman, Charles P

    2017-01-01

    Throughout the world, biomedical knowledge is routinely generated and shared through primary and secondary scientific publications. However, there is too much latency between publication of knowledge and its routine use in practice. To address this latency, what is actionable in scientific publications can be encoded to make it computable. We have created a purpose-built digital library platform to hold, manage, and share actionable, computable knowledge for health called the Knowledge Grid Library. Here we present it with its system architecture.

  19. CT brush and CancerZap!: two video games for computed tomography dose minimization.

    PubMed

    Alvare, Graham; Gordon, Richard

    2015-05-12

    X-ray dose from computed tomography (CT) scanners has become a significant public health concern. All CT scanners spray x-ray photons across a patient, including those using compressive sensing algorithms. New technologies make it possible to aim x-ray beams where they are most needed to form a diagnostic or screening image. We have designed a computer game, CT Brush, that takes advantage of this new flexibility. It uses a standard MART algorithm (Multiplicative Algebraic Reconstruction Technique), but with a user defined dynamically selected subset of the rays. The image appears as the player moves the CT brush over an initially blank scene, with dose accumulating with every "mouse down" move. The goal is to find the "tumor" with as few moves (least dose) as possible. We have successfully implemented CT Brush in Java and made it available publicly, requesting crowdsourced feedback on improving the open source code. With this experience, we also outline a "shoot 'em up game" CancerZap! for photon limited CT. We anticipate that human computing games like these, analyzed by methods similar to those used to understand eye tracking, will lead to new object dependent CT algorithms that will require significantly less dose than object independent nonlinear and compressive sensing algorithms that depend on sprayed photons. Preliminary results suggest substantial dose reduction is achievable.

  20. Constructing a Foundational Platform Driven by Japan's K Supercomputer for Next-Generation Drug Design.

    PubMed

    Brown, J B; Nakatsui, Masahiko; Okuno, Yasushi

    2014-12-01

    The cost of pharmaceutical R&D has risen enormously, both worldwide and in Japan. However, Japan faces a particularly difficult situation in that its population is aging rapidly, and the cost of pharmaceutical R&D affects not only the industry but the entire medical system as well. To attempt to reduce costs, the newly launched K supercomputer is available for big data drug discovery and structural simulation-based drug discovery. We have implemented both primary (direct) and secondary (infrastructure, data processing) methods for the two types of drug discovery, custom tailored to maximally use the 88 128 compute nodes/CPUs of K, and evaluated the implementations. We present two types of results. In the first, we executed the virtual screening of nearly 19 billion compound-protein interactions, and calculated the accuracy of predictions against publicly available experimental data. In the second investigation, we implemented a very computationally intensive binding free energy algorithm, and found that comparison of our binding free energies was considerably accurate when validated against another type of publicly available experimental data. The common feature of both result types is the scale at which computations were executed. The frameworks presented in this article provide prospectives and applications that, while tuned to the computing resources available in Japan, are equally applicable to any equivalent large-scale infrastructure provided elsewhere. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Computers and the internet: tools for youth empowerment.

    PubMed

    Valaitis, Ruta K

    2005-10-04

    Youth are often disenfranchised in their communities and may feel they have little voice. Since computers are an important aspect of youth culture, they may offer solutions to increasing youth participation in communities. This qualitative case study investigated the perceptions of 19 (predominantly female) inner-city school youth about their use of computers and the Internet in a school-based community development project. Youth working with public health nurses in a school-based community development project communicated with local community members using computer-mediated communication, surveyed peers online, built websites, searched for information online, and prepared project materials using computers and the Internet. Participant observation, semistructured interviews, analysis of online messages, and online- and paper-based surveys were used to gather data about youth's and adults' perceptions and use of the technologies. Constant comparison method and between-method triangulation were used in the analysis to satisfy the existence of themes. Not all youth were interested in working with computers. Some electronic messages from adults were perceived to be critical, and writing to adults was intimidating for some youth. In addition, technical problems were experienced. Despite these barriers, most youth perceived that using computers and the Internet reduced their anxiety concerning communication with adults, increased their control when dealing with adults, raised their perception of their social status, increased participation within the community, supported reflective thought, increased efficiency, and improved their access to resources. Overall, youth perceived computers and the Internet to be empowering tools, and they should be encouraged to use such technology to support them in community initiatives.

  2. Conditional cooperation and confusion in public-goods experiments

    PubMed Central

    Burton-Chellew, Maxwell N.; El Mouden, Claire; West, Stuart A.

    2016-01-01

    Economic experiments are often used to study if humans altruistically value the welfare of others. A canonical result from public-good games is that humans vary in how they value the welfare of others, dividing into fair-minded conditional cooperators, who match the cooperation of others, and selfish noncooperators. However, an alternative explanation for the data are that individuals vary in their understanding of how to maximize income, with misunderstanding leading to the appearance of cooperation. We show that (i) individuals divide into the same behavioral types when playing with computers, whom they cannot be concerned with the welfare of; (ii) behavior across games with computers and humans is correlated and can be explained by variation in understanding of how to maximize income; (iii) misunderstanding correlates with higher levels of cooperation; and (iv) standard control questions do not guarantee understanding. These results cast doubt on certain experimental methods and demonstrate that a common assumption in behavioral economics experiments, that choices reveal motivations, will not necessarily hold. PMID:26787890

  3. Application of data mining approaches to drug delivery.

    PubMed

    Ekins, Sean; Shimada, Jun; Chang, Cheng

    2006-11-30

    Computational approaches play a key role in all areas of the pharmaceutical industry from data mining, experimental and clinical data capture to pharmacoeconomics and adverse events monitoring. They will likely continue to be indispensable assets along with a growing library of software applications. This is primarily due to the increasingly massive amount of biology, chemistry and clinical data, which is now entering the public domain mainly as a result of NIH and commercially funded projects. We are therefore in need of new methods for mining this mountain of data in order to enable new hypothesis generation. The computational approaches include, but are not limited to, database compilation, quantitative structure activity relationships (QSAR), pharmacophores, network visualization models, decision trees, machine learning algorithms and multidimensional data visualization software that could be used to improve drug delivery after mining public and/or proprietary data. We will discuss some areas of unmet needs in the area of data mining for drug delivery that can be addressed with new software tools or databases of relevance to future pharmaceutical projects.

  4. Computer programs to characterize alloys and predict cyclic life using the total strain version of strainrange partitioning: Tutorial and users manual, version 1.0

    NASA Technical Reports Server (NTRS)

    Saltsman, James F.

    1992-01-01

    This manual presents computer programs for characterizing and predicting fatigue and creep-fatigue resistance of metallic materials in the high-temperature, long-life regime for isothermal and nonisothermal fatigue. The programs use the total strain version of Strainrange Partitioning (TS-SRP). An extensive database has also been developed in a parallel effort. This database is probably the largest source of high-temperature, creep-fatigue test data available in the public domain and can be used with other life prediction methods as well. This users manual, software, and database are all in the public domain and are available through COSMIC (382 East Broad Street, Athens, GA 30602; (404) 542-3265, FAX (404) 542-4807). Two disks accompany this manual. The first disk contains the source code, executable files, and sample output from these programs. The second disk contains the creep-fatigue data in a format compatible with these programs.

  5. Conditional cooperation and confusion in public-goods experiments.

    PubMed

    Burton-Chellew, Maxwell N; El Mouden, Claire; West, Stuart A

    2016-02-02

    Economic experiments are often used to study if humans altruistically value the welfare of others. A canonical result from public-good games is that humans vary in how they value the welfare of others, dividing into fair-minded conditional cooperators, who match the cooperation of others, and selfish noncooperators. However, an alternative explanation for the data are that individuals vary in their understanding of how to maximize income, with misunderstanding leading to the appearance of cooperation. We show that (i) individuals divide into the same behavioral types when playing with computers, whom they cannot be concerned with the welfare of; (ii) behavior across games with computers and humans is correlated and can be explained by variation in understanding of how to maximize income; (iii) misunderstanding correlates with higher levels of cooperation; and (iv) standard control questions do not guarantee understanding. These results cast doubt on certain experimental methods and demonstrate that a common assumption in behavioral economics experiments, that choices reveal motivations, will not necessarily hold.

  6. Use of the internet to study the utility values of the public.

    PubMed Central

    Lenert, Leslie A.; Sturley, Ann E.

    2002-01-01

    One of the most difficult tasks in cost-effectiveness analysis is the measurement of quality weights (utilities) for health states. The task is difficult because subjects often lack familiarity with health states they are asked to rate and because utilities measures such as the standard gamble, ask subjects to perform tasks that are complex and far from everyday experience. A large body of research suggests that computer methods can play an important role in explaining health states and measuring utilities. However, administering computer surveys to a "general public" sample, the most relevant sample for cost-effectiveness analysis, is logistically difficult. In this paper, we describe a software system designed to allow the study of general population preferences in a volunteer Internet survey panel. The approach, which relied on over sampling of ethnic groups and older members of the panel, produced a data set with an ethnically, chronologically and geographically diverse group of respondents, but was not successful in replicating the joint distribution of demographic patterns in the population. PMID:12463862

  7. Dictionary-driven protein annotation.

    PubMed

    Rigoutsos, Isidore; Huynh, Tien; Floratos, Aris; Parida, Laxmi; Platt, Daniel

    2002-09-01

    Computational methods seeking to automatically determine the properties (functional, structural, physicochemical, etc.) of a protein directly from the sequence have long been the focus of numerous research groups. With the advent of advanced sequencing methods and systems, the number of amino acid sequences that are being deposited in the public databases has been increasing steadily. This has in turn generated a renewed demand for automated approaches that can annotate individual sequences and complete genomes quickly, exhaustively and objectively. In this paper, we present one such approach that is centered around and exploits the Bio-Dictionary, a collection of amino acid patterns that completely covers the natural sequence space and can capture functional and structural signals that have been reused during evolution, within and across protein families. Our annotation approach also makes use of a weighted, position-specific scoring scheme that is unaffected by the over-representation of well-conserved proteins and protein fragments in the databases used. For a given query sequence, the method permits one to determine, in a single pass, the following: local and global similarities between the query and any protein already present in a public database; the likeness of the query to all available archaeal/ bacterial/eukaryotic/viral sequences in the database as a function of amino acid position within the query; the character of secondary structure of the query as a function of amino acid position within the query; the cytoplasmic, transmembrane or extracellular behavior of the query; the nature and position of binding domains, active sites, post-translationally modified sites, signal peptides, etc. In terms of performance, the proposed method is exhaustive, objective and allows for the rapid annotation of individual sequences and full genomes. Annotation examples are presented and discussed in Results, including individual queries and complete genomes that were released publicly after we built the Bio-Dictionary that is used in our experiments. Finally, we have computed the annotations of more than 70 complete genomes and made them available on the World Wide Web at http://cbcsrv.watson.ibm.com/Annotations/.

  8. Analysis of citation networks as a new tool for scientific research

    DOE PAGES

    Vasudevan, R. K.; Ziatdinov, M.; Chen, C.; ...

    2016-12-06

    The rapid growth of scientific publications necessitates new methods to understand the direction of scientific research within fields of study, ascertain the importance of particular groups, authors, or institutions, compute metrics that can determine the importance (centrality) of particular seminal papers, and provide insight into the social (collaboration) networks that are present. We present one such method based on analysis of citation networks, using the freely available CiteSpace Program. We use citation network analysis on three examples, including a single material that has been widely explored in the last decade (BiFeO 3), two small subfields with a minimal number ofmore » authors (flexoelectricity and Kitaev physics), and a much wider field with thousands of publications pertaining to a single technique (scanning tunneling microscopy). Interpretation of the analysis and key insights into the fields, such as whether the fields are experiencing resurgence or stagnation, are discussed, and author or collaboration networks that are prominent are determined. Such methods represent a paradigm shift in our way of dealing with the large volume of scientific publications and could change the way literature searches and reviews are conducted, as well as how the impact of specific work is assessed.« less

  9. Reusable data in public health data-bases-problems encountered in Danish Children's Database.

    PubMed

    Høstgaard, Anna Marie; Pape-Haugaard, Louise

    2012-01-01

    Denmark have unique health informatics databases e.g. "The Children's Database", which since 2009 holds data on all Danish children from birth until 17 years of age. In the current set-up a number of potential sources of errors exist - both technical and human-which means that the data is flawed. This gives rise to erroneous statistics and makes the data unsuitable for research purposes. In order to make the data usable, it is necessary to develop new methods for validating the data generation process at the municipal/regional/national level. In the present ongoing research project, two research areas are combined: Public Health Informatics and Computer Science, and both ethnographic as well as system engineering research methods are used. The project is expected to generate new generic methods and knowledge about electronic data collection and transmission in different social contexts and by different social groups and thus to be of international importance, since this is sparsely documented in the Public Health Informatics perspective. This paper presents the preliminary results, which indicate that health information technology used ought to be subject for redesign, where a thorough insight into the work practices should be point of departure.

  10. Analysis of citation networks as a new tool for scientific research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasudevan, R. K.; Ziatdinov, M.; Chen, C.

    The rapid growth of scientific publications necessitates new methods to understand the direction of scientific research within fields of study, ascertain the importance of particular groups, authors, or institutions, compute metrics that can determine the importance (centrality) of particular seminal papers, and provide insight into the social (collaboration) networks that are present. We present one such method based on analysis of citation networks, using the freely available CiteSpace Program. We use citation network analysis on three examples, including a single material that has been widely explored in the last decade (BiFeO 3), two small subfields with a minimal number ofmore » authors (flexoelectricity and Kitaev physics), and a much wider field with thousands of publications pertaining to a single technique (scanning tunneling microscopy). Interpretation of the analysis and key insights into the fields, such as whether the fields are experiencing resurgence or stagnation, are discussed, and author or collaboration networks that are prominent are determined. Such methods represent a paradigm shift in our way of dealing with the large volume of scientific publications and could change the way literature searches and reviews are conducted, as well as how the impact of specific work is assessed.« less

  11. 43 CFR 38.2 - Computation of hourly, daily, weekly, and biweekly adjusted rates of pay.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Computation of hourly, daily, weekly, and biweekly adjusted rates of pay. 38.2 Section 38.2 Public Lands: Interior Office of the Secretary of the Interior PAY OF U.S. PARK POLICE-INTERIM GEOGRAPHIC ADJUSTMENTS § 38.2 Computation of hourly, daily, weekly...

  12. Principles of Protein Stability and Their Application in Computational Design.

    PubMed

    Goldenzweig, Adi; Fleishman, Sarel

    2018-01-26

    Proteins are increasingly used in basic and applied biomedical research.Many proteins, however, are only marginally stable and can be expressed in limited amounts, thus hampering research and applications. Research has revealed the thermodynamic, cellular, and evolutionary principles and mechanisms that underlie marginal stability. With this growing understanding, computational stability design methods have advanced over the past two decades starting from methods that selectively addressed only some aspects of marginal stability. Current methods are more general and, by combining phylogenetic analysis with atomistic design, have shown drastic improvements in solubility, thermal stability, and aggregation resistance while maintaining the protein's primary molecular activity. Stability design is opening the way to rational engineering of improved enzymes, therapeutics, and vaccines and to the application of protein design methodology to large proteins and molecular activities that have proven challenging in the past. Expected final online publication date for the Annual Review of Biochemistry Volume 87 is June 20, 2018. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

  13. Gene context analysis in the Integrated Microbial Genomes (IMG) data management system.

    PubMed

    Mavromatis, Konstantinos; Chu, Ken; Ivanova, Natalia; Hooper, Sean D; Markowitz, Victor M; Kyrpides, Nikos C

    2009-11-24

    Computational methods for determining the function of genes in newly sequenced genomes have been traditionally based on sequence similarity to genes whose function has been identified experimentally. Function prediction methods can be extended using gene context analysis approaches such as examining the conservation of chromosomal gene clusters, gene fusion events and co-occurrence profiles across genomes. Context analysis is based on the observation that functionally related genes are often having similar gene context and relies on the identification of such events across phylogenetically diverse collection of genomes. We have used the data management system of the Integrated Microbial Genomes (IMG) as the framework to implement and explore the power of gene context analysis methods because it provides one of the largest available genome integrations. Visualization and search tools to facilitate gene context analysis have been developed and applied across all publicly available archaeal and bacterial genomes in IMG. These computations are now maintained as part of IMG's regular genome content update cycle. IMG is available at: http://img.jgi.doe.gov.

  14. 3D optic disc reconstruction via a global fundus stereo algorithm.

    PubMed

    Bansal, M; Sizintsev, M; Eledath, J; Sawhney, H; Pearson, D J; Stone, R A

    2013-01-01

    This paper presents a novel method to recover 3D structure of the optic disc in the retina from two uncalibrated fundus images. Retinal images are commonly uncalibrated when acquired clinically, creating rectification challenges as well as significant radiometric and blur differences within the stereo pair. By exploiting structural peculiarities of the retina, we modified the Graph Cuts computational stereo method (one of current state-of-the-art methods) to yield a high quality algorithm for fundus stereo reconstruction. Extensive qualitative and quantitative experimental evaluation (where OCT scans are used as 3D ground truth) on our and publicly available datasets shows the superiority of the proposed method in comparison to other alternatives.

  15. Inferring gene and protein interactions using PubMed citations and consensus Bayesian networks.

    PubMed

    Deeter, Anthony; Dalman, Mark; Haddad, Joseph; Duan, Zhong-Hui

    2017-01-01

    The PubMed database offers an extensive set of publication data that can be useful, yet inherently complex to use without automated computational techniques. Data repositories such as the Genomic Data Commons (GDC) and the Gene Expression Omnibus (GEO) offer experimental data storage and retrieval as well as curated gene expression profiles. Genetic interaction databases, including Reactome and Ingenuity Pathway Analysis, offer pathway and experiment data analysis using data curated from these publications and data repositories. We have created a method to generate and analyze consensus networks, inferring potential gene interactions, using large numbers of Bayesian networks generated by data mining publications in the PubMed database. Through the concept of network resolution, these consensus networks can be tailored to represent possible genetic interactions. We designed a set of experiments to confirm that our method is stable across variation in both sample and topological input sizes. Using gene product interactions from the KEGG pathway database and data mining PubMed publication abstracts, we verify that regardless of the network resolution or the inferred consensus network, our method is capable of inferring meaningful gene interactions through consensus Bayesian network generation with multiple, randomized topological orderings. Our method can not only confirm the existence of currently accepted interactions, but has the potential to hypothesize new ones as well. We show our method confirms the existence of known gene interactions such as JAK-STAT-PI3K-AKT-mTOR, infers novel gene interactions such as RAS- Bcl-2 and RAS-AKT, and found significant pathway-pathway interactions between the JAK-STAT signaling and Cardiac Muscle Contraction KEGG pathways.

  16. Applying spatial analysis tools in public health: an example using SaTScan to detect geographic targets for colorectal cancer screening interventions.

    PubMed

    Sherman, Recinda L; Henry, Kevin A; Tannenbaum, Stacey L; Feaster, Daniel J; Kobetz, Erin; Lee, David J

    2014-03-20

    Epidemiologists are gradually incorporating spatial analysis into health-related research as geocoded cases of disease become widely available and health-focused geospatial computer applications are developed. One health-focused application of spatial analysis is cluster detection. Using cluster detection to identify geographic areas with high-risk populations and then screening those populations for disease can improve cancer control. SaTScan is a free cluster-detection software application used by epidemiologists around the world to describe spatial clusters of infectious and chronic disease, as well as disease vectors and risk factors. The objectives of this article are to describe how spatial analysis can be used in cancer control to detect geographic areas in need of colorectal cancer screening intervention, identify issues commonly encountered by SaTScan users, detail how to select the appropriate methods for using SaTScan, and explain how method selection can affect results. As an example, we used various methods to detect areas in Florida where the population is at high risk for late-stage diagnosis of colorectal cancer. We found that much of our analysis was underpowered and that no single method detected all clusters of statistical or public health significance. However, all methods detected 1 area as high risk; this area is potentially a priority area for a screening intervention. Cluster detection can be incorporated into routine public health operations, but the challenge is to identify areas in which the burden of disease can be alleviated through public health intervention. Reliance on SaTScan's default settings does not always produce pertinent results.

  17. The Use of Public Computing Facilities by Library Patrons: Demography, Motivations, and Barriers

    ERIC Educational Resources Information Center

    DeMaagd, Kurt; Chew, Han Ei; Huang, Guanxiong; Khan, M. Laeeq; Sreenivasan, Akshaya; LaRose, Robert

    2013-01-01

    Public libraries play an important part in the development of a community. Today, they are seen as more than store houses of books; they are also responsible for the dissemination of online, and offline information. Public access computers are becoming increasingly popular as more and more people understand the need for internet access. Using a…

  18. COMPUTER TECHNOLOGY AND SOCIAL CHANGE,

    DTIC Science & Technology

    This paper presents a discussion of the social , political, economic and psychological problems associated with the rapid growth and development of...public officials and responsible groups is required to increase public understanding of the computer as a powerful tool, to select appropriate

  19. Global Flowfield About the V-22 Tiltrotor Aircraft

    NASA Technical Reports Server (NTRS)

    Meakin, Robert L.

    1996-01-01

    This final report includes five publications that resulted from the studies of the global flowfield about the V-22 Tiltrotor Aircraft. The first of the five is 'The Chimera Method of Simulation for Unsteady Three-Dimensional Viscous Flow', as presented in 'Computational Fluid Dynamics Review 1995.' The remaining papers, all presented at AIAA conferences, are 'Unsteady Simulation of the Viscous Flow About a V-22 Rotor and Wing in Hover', 'An Efficient Means of Adaptive Refinement Within Systems of Overset Grids', 'On the Spatial and Temporal Accuracy of Overset Grid Methods for MOving Body Problems', and 'Moving Body Overset Grid Methods for Complete Aircraft Tiltrotor Simulations.'

  20. The Impact of the Role of an Instructional Technology Facilitator on Teacher Efficacy in Classroom Technology Integration in Two Rural Public Schools in Northwestern North Carolina

    ERIC Educational Resources Information Center

    Adams, Karri Campbell

    2015-01-01

    The purpose of this study was to contribute to a limited body of research on the impact of the role of the school-level instructional technology facilitator on teacher technology efficacy. This mixed-methods study involved the administration of a survey instrument designed to measure teacher technology efficacy, the Computer Technology Integration…

  1. An Extensible NetLogo Model for Visualizing Message Routing Protocols

    DTIC Science & Technology

    2017-08-01

    the hard sciences to the social sciences to computer-generated art. NetLogo represents the world as a set of...describe the model is shown here; for the supporting methods , refer to the source code. Approved for public release; distribution is unlimited. 4 iv...if ticks - last-inject > time-to-inject [inject] if run# > #runs [stop] end Next, we present some basic statistics collected for the

  2. Coprime and Nested Arrays: A New Paradigm for Sampling in Space and Time

    DTIC Science & Technology

    2015-09-14

    International Conference on Computers and Devices for Communication (CODEC), Kolkata, India , December, 2015. 2) Plenary speaker at the Asia Pacific...National Symposium on Mathematical Methods and Applications, Chennai, India , Dec. 2013. This is held to honor the late Srinivasa Ramanujan every year...4) Plenary speaker at the International Conf. on Comm. and Signal Processing (ICCSP), Calicut, India , 2011. List of publications during the above

  3. Uncover the Cloud for Geospatial Sciences and Applications to Adopt Cloud Computing

    NASA Astrophysics Data System (ADS)

    Yang, C.; Huang, Q.; Xia, J.; Liu, K.; Li, J.; Xu, C.; Sun, M.; Bambacus, M.; Xu, Y.; Fay, D.

    2012-12-01

    Cloud computing is emerging as the future infrastructure for providing computing resources to support and enable scientific research, engineering development, and application construction, as well as work force education. On the other hand, there is a lot of doubt about the readiness of cloud computing to support a variety of scientific research, development and educations. This research is a project funded by NASA SMD to investigate through holistic studies how ready is the cloud computing to support geosciences. Four applications with different computing characteristics including data, computing, concurrent, and spatiotemporal intensities are taken to test the readiness of cloud computing to support geosciences. Three popular and representative cloud platforms including Amazon EC2, Microsoft Azure, and NASA Nebula as well as a traditional cluster are utilized in the study. Results illustrates that cloud is ready to some degree but more research needs to be done to fully implemented the cloud benefit as advertised by many vendors and defined by NIST. Specifically, 1) most cloud platform could help stand up new computing instances, a new computer, in a few minutes as envisioned, therefore, is ready to support most computing needs in an on demand fashion; 2) the load balance and elasticity, a defining characteristic, is ready in some cloud platforms, such as Amazon EC2, to support bigger jobs, e.g., needs response in minutes, while some are not ready to support the elasticity and load balance well. All cloud platform needs further research and development to support real time application at subminute level; 3) the user interface and functionality of cloud platforms vary a lot and some of them are very professional and well supported/documented, such as Amazon EC2, some of them needs significant improvement for the general public to adopt cloud computing without professional training or knowledge about computing infrastructure; 4) the security is a big concern in cloud computing platform, with the sharing spirit of cloud computing, it is very hard to ensure higher level security, except a private cloud is built for a specific organization without public access, public cloud platform does not support FISMA medium level yet and may never be able to support FISMA high level; 5) HPC jobs needs of cloud computing is not well supported and only Amazon EC2 supports this well. The research is being taken by NASA and other agencies to consider cloud computing adoption. We hope the publication of the research would also benefit the public to adopt cloud computing.

  4. Texture classification of lung computed tomography images

    NASA Astrophysics Data System (ADS)

    Pheng, Hang See; Shamsuddin, Siti M.

    2013-03-01

    Current development of algorithms in computer-aided diagnosis (CAD) scheme is growing rapidly to assist the radiologist in medical image interpretation. Texture analysis of computed tomography (CT) scans is one of important preliminary stage in the computerized detection system and classification for lung cancer. Among different types of images features analysis, Haralick texture with variety of statistical measures has been used widely in image texture description. The extraction of texture feature values is essential to be used by a CAD especially in classification of the normal and abnormal tissue on the cross sectional CT images. This paper aims to compare experimental results using texture extraction and different machine leaning methods in the classification normal and abnormal tissues through lung CT images. The machine learning methods involve in this assessment are Artificial Immune Recognition System (AIRS), Naive Bayes, Decision Tree (J48) and Backpropagation Neural Network. AIRS is found to provide high accuracy (99.2%) and sensitivity (98.0%) in the assessment. For experiments and testing purpose, publicly available datasets in the Reference Image Database to Evaluate Therapy Response (RIDER) are used as study cases.

  5. Lessons from a doctoral thesis.

    PubMed

    Peiris, A N; Mueller, R A; Sheridan, D P

    1990-01-01

    The production of a doctoral thesis is a time-consuming affair that until recently was done in conjunction with professional publishing services. Advances in computer technology have made many sophisticated desktop publishing techniques available to the microcomputer user. We describe the computer method used, the problems encountered, and the solutions improvised in the production of a doctoral thesis by computer. The Apple Macintosh was selected for its ease of use and intrinsic graphics capabilities. A scanner was used to incorporate text from published papers into a word processing program. The body of the text was updated and supplemented with new sections. Scanned graphics from the published papers were less suitable for publication, and the original data were replotted and modified with a graphics-drawing program. Graphics were imported and incorporated in the text. Final hard copy was produced by a laser printer and bound with both conventional and rapid new binding techniques. Microcomputer-based desktop processing methods provide a rapid and cost-effective means of communicating the written word. We anticipate that this evolving technology will have increased use by physicians in both the private and academic sectors.

  6. A practical approximation algorithm for solving massive instances of hybridization number for binary and nonbinary trees.

    PubMed

    van Iersel, Leo; Kelk, Steven; Lekić, Nela; Scornavacca, Celine

    2014-05-05

    Reticulate events play an important role in determining evolutionary relationships. The problem of computing the minimum number of such events to explain discordance between two phylogenetic trees is a hard computational problem. Even for binary trees, exact solvers struggle to solve instances with reticulation number larger than 40-50. Here we present CycleKiller and NonbinaryCycleKiller, the first methods to produce solutions verifiably close to optimality for instances with hundreds or even thousands of reticulations. Using simulations, we demonstrate that these algorithms run quickly for large and difficult instances, producing solutions that are very close to optimality. As a spin-off from our simulations we also present TerminusEst, which is the fastest exact method currently available that can handle nonbinary trees: this is used to measure the accuracy of the NonbinaryCycleKiller algorithm. All three methods are based on extensions of previous theoretical work (SIDMA 26(4):1635-1656, TCBB 10(1):18-25, SIDMA 28(1):49-66) and are publicly available. We also apply our methods to real data.

  7. Routine history as compared to audio computer-assisted self-interview for prenatal care history taking.

    PubMed

    Mears, Molly; Coonrod, Dean V; Bay, R Curtis; Mills, Terry E; Watkins, Michelle C

    2005-09-01

    To compare endorsement rates obtained with audio computer-assisted self-interview versus routine prenatal history. A crosssectional study compared items captured with the routine history to those captured with a computer interview (computer screen displaying and computer audio reading questions, with responses entered by touch screen). The subjects were women (n=174) presenting to a public hospital clinic for prenatal care. The prevalence of positive responses using the computer interview was significantly greater (p < 0.01) than with the routine history for induced abortion (16.8% versus 4.0%), lifetime smoking (12.8% versus 5.2%), intimate partner violence (10.0% versus 2.4%), ectopic pregnancy (5.2% versus 1.1%) and family history of mental retardation (6.7% versus 0.6%). Significant differences were not found for history of spontaneous abortion, hypertension, epilepsy, thyroid disease, smoking during pregnancy, gynecologic surgery, abnormal Pap test, neural tube defect or cystic fibrosis family history. However, in all cases, prevalence was equal or greater with the computer interview. Women were more likely to report sensitive and high-risk behavior, such as smoking history, intimate partner violence and elective abortion, with the computer interview. The computer interview displayed equal or increased patient reporting of positive responses and may therefore be an accurate method of obtaining an initial history.

  8. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems

    PubMed Central

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Background Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. Methods This was a descriptive-correlative study on 384 randomly chosen male guidance school students. They were asked to answer the researcher's questionnaire about computer games and Achenbach’s Youth Self-Report (YSR). Findings The Results of this study indicated that there was about 95% direct significant correlation between the amount of playing games among adolescents and anxiety/depression, withdrawn/depression, rule-breaking behaviors, aggression, and social problems. However, there was no statistically significant correlation between the amount of computer game usage and physical complaints, thinking problems, and attention problems. In addition, there was a significant correlation between the students’ place of living and their parents’ job, and using computer games. Conclusion Computer games lead to anxiety, depression, withdrawal, rule-breaking behavior, aggression, and social problems in adolescents. PMID:24494157

  9. The Influence of Using Momentum and Impulse Computer Simulation to Senior High School Students’ Concept Mastery

    NASA Astrophysics Data System (ADS)

    Kaniawati, I.; Samsudin, A.; Hasopa, Y.; Sutrisno, A. D.; Suhendi, E.

    2016-08-01

    This research is based on students’ lack of mastery of physics abstract concepts. Thus, this study aims to improve senior high school students’ mastery of momentum and impulse concepts with the use of computer simulation. To achieve these objectives, the research method employed was pre experimental design with one group pre-test post-test. A total of 36 science students of grade 11 in one of public senior high school in Bandung became the sample in this study. The instruments utilized to determine the increase of students’ concept mastery were pretest and posttest in the form of multiple choices. After using computer simulations in physics learning, students’ mastery of momentum and impulse concept has increased as indicated by the normalized gain of 0.64 with the medium category.

  10. Current algorithmic solutions for peptide-based proteomics data generation and identification.

    PubMed

    Hoopmann, Michael R; Moritz, Robert L

    2013-02-01

    Peptide-based proteomic data sets are ever increasing in size and complexity. These data sets provide computational challenges when attempting to quickly analyze spectra and obtain correct protein identifications. Database search and de novo algorithms must consider high-resolution MS/MS spectra and alternative fragmentation methods. Protein inference is a tricky problem when analyzing large data sets of degenerate peptide identifications. Combining multiple algorithms for improved peptide identification puts significant strain on computational systems when investigating large data sets. This review highlights some of the recent developments in peptide and protein identification algorithms for analyzing shotgun mass spectrometry data when encountering the aforementioned hurdles. Also explored are the roles that analytical pipelines, public spectral libraries, and cloud computing play in the evolution of peptide-based proteomics. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Science | Argonne National Laboratory

    Science.gov Websites

    Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at Argonne Work with Scientific Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels

  12. Argonne Research Library | Argonne National Laboratory

    Science.gov Websites

    Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at Argonne Work with Scientific Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at IMEInstitute for Molecular Engineering JCESRJoint Center for Energy Storage Research MCSGMidwest Center for

  13. [Datanet 1 and the convergence of the computer and telecommunications].

    PubMed

    de Wit, Onno

    2008-01-01

    This article describes the efforts of the Dutch national company for telecommunication, PTT, in introducing and developing a public network for data communication in the Netherlands in the last decades of the twentieth century. As early as the 1960s, private companies started to connect their local computers. As a result, small private computer networks started to emerge. As the state company offering general access to public services in telephony, the PTT strove to develop a public data network, accessible to every user and telephone subscriber. This ambition was realized with Datanet 1, the public data network which was officially opened in 1982. In the years that followed, Datanet became the dominant network for data transmission, despite competing efforts by private companies and computer manufacturers. The large-scale application of Datanet in public municipal administration serves as a case study for the development of data communication in practice, that shows that there was a gradual migration from X-25 to TCP/IP protocols. The article concludes by stating that the introduction and development of data transmission transformed the role of the PTT in Dutch society, brought new working practices, new services and new responsibilities, and resulted in a whole new phase in the history of the computer.

  14. Computers and Children: Problems and Possibilities.

    ERIC Educational Resources Information Center

    Siegfried, Pat

    1983-01-01

    Discusses the use of computers by children, highlighting a definition of computer literacy, computer education in schools, computer software, microcomputers, programming languages, and public library involvement. Seven references and a 40-item bibliography are included. (EJS)

  15. A hybrid cloud read aligner based on MinHash and kmer voting that preserves privacy

    NASA Astrophysics Data System (ADS)

    Popic, Victoria; Batzoglou, Serafim

    2017-05-01

    Low-cost clouds can alleviate the compute and storage burden of the genome sequencing data explosion. However, moving personal genome data analysis to the cloud can raise serious privacy concerns. Here, we devise a method named Balaur, a privacy preserving read mapper for hybrid clouds based on locality sensitive hashing and kmer voting. Balaur can securely outsource a substantial fraction of the computation to the public cloud, while being highly competitive in accuracy and speed with non-private state-of-the-art read aligners on short read data. We also show that the method is significantly faster than the state of the art in long read mapping. Therefore, Balaur can enable institutions handling massive genomic data sets to shift part of their analysis to the cloud without sacrificing accuracy or exposing sensitive information to an untrusted third party.

  16. A hybrid cloud read aligner based on MinHash and kmer voting that preserves privacy

    PubMed Central

    Popic, Victoria; Batzoglou, Serafim

    2017-01-01

    Low-cost clouds can alleviate the compute and storage burden of the genome sequencing data explosion. However, moving personal genome data analysis to the cloud can raise serious privacy concerns. Here, we devise a method named Balaur, a privacy preserving read mapper for hybrid clouds based on locality sensitive hashing and kmer voting. Balaur can securely outsource a substantial fraction of the computation to the public cloud, while being highly competitive in accuracy and speed with non-private state-of-the-art read aligners on short read data. We also show that the method is significantly faster than the state of the art in long read mapping. Therefore, Balaur can enable institutions handling massive genomic data sets to shift part of their analysis to the cloud without sacrificing accuracy or exposing sensitive information to an untrusted third party. PMID:28508884

  17. Rigid-Docking Approaches to Explore Protein-Protein Interaction Space.

    PubMed

    Matsuzaki, Yuri; Uchikoga, Nobuyuki; Ohue, Masahito; Akiyama, Yutaka

    Protein-protein interactions play core roles in living cells, especially in the regulatory systems. As information on proteins has rapidly accumulated on publicly available databases, much effort has been made to obtain a better picture of protein-protein interaction networks using protein tertiary structure data. Predicting relevant interacting partners from their tertiary structure is a challenging task and computer science methods have the potential to assist with this. Protein-protein rigid docking has been utilized by several projects, docking-based approaches having the advantages that they can suggest binding poses of predicted binding partners which would help in understanding the interaction mechanisms and that comparing docking results of both non-binders and binders can lead to understanding the specificity of protein-protein interactions from structural viewpoints. In this review we focus on explaining current computational prediction methods to predict pairwise direct protein-protein interactions that form protein complexes.

  18. Robust Arm and Hand Tracking by Unsupervised Context Learning

    PubMed Central

    Spruyt, Vincent; Ledda, Alessandro; Philips, Wilfried

    2014-01-01

    Hand tracking in video is an increasingly popular research field due to the rise of novel human-computer interaction methods. However, robust and real-time hand tracking in unconstrained environments remains a challenging task due to the high number of degrees of freedom and the non-rigid character of the human hand. In this paper, we propose an unsupervised method to automatically learn the context in which a hand is embedded. This context includes the arm and any other object that coherently moves along with the hand. We introduce two novel methods to incorporate this context information into a probabilistic tracking framework, and introduce a simple yet effective solution to estimate the position of the arm. Finally, we show that our method greatly increases robustness against occlusion and cluttered background, without degrading tracking performance if no contextual information is available. The proposed real-time algorithm is shown to outperform the current state-of-the-art by evaluating it on three publicly available video datasets. Furthermore, a novel dataset is created and made publicly available for the research community. PMID:25004155

  19. 75 FR 70899 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-19

    ... submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection... Annual Burden Hours: 2,952. Public Computer Center Reports (Quarterly and Annually) Number of Respondents... specific to Infrastructure and Comprehensive Community Infrastructure, Public Computer Center, and...

  20. 77 FR 14340 - Proposed Information Collection; Comment Request; Broadband Technology Opportunities Program...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-09

    ... (PPRs) to capture quarterly and annual reports for each project type (Infrastructure, Public Computer... Information Collection; Comment Request; Broadband Technology Opportunities Program (BTOP) Quarterly and..., which included competitive grants to expand public computer center capacity and innovative programs to...

  1. Systems Epidemiology: What’s in a Name?

    PubMed Central

    Dammann, O.; Gray, P.; Gressens, P.; Wolkenhauer, O.; Leviton, A.

    2014-01-01

    Systems biology is an interdisciplinary effort to integrate molecular, cellular, tissue, organ, and organism levels of function into computational models that facilitate the identification of general principles. Systems medicine adds a disease focus. Systems epidemiology adds yet another level consisting of antecedents that might contribute to the disease process in populations. In etiologic and prevention research, systems-type thinking about multiple levels of causation will allow epidemiologists to identify contributors to disease at multiple levels as well as their interactions. In public health, systems epidemiology will contribute to the improvement of syndromic surveillance methods. We encourage the creation of computational simulation models that integrate information about disease etiology, pathogenetic data, and the expertise of investigators from different disciplines. PMID:25598870

  2. Audit of the annual reports of directors of public health: production methods.

    PubMed

    Sidhu, K S

    1992-07-01

    An audit of production methods used for the Directors of Public Health (DsPH) Annual Report of the health of their local population. Postal questionnaire survey. 23 Departments of Public Health in the West Midlands Region. Costs and problems relating to different production techniques used. The majority of DsPH favoured reports with figures and graphs. This led to most DsPH using in-house desktop publishing or employing external graphic designers. Those using the former technique had more problems related to computers and felt they spent too much medical time working on the document. However, they also valued the relative low cost, editorial freedom and the ability to correct mistakes easily. Departments which employed external graphic designers generally paid more, but appreciated the extra time made available by delegating the work. They also felt that the expertise was valuable in document design. However, inaccuracies were cited as being more difficult to correct. Perhaps the best way of producing an annual report is to amalgamate the two commonest production techniques (i.e. external graphic design and in-house desktop publishing).

  3. Programmers, professors, and parasites: credit and co-authorship in computer science.

    PubMed

    Solomon, Justin

    2009-12-01

    This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.

  4. Synchronization of random bit generators based on coupled chaotic lasers and application to cryptography.

    PubMed

    Kanter, Ido; Butkovski, Maria; Peleg, Yitzhak; Zigzag, Meital; Aviad, Yaara; Reidler, Igor; Rosenbluh, Michael; Kinzel, Wolfgang

    2010-08-16

    Random bit generators (RBGs) constitute an important tool in cryptography, stochastic simulations and secure communications. The later in particular has some difficult requirements: high generation rate of unpredictable bit strings and secure key-exchange protocols over public channels. Deterministic algorithms generate pseudo-random number sequences at high rates, however, their unpredictability is limited by the very nature of their deterministic origin. Recently, physical RBGs based on chaotic semiconductor lasers were shown to exceed Gbit/s rates. Whether secure synchronization of two high rate physical RBGs is possible remains an open question. Here we propose a method, whereby two fast RBGs based on mutually coupled chaotic lasers, are synchronized. Using information theoretic analysis we demonstrate security against a powerful computational eavesdropper, capable of noiseless amplification, where all parameters are publicly known. The method is also extended to secure synchronization of a small network of three RBGs.

  5. PREFACE: 9th World Congress on Computational Mechanics and 4th Asian Pacific Congress on Computational Mechanics

    NASA Astrophysics Data System (ADS)

    Khalili, N.; Valliappan, S.; Li, Q.; Russell, A.

    2010-07-01

    The use for mathematical models of natural phenomena has underpinned science and engineering for centuries, but until the advent of modern computers and computational methods, the full utility of most of these models remained outside the reach of the engineering communities. Since World War II, advances in computational methods have transformed the way engineering and science is undertaken throughout the world. Today, theories of mechanics of solids and fluids, electromagnetism, heat transfer, plasma physics, and other scientific disciplines are implemented through computational methods in engineering analysis, design, manufacturing, and in studying broad classes of physical phenomena. The discipline concerned with the application of computational methods is now a key area of research, education, and application throughout the world. In the early 1980's, the International Association for Computational Mechanics (IACM) was founded to promote activities related to computational mechanics and has made impressive progress. The most important scientific event of IACM is the World Congress on Computational Mechanics. The first was held in Austin (USA) in 1986 and then in Stuttgart (Germany) in 1990, Chiba (Japan) in 1994, Buenos Aires (Argentina) in 1998, Vienna (Austria) in 2002, Beijing (China) in 2004, Los Angeles (USA) in 2006 and Venice, Italy; in 2008. The 9th World Congress on Computational Mechanics is held in conjunction with the 4th Asian Pacific Congress on Computational Mechanics under the auspices of Australian Association for Computational Mechanics (AACM), Asian Pacific Association for Computational Mechanics (APACM) and International Association for Computational Mechanics (IACM). The 1st Asian Pacific Congress was in Sydney (Australia) in 2001, then in Beijing (China) in 2004 and Kyoto (Japan) in 2007. The WCCM/APCOM 2010 publications consist of a printed book of abstracts given to delegates, along with 247 full length peer reviewed papers published with free access online in IOP Conference Series: Materials Science and Engineering. The editors acknowledge the help of the paper reviewers in maintaining a high standard of assessment and the co-operation of the authors in complying with the requirements of the editors and the reviewers. We also would like to take this opportunity to thank the members of the Local Organising Committee and the International Scientific Committee for helping make WCCM/APCOM 2010 a successful event. We also thank The University of New South Wales, The University of Newcastle, the Centre for Infrastructure Engineering and Safety (CIES), IACM, APCAM, AACM for their financial support, along with the United States Association for Computational Mechanics for the Travel Awards made available. N. Khalili S. Valliappan Q. Li A. Russell 19 July 2010 Sydney, Australia

  6. The Mayo Clinic Author Catalog: A Living Repository of Medical Knowledge

    PubMed Central

    Key, Jack D.; Sholtz, Katherine J.

    1973-01-01

    Since 1907 records have been kept of publications by staff members of the Mayo Clinic, and this information has been invaluable. The Author Catalog has proved itself such a useful tool for the Mayo Clinic that other libraries, large and small, may wish to consider adopting such a service. The Mayo medical complex is a large institution with more than 500 staff and faculty members engaged in the publication of clinical, educational, and research findings. The great amount of cross-disciplinary cooperation and interdepartmental research makes essential an up-to-date record of what is going on. The Mayo Clinic Library developed a comprehensive computerized method for identifying research and for identifying and indexing publications of Mayo staff members. At the end of 1971 more than 25,000 citations had been stored on computer tape. Images PMID:4122094

  7. YNOGK: A New Public Code for Calculating Null Geodesics in the Kerr Spacetime

    NASA Astrophysics Data System (ADS)

    Yang, Xiaolin; Wang, Jiancheng

    2013-07-01

    Following the work of Dexter & Agol, we present a new public code for the fast calculation of null geodesics in the Kerr spacetime. Using Weierstrass's and Jacobi's elliptic functions, we express all coordinates and affine parameters as analytical and numerical functions of a parameter p, which is an integral value along the geodesic. This is the main difference between our code and previous similar ones. The advantage of this treatment is that the information about the turning points does not need to be specified in advance by the user, and many applications such as imaging, the calculation of line profiles, and the observer-emitter problem, become root-finding problems. All elliptic integrations are computed by Carlson's elliptic integral method as in Dexter & Agol, which guarantees the fast computational speed of our code. The formulae to compute the constants of motion given by Cunningham & Bardeen have been extended, which allow one to readily handle situations in which the emitter or the observer has an arbitrary distance from, and motion state with respect to, the central compact object. The validation of the code has been extensively tested through applications to toy problems from the literature. The source FORTRAN code is freely available for download on our Web site http://www1.ynao.ac.cn/~yangxl/yxl.html.

  8. VizieR Online Data Catalog: Historical and HST Astrometry of Sirius A,B (Bond+, 2017)

    NASA Astrophysics Data System (ADS)

    Bond, H. E.; Schaefer, G. H.; Gilliland, R. L.; Holberg, J. B.; Mason, B. D.; Lindenblad, I. W.; Seitz-McLeese, M.; Arnett, W. D.; Demarque, P.; Spada, F.; Young, P. A.; Barstow, M. A.; Burleigh, M. R.; Gudehus, D.

    2017-05-01

    We have assembled a compilation of published historical measurements of the position angle (PA) and the angular separation of Sirius B relative to Sirius A. Our tabulation is based on a critical review of measures contained in the Washington Double Star Catalog maintained at the USNO and from our additional literature searches. Notes included in the tabulation give extensive commentary on the historical observations. Many early publications provided measures averaged over multiple nights or even an entire observing season for the purpose of reducing computational labor in subsequent analyses. With modern computers, there is no need for such averaging, so we opted to present the individual measures whenever available. However, if an observer reported more than one measurement on a given night, we did compute the mean position for that night. If the original publication only reported a mean across several nights, we tabulated that mean as reported. The visual micrometer observations did not always include a contemporaneous measurement of both the PA and separation. These omissions are listed as -99.0 in the table. The measurement uncertainties were assigned through our orbital fitting method described in the paper. Measurements that were rejected from the orbital solution are identified in the Notes column and are listed with uncertainties of 0. (3 data files).

  9. Implementation of audio computer-assisted interviewing software in HIV/AIDS research.

    PubMed

    Pluhar, Erika; McDonnell Holstad, Marcia; Yeager, Katherine A; Denzmore-Nwagbara, Pamela; Corkran, Carol; Fielder, Bridget; McCarty, Frances; Diiorio, Colleen

    2007-01-01

    Computer-assisted interviewing (CAI) has begun to play a more prominent role in HIV/AIDS prevention research. Despite the increased popularity of CAI, particularly audio computer-assisted self-interviewing (ACASI), some research teams are still reluctant to implement ACASI technology because of lack of familiarity with the practical issues related to using these software packages. The purpose of this report is to describe the implementation of one particular ACASI software package, the Questionnaire Development System (QDS; Nova Research Company, Bethesda, MD), in several nursing and HIV/AIDS prevention research settings. The authors present acceptability and satisfaction data from two large-scale public health studies in which they have used QDS with diverse populations. They also address issues related to developing and programming a questionnaire; discuss practical strategies related to planning for and implementing ACASI in the field, including selecting equipment, training staff, and collecting and transferring data; and summarize advantages and disadvantages of computer-assisted research methods.

  10. Multimedia for occupational safety and health training: a pilot study examining a multimedia learning theory.

    PubMed

    Wallen, Erik S; Mulloy, Karen B

    2006-10-01

    Occupational diseases are a significant problem affecting public health. Safety training is an important method of preventing occupational illness. Training is increasingly being delivered by computer although theories of learning from computer-based multimedia have been tested almost entirely on college students. This study was designed to determine whether these theories might also be applied to safety training applications for working adults. Participants viewed either computer-based multimedia respirator use training with concurrent narration, narration prior to the animation, or unrelated safety training. Participants then took a five-item transfer test which measured their ability to use their knowledge in new and creative ways. Participants who viewed the computer-based multimedia trainings both did significantly better than the control group on the transfer test. The results of this pilot study suggest that design guidelines developed for younger learners may be effective for training workers in occupational safety and health although more investigation is needed.

  11. RECOLA2: REcursive Computation of One-Loop Amplitudes 2

    NASA Astrophysics Data System (ADS)

    Denner, Ansgar; Lang, Jean-Nicolas; Uccirati, Sandro

    2018-03-01

    We present the Fortran95 program RECOLA2 for the perturbative computation of next-to-leading-order transition amplitudes in the Standard Model of particle physics and extended Higgs sectors. New theories are implemented via model files in the 't Hooft-Feynman gauge in the conventional formulation of quantum field theory and in the Background-Field method. The present version includes model files for Two-Higgs-Doublet Model and the Higgs-Singlet Extension of the Standard Model. We support standard renormalization schemes for the Standard Model as well as many commonly used renormalization schemes in extended Higgs sectors. Within these models the computation of next-to-leading-order polarized amplitudes and squared amplitudes, optionally summed over spin and colour, is fully automated for any process. RECOLA2 allows the computation of colour- and spin-correlated leading-order squared amplitudes that are needed in the dipole subtraction formalism. RECOLA2 is publicly available for download at http://recola.hepforge.org.

  12. Public Domain Microcomputer Software for Forestry.

    ERIC Educational Resources Information Center

    Martin, Les

    A project was conducted to develop a computer forestry/forest products bibliography applicable to high school and community college vocational/technical programs. The project director contacted curriculum clearinghouses, computer companies, and high school and community college instructors in order to obtain listings of public domain programs for…

  13. Funding Public Computing Centers: Balancing Broadband Availability and Expected Demand

    ERIC Educational Resources Information Center

    Jayakar, Krishna; Park, Eun-A

    2012-01-01

    The National Broadband Plan (NBP) recently announced by the Federal Communication Commission visualizes a significantly enhanced commitment to public computing centers (PCCs) as an element of the Commission's plans for promoting broadband availability. In parallel, the National Telecommunications and Information Administration (NTIA) has…

  14. 21 CFR 20.117 - New drug information.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... covered by such applications. (b) Other computer printouts containing IND and NDA information are... computer printouts are available for public inspection in the Food and Drug Administration's Freedom of Information Public Room: (1) A numerical listing of all new drug applications and abbreviated new drug...

  15. Protecting Public-Access Computers in Libraries.

    ERIC Educational Resources Information Center

    King, Monica

    1999-01-01

    Describes one public library's development of a computer-security plan, along with helpful products used. Discussion includes Internet policy, physical protection of hardware, basic protection of the operating system and software on the network, browser dilemmas and maintenance, creating clear intuitive interface, and administering fair use and…

  16. A fast and objective multidimensional kernel density estimation method: fastKDE

    DOE PAGES

    O'Brien, Travis A.; Kashinath, Karthik; Cavanaugh, Nicholas R.; ...

    2016-03-07

    Numerous facets of scientific research implicitly or explicitly call for the estimation of probability densities. Histograms and kernel density estimates (KDEs) are two commonly used techniques for estimating such information, with the KDE generally providing a higher fidelity representation of the probability density function (PDF). Both methods require specification of either a bin width or a kernel bandwidth. While techniques exist for choosing the kernel bandwidth optimally and objectively, they are computationally intensive, since they require repeated calculation of the KDE. A solution for objectively and optimally choosing both the kernel shape and width has recently been developed by Bernacchiamore » and Pigolotti (2011). While this solution theoretically applies to multidimensional KDEs, it has not been clear how to practically do so. A method for practically extending the Bernacchia-Pigolotti KDE to multidimensions is introduced. This multidimensional extension is combined with a recently-developed computational improvement to their method that makes it computationally efficient: a 2D KDE on 10 5 samples only takes 1 s on a modern workstation. This fast and objective KDE method, called the fastKDE method, retains the excellent statistical convergence properties that have been demonstrated for univariate samples. The fastKDE method exhibits statistical accuracy that is comparable to state-of-the-science KDE methods publicly available in R, and it produces kernel density estimates several orders of magnitude faster. The fastKDE method does an excellent job of encoding covariance information for bivariate samples. This property allows for direct calculation of conditional PDFs with fastKDE. It is demonstrated how this capability might be leveraged for detecting non-trivial relationships between quantities in physical systems, such as transitional behavior.« less

  17. Efficient experimental design for uncertainty reduction in gene regulatory networks.

    PubMed

    Dehghannasiri, Roozbeh; Yoon, Byung-Jun; Dougherty, Edward R

    2015-01-01

    An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/.

  18. Efficient experimental design for uncertainty reduction in gene regulatory networks

    PubMed Central

    2015-01-01

    Background An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. Results The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Conclusions Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/. PMID:26423515

  19. Breast tumor malignancy modelling using evolutionary neural logic networks.

    PubMed

    Tsakonas, Athanasios; Dounias, Georgios; Panagi, Georgia; Panourgias, Evangelia

    2006-01-01

    The present work proposes a computer assisted methodology for the effective modelling of the diagnostic decision for breast tumor malignancy. The suggested approach is based on innovative hybrid computational intelligence algorithms properly applied in related cytological data contained in past medical records. The experimental data used in this study were gathered in the early 1990s in the University of Wisconsin, based in post diagnostic cytological observations performed by expert medical staff. Data were properly encoded in a computer database and accordingly, various alternative modelling techniques were applied on them, in an attempt to form diagnostic models. Previous methods included standard optimisation techniques, as well as artificial intelligence approaches, in a way that a variety of related publications exists in modern literature on the subject. In this report, a hybrid computational intelligence approach is suggested, which effectively combines modern mathematical logic principles, neural computation and genetic programming in an effective manner. The approach proves promising either in terms of diagnostic accuracy and generalization capabilities, or in terms of comprehensibility and practical importance for the related medical staff.

  20. Trends in Social Science: The Impact of Computational and Simulative Models

    NASA Astrophysics Data System (ADS)

    Conte, Rosaria; Paolucci, Mario; Cecconi, Federico

    This paper discusses current progress in the computational social sciences. Specifically, it examines the following questions: Are the computational social sciences exhibiting positive or negative developments? What are the roles of agent-based models and simulation (ABM), network analysis, and other "computational" methods within this dynamic? (Conte, The necessity of intelligent agents in social simulation, Advances in Complex Systems, 3(01n04), 19-38, 2000; Conte 2010; Macy, Annual Review of Sociology, 143-166, 2002). Are there objective indicators of scientific growth that can be applied to different scientific areas, allowing for comparison among them? In this paper, some answers to these questions are presented and discussed. In particular, comparisons among different disciplines in the social and computational sciences are shown, taking into account their respective growth trends in the number of publication citations over the last few decades (culled from Google Scholar). After a short discussion of the methodology adopted, results of keyword-based queries are presented, unveiling some unexpected local impacts of simulation on the takeoff of traditionally poorly productive disciplines.

  1. Fast parallel molecular algorithms for DNA-based computation: solving the elliptic curve discrete logarithm problem over GF2.

    PubMed

    Li, Kenli; Zou, Shuting; Xv, Jin

    2008-01-01

    Elliptic curve cryptographic algorithms convert input data to unrecognizable encryption and the unrecognizable data back again into its original decrypted form. The security of this form of encryption hinges on the enormous difficulty that is required to solve the elliptic curve discrete logarithm problem (ECDLP), especially over GF(2(n)), n in Z+. This paper describes an effective method to find solutions to the ECDLP by means of a molecular computer. We propose that this research accomplishment would represent a breakthrough for applied biological computation and this paper demonstrates that in principle this is possible. Three DNA-based algorithms: a parallel adder, a parallel multiplier, and a parallel inverse over GF(2(n)) are described. The biological operation time of all of these algorithms is polynomial with respect to n. Considering this analysis, cryptography using a public key might be less secure. In this respect, a principal contribution of this paper is to provide enhanced evidence of the potential of molecular computing to tackle such ambitious computations.

  2. Fast Parallel Molecular Algorithms for DNA-Based Computation: Solving the Elliptic Curve Discrete Logarithm Problem over GF(2n)

    PubMed Central

    Li, Kenli; Zou, Shuting; Xv, Jin

    2008-01-01

    Elliptic curve cryptographic algorithms convert input data to unrecognizable encryption and the unrecognizable data back again into its original decrypted form. The security of this form of encryption hinges on the enormous difficulty that is required to solve the elliptic curve discrete logarithm problem (ECDLP), especially over GF(2n), n ∈ Z+. This paper describes an effective method to find solutions to the ECDLP by means of a molecular computer. We propose that this research accomplishment would represent a breakthrough for applied biological computation and this paper demonstrates that in principle this is possible. Three DNA-based algorithms: a parallel adder, a parallel multiplier, and a parallel inverse over GF(2n) are described. The biological operation time of all of these algorithms is polynomial with respect to n. Considering this analysis, cryptography using a public key might be less secure. In this respect, a principal contribution of this paper is to provide enhanced evidence of the potential of molecular computing to tackle such ambitious computations. PMID:18431451

  3. News | Computing

    Science.gov Websites

    Support News Publications Computing for Experiments Computing for Neutrino and Muon Physics Computing for Collider Experiments Computing for Astrophysics Research and Development Accelerator Modeling ComPASS - Impact of Detector Simulation on Particle Physics Collider Experiments Daniel Elvira's paper "Impact

  4. An accurate and efficient method to predict the electronic excitation energies of BODIPY fluorescent dyes.

    PubMed

    Wang, Jia-Nan; Jin, Jun-Ling; Geng, Yun; Sun, Shi-Ling; Xu, Hong-Liang; Lu, Ying-Hua; Su, Zhong-Min

    2013-03-15

    Recently, the extreme learning machine neural network (ELMNN) as a valid computing method has been proposed to predict the nonlinear optical property successfully (Wang et al., J. Comput. Chem. 2012, 33, 231). In this work, first, we follow this line of work to predict the electronic excitation energies using the ELMNN method. Significantly, the root mean square deviation of the predicted electronic excitation energies of 90 4,4-difluoro-4-bora-3a,4a-diaza-s-indacene (BODIPY) derivatives between the predicted and experimental values has been reduced to 0.13 eV. Second, four groups of molecule descriptors are considered when building the computing models. The results show that the quantum chemical descriptions have the closest intrinsic relation with the electronic excitation energy values. Finally, a user-friendly web server (EEEBPre: Prediction of electronic excitation energies for BODIPY dyes), which is freely accessible to public at the web site: http://202.198.129.218, has been built for prediction. This web server can return the predicted electronic excitation energy values of BODIPY dyes that are high consistent with the experimental values. We hope that this web server would be helpful to theoretical and experimental chemists in related research. Copyright © 2012 Wiley Periodicals, Inc.

  5. A novel Bayesian framework for discriminative feature extraction in Brain-Computer Interfaces.

    PubMed

    Suk, Heung-Il; Lee, Seong-Whan

    2013-02-01

    As there has been a paradigm shift in the learning load from a human subject to a computer, machine learning has been considered as a useful tool for Brain-Computer Interfaces (BCIs). In this paper, we propose a novel Bayesian framework for discriminative feature extraction for motor imagery classification in an EEG-based BCI in which the class-discriminative frequency bands and the corresponding spatial filters are optimized by means of the probabilistic and information-theoretic approaches. In our framework, the problem of simultaneous spatiospectral filter optimization is formulated as the estimation of an unknown posterior probability density function (pdf) that represents the probability that a single-trial EEG of predefined mental tasks can be discriminated in a state. In order to estimate the posterior pdf, we propose a particle-based approximation method by extending a factored-sampling technique with a diffusion process. An information-theoretic observation model is also devised to measure discriminative power of features between classes. From the viewpoint of classifier design, the proposed method naturally allows us to construct a spectrally weighted label decision rule by linearly combining the outputs from multiple classifiers. We demonstrate the feasibility and effectiveness of the proposed method by analyzing the results and its success on three public databases.

  6. Microcomputer Laboratory Design.

    DTIC Science & Technology

    1983-03-01

    Approved for public release, distribution unlimited 17. OiSTi Of OUTIO STATEMIEN (61 tile 41141f61 d "#Or d i 1806k 20. If |1 N RA""") WS. SUPPLEMENTARY...and implemented to support Airborne Digital Computation, AE 4641, a course involving a study of the methods used for digital computa- tion in...University, 1975 Submitted in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE IN AERONAUTICAL ENGINEERING -from the NAVAL

  7. A uniform technique for flood frequency analysis.

    USGS Publications Warehouse

    Thomas, W.O.

    1985-01-01

    This uniform technique consisted of fitting the logarithms of annual peak discharges to a Pearson Type III distribution using the method of moments. The objective was to adopt a consistent approach for the estimation of floodflow frequencies that could be used in computing average annual flood losses for project evaluation. In addition, a consistent approach was needed for defining equitable flood-hazard zones as part of the National Flood Insurance Program. -from ASCE Publications Information

  8. Bibliography of mass spectroscopy literature for 1972 compiled by a computer method. Volume I. Bibliography and author index

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capellen, J.; Svec, H.J.; Sage, C.R.

    1975-08-01

    This report covers the year 1972, and lists approximately 10,000 articles of interest to mass spectroscopists. This two-volume report consists of three sections. Vol. I contains bibliography and author index sections. The bibliography section lists the authors, the title, and the publication data for each article. The author index lists the authors' names and the reference numbers of their articles. (auth)

  9. Cloud Computing: An Overview

    NASA Astrophysics Data System (ADS)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  10. Security and Privacy Qualities of Medical Devices: An Analysis of FDA Postmarket Surveillance

    PubMed Central

    Kramer, Daniel B.; Baker, Matthew; Ransford, Benjamin; Molina-Markham, Andres; Stewart, Quinn; Fu, Kevin; Reynolds, Matthew R.

    2012-01-01

    Background Medical devices increasingly depend on computing functions such as wireless communication and Internet connectivity for software-based control of therapies and network-based transmission of patients’ stored medical information. These computing capabilities introduce security and privacy risks, yet little is known about the prevalence of such risks within the clinical setting. Methods We used three comprehensive, publicly available databases maintained by the Food and Drug Administration (FDA) to evaluate recalls and adverse events related to security and privacy risks of medical devices. Results Review of weekly enforcement reports identified 1,845 recalls; 605 (32.8%) of these included computers, 35 (1.9%) stored patient data, and 31 (1.7%) were capable of wireless communication. Searches of databases specific to recalls and adverse events identified only one event with a specific connection to security or privacy. Software-related recalls were relatively common, and most (81.8%) mentioned the possibility of upgrades, though only half of these provided specific instructions for the update mechanism. Conclusions Our review of recalls and adverse events from federal government databases reveals sharp inconsistencies with databases at individual providers with respect to security and privacy risks. Recalls related to software may increase security risks because of unprotected update and correction mechanisms. To detect signals of security and privacy problems that adversely affect public health, federal postmarket surveillance strategies should rethink how to effectively and efficiently collect data on security and privacy problems in devices that increasingly depend on computing systems susceptible to malware. PMID:22829874

  11. Wildlife software: procedures for publication of computer software

    USGS Publications Warehouse

    Samuel, M.D.

    1990-01-01

    Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.

  12. ACToR: Aggregated Computational Toxicology Resource (T)

    EPA Science Inventory

    The EPA Aggregated Computational Toxicology Resource (ACToR) is a set of databases compiling information on chemicals in the environment from a large number of public and in-house EPA sources. ACToR has 3 main goals: (1) The serve as a repository of public toxicology information ...

  13. Locating Critical Circular and Unconstrained Failure Surface in Slope Stability Analysis with Tailored Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Pasik, Tomasz; van der Meij, Raymond

    2017-12-01

    This article presents an efficient search method for representative circular and unconstrained slip surfaces with the use of the tailored genetic algorithm. Searches for unconstrained slip planes with rigid equilibrium methods are yet uncommon in engineering practice, and little publications regarding truly free slip planes exist. The proposed method presents an effective procedure being the result of the right combination of initial population type, selection, crossover and mutation method. The procedure needs little computational effort to find the optimum, unconstrained slip plane. The methodology described in this paper is implemented using Mathematica. The implementation, along with further explanations, is fully presented so the results can be reproduced. Sample slope stability calculations are performed for four cases, along with a detailed result interpretation. Two cases are compared with analyses described in earlier publications. The remaining two are practical cases of slope stability analyses of dikes in Netherlands. These four cases show the benefits of analyzing slope stability with a rigid equilibrium method combined with a genetic algorithm. The paper concludes by describing possibilities and limitations of using the genetic algorithm in the context of the slope stability problem.

  14. A hybrid parallel architecture for electrostatic interactions in the simulation of dissipative particle dynamics

    NASA Astrophysics Data System (ADS)

    Yang, Sheng-Chun; Lu, Zhong-Yuan; Qian, Hu-Jun; Wang, Yong-Lei; Han, Jie-Ping

    2017-11-01

    In this work, we upgraded the electrostatic interaction method of CU-ENUF (Yang, et al., 2016) which first applied CUNFFT (nonequispaced Fourier transforms based on CUDA) to the reciprocal-space electrostatic computation and made the computation of electrostatic interaction done thoroughly in GPU. The upgraded edition of CU-ENUF runs concurrently in a hybrid parallel way that enables the computation parallelizing on multiple computer nodes firstly, then further on the installed GPU in each computer. By this parallel strategy, the size of simulation system will be never restricted to the throughput of a single CPU or GPU. The most critical technical problem is how to parallelize a CUNFFT in the parallel strategy, which is conquered effectively by deep-seated research of basic principles and some algorithm skills. Furthermore, the upgraded method is capable of computing electrostatic interactions for both the atomistic molecular dynamics (MD) and the dissipative particle dynamics (DPD). Finally, the benchmarks conducted for validation and performance indicate that the upgraded method is able to not only present a good precision when setting suitable parameters, but also give an efficient way to compute electrostatic interactions for huge simulation systems. Program Files doi:http://dx.doi.org/10.17632/zncf24fhpv.1 Licensing provisions: GNU General Public License 3 (GPL) Programming language: C, C++, and CUDA C Supplementary material: The program is designed for effective electrostatic interactions of large-scale simulation systems, which runs on particular computers equipped with NVIDIA GPUs. It has been tested on (a) single computer node with Intel(R) Core(TM) i7-3770@ 3.40 GHz (CPU) and GTX 980 Ti (GPU), and (b) MPI parallel computer nodes with the same configurations. Nature of problem: For molecular dynamics simulation, the electrostatic interaction is the most time-consuming computation because of its long-range feature and slow convergence in simulation space, which approximately take up most of the total simulation time. Although the parallel method CU-ENUF (Yang et al., 2016) based on GPU has achieved a qualitative leap compared with previous methods in electrostatic interactions computation, the computation capability is limited to the throughput capacity of a single GPU for super-scale simulation system. Therefore, we should look for an effective method to handle the calculation of electrostatic interactions efficiently for a simulation system with super-scale size. Solution method: We constructed a hybrid parallel architecture, in which CPU and GPU are combined to accelerate the electrostatic computation effectively. Firstly, the simulation system is divided into many subtasks via domain-decomposition method. Then MPI (Message Passing Interface) is used to implement the CPU-parallel computation with each computer node corresponding to a particular subtask, and furthermore each subtask in one computer node will be executed in GPU in parallel efficiently. In this hybrid parallel method, the most critical technical problem is how to parallelize a CUNFFT (nonequispaced fast Fourier transform based on CUDA) in the parallel strategy, which is conquered effectively by deep-seated research of basic principles and some algorithm skills. Restrictions: The HP-ENUF is mainly oriented to super-scale system simulations, in which the performance superiority is shown adequately. However, for a small simulation system containing less than 106 particles, the mode of multiple computer nodes has no apparent efficiency advantage or even lower efficiency due to the serious network delay among computer nodes, than the mode of single computer node. References: (1) S.-C. Yang, H.-J. Qian, Z.-Y. Lu, Appl. Comput. Harmon. Anal. 2016, http://dx.doi.org/10.1016/j.acha.2016.04.009. (2) S.-C. Yang, Y.-L. Wang, G.-S. Jiao, H.-J. Qian, Z.-Y. Lu, J. Comput. Chem. 37 (2016) 378. (3) S.-C. Yang, Y.-L. Zhu, H.-J. Qian, Z.-Y. Lu, Appl. Chem. Res. Chin. Univ., 2017, http://dx.doi.org/10.1007/s40242-016-6354-5. (4) Y.-L. Zhu, H. Liu, Z.-W. Li, H.-J. Qian, G. Milano, Z.-Y. Lu, J. Comput. Chem. 34 (2013) 2197.

  15. 43 CFR 44.23 - How does the Department certify payment computations?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Certified Public Accountant, or an independent public accountant, that the statement has been audited in... guidelines that State auditors, independent Certified Public Accountants, or independent public accountants...

  16. 43 CFR 44.23 - How does the Department certify payment computations?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Certified Public Accountant, or an independent public accountant, that the statement has been audited in... guidelines that State auditors, independent Certified Public Accountants, or independent public accountants...

  17. 43 CFR 44.23 - How does the Department certify payment computations?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Certified Public Accountant, or an independent public accountant, that the statement has been audited in... guidelines that State auditors, independent Certified Public Accountants, or independent public accountants...

  18. 43 CFR 44.23 - How does the Department certify payment computations?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Certified Public Accountant, or an independent public accountant, that the statement has been audited in... guidelines that State auditors, independent Certified Public Accountants, or independent public accountants...

  19. 45 CFR 612.2 - Public reading room.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 3 2012-10-01 2012-10-01 false Public reading room. 612.2 Section 612.2 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION AVAILABILITY OF... available for public inspection and copying and has computers and printers available for public use in...

  20. 45 CFR 612.2 - Public reading room.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 3 2011-10-01 2011-10-01 false Public reading room. 612.2 Section 612.2 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION AVAILABILITY OF... available for public inspection and copying and has computers and printers available for public use in...

  1. 45 CFR 612.2 - Public reading room.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 3 2014-10-01 2014-10-01 false Public reading room. 612.2 Section 612.2 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION AVAILABILITY OF... available for public inspection and copying and has computers and printers available for public use in...

  2. 45 CFR 612.2 - Public reading room.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 3 2013-10-01 2013-10-01 false Public reading room. 612.2 Section 612.2 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION AVAILABILITY OF... available for public inspection and copying and has computers and printers available for public use in...

  3. 45 CFR 612.2 - Public reading room.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false Public reading room. 612.2 Section 612.2 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION AVAILABILITY OF... available for public inspection and copying and has computers and printers available for public use in...

  4. Inferring gene and protein interactions using PubMed citations and consensus Bayesian networks

    PubMed Central

    Dalman, Mark; Haddad, Joseph; Duan, Zhong-Hui

    2017-01-01

    The PubMed database offers an extensive set of publication data that can be useful, yet inherently complex to use without automated computational techniques. Data repositories such as the Genomic Data Commons (GDC) and the Gene Expression Omnibus (GEO) offer experimental data storage and retrieval as well as curated gene expression profiles. Genetic interaction databases, including Reactome and Ingenuity Pathway Analysis, offer pathway and experiment data analysis using data curated from these publications and data repositories. We have created a method to generate and analyze consensus networks, inferring potential gene interactions, using large numbers of Bayesian networks generated by data mining publications in the PubMed database. Through the concept of network resolution, these consensus networks can be tailored to represent possible genetic interactions. We designed a set of experiments to confirm that our method is stable across variation in both sample and topological input sizes. Using gene product interactions from the KEGG pathway database and data mining PubMed publication abstracts, we verify that regardless of the network resolution or the inferred consensus network, our method is capable of inferring meaningful gene interactions through consensus Bayesian network generation with multiple, randomized topological orderings. Our method can not only confirm the existence of currently accepted interactions, but has the potential to hypothesize new ones as well. We show our method confirms the existence of known gene interactions such as JAK-STAT-PI3K-AKT-mTOR, infers novel gene interactions such as RAS- Bcl-2 and RAS-AKT, and found significant pathway-pathway interactions between the JAK-STAT signaling and Cardiac Muscle Contraction KEGG pathways. PMID:29049295

  5. Effect of Addiction to Computer Games on Physical and Mental Health of Female and Male Students of Guidance School in City of Isfahan

    PubMed Central

    Zamani, Eshrat; Chashmi, Maliheh; Hedayati, Nasim

    2009-01-01

    Background: This study aimed to investigate the effects of addiction to computer games on physical and mental health of students. Methods: The study population includes all students in the second year of public guidance schools in the city of Isfahan in the educational year of 2009-2010. The sample size includes 564 students selected by multiple steps stratified sampling. Dependent variables include general health in dimensions of physical health, anxiety and sleeplessness and impaired social functioning. Data were collected using General Health Questionnaire (GHQ-28) scale and a questionnaire on addiction to computer games. Pearson's correlation coefficient and structural model were used for data analysis. Findings: There was a significant positive correlation between students' computer games addiction and their physical and mental health in dimensions of physical health, anxiety and sleeplessness There was a significant negative relationship between addictions to computer games and impaired social functioning. Conclusion: The results of this study are in agreement with the findings of other studies around the world. As the results show, addiction to computer games affects various dimensions of health and increases physical problems, anxiety and depression, while decreases social functioning disorder. PMID:24494091

  6. What do computer scientists tweet? Analyzing the link-sharing practice on Twitter.

    PubMed

    Schmitt, Marco; Jäschke, Robert

    2017-01-01

    Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists' style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science.

  7. What do computer scientists tweet? Analyzing the link-sharing practice on Twitter

    PubMed Central

    Schmitt, Marco

    2017-01-01

    Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists’ style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science. PMID:28636619

  8. Ames Research Center publications: A continuing bibliography, 1980

    NASA Technical Reports Server (NTRS)

    1981-01-01

    This bibliography lists formal NASA publications, journal articles, books, chapters of books, patents, contractor reports, and computer programs that were issued by Ames Research Center and indexed by Scientific and Technical Aerospace Reports, Limited Scientific and Technical Aerospace Reports, International Aerospace Abstracts, and Computer Program Abstracts in 1980. Citations are arranged by directorate, type of publication, and NASA accession numbers. Subject, personal author, corporate source, contract number, and report/accession number indexes are provided.

  9. Laser speckle imaging for lesion detection on tooth

    NASA Astrophysics Data System (ADS)

    Gavinho, Luciano G.; Silva, João. V. P.; Damazio, João. H.; Sfalcin, Ravana A.; Araujo, Sidnei A.; Pinto, Marcelo M.; Olivan, Silvia R. G.; Prates, Renato A.; Bussadori, Sandra K.; Deana, Alessandro M.

    2018-02-01

    Computer vision technologies for diagnostic imaging applied to oral lesions, specifically, carious lesions of the teeth, are in their early years of development. The relevance of this public problem, dental caries, worries countries around the world, as it affects almost the entire population, at least once in the life of each individual. The present work demonstrates current techniques for obtaining information about lesions on teeth by segmentation laser speckle imagens (LSI). Laser speckle image results from laser light reflection on a rough surface, and it was considered a noise but has important features that carry information about the illuminated surface. Even though these are basic images, only a few works have analyzed it by application of computer vision methods. In this article, we present the latest results of our group, in which Computer vision techniques were adapted to segment laser speckle images for diagnostic purposes. These methods are applied to the segmentation of images between healthy and lesioned regions of the tooth. These methods have proven to be effective in the diagnosis of early-stage lesions, often imperceptible in traditional diagnostic methods in the clinical practice. The first method uses first-order statistical models, segmenting the image by comparing the mean and standard deviation of the intensity of the pixels. The second method is based on the distance of the chi-square (χ2 ) between the histograms of the image, bringing a significant improvement in the precision of the diagnosis, while a third method introduces the use of fractal geometry, exposing, through of the fractal dimension, more precisely the difference between lesioned areas and healthy areas of a tooth compared to other methods of segmentation. So far, we can observe efficiency in the segmentation of the carious regions. A software was developed for the execution and demonstration of the applicability of the models

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Brien, Travis A.; Kashinath, Karthik; Cavanaugh, Nicholas R.

    Numerous facets of scientific research implicitly or explicitly call for the estimation of probability densities. Histograms and kernel density estimates (KDEs) are two commonly used techniques for estimating such information, with the KDE generally providing a higher fidelity representation of the probability density function (PDF). Both methods require specification of either a bin width or a kernel bandwidth. While techniques exist for choosing the kernel bandwidth optimally and objectively, they are computationally intensive, since they require repeated calculation of the KDE. A solution for objectively and optimally choosing both the kernel shape and width has recently been developed by Bernacchiamore » and Pigolotti (2011). While this solution theoretically applies to multidimensional KDEs, it has not been clear how to practically do so. A method for practically extending the Bernacchia-Pigolotti KDE to multidimensions is introduced. This multidimensional extension is combined with a recently-developed computational improvement to their method that makes it computationally efficient: a 2D KDE on 10 5 samples only takes 1 s on a modern workstation. This fast and objective KDE method, called the fastKDE method, retains the excellent statistical convergence properties that have been demonstrated for univariate samples. The fastKDE method exhibits statistical accuracy that is comparable to state-of-the-science KDE methods publicly available in R, and it produces kernel density estimates several orders of magnitude faster. The fastKDE method does an excellent job of encoding covariance information for bivariate samples. This property allows for direct calculation of conditional PDFs with fastKDE. It is demonstrated how this capability might be leveraged for detecting non-trivial relationships between quantities in physical systems, such as transitional behavior.« less

  11. Peculiarities of organization of project and research activity of students in computer science, physics and technology

    NASA Astrophysics Data System (ADS)

    Stolyarov, I. V.

    2017-01-01

    The author of this article manages a project and research activity of students in the areas of computer science, physics, engineering and biology, basing on the acquired experience in these fields. Pupils constantly become winners of competitions and conferences of different levels, for example, three of the finalists of Intel ISEF in 2013 in Phoenix (Arizona, USA) and in 2014 in Los Angeles (California, USA). In 2013 A. Makarychev received the "Small Nobel prize" in Computer Science section and special award sponsors - the company's CAST. Scientific themes and methods suggested by the author and developed in joint publications of students from Russia, Germany and Austria are the patents for invention and certificates for registration in the ROSPATENT. The article presents the results of the implementation of specific software and hardware systems in physics, engineering and medicine.

  12. All Roads Lead to Portal

    ERIC Educational Resources Information Center

    Heid, Susan D.

    2007-01-01

    Portals are taking off on campuses nationwide. According to "Campus Computing 2006," the Campus Computing Project's survey of 540 two- and four-year public and private colleges and universities across the US, portal deployment for four-year public residential universities jumped from 28 to 74 percent of responding institutions between the…

  13. The Ever-Present Demand for Public Computing Resources. CDS Spotlight

    ERIC Educational Resources Information Center

    Pirani, Judith A.

    2014-01-01

    This Core Data Service (CDS) Spotlight focuses on public computing resources, including lab/cluster workstations in buildings, virtual lab/cluster workstations, kiosks, laptop and tablet checkout programs, and workstation access in unscheduled classrooms. The findings are derived from 758 CDS 2012 participating institutions. A dataset of 529…

  14. 45 CFR 150.429 - Computation of time and extensions of time.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Computation of time and extensions of time. 150.429 Section 150.429 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings...

  15. 45 CFR 150.429 - Computation of time and extensions of time.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Computation of time and extensions of time. 150.429 Section 150.429 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings...

  16. The Reality, Direction, and Future of Computerized Publications.

    ERIC Educational Resources Information Center

    Levenstein, Nicholas

    1994-01-01

    Considers potential of personal computers, comparing development of computer today to that of cars in the 1920s. Examines recent changes in communications, university publications, and cultural habits. Explores technological issues, looking at modems and fiber optic cables, better screens, and CD-ROM and optical magnetic drives. (NB)

  17. 48 CFR 53.105 - Computer generation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Computer generation. 53...) CLAUSES AND FORMS FORMS General 53.105 Computer generation. (a) Agencies may computer-generate the... be computer generated by the public. Unless prohibited by agency regulations, forms prescribed by...

  18. 48 CFR 53.105 - Computer generation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Computer generation. 53...) CLAUSES AND FORMS FORMS General 53.105 Computer generation. (a) Agencies may computer-generate the... be computer generated by the public. Unless prohibited by agency regulations, forms prescribed by...

  19. The Geoscience Paper of the Future: Best Practices for Documenting and Sharing Research from Data to Software to Provenance

    NASA Astrophysics Data System (ADS)

    Gil, Y.; Yu, X.; David, C. H.; Demir, I.; Essawy, B.; Fulweiler, R. W.; Goodall, J. L.; Karlstrom, L.; Lee, H.; Mills, H. J.; Pierce, S. A.; Pope, A.; Tzeng, M.; Villamizar, S. R.

    2016-12-01

    Geoscientists live in a world rich with digital data and methods, and their computational research cannot be fully captured in traditional publications. The Geoscience Paper of the Future (GPF) proposes best practices for GPF authors to make data, software, and methods openly accessible, citable, and well documented. Those best practices come from recommendations by both scholars and organizations concerning open science, reproducible publications, and digital scholarship. The publication of digital objects empowers scientists to manage their research products as valuable scientific assets in an open and transparent way that enables broader access by other scientists, students, decision makers, and the public. Improving documentation and dissemination of research will accelerate the pace of scientific discovery by improving the ability of others to build upon published work. This presentation summarizes these best practices, as well as the practical experiences of several GPF authors in different geosciences disciplines. It will also discuss existing challenges for authors and publishers to produce GPFs in practice, and the opportunities to develop new approaches and infrastructure to implement those best practices. The adoption of GPF recommendations requires awareness and social change in the scientific community, including clear communication of the benefits and best practices that may be new to geoscientists.

  20. The Role of Parents and Related Factors on Adolescent Computer Use

    PubMed Central

    Epstein, Jennifer A.

    2012-01-01

    Background Research suggested the importance of parents on their adolescents’ computer activity. Spending too much time on the computer for recreational purposes in particular has been found to be related to areas of public health concern in children/adolescents, including obesity and substance use. Design and Methods The goal of the research was to determine the association between recreational computer use and potentially linked factors (parental monitoring, social influences to use computers including parents, age of first computer use, self-control, and particular internet activities). Participants (aged 13-17 years and residing in the United States) were recruited via the Internet to complete an anonymous survey online using a survey tool. The target sample of 200 participants who completed the survey was achieved. The sample’s average age was 16 and was 63% girls. Results A set of regressions with recreational computer use as dependent variables were run. Conclusions Less parental monitoring, younger age at first computer use, listening or downloading music from the internet more frequently, using the internet for educational purposes less frequently, and parent’s use of the computer for pleasure were related to spending a greater percentage of time on non-school computer use. These findings suggest the importance of parental monitoring and parental computer use on their children’s own computer use, and the influence of some internet activities on adolescent computer use. Finally, programs aimed at parents to help them increase the age when their children start using computers and learn how to place limits on recreational computer use are needed. PMID:25170449

  1. Computer loss experience and predictions

    NASA Astrophysics Data System (ADS)

    Parker, Donn B.

    1996-03-01

    The types of losses organizations must anticipate have become more difficult to predict because of the eclectic nature of computers and the data communications and the decrease in news media reporting of computer-related losses as they become commonplace. Total business crime is conjectured to be decreasing in frequency and increasing in loss per case as a result of increasing computer use. Computer crimes are probably increasing, however, as their share of the decreasing business crime rate grows. Ultimately all business crime will involve computers in some way, and we could see a decline of both together. The important information security measures in high-loss business crime generally concern controls over authorized people engaged in unauthorized activities. Such controls include authentication of users, analysis of detailed audit records, unannounced audits, segregation of development and production systems and duties, shielding the viewing of screens, and security awareness and motivation controls in high-value transaction areas. Computer crimes that involve highly publicized intriguing computer misuse methods, such as privacy violations, radio frequency emanations eavesdropping, and computer viruses, have been reported in waves that periodically have saturated the news media during the past 20 years. We must be able to anticipate such highly publicized crimes and reduce the impact and embarrassment they cause. On the basis of our most recent experience, I propose nine new types of computer crime to be aware of: computer larceny (theft and burglary of small computers), automated hacking (use of computer programs to intrude), electronic data interchange fraud (business transaction fraud), Trojan bomb extortion and sabotage (code security inserted into others' systems that can be triggered to cause damage), LANarchy (unknown equipment in use), desktop forgery (computerized forgery and counterfeiting of documents), information anarchy (indiscriminate use of crypto without control), Internet abuse (antisocial use of data communications), and international industrial espionage (governments stealing business secrets). A wide variety of safeguards are necessary to deal with these new crimes. The most powerful controls include (1) carefully controlled use of cryptography and digital signatures with good key management and overriding business and government decryption capability and (2) use of tokens such as smart cards to increase the strength of secret passwords for authentication of computer users. Jewelry-type security for small computers--including registration of serial numbers and security inventorying of equipment, software, and connectivity--will be necessary. Other safeguards include automatic monitoring of computer use and detection of unusual activities, segmentation and filtering of networks, special paper and ink for documents, and reduction of paper documents. Finally, international cooperation of governments to create trusted environments for business is essential.

  2. Mining SNPs from EST sequences using filters and ensemble classifiers.

    PubMed

    Wang, J; Zou, Q; Guo, M Z

    2010-05-04

    Abundant single nucleotide polymorphisms (SNPs) provide the most complete information for genome-wide association studies. However, due to the bottleneck of manual discovery of putative SNPs and the inaccessibility of the original sequencing reads, it is essential to develop a more efficient and accurate computational method for automated SNP detection. We propose a novel computational method to rapidly find true SNPs in public-available EST (expressed sequence tag) databases; this method is implemented as SNPDigger. EST sequences are clustered and aligned. SNP candidates are then obtained according to a measure of redundant frequency. Several new informative biological features, such as the structural neighbor profiles and the physical position of the SNP, were extracted from EST sequences, and the effectiveness of these features was demonstrated. An ensemble classifier, which employs a carefully selected feature set, was included for the imbalanced training data. The sensitivity and specificity of our method both exceeded 80% for human genetic data in the cross validation. Our method enables detection of SNPs from the user's own EST dataset and can be used on species for which there is no genome data. Our tests showed that this method can effectively guide SNP discovery in ESTs and will be useful to avoid and save the cost of biological analyses.

  3. LUNGx Challenge for computerized lung nodule classification

    DOE PAGES

    Armato, Samuel G.; Drukker, Karen; Li, Feng; ...

    2016-12-19

    The purpose of this work is to describe the LUNGx Challenge for the computerized classification of lung nodules on diagnostic computed tomography (CT) scans as benign or malignant and report the performance of participants’ computerized methods along with that of six radiologists who participated in an observer study performing the same Challenge task on the same dataset. The Challenge provided sets of calibration and testing scans, established a performance assessment process, and created an infrastructure for case dissemination and result submission. We present ten groups that applied their own methods to 73 lung nodules (37 benign and 36 malignant) thatmore » were selected to achieve approximate size matching between the two cohorts. Area under the receiver operating characteristic curve (AUC) values for these methods ranged from 0.50 to 0.68; only three methods performed statistically better than random guessing. The radiologists’ AUC values ranged from 0.70 to 0.85; three radiologists performed statistically better than the best-performing computer method. The LUNGx Challenge compared the performance of computerized methods in the task of differentiating benign from malignant lung nodules on CT scans, placed in the context of the performance of radiologists on the same task. Lastly, the continued public availability of the Challenge cases will provide a valuable resource for the medical imaging research community.« less

  4. LUNGx Challenge for computerized lung nodule classification

    PubMed Central

    Armato, Samuel G.; Drukker, Karen; Li, Feng; Hadjiiski, Lubomir; Tourassi, Georgia D.; Engelmann, Roger M.; Giger, Maryellen L.; Redmond, George; Farahani, Keyvan; Kirby, Justin S.; Clarke, Laurence P.

    2016-01-01

    Abstract. The purpose of this work is to describe the LUNGx Challenge for the computerized classification of lung nodules on diagnostic computed tomography (CT) scans as benign or malignant and report the performance of participants’ computerized methods along with that of six radiologists who participated in an observer study performing the same Challenge task on the same dataset. The Challenge provided sets of calibration and testing scans, established a performance assessment process, and created an infrastructure for case dissemination and result submission. Ten groups applied their own methods to 73 lung nodules (37 benign and 36 malignant) that were selected to achieve approximate size matching between the two cohorts. Area under the receiver operating characteristic curve (AUC) values for these methods ranged from 0.50 to 0.68; only three methods performed statistically better than random guessing. The radiologists’ AUC values ranged from 0.70 to 0.85; three radiologists performed statistically better than the best-performing computer method. The LUNGx Challenge compared the performance of computerized methods in the task of differentiating benign from malignant lung nodules on CT scans, placed in the context of the performance of radiologists on the same task. The continued public availability of the Challenge cases will provide a valuable resource for the medical imaging research community. PMID:28018939

  5. LUNGx Challenge for computerized lung nodule classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armato, Samuel G.; Drukker, Karen; Li, Feng

    The purpose of this work is to describe the LUNGx Challenge for the computerized classification of lung nodules on diagnostic computed tomography (CT) scans as benign or malignant and report the performance of participants’ computerized methods along with that of six radiologists who participated in an observer study performing the same Challenge task on the same dataset. The Challenge provided sets of calibration and testing scans, established a performance assessment process, and created an infrastructure for case dissemination and result submission. We present ten groups that applied their own methods to 73 lung nodules (37 benign and 36 malignant) thatmore » were selected to achieve approximate size matching between the two cohorts. Area under the receiver operating characteristic curve (AUC) values for these methods ranged from 0.50 to 0.68; only three methods performed statistically better than random guessing. The radiologists’ AUC values ranged from 0.70 to 0.85; three radiologists performed statistically better than the best-performing computer method. The LUNGx Challenge compared the performance of computerized methods in the task of differentiating benign from malignant lung nodules on CT scans, placed in the context of the performance of radiologists on the same task. Lastly, the continued public availability of the Challenge cases will provide a valuable resource for the medical imaging research community.« less

  6. A Comparative Survey of Methods for Remote Heart Rate Detection From Frontal Face Videos

    PubMed Central

    Wang, Chen; Pun, Thierry; Chanel, Guillaume

    2018-01-01

    Remotely measuring physiological activity can provide substantial benefits for both the medical and the affective computing applications. Recent research has proposed different methodologies for the unobtrusive detection of heart rate (HR) using human face recordings. These methods are based on subtle color changes or motions of the face due to cardiovascular activities, which are invisible to human eyes but can be captured by digital cameras. Several approaches have been proposed such as signal processing and machine learning. However, these methods are compared with different datasets, and there is consequently no consensus on method performance. In this article, we describe and evaluate several methods defined in literature, from 2008 until present day, for the remote detection of HR using human face recordings. The general HR processing pipeline is divided into three stages: face video processing, face blood volume pulse (BVP) signal extraction, and HR computation. Approaches presented in the paper are classified and grouped according to each stage. At each stage, algorithms are analyzed and compared based on their performance using the public database MAHNOB-HCI. Results found in this article are limited on MAHNOB-HCI dataset. Results show that extracted face skin area contains more BVP information. Blind source separation and peak detection methods are more robust with head motions for estimating HR. PMID:29765940

  7. A Secure Alignment Algorithm for Mapping Short Reads to Human Genome.

    PubMed

    Zhao, Yongan; Wang, Xiaofeng; Tang, Haixu

    2018-05-09

    The elastic and inexpensive computing resources such as clouds have been recognized as a useful solution to analyzing massive human genomic data (e.g., acquired by using next-generation sequencers) in biomedical researches. However, outsourcing human genome computation to public or commercial clouds was hindered due to privacy concerns: even a small number of human genome sequences contain sufficient information for identifying the donor of the genomic data. This issue cannot be directly addressed by existing security and cryptographic techniques (such as homomorphic encryption), because they are too heavyweight to carry out practical genome computation tasks on massive data. In this article, we present a secure algorithm to accomplish the read mapping, one of the most basic tasks in human genomic data analysis based on a hybrid cloud computing model. Comparing with the existing approaches, our algorithm delegates most computation to the public cloud, while only performing encryption and decryption on the private cloud, and thus makes the maximum use of the computing resource of the public cloud. Furthermore, our algorithm reports similar results as the nonsecure read mapping algorithms, including the alignment between reads and the reference genome, which can be directly used in the downstream analysis such as the inference of genomic variations. We implemented the algorithm in C++ and Python on a hybrid cloud system, in which the public cloud uses an Apache Spark system.

  8. Public Health Surveillance and Meaningful Use Regulations: A Crisis of Opportunity

    PubMed Central

    Sundwall, David N.

    2012-01-01

    The Health Information Technology for Economic and Clinical Health Act is intended to enhance reimbursement of health care providers for meaningful use of electronic health records systems. This presents both opportunities and challenges for public health departments. To earn incentive payments, clinical providers must exchange specified types of data with the public health system, such as immunization and syndromic surveillance data and notifiable disease reporting. However, a crisis looms because public health’s information technology systems largely lack the capabilities to accept the types of data proposed for exchange. Cloud computing may be a solution for public health information systems. Through shared computing resources, public health departments could reap the benefits of electronic reporting within federal funding constraints. PMID:22390523

  9. Public health surveillance and meaningful use regulations: a crisis of opportunity.

    PubMed

    Lenert, Leslie; Sundwall, David N

    2012-03-01

    The Health Information Technology for Economic and Clinical Health Act is intended to enhance reimbursement of health care providers for meaningful use of electronic health records systems. This presents both opportunities and challenges for public health departments. To earn incentive payments, clinical providers must exchange specified types of data with the public health system, such as immunization and syndromic surveillance data and notifiable disease reporting. However, a crisis looms because public health's information technology systems largely lack the capabilities to accept the types of data proposed for exchange. Cloud computing may be a solution for public health information systems. Through shared computing resources, public health departments could reap the benefits of electronic reporting within federal funding constraints.

  10. A Resource Management Tool for Public Health Continuity of Operations During Disasters

    PubMed Central

    Turner, Anne M.; Reeder, Blaine; Wallace, James C.

    2014-01-01

    Objective We developed and validated a user-centered information system to support the local planning of public health continuity of operations for the Community Health Services Division, Public Health - Seattle & King County, Washington. Methods The Continuity of Operations Data Analysis (CODA) system was designed as a prototype developed using requirements identified through participatory design. CODA uses open-source software that links personnel contact and licensing information with needed skills and clinic locations for 821 employees at 14 public health clinics in Seattle and King County. Using a web-based interface, CODA can visualize locations of personnel in relationship to clinics to assist clinic managers in allocating public health personnel and resources under dynamic conditions. Results Based on user input, the CODA prototype was designed as a low-cost, user-friendly system to inventory and manage public health resources. In emergency conditions, the system can run on a stand-alone battery-powered laptop computer. A formative evaluation by managers of multiple public health centers confirmed the prototype design’s usefulness. Emergency management administrators also provided positive feedback about the system during a separate demonstration. Conclusions Validation of the CODA information design prototype by public health managers and emergency management administrators demonstrates the potential usefulness of building a resource management system using open-source technologies and participatory design principles. PMID:24618165

  11. Tomography reconstruction methods for damage diagnosis of wood structure in construction field

    NASA Astrophysics Data System (ADS)

    Qiu, Qiwen; Lau, Denvid

    2018-03-01

    The structural integrity of wood building element plays a critical role in the public safety, which requires effective methods for diagnosis of internal damage inside the wood body. Conventionally, the non-destructive testing (NDT) methods such as X-ray computed tomography, thermography, radar imaging reconstruction method, ultrasonic tomography, nuclear magnetic imaging techniques, and sonic tomography have been used to obtain the information about the internal structure of wood. In this paper, the applications, advantages and disadvantages of these traditional tomography methods are reviewed. Additionally, the present article gives an overview of recently developed tomography approach that relies on the use of mechanical and electromagnetic waves for assessing the structural integrity of wood buildings. This developed tomography reconstruction method is believed to provide a more accurate, reliable, and comprehensive assessment of wood structural integrity

  12. Cloud computing for genomic data analysis and collaboration.

    PubMed

    Langmead, Ben; Nellore, Abhinav

    2018-04-01

    Next-generation sequencing has made major strides in the past decade. Studies based on large sequencing data sets are growing in number, and public archives for raw sequencing data have been doubling in size every 18 months. Leveraging these data requires researchers to use large-scale computational resources. Cloud computing, a model whereby users rent computers and storage from large data centres, is a solution that is gaining traction in genomics research. Here, we describe how cloud computing is used in genomics for research and large-scale collaborations, and argue that its elasticity, reproducibility and privacy features make it ideally suited for the large-scale reanalysis of publicly available archived data, including privacy-protected data.

  13. Turbulence Measurements and Computations for the Predication of Broadband Noise in High Bypass Ratio Fans

    NASA Technical Reports Server (NTRS)

    Devenport, William J.; Ragab, Saad A.

    2000-01-01

    Work was performed under this grant with a view to providing the experimental and computational results needed to improve the prediction of broadband stator noise in large bypass ratio aircraft engines. The central hypothesis of our study was that a large fraction of this noise was generated by the fan tip leakage vortices. More specifically, that these vortices are a significant component of the fan wake turbulence and they contain turbulent eddies of a type that can produce significant broadband noise. To test this hypothesis we originally proposed experimental work and computations with the following objectives: (1) to build a large scale two-dimensional cascade with a tip gap and a stationary endwall that, as far as possible, simulates the fan tip geometry, (2) to build a moving endwall for use with the large scale cascade, (3) to measure, in detail, the turbulence structure and spectrum generated by the blade wake and tip leakage vortex, for both endwall configurations, (4) to use the CFD to compute the flow and turbulence distributions for both the experimental configurations and the ADP fan, (5) to provide the experimental and CFD results for the cascades and the physical understanding gained from their study as a basis for improving the broadband noise prediction method. In large part these objectives have been achieved. The most important achievements and findings of our experimental and computational efforts are summarized below. The bibliography at the end of this report includes a list of all publications produced to date under this project. Note that this list is necessarily incomplete the task of publication (particularly in journal papers) continues.

  14. Cloud services on an astronomy data center

    NASA Astrophysics Data System (ADS)

    Solar, Mauricio; Araya, Mauricio; Farias, Humberto; Mardones, Diego; Wang, Zhong

    2016-08-01

    The research on computational methods for astronomy performed by the first phase of the Chilean Virtual Observatory (ChiVO) led to the development of functional prototypes, implementing state-of-the-art computational methods and proposing new algorithms and techniques. The ChiVO software architecture is based on the use of the IVOA protocols and standards. These protocols and standards are grouped in layers, with emphasis on the application and data layers, because their basic standards define the minimum operation that a VO should conduct. As momentary verification, the current implementation works with a set of data, with 1 TB capacity, which comes from the reduction of the cycle 0 of ALMA. This research was mainly focused on spectroscopic data cubes coming from the cycle 0 ALMA's public data. As the dataset size increases when the cycle 1 ALMA's public data is also increasing every month, data processing is becoming a major bottleneck for scientific research in astronomy. When designing the ChiVO, we focused on improving both computation and I/ O costs, and this led us to configure a data center with 424 high speed cores of 2,6 GHz, 1 PB of storage (distributed in hard disk drives-HDD and solid state drive-SSD) and high speed communication Infiniband. We are developing a cloud based e-infrastructure for ChiVO services, in order to have a coherent framework for developing novel web services for on-line data processing in the ChiVO. We are currently parallelizing these new algorithms and techniques using HPC tools to speed up big data processing, and we will report our results in terms of data size, data distribution, number of cores and response time, in order to compare different processing and storage configurations.

  15. Integrated system dynamics toolbox for water resources planning.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reno, Marissa Devan; Passell, Howard David; Malczynski, Leonard A.

    2006-12-01

    Public mediated resource planning is quickly becoming the norm rather than the exception. Unfortunately, supporting tools are lacking that interactively engage the public in the decision-making process and integrate over the myriad values that influence water policy. In the pages of this report we document the first steps toward developing a specialized decision framework to meet this need; specifically, a modular and generic resource-planning ''toolbox''. The technical challenge lies in the integration of the disparate systems of hydrology, ecology, climate, demographics, economics, policy and law, each of which influence the supply and demand for water. Specifically, these systems, their associatedmore » processes, and most importantly the constitutive relations that link them must be identified, abstracted, and quantified. For this reason, the toolbox forms a collection of process modules and constitutive relations that the analyst can ''swap'' in and out to model the physical and social systems unique to their problem. This toolbox with all of its modules is developed within the common computational platform of system dynamics linked to a Geographical Information System (GIS). Development of this resource-planning toolbox represents an important foundational element of the proposed interagency center for Computer Aided Dispute Resolution (CADRe). The Center's mission is to manage water conflict through the application of computer-aided collaborative decision-making methods. The Center will promote the use of decision-support technologies within collaborative stakeholder processes to help stakeholders find common ground and create mutually beneficial water management solutions. The Center will also serve to develop new methods and technologies to help federal, state and local water managers find innovative and balanced solutions to the nation's most vexing water problems. The toolbox is an important step toward achieving the technology development goals of this center.« less

  16. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    PubMed Central

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906

  17. Secure encapsulation and publication of biological services in the cloud computing environment.

    PubMed

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  18. Computer Aided Method for System Safety and Reliability Assessments

    DTIC Science & Technology

    2008-09-01

    program between 1998 and 2003. This tool was not marketed in the public domain after the CRV program ended. The other tool is called eXpress, and it...support Government reviewed and approved analyses methodologies which can 5 then be shared with other government agencies and industry partners...Documented for B&R, UP&L, EPRI 30 DEC 80 GO IBM Version Enhanced at UCC , Dallas, Descriptors, Facility to Alter Array Sizes, Explanation of Use 1 SEP 82

  19. Modelling chemical reactions by QM/MM calculations: the case of the tautomerization in fireflies bioluminescent systems

    NASA Astrophysics Data System (ADS)

    Berraud-Pache, Romain; Garcia-Iriepa, Cristina; Navizet, Isabelle

    2018-04-01

    In less than half a century, the hybrid QM/MM method has become one of the most used technique to model molecules embedded in a complex environment. A well-known application of the QM/MM method is for biological systems. Nowadays, one can understand how enzymatic reactions work or compute spectroscopic properties, like the wavelength of emission. Here, we have tackled the issue of modelling chemical reactions inside proteins. We have studied a bioluminescent system, fireflies, and deciphered if a keto-enol tautomerization is possible inside the protein. The two tautomers are candidates to be the emissive molecule of the bioluminescence but no outcome has been reached. One hypothesis is to consider a possible keto-enol tautomerization to treat this issue, as it has been already observed in water. A joint approach combining extensive MD simulations as well as computation of key intermediates like TS using QM/MM calculations is presented in this publication. We also emphasize the procedure and difficulties met during this approach in order to give a guide for this kind of chemical reactions using QM/MM methods.

  20. Probabilistic performance estimators for computational chemistry methods: The empirical cumulative distribution function of absolute errors

    NASA Astrophysics Data System (ADS)

    Pernot, Pascal; Savin, Andreas

    2018-06-01

    Benchmarking studies in computational chemistry use reference datasets to assess the accuracy of a method through error statistics. The commonly used error statistics, such as the mean signed and mean unsigned errors, do not inform end-users on the expected amplitude of prediction errors attached to these methods. We show that, the distributions of model errors being neither normal nor zero-centered, these error statistics cannot be used to infer prediction error probabilities. To overcome this limitation, we advocate for the use of more informative statistics, based on the empirical cumulative distribution function of unsigned errors, namely, (1) the probability for a new calculation to have an absolute error below a chosen threshold and (2) the maximal amplitude of errors one can expect with a chosen high confidence level. Those statistics are also shown to be well suited for benchmarking and ranking studies. Moreover, the standard error on all benchmarking statistics depends on the size of the reference dataset. Systematic publication of these standard errors would be very helpful to assess the statistical reliability of benchmarking conclusions.

  1. Modeling Chemical Reactions by QM/MM Calculations: The Case of the Tautomerization in Fireflies Bioluminescent Systems

    PubMed Central

    Berraud-Pache, Romain; Garcia-Iriepa, Cristina; Navizet, Isabelle

    2018-01-01

    In less than half a century, the hybrid QM/MM method has become one of the most used technique to model molecules embedded in a complex environment. A well-known application of the QM/MM method is for biological systems. Nowadays, one can understand how enzymatic reactions work or compute spectroscopic properties, like the wavelength of emission. Here, we have tackled the issue of modeling chemical reactions inside proteins. We have studied a bioluminescent system, fireflies, and deciphered if a keto-enol tautomerization is possible inside the protein. The two tautomers are candidates to be the emissive molecule of the bioluminescence but no outcome has been reached. One hypothesis is to consider a possible keto-enol tautomerization to treat this issue, as it has been already observed in water. A joint approach combining extensive MD simulations as well as computation of key intermediates like TS using QM/MM calculations is presented in this publication. We also emphasize the procedure and difficulties met during this approach in order to give a guide for this kind of chemical reactions using QM/MM methods. PMID:29719820

  2. Modeling Chemical Reactions by QM/MM Calculations: The Case of the Tautomerization in Fireflies Bioluminescent Systems.

    PubMed

    Berraud-Pache, Romain; Garcia-Iriepa, Cristina; Navizet, Isabelle

    2018-01-01

    In less than half a century, the hybrid QM/MM method has become one of the most used technique to model molecules embedded in a complex environment. A well-known application of the QM/MM method is for biological systems. Nowadays, one can understand how enzymatic reactions work or compute spectroscopic properties, like the wavelength of emission. Here, we have tackled the issue of modeling chemical reactions inside proteins. We have studied a bioluminescent system, fireflies, and deciphered if a keto-enol tautomerization is possible inside the protein. The two tautomers are candidates to be the emissive molecule of the bioluminescence but no outcome has been reached. One hypothesis is to consider a possible keto-enol tautomerization to treat this issue, as it has been already observed in water. A joint approach combining extensive MD simulations as well as computation of key intermediates like TS using QM/MM calculations is presented in this publication. We also emphasize the procedure and difficulties met during this approach in order to give a guide for this kind of chemical reactions using QM/MM methods.

  3. A General Introduction of the Earthquake Early Warning System Technology Developed in China

    NASA Astrophysics Data System (ADS)

    Wang, T.

    2015-12-01

    Since the Wenchuan earthquake in 2008, a dramatic progress on earthquake early warning (EEW) has been made by Institute of Care-life (ICL) in China. The research on EEW by ICL covers choosing appropriate sensors, methods of installing the sensors, data automatic process methods of the seismic waves for EEW, methods of applying of EEW warnings for public, schools and life-line projects. ICL innovatively applies distributed computing and cloud computing technology. So far, ICL has deployed over 5500 EEW sensors in China, which is 5 times the number of EEW sensors in Japan, covering more than 2.1 million square kilometers. Since June, 2011, over 5000 earthquakes, with 28 of them are destructive quakes, have triggered the EEWS with no false alert. The root mean square (RMS) error of the magnitude for the 28 destructive quakes is 0.32. In addition, innovative work is done to suppress false alarm and miss alarm, which pushes forward the application of EEW in China. The technology is also being applied in Nepal now.

  4. Communicating credibly in an incredible environment-yucca mountain public outreach tools and techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheldon, S.R.; Muller, E.

    Open disclosure and public understanding of major issues surrounding the Yucca Mountain Project is a consistent goal for Clark County, Nevada, which represents nearly 80 percent of Nevada's total population. Recent enhancements to the County's communication methods employ emerging technology as well as traditional public relations tactics. The County's communication methods engage the public through highly visual displays, exhibits, informative and entertaining video programs, school presentations, creative print inserts, public interaction and news media. The program provides information based on the county's research studies and findings on property values, the environment, tourism, public health and safety, increased costs for emergencymore » services and the potential disproportionate effects to Native American tribes and other minority populations in the area. Multi-cultural Dialogue: Nevada, particularly southern Nevada and the Las Vegas area, has experienced explosive growth in the last decade. The fastest growing demographic group in Nevada is Hispanics (nearly 23% in Las Vegas) and Asians (approx. 8%). Clark County's Nuclear Waste's Multi-cultural Program is designed to reach residents from these emerging segments of our population. Educational video programs: While officially opposed to the project, Clark County is committed to providing Nevada residents with accurate, timely and objective information about Yucca Mountain and its potential impacts to our state. Since the actual operation of the repository, if approved by the Nuclear Regulatory Commission, is about a decade away, the program includes presentations for middle and high school students on age-appropriate topics. Work with indigenous tribes: American Indian tribes in Southern Nevada participated in an unprecedented video program presenting the unique views and perspectives of the American Indian tribes directly impacted by the proposed repository. Monitoring program: To track economic, fiscal and social changes over time, the monitoring program is comprised of indicators in several core areas, including indicators of environmental, economic, community well being, fiscal, developmental and public health and safety. Its purpose is to highlight and monitor the most meaningful indicators of performance and perception in key service areas. The monitoring program is promoted within the public outreach program to make Nevada residents aware of this important resource of information. Internet Activities: Interactive quizzes, informational postings, electronic newsletters and pod-casts draw a demographic that prefers getting information from computer sources. Lively, interesting and ethnically diverse pod-cast episodes provide access to audio shows, which can be downloaded, to MP3 players or to a standard computer. (authors)« less

  5. Next Generation Sequence Analysis and Computational Genomics Using Graphical Pipeline Workflows

    PubMed Central

    Torri, Federica; Dinov, Ivo D.; Zamanyan, Alen; Hobel, Sam; Genco, Alex; Petrosyan, Petros; Clark, Andrew P.; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Knowles, James A.; Ames, Joseph; Kesselman, Carl; Toga, Arthur W.; Potkin, Steven G.; Vawter, Marquis P.; Macciardi, Fabio

    2012-01-01

    Whole-genome and exome sequencing have already proven to be essential and powerful methods to identify genes responsible for simple Mendelian inherited disorders. These methods can be applied to complex disorders as well, and have been adopted as one of the current mainstream approaches in population genetics. These achievements have been made possible by next generation sequencing (NGS) technologies, which require substantial bioinformatics resources to analyze the dense and complex sequence data. The huge analytical burden of data from genome sequencing might be seen as a bottleneck slowing the publication of NGS papers at this time, especially in psychiatric genetics. We review the existing methods for processing NGS data, to place into context the rationale for the design of a computational resource. We describe our method, the Graphical Pipeline for Computational Genomics (GPCG), to perform the computational steps required to analyze NGS data. The GPCG implements flexible workflows for basic sequence alignment, sequence data quality control, single nucleotide polymorphism analysis, copy number variant identification, annotation, and visualization of results. These workflows cover all the analytical steps required for NGS data, from processing the raw reads to variant calling and annotation. The current version of the pipeline is freely available at http://pipeline.loni.ucla.edu. These applications of NGS analysis may gain clinical utility in the near future (e.g., identifying miRNA signatures in diseases) when the bioinformatics approach is made feasible. Taken together, the annotation tools and strategies that have been developed to retrieve information and test hypotheses about the functional role of variants present in the human genome will help to pinpoint the genetic risk factors for psychiatric disorders. PMID:23139896

  6. reCAPTCHA: human-based character recognition via Web security measures.

    PubMed

    von Ahn, Luis; Maurer, Benjamin; McMillen, Colin; Abraham, David; Blum, Manuel

    2008-09-12

    CAPTCHAs (Completely Automated Public Turing test to tell Computers and Humans Apart) are widespread security measures on the World Wide Web that prevent automated programs from abusing online services. They do so by asking humans to perform a task that computers cannot yet perform, such as deciphering distorted characters. Our research explored whether such human effort can be channeled into a useful purpose: helping to digitize old printed material by asking users to decipher scanned words from books that computerized optical character recognition failed to recognize. We showed that this method can transcribe text with a word accuracy exceeding 99%, matching the guarantee of professional human transcribers. Our apparatus is deployed in more than 40,000 Web sites and has transcribed over 440 million words.

  7. NETTAB 2012 on "Integrated Bio-Search"

    PubMed Central

    2014-01-01

    The NETTAB 2012 workshop, held in Como on November 14-16, 2012, was devoted to "Integrated Bio-Search", that is to technologies, methods, architectures, systems and applications for searching, retrieving, integrating and analyzing data, information, and knowledge with the aim of answering complex bio-medical-molecular questions, i.e. some of the most challenging issues in bioinformatics today. It brought together about 80 researchers working in the field of Bioinformatics, Computational Biology, Biology, Computer Science and Engineering. More than 50 scientific contributions, including keynote and tutorial talks, oral communications, posters and software demonstrations, were presented at the workshop. This preface provides a brief overview of the workshop and shortly introduces the peer-reviewed manuscripts that were accepted for publication in this Supplement. PMID:24564635

  8. 48 CFR 227.7202-1 - Policy.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-1 Policy. (a) Commercial computer software or commercial computer software documentation shall be acquired under the licenses customarily provided to the public...

  9. 48 CFR 227.7202-1 - Policy.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-1 Policy. (a) Commercial computer software or commercial computer software documentation shall be acquired under the licenses customarily provided to the public...

  10. 48 CFR 227.7202-1 - Policy.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-1 Policy. (a) Commercial computer software or commercial computer software documentation shall be acquired under the licenses customarily provided to the public...

  11. 48 CFR 227.7202-1 - Policy.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-1 Policy. (a) Commercial computer software or commercial computer software documentation shall be acquired under the licenses customarily provided to the public...

  12. 48 CFR 227.7202-1 - Policy.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-1 Policy. (a) Commercial computer software or commercial computer software documentation shall be acquired under the licenses customarily provided to the public...

  13. Librarian of the Year 2009: Team Cedar Rapids

    ERIC Educational Resources Information Center

    Berry, John N., III

    2009-01-01

    When flood came to Cedar Rapids city, the Cedar Rapids Public Library (CRPL), IA, lost 160,000 items including large parts of its adult and youth collections, magazines, newspapers, reference materials, CDs, and DVDs. Most of its public access computers were destroyed as was its computer lab and microfilm equipment. The automatic circulation and…

  14. Policy Issues in Computer Networks: Multi-Access Information Systems.

    ERIC Educational Resources Information Center

    Lyons, Patrice A.

    As computer databases become more publicly accessible through public networks, there is a growing need to provide effective protection for proprietary information. Without adequate assurances that their works will be protected, authors and other copyright owners may be reluctant to allow the full text of their works to be accessed through computer…

  15. Technology to the People: Kirsten Thompson--Milwaukee Public Library

    ERIC Educational Resources Information Center

    Library Journal, 2004

    2004-01-01

    Considering Kirsten Thompson's career in the Milwaukee Public Library (MPL), it is no surprise that as an undergraduate she studied both computer programming and psychology. The combination is the driving force behind her multiple projects at MPL, which have all focused on bringing technology to users. "Having the computers is great, but you…

  16. Computationally predicted Adverse Outcome Pathway networks for liver-related diseases using publicly available data sources: Case studies and lessons learned

    EPA Science Inventory

    The Adverse Outcome Pathway (AOP) framework summarizes key information about mechanistic events leading to an adverse health or ecological outcome. In recent years computationally predicted AOPs (cpAOP) making use of publicly available data have been proposed as a means of accele...

  17. Computer-Focused Russian Bilingual Instructional Program, 1988-89. OREA Report.

    ERIC Educational Resources Information Center

    Berney, Tomi D.; Gritzer, Glenn

    In its fourth year, the computer-Focused Russian Bilingual Instructional Program provided instructional and support activities to 276 Russian-speaking students, most of whom are limited English proficient, at 4 public and 2 private high schools in Brooklyn. Instructional activities varied by site. Public school students took English as a Second…

  18. 45 CFR 1630.13 - Time.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...

  19. 45 CFR 501.7 - Time.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false Time. 501.7 Section 501.7 Public Welfare..., DEPARTMENT OF JUSTICE RULES OF PRACTICE SUBPOENAS, DEPOSITIONS, AND OATHS § 501.7 Time. (a) Computation. In computing any period of time prescribed or allowed by the regulations, by order of the Commission, or by any...

  20. 45 CFR 501.7 - Time.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 3 2012-10-01 2012-10-01 false Time. 501.7 Section 501.7 Public Welfare..., DEPARTMENT OF JUSTICE RULES OF PRACTICE SUBPOENAS, DEPOSITIONS, AND OATHS § 501.7 Time. (a) Computation. In computing any period of time prescribed or allowed by the regulations, by order of the Commission, or by any...

  1. 45 CFR 1630.13 - Time.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 4 2012-10-01 2012-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...

  2. 45 CFR 501.7 - Time.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 3 2013-10-01 2013-10-01 false Time. 501.7 Section 501.7 Public Welfare..., DEPARTMENT OF JUSTICE RULES OF PRACTICE SUBPOENAS, DEPOSITIONS, AND OATHS § 501.7 Time. (a) Computation. In computing any period of time prescribed or allowed by the regulations, by order of the Commission, or by any...

  3. 45 CFR 501.7 - Time.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 3 2011-10-01 2011-10-01 false Time. 501.7 Section 501.7 Public Welfare..., DEPARTMENT OF JUSTICE RULES OF PRACTICE SUBPOENAS, DEPOSITIONS, AND OATHS § 501.7 Time. (a) Computation. In computing any period of time prescribed or allowed by the regulations, by order of the Commission, or by any...

  4. 45 CFR 1630.13 - Time.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 4 2011-10-01 2011-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...

  5. 45 CFR 1630.13 - Time.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 4 2013-10-01 2013-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...

  6. 45 CFR 1630.13 - Time.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 4 2014-10-01 2014-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...

  7. 45 CFR 501.7 - Time.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 3 2014-10-01 2014-10-01 false Time. 501.7 Section 501.7 Public Welfare..., DEPARTMENT OF JUSTICE RULES OF PRACTICE SUBPOENAS, DEPOSITIONS, AND OATHS § 501.7 Time. (a) Computation. In computing any period of time prescribed or allowed by the regulations, by order of the Commission, or by any...

  8. Localization of optic disc and fovea in retinal images using intensity based line scanning analysis.

    PubMed

    Kamble, Ravi; Kokare, Manesh; Deshmukh, Girish; Hussin, Fawnizu Azmadi; Mériaudeau, Fabrice

    2017-08-01

    Accurate detection of diabetic retinopathy (DR) mainly depends on identification of retinal landmarks such as optic disc and fovea. Present methods suffer from challenges like less accuracy and high computational complexity. To address this issue, this paper presents a novel approach for fast and accurate localization of optic disc (OD) and fovea using one-dimensional scanned intensity profile analysis. The proposed method utilizes both time and frequency domain information effectively for localization of OD. The final OD center is located using signal peak-valley detection in time domain and discontinuity detection in frequency domain analysis. However, with the help of detected OD location, the fovea center is located using signal valley analysis. Experiments were conducted on MESSIDOR dataset, where OD was successfully located in 1197 out of 1200 images (99.75%) and fovea in 1196 out of 1200 images (99.66%) with an average computation time of 0.52s. The large scale evaluation has been carried out extensively on nine publicly available databases. The proposed method is highly efficient in terms of quickly and accurately localizing OD and fovea structure together compared with the other state-of-the-art methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Automated detection of dark and bright lesions in retinal images for early detection of diabetic retinopathy.

    PubMed

    Akram, Usman M; Khan, Shoab A

    2012-10-01

    There is an ever-increasing interest in the development of automatic medical diagnosis systems due to the advancement in computing technology and also to improve the service by medical community. The knowledge about health and disease is required for reliable and accurate medical diagnosis. Diabetic Retinopathy (DR) is one of the most common causes of blindness and it can be prevented if detected and treated early. DR has different signs and the most distinctive are microaneurysm and haemorrhage which are dark lesions and hard exudates and cotton wool spots which are bright lesions. Location and structure of blood vessels and optic disk play important role in accurate detection and classification of dark and bright lesions for early detection of DR. In this article, we propose a computer aided system for the early detection of DR. The article presents algorithms for retinal image preprocessing, blood vessel enhancement and segmentation and optic disk localization and detection which eventually lead to detection of different DR lesions using proposed hybrid fuzzy classifier. The developed methods are tested on four different publicly available databases. The presented methods are compared with recently published methods and the results show that presented methods outperform all others.

  10. Fast and accurate reference-free alignment of subtomograms.

    PubMed

    Chen, Yuxiang; Pfeffer, Stefan; Hrabe, Thomas; Schuller, Jan Michael; Förster, Friedrich

    2013-06-01

    In cryoelectron tomography alignment and averaging of subtomograms, each dnepicting the same macromolecule, improves the resolution compared to the individual subtomogram. Major challenges of subtomogram alignment are noise enhancement due to overfitting, the bias of an initial reference in the iterative alignment process, and the computational cost of processing increasingly large amounts of data. Here, we propose an efficient and accurate alignment algorithm via a generalized convolution theorem, which allows computation of a constrained correlation function using spherical harmonics. This formulation increases computational speed of rotational matching dramatically compared to rotation search in Cartesian space without sacrificing accuracy in contrast to other spherical harmonic based approaches. Using this sampling method, a reference-free alignment procedure is proposed to tackle reference bias and overfitting, which also includes contrast transfer function correction by Wiener filtering. Application of the method to simulated data allowed us to obtain resolutions near the ground truth. For two experimental datasets, ribosomes from yeast lysate and purified 20S proteasomes, we achieved reconstructions of approximately 20Å and 16Å, respectively. The software is ready-to-use and made public to the community. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. ELSI Bibliography: Ethical legal and social implications of the Human Genome Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yesley, M.S.

    This second edition of the ELSI Bibliography provides a current and comprehensive resource for identifying publications on the major topics related to the ethical, legal and social issues (ELSI) of the Human Genome Project. Since the first edition of the ELSI Bibliography was printed last year, new publications and earlier ones identified by additional searching have doubled our computer database of ELSI publications to over 5600 entries. The second edition of the ELSI Bibliography reflects this growth of the underlying computer database. Researchers should note that an extensive collection of publications in the database is available for public use atmore » the General Law Library of Los Alamos National Laboratory (LANL).« less

  12. Key Lessons in Building "Data Commons": The Open Science Data Cloud Ecosystem

    NASA Astrophysics Data System (ADS)

    Patterson, M.; Grossman, R.; Heath, A.; Murphy, M.; Wells, W.

    2015-12-01

    Cloud computing technology has created a shift around data and data analysis by allowing researchers to push computation to data as opposed to having to pull data to an individual researcher's computer. Subsequently, cloud-based resources can provide unique opportunities to capture computing environments used both to access raw data in its original form and also to create analysis products which may be the source of data for tables and figures presented in research publications. Since 2008, the Open Cloud Consortium (OCC) has operated the Open Science Data Cloud (OSDC), which provides scientific researchers with computational resources for storing, sharing, and analyzing large (terabyte and petabyte-scale) scientific datasets. OSDC has provided compute and storage services to over 750 researchers in a wide variety of data intensive disciplines. Recently, internal users have logged about 2 million core hours each month. The OSDC also serves the research community by colocating these resources with access to nearly a petabyte of public scientific datasets in a variety of fields also accessible for download externally by the public. In our experience operating these resources, researchers are well served by "data commons," meaning cyberinfrastructure that colocates data archives, computing, and storage infrastructure and supports essential tools and services for working with scientific data. In addition to the OSDC public data commons, the OCC operates a data commons in collaboration with NASA and is developing a data commons for NOAA datasets. As cloud-based infrastructures for distributing and computing over data become more pervasive, we ask, "What does it mean to publish data in a data commons?" Here we present the OSDC perspective and discuss several services that are key in architecting data commons, including digital identifier services.

  13. Z -boson decays to a vector quarkonium plus a photon

    NASA Astrophysics Data System (ADS)

    Bodwin, Geoffrey T.; Chung, Hee Sok; Ee, June-Haak; Lee, Jungil

    2018-01-01

    We compute the decay rates for the processes Z →V +γ , where Z is the Z -boson, γ is the photon, and V is one of the vector quarkonia J /ψ or ϒ (n S ), with n =1 , 2, or 3. Our computations include corrections through relative orders αs and v2 and resummations of logarithms of mZ2/mQ2, to all orders in αs, at next-to-leading-logarithmic accuracy. (v is the velocity of the heavy quark Q or the heavy antiquark Q ¯ in the quarkonium rest frame, and mZ and mQ are the masses of Z and Q , respectively.) Our calculations are the first to include both the order-αs correction to the light-cone distributions amplitude and the resummation of logarithms of mZ2/mQ2 and are the first calculations for the ϒ (2 S ) and ϒ (3 S ) final states. The resummations of logarithms of mZ2/mQ2 that are associated with the order-αs and order-v2 corrections are carried out by making use of the Abel-Padé method. We confirm the analytic result for the order-v2 correction that was presented in a previous publication, and we correct the relative sign of the direct and indirect amplitudes and some choices of scales in that publication. Our branching fractions for Z →J /ψ +γ and Z →ϒ (1 S )+γ differ by 2.0 σ and -4.0 σ , respectively, from the branching fractions that are given in the most recent publication on this topic (in units of the uncertainties that are given in that publication). However, we argue that the uncertainties in the rates are underestimated in that publication.

  14. Computational Approaches to Chemical Hazard Assessment

    PubMed Central

    Luechtefeld, Thomas; Hartung, Thomas

    2018-01-01

    Summary Computational prediction of toxicity has reached new heights as a result of decades of growth in the magnitude and diversity of biological data. Public packages for statistics and machine learning make model creation faster. New theory in machine learning and cheminformatics enables integration of chemical structure, toxicogenomics, simulated and physical data in the prediction of chemical health hazards, and other toxicological information. Our earlier publications have characterized a toxicological dataset of unprecedented scale resulting from the European REACH legislation (Registration Evaluation Authorisation and Restriction of Chemicals). These publications dove into potential use cases for regulatory data and some models for exploiting this data. This article analyzes the options for the identification and categorization of chemicals, moves on to the derivation of descriptive features for chemicals, discusses different kinds of targets modeled in computational toxicology, and ends with a high-level perspective of the algorithms used to create computational toxicology models. PMID:29101769

  15. Dictionary-driven protein annotation

    PubMed Central

    Rigoutsos, Isidore; Huynh, Tien; Floratos, Aris; Parida, Laxmi; Platt, Daniel

    2002-01-01

    Computational methods seeking to automatically determine the properties (functional, structural, physicochemical, etc.) of a protein directly from the sequence have long been the focus of numerous research groups. With the advent of advanced sequencing methods and systems, the number of amino acid sequences that are being deposited in the public databases has been increasing steadily. This has in turn generated a renewed demand for automated approaches that can annotate individual sequences and complete genomes quickly, exhaustively and objectively. In this paper, we present one such approach that is centered around and exploits the Bio-Dictionary, a collection of amino acid patterns that completely covers the natural sequence space and can capture functional and structural signals that have been reused during evolution, within and across protein families. Our annotation approach also makes use of a weighted, position-specific scoring scheme that is unaffected by the over-representation of well-conserved proteins and protein fragments in the databases used. For a given query sequence, the method permits one to determine, in a single pass, the following: local and global similarities between the query and any protein already present in a public database; the likeness of the query to all available archaeal/bacterial/eukaryotic/viral sequences in the database as a function of amino acid position within the query; the character of secondary structure of the query as a function of amino acid position within the query; the cytoplasmic, transmembrane or extracellular behavior of the query; the nature and position of binding domains, active sites, post-translationally modified sites, signal peptides, etc. In terms of performance, the proposed method is exhaustive, objective and allows for the rapid annotation of individual sequences and full genomes. Annotation examples are presented and discussed in Results, including individual queries and complete genomes that were released publicly after we built the Bio-Dictionary that is used in our experiments. Finally, we have computed the annotations of more than 70 complete genomes and made them available on the World Wide Web at http://cbcsrv.watson.ibm.com/Annotations/. PMID:12202776

  16. Public library computer training for older adults to access high-quality Internet health information

    PubMed Central

    Xie, Bo; Bugg, Julie M.

    2010-01-01

    An innovative experiment to develop and evaluate a public library computer training program to teach older adults to access and use high-quality Internet health information involved a productive collaboration among public libraries, the National Institute on Aging and the National Library of Medicine of the National Institutes of Health (NIH), and a Library and Information Science (LIS) academic program at a state university. One hundred and thirty-one older adults aged 54–89 participated in the study between September 2007 and July 2008. Key findings include: a) participants had overwhelmingly positive perceptions of the training program; b) after learning about two NIH websites (http://nihseniorhealth.gov and http://medlineplus.gov) from the training, many participants started using these online resources to find high quality health and medical information and, further, to guide their decision-making regarding a health- or medically-related matter; and c) computer anxiety significantly decreased (p < .001) while computer interest and efficacy significantly increased (p = .001 and p < .001, respectively) from pre- to post-training, suggesting statistically significant improvements in computer attitudes between pre- and post-training. The findings have implications for public libraries, LIS academic programs, and other organizations interested in providing similar programs in their communities. PMID:20161649

  17. Adoption and implementation of a computer-delivered HIV/STD risk-reduction intervention for African American adolescent females seeking services at county health departments: implementation optimization is urgently needed.

    PubMed

    DiClemente, Ralph J; Bradley, Erin; Davis, Teaniese L; Brown, Jennifer L; Ukuku, Mary; Sales, Jessica M; Rose, Eve S; Wingood, Gina M

    2013-06-01

    Although group-delivered HIV/sexually transmitted disease (STD) risk-reduction interventions for African American adolescent females have proven efficacious, they require significant financial and staffing resources to implement and may not be feasible in personnel- and resource-constrained public health clinics. We conducted a study assessing adoption and implementation of an evidence-based HIV/STD risk-reduction intervention that was translated from a group-delivered modality to a computer-delivered modality to facilitate use in county public health departments. Usage of the computer-delivered intervention was low across 8 participating public health clinics. Further investigation is needed to optimize implementation by identifying, understanding, and surmounting barriers that hamper timely and efficient implementation of technology-delivered HIV/STD risk-reduction interventions in county public health clinics.

  18. Adoption and Implementation of a Computer-delivered HIV/STD Risk-Reduction Intervention for African American Adolescent Females Seeking Services at County Health Departments: Implementation Optimization is Urgently Needed

    PubMed Central

    DiClemente, Ralph J.; Bradley, Erin; Davis, Teaniese L.; Brown, Jennifer L.; Ukuku, Mary; Sales, Jessica M.; Rose, Eve S.; Wingood, Gina M.

    2013-01-01

    Although group-delivered HIV/STD risk-reduction interventions for African American adolescent females have proven efficacious, they require significant financial and staffing resources to implement and may not be feasible in personnel- and resource-constrained public health clinics. We conducted a study assessing adoption and implementation of an evidence-based HIV/STD risk-reduction intervention that was translated from a group-delivered modality to a computer-delivered modality to facilitate use in county public health departments. Usage of the computer-delivered intervention was low across eight participating public health clinics. Further investigation is needed to optimize implementation by identifying, understanding and surmounting barriers that hamper timely and efficient implementation of technology-delivered HIV/STD risk-reduction interventions in county public health clinics. PMID:23673891

  19. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    ... Recompetition results for Scientific Discovery through Advanced Computing (SciDAC) applications Co-design Public... DEPARTMENT OF ENERGY DOE/Advanced Scientific Computing Advisory Committee AGENCY: Office of... the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub...

  20. Outline of CS application experiments

    NASA Astrophysics Data System (ADS)

    Otsu, Y.; Kondoh, K.; Matsumoto, M.

    1985-09-01

    To promote and investigate the practical application of satellite use, CS application experiments for various social activity needs, including those of public services such as the National Police Agency and the Japanese National Railway, computer network services, news material transmissions, and advanced teleconference activities, were performed. Public service satellite communications systems were developed and tested. Based on results obtained, several public services have implemented CS-2 for practical disaster-back-up uses. Practical application computer network and enhanced video-conference experiments have also been performed.

  1. Aggregated Computational Toxicology Online Resource

    EPA Pesticide Factsheets

    Aggregated Computational Toxicology Online Resource (AcTOR) is EPA's online aggregator of all the public sources of chemical toxicity data. ACToR aggregates data from over 1,000 public sources on over 500,000 chemicals and is searchable by chemical name, other identifiers and by chemical structure. It can be used to query a specific chemical and find all publicly available hazard, exposure and risk assessment data. It also provides access to EPA's ToxCast, ToxRefDB, DSSTox, Dashboard and DSSTox data.

  2. Steponas Kolupaila's contribution to hydrological science development

    NASA Astrophysics Data System (ADS)

    Valiuškevičius, Gintaras

    2017-08-01

    Steponas Kolupaila (1892-1964) was an important figure in 20th century hydrology and one of the pioneers of scientific water gauging in Europe. His research on the reliability of hydrological data and measurement methods was particularly important and contributed to the development of empirical hydrological calculation methods. Kolupaila was one of the first who standardised water-gauging methods internationally. He created several original hydrological and hydraulic calculation methods (his discharge assessment method for winter period was particularly significant). His innate abilities and frequent travel made Kolupaila a universal specialist in various fields and an active public figure. He revealed his multilayered scientific and cultural experiences in his most famous book, Bibliography of Hydrometry. This book introduced the unique European hydrological-measurement and computation methods to the community of world hydrologists at that time and allowed the development and adaptation of these methods across the world.

  3. 41 CFR 105-64.110 - When may GSA establish computer matching programs?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...

  4. 41 CFR 105-64.110 - When may GSA establish computer matching programs?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...

  5. 41 CFR 105-64.110 - When may GSA establish computer matching programs?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...

  6. 41 CFR 105-64.110 - When may GSA establish computer matching programs?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...

  7. 41 CFR 105-64.110 - When may GSA establish computer matching programs?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...

  8. Prevalence and Correlates of Problematic Internet Experiences and Computer-Using Time: A Two-Year Longitudinal Study in Korean School Children

    PubMed Central

    Stewart, Robert; Lee, Ju-Yeon; Kim, Jae-Min; Kim, Sung-Wan; Shin, Il-Seon; Yoon, Jin-Sang

    2014-01-01

    Objective To measure the prevalence of and factors associated with online inappropriate sexual exposure, cyber-bullying victimisation, and computer-using time in early adolescence. Methods A two-year, prospective school survey was performed with 1,173 children aged 13 at baseline. Data collected included demographic factors, bullying experience, depression, anxiety, coping strategies, self-esteem, psychopathology, attention-deficit hyperactivity disorder symptoms, and school performance. These factors were investigated in relation to problematic Internet experiences and computer-using time at age 15. Results The prevalence of online inappropriate sexual exposure, cyber-bullying victimisation, academic-purpose computer overuse, and game-purpose computer overuse was 31.6%, 19.2%, 8.5%, and 21.8%, respectively, at age 15. Having older siblings, more weekly pocket money, depressive symptoms, anxiety symptoms, and passive coping strategy were associated with reported online sexual harassment. Male gender, depressive symptoms, and anxiety symptoms were associated with reported cyber-bullying victimisation. Female gender was associated with academic-purpose computer overuse, while male gender, lower academic level, increased height, and having older siblings were associated with game-purpose computer-overuse. Conclusion Different environmental and psychological factors predicted different aspects of problematic Internet experiences and computer-using time. This knowledge is important for framing public health interventions to educate adolescents about, and prevent, internet-derived problems. PMID:24605120

  9. Unconstrained and contactless hand geometry biometrics.

    PubMed

    de-Santos-Sierra, Alberto; Sánchez-Ávila, Carmen; Del Pozo, Gonzalo Bailador; Guerra-Casanova, Javier

    2011-01-01

    This paper presents a hand biometric system for contact-less, platform-free scenarios, proposing innovative methods in feature extraction, template creation and template matching. The evaluation of the proposed method considers both the use of three contact-less publicly available hand databases, and the comparison of the performance to two competitive pattern recognition techniques existing in literature: namely support vector machines (SVM) and k-nearest neighbour (k-NN). Results highlight the fact that the proposed method outcomes existing approaches in literature in terms of computational cost, accuracy in human identification, number of extracted features and number of samples for template creation. The proposed method is a suitable solution for human identification in contact-less scenarios based on hand biometrics, providing a feasible solution to devices with limited hardware requirements like mobile devices.

  10. Unconstrained and Contactless Hand Geometry Biometrics

    PubMed Central

    de-Santos-Sierra, Alberto; Sánchez-Ávila, Carmen; del Pozo, Gonzalo Bailador; Guerra-Casanova, Javier

    2011-01-01

    This paper presents a hand biometric system for contact-less, platform-free scenarios, proposing innovative methods in feature extraction, template creation and template matching. The evaluation of the proposed method considers both the use of three contact-less publicly available hand databases, and the comparison of the performance to two competitive pattern recognition techniques existing in literature: namely Support Vector Machines (SVM) and k-Nearest Neighbour (k-NN). Results highlight the fact that the proposed method outcomes existing approaches in literature in terms of computational cost, accuracy in human identification, number of extracted features and number of samples for template creation. The proposed method is a suitable solution for human identification in contact-less scenarios based on hand biometrics, providing a feasible solution to devices with limited hardware requirements like mobile devices. PMID:22346634

  11. Nonlinear Krylov and moving nodes in the method of lines

    NASA Astrophysics Data System (ADS)

    Miller, Keith

    2005-11-01

    We report on some successes and problem areas in the Method of Lines from our work with moving node finite element methods. First, we report on our "nonlinear Krylov accelerator" for the modified Newton's method on the nonlinear equations of our stiff ODE solver. Since 1990 it has been robust, simple, cheap, and automatic on all our moving node computations. We publicize further trials with it here because it should be of great general usefulness to all those solving evolutionary equations. Second, we discuss the need for reliable automatic choice of spatially variable time steps. Third, we discuss the need for robust and efficient iterative solvers for the difficult linearized equations (Jx=b) of our stiff ODE solver. Here, the 1997 thesis of Zulu Xaba has made significant progress.

  12. Research on Key Technologies of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Zhang, Shufen; Yan, Hongcan; Chen, Xuebin

    With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.

  13. Google Earth Engine: a new cloud-computing platform for global-scale earth observation data and analysis

    NASA Astrophysics Data System (ADS)

    Moore, R. T.; Hansen, M. C.

    2011-12-01

    Google Earth Engine is a new technology platform that enables monitoring and measurement of changes in the earth's environment, at planetary scale, on a large catalog of earth observation data. The platform offers intrinsically-parallel computational access to thousands of computers in Google's data centers. Initial efforts have focused primarily on global forest monitoring and measurement, in support of REDD+ activities in the developing world. The intent is to put this platform into the hands of scientists and developing world nations, in order to advance the broader operational deployment of existing scientific methods, and strengthen the ability for public institutions and civil society to better understand, manage and report on the state of their natural resources. Earth Engine currently hosts online nearly the complete historical Landsat archive of L5 and L7 data collected over more than twenty-five years. Newly-collected Landsat imagery is downloaded from USGS EROS Center into Earth Engine on a daily basis. Earth Engine also includes a set of historical and current MODIS data products. The platform supports generation, on-demand, of spatial and temporal mosaics, "best-pixel" composites (for example to remove clouds and gaps in satellite imagery), as well as a variety of spectral indices. Supervised learning methods are available over the Landsat data catalog. The platform also includes a new application programming framework, or "API", that allows scientists access to these computational and data resources, to scale their current algorithms or develop new ones. Under the covers of the Google Earth Engine API is an intrinsically-parallel image-processing system. Several forest monitoring applications powered by this API are currently in development and expected to be operational in 2011. Combining science with massive data and technology resources in a cloud-computing framework can offer advantages of computational speed, ease-of-use and collaboration, as well as transparency in data and methods. Methods developed for global processing of MODIS data to map land cover are being adopted for use with Landsat data. Specifically, the MODIS Vegetation Continuous Field product methodology has been applied for mapping forest extent and change at national scales using Landsat time-series data sets. Scaling this method to continental and global scales is enabled by Google Earth Engine computing capabilities. By combining the supervised learning VCF approach with the Landsat archive and cloud computing, unprecedented monitoring of land cover dynamics is enabled.

  14. [Scientometrics and bibliometrics of biomedical engineering periodicals and papers].

    PubMed

    Zhao, Ping; Xu, Ping; Li, Bingyan; Wang, Zhengrong

    2003-09-01

    This investigation was made to reveal the current status, research trend and research level of biomedical engineering in Chinese mainland by means of scientometrics and to assess the quality of the four domestic publications by bibliometrics. We identified all articles of four related publications by searching Chinese and foreign databases from 1997 to 2001. All articles collected or cited by these databases were searched and statistically analyzed for finding out the relevant distributions, including databases, years, authors, institutions, subject headings and subheadings. The source of sustentation funds and the related articles were analyzed too. The results showed that two journals were cited by two foreign databases and five Chinese databases simultaneously. The output of Journal of Biomedical Engineering was the highest. Its quantity of original papers cited by EI, CA and the totality of papers sponsored by funds were higher than those of the others, but the quantity and percentage per year of biomedical articles cited by EI were decreased in all. Inland core authors and institutions had come into being in the field of biomedical engineering. Their research topics were mainly concentrated on ten subject headings which included biocompatible materials, computer-assisted signal processing, electrocardiography, computer-assisted image processing, biomechanics, algorithms, electroencephalography, automatic data processing, mechanical stress, hemodynamics, mathematical computing, microcomputers, theoretical models, etc. The main subheadings were concentrated on instrumentation, physiopathology, diagnosis, therapy, ultrasonography, physiology, analysis, surgery, pathology, method, etc.

  15. ExScalibur: A High-Performance Cloud-Enabled Suite for Whole Exome Germline and Somatic Mutation Identification.

    PubMed

    Bao, Riyue; Hernandez, Kyle; Huang, Lei; Kang, Wenjun; Bartom, Elizabeth; Onel, Kenan; Volchenboum, Samuel; Andrade, Jorge

    2015-01-01

    Whole exome sequencing has facilitated the discovery of causal genetic variants associated with human diseases at deep coverage and low cost. In particular, the detection of somatic mutations from tumor/normal pairs has provided insights into the cancer genome. Although there is an abundance of publicly-available software for the detection of germline and somatic variants, concordance is generally limited among variant callers and alignment algorithms. Successful integration of variants detected by multiple methods requires in-depth knowledge of the software, access to high-performance computing resources, and advanced programming techniques. We present ExScalibur, a set of fully automated, highly scalable and modulated pipelines for whole exome data analysis. The suite integrates multiple alignment and variant calling algorithms for the accurate detection of germline and somatic mutations with close to 99% sensitivity and specificity. ExScalibur implements streamlined execution of analytical modules, real-time monitoring of pipeline progress, robust handling of errors and intuitive documentation that allows for increased reproducibility and sharing of results and workflows. It runs on local computers, high-performance computing clusters and cloud environments. In addition, we provide a data analysis report utility to facilitate visualization of the results that offers interactive exploration of quality control files, read alignment and variant calls, assisting downstream customization of potential disease-causing mutations. ExScalibur is open-source and is also available as a public image on Amazon cloud.

  16. 29 CFR 541.710 - Employees of public agencies.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE SALES EMPLOYEES Definitions and Miscellaneous Provisions § 541.710 Employees of public agencies. (a) An employee of a public...

  17. 29 CFR 541.710 - Employees of public agencies.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE SALES EMPLOYEES Definitions and Miscellaneous Provisions § 541.710 Employees of public agencies. (a) An employee of a public...

  18. 29 CFR 541.710 - Employees of public agencies.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE SALES EMPLOYEES Definitions and Miscellaneous Provisions § 541.710 Employees of public agencies. (a) An employee of a public...

  19. 29 CFR 541.710 - Employees of public agencies.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE SALES EMPLOYEES Definitions and Miscellaneous Provisions § 541.710 Employees of public agencies. (a) An employee of a public...

  20. Public perceptions of quarantine: community-based telephone survey following an infectious disease outbreak

    PubMed Central

    2009-01-01

    Background The use of restrictive measures such as quarantine draws into sharp relief the dynamic interplay between the individual rights of the citizen on the one hand and the collective rights of the community on the other. Concerns regarding infectious disease outbreaks (SARS, pandemic influenza) have intensified the need to understand public perceptions of quarantine and other social distancing measures. Methods We conducted a telephone survey of the general population in the Greater Toronto Area in Ontario, Canada. Computer-assisted telephone interviewing (CATI) technology was used. A final sample of 500 individuals was achieved through standard random-digit dialing. Results Our data indicate strong public support for the use of quarantine when required and for serious legal sanctions against those who fail to comply. This support is contingent both on the implementation of legal safeguards to protect against inappropriate use and on the provision of psychosocial supports for those affected. Conclusion To engender strong public support for quarantine and other restrictive measures, government officials and public health policy-makers would do well to implement a comprehensive system of supports and safeguards, to educate and inform frontline public health workers, and to engage the public at large in an open dialogue on the ethical use of restrictive measures during infectious disease outbreaks. PMID:20015400

  1. Gps-Denied Geo-Localisation Using Visual Odometry

    NASA Astrophysics Data System (ADS)

    Gupta, Ashish; Chang, Huan; Yilmaz, Alper

    2016-06-01

    The primary method for geo-localization is based on GPS which has issues of localization accuracy, power consumption, and unavailability. This paper proposes a novel approach to geo-localization in a GPS-denied environment for a mobile platform. Our approach has two principal components: public domain transport network data available in GIS databases or OpenStreetMap; and a trajectory of a mobile platform. This trajectory is estimated using visual odometry and 3D view geometry. The transport map information is abstracted as a graph data structure, where various types of roads are modelled as graph edges and typically intersections are modelled as graph nodes. A search for the trajectory in real time in the graph yields the geo-location of the mobile platform. Our approach uses a simple visual sensor and it has a low memory and computational footprint. In this paper, we demonstrate our method for trajectory estimation and provide examples of geolocalization using public-domain map data. With the rapid proliferation of visual sensors as part of automated driving technology and continuous growth in public domain map data, our approach has the potential to completely augment, or even supplant, GPS based navigation since it functions in all environments.

  2. Introduction to Library Public Services. Sixth Edition. Library and Information Science Text Series.

    ERIC Educational Resources Information Center

    Evans, G. Edward; Amodeo, Anthony J.; Carter, Thomas L.

    This book covers the role, purpose, and philosophy related to each of the major functional areas of library public service. This sixth edition, on the presumption that most people know the basic facts about computer hardware, does not include the chapter (in the previous edition) on computer basics, and instead integrated specific technological…

  3. 47 CFR 0.465 - Request for copies of materials which are available, or made available, for public inspection.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION General Information Public Information and Inspection... this service shall be ten cents ($0.10) per page or $5 per computer disk in addition to charges for staff time as provided in § 0.467. For copies prepared with other media, such as computer tapes...

  4. 47 CFR 0.465 - Request for copies of materials which are available, or made available, for public inspection.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION General Information Public Information and Inspection... this service shall be ten cents ($0.10) per page or $5 per computer disk in addition to charges for staff time as provided in § 0.467. For copies prepared with other media, such as computer tapes...

  5. 47 CFR 0.465 - Request for copies of materials which are available, or made available, for public inspection.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION General Information Public Information and Inspection... this service shall be ten cents ($0.10) per page or $5 per computer disk in addition to charges for staff time as provided in § 0.467. For copies prepared with other media, such as computer tapes...

  6. The Computer Experience Microvan Program: A Cooperative Endeavor to Improve University-Public School Relations through Technology.

    ERIC Educational Resources Information Center

    Amodeo, Luiza B.; Martin, Jeanette

    To a large extent the Southwest can be described as a rural area. Under these circumstances, programs for public understanding of technology become, first of all, exercises in logistics. In 1982, New Mexico State University introduced a program to inform teachers about computer technology. This program takes microcomputers into rural classrooms…

  7. Selection and Integration of a Computer Simulation for Public Budgeting and Finance (PBS 116).

    ERIC Educational Resources Information Center

    Banas, Ed Jr.

    1998-01-01

    Describes the development of a course on public budgeting and finance, which integrated the use of SimCity Classic, a computer-simulation software, with traditional lecture, guest speakers, and collaborative-learning activities. Explains the rationale for the course design and discusses the results from the first semester of teaching the course.…

  8. 77 FR 31026 - Use of Computer Simulation of the United States Blood Supply in Support of Planning for Emergency...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-24

    ... enhancements to extend the model predictions from red blood cell units to other blood components, such as...] Use of Computer Simulation of the United States Blood Supply in Support of Planning for Emergency...: Notice of public workshop. The Food and Drug Administration (FDA) is announcing a public workshop...

  9. Y2K Resources for Public Libraries.

    ERIC Educational Resources Information Center

    Foster, Janet

    1999-01-01

    Presents information for public libraries on computer-related vulnerabilities as the century turns from 1999 to 2000. Highlights include: general Y2K information; the Y2K Bug and PCs; Y2K sites for librarians; Online Computer Library Center (OCLC) and USMARC; technological developments in cyberspace; and a list of Web sites and Y2K resources. (AEF)

  10. The FELICIA bulletin board system and the IRBIS anonymous FTP server: Computer security information sources for the DOE community. CIAC-2302

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orvis, W.J.

    1993-11-03

    The Computer Incident Advisory Capability (CIAC) operates two information servers for the DOE community, FELICIA (formerly FELIX) and IRBIS. FELICIA is a computer Bulletin Board System (BBS) that can be accessed by telephone with a modem. IRBIS is an anonymous ftp server that can be accessed on the Internet. Both of these servers contain all of the publicly available CIAC, CERT, NIST, and DDN bulletins, virus descriptions, the VIRUS-L moderated virus bulletin board, copies of public domain and shareware virus- detection/protection software, and copies of useful public domain and shareware utility programs. This guide describes how to connect these systemsmore » and obtain files from them.« less

  11. Computers and Data Processing. Subject Bibliography.

    ERIC Educational Resources Information Center

    United States Government Printing Office, Washington, DC.

    This annotated bibliography of U.S. Government publications contains over 90 entries on topics including telecommunications standards, U.S. competitiveness in high technology industries, computer-related crimes, capacity management of information technology systems, the application of computer technology in the Soviet Union, computers and…

  12. Redox potentials and pKa for benzoquinone from density functional theory based molecular dynamics.

    PubMed

    Cheng, Jun; Sulpizi, Marialore; Sprik, Michiel

    2009-10-21

    The density functional theory based molecular dynamics (DFTMD) method for the computation of redox free energies presented in previous publications and the more recent modification for computation of acidity constants are reviewed. The method uses a half reaction scheme based on reversible insertion/removal of electrons and protons. The proton insertion is assisted by restraining potentials acting as chaperones. The procedure for relating the calculated deprotonation free energies to Brønsted acidities (pK(a)) and the oxidation free energies to electrode potentials with respect to the normal hydrogen electrode is discussed in some detail. The method is validated in an application to the reduction of aqueous 1,4-benzoquinone. The conversion of hydroquinone to quinone can take place via a number of alternative pathways consisting of combinations of acid dissociations, oxidations, or dehydrogenations. The free energy changes of all elementary steps (ten in total) are computed. The accuracy of the calculations is assessed by comparing the energies of different pathways for the same reaction (Hess's law) and by comparison to experiment. This two-sided test enables us to separate the errors related with the restrictions on length and time scales accessible to DFTMD from the errors introduced by the DFT approximation. It is found that the DFT approximation is the main source of error for oxidation free energies.

  13. Computational Tools to Assess Turbine Biological Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richmond, Marshall C.; Serkowski, John A.; Rakowski, Cynthia L.

    2014-07-24

    Public Utility District No. 2 of Grant County (GCPUD) operates the Priest Rapids Dam (PRD), a hydroelectric facility on the Columbia River in Washington State. The dam contains 10 Kaplan-type turbine units that are now more than 50 years old. Plans are underway to refit these aging turbines with new runners. The Columbia River at PRD is a migratory pathway for several species of juvenile and adult salmonids, so passage of fish through the dam is a major consideration when upgrading the turbines. In this paper, a method for turbine biological performance assessment (BioPA) is demonstrated. Using this method, amore » suite of biological performance indicators is computed based on simulated data from a CFD model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. Using known relationships between the dose of an injury mechanism and frequency of injury (dose–response) from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from proposed designs, the engineer can identify the more-promising alternatives. We present an application of the BioPA method for baseline risk assessment calculations for the existing Kaplan turbines at PRD that will be used as the minimum biological performance that a proposed new design must achieve.« less

  14. A window-based time series feature extraction method.

    PubMed

    Katircioglu-Öztürk, Deniz; Güvenir, H Altay; Ravens, Ursula; Baykal, Nazife

    2017-10-01

    This study proposes a robust similarity score-based time series feature extraction method that is termed as Window-based Time series Feature ExtraCtion (WTC). Specifically, WTC generates domain-interpretable results and involves significantly low computational complexity thereby rendering itself useful for densely sampled and populated time series datasets. In this study, WTC is applied to a proprietary action potential (AP) time series dataset on human cardiomyocytes and three precordial leads from a publicly available electrocardiogram (ECG) dataset. This is followed by comparing WTC in terms of predictive accuracy and computational complexity with shapelet transform and fast shapelet transform (which constitutes an accelerated variant of the shapelet transform). The results indicate that WTC achieves a slightly higher classification performance with significantly lower execution time when compared to its shapelet-based alternatives. With respect to its interpretable features, WTC has a potential to enable medical experts to explore definitive common trends in novel datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Employability and career experiences of international graduates of MSc Public Health: a mixed methods study.

    PubMed

    Buunaaisie, C; Manyara, A M; Annett, H; Bird, E L; Bray, I; Ige, J; Jones, M; Orme, J; Pilkington, P; Evans, D

    2018-05-08

    This article aims to describe the public health career experiences of international graduates of a Master of Science in Public Health (MSc PH) programme and to contribute to developing the evidence base on international public health workforce capacity development. A sequential mixed methods study was conducted between January 2017 and April 2017. Ninety-seven international graduates of one UK university's MSc PH programme were invited to take part in an online survey followed by semistructured interviews, for respondents who consented to be interviewed. We computed the descriptive statistics of the quantitative data obtained, and qualitative data were thematically analysed. The response rate was 48.5%. Most respondents (63%) were employed by various agencies within 1 year after graduation. Others (15%) were at different stages of doctor of philosophy studies. Respondents reported enhanced roles after graduation in areas such as public health policy analysis (74%); planning, implementation and evaluation of public health interventions (74%); leadership roles (72%); and research (70%). The common perceived skills that were relevant to the respondents' present jobs were critical analysis (87%), multidisciplinary thinking (86%), demonstrating public health leadership skills (84%) and research (77%). Almost all respondents (90%) were confident in conducting research. Respondents recommended the provision of longer public health placement opportunities, elective courses on project management and advanced statistics, and 'internationalisation' of the programme's curriculum. The study has revealed the relevance of higher education in public health in developing the career prospects and skills of graduates. International graduates of this MSc PH programme were satisfied with the relevance and impact of the skills they acquired during their studies. The outcomes of this study can be used for curriculum reformation. Employers' perspectives of the capabilities of these graduates, however, need further consideration. Copyright © 2018 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  16. Prospects for Finite-Difference Time-Domain (FDTD) Computational Electrodynamics

    NASA Astrophysics Data System (ADS)

    Taflove, Allen

    2002-08-01

    FDTD is the most powerful numerical solution of Maxwell's equations for structures having internal details. Relative to moment-method and finite-element techniques, FDTD can accurately model such problems with 100-times more field unknowns and with nonlinear and/or time-variable parameters. Hundreds of FDTD theory and applications papers are published each year. Currently, there are at least 18 commercial FDTD software packages for solving problems in: defense (especially vulnerability to electromagnetic pulse and high-power microwaves); design of antennas and microwave devices/circuits; electromagnetic compatibility; bioelectromagnetics (especially assessment of cellphone-generated RF absorption in human tissues); signal integrity in computer interconnects; and design of micro-photonic devices (especially photonic bandgap waveguides, microcavities; and lasers). This paper explores emerging prospects for FDTD computational electromagnetics brought about by continuing advances in computer capabilities and FDTD algorithms. We conclude that advances already in place point toward the usage by 2015 of ultralarge-scale (up to 1E11 field unknowns) FDTD electromagnetic wave models covering the frequency range from about 0.1 Hz to 1E17 Hz. We expect that this will yield significant benefits for our society in areas as diverse as computing, telecommunications, defense, and public health and safety.

  17. Computational Exposure Science: An Emerging Discipline to ...

    EPA Pesticide Factsheets

    Background: Computational exposure science represents a frontier of environmental science that is emerging and quickly evolving.Objectives: In this commentary, we define this burgeoning discipline, describe a framework for implementation, and review some key ongoing research elements that are advancing the science with respect to exposure to chemicals in consumer products.Discussion: The fundamental elements of computational exposure science include the development of reliable, computationally efficient predictive exposure models; the identification, acquisition, and application of data to support and evaluate these models; and generation of improved methods for extrapolating across chemicals. We describe our efforts in each of these areas and provide examples that demonstrate both progress and potential.Conclusions: Computational exposure science, linked with comparable efforts in toxicology, is ushering in a new era of risk assessment that greatly expands our ability to evaluate chemical safety and sustainability and to protect public health. The National Exposure Research Laboratory’s (NERL’s) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA’s mission to protect human health and the environment. HEASD’s research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA’s strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source

  18. The Effect of Public Service Advertising on Cardiovascular Disease in Korea

    PubMed Central

    JANG, Juhyun; NA, Baeg Ju; LEE, Moo-Sik; SEO, Soonryu; SUNG, Changhyun; KIM, Hyun Joo; LEE, Jin Yong

    2016-01-01

    Background: Public Service Advertising (PSA) is a public interest message disseminated in the form of an advertisement communication and its main purpose is to promote public behavioral changes regarding a social issue. Korea Centers for Disease Control and Prevention (KCDC) has been delivering PSA by various media. However, the effect of PSAs has never been evaluated. The purpose of this study was to estimate the effects of broadcasted PSA produced by KCDC on cardiovascular disease (CVD). Methods: One thousand adult participants throughout 15 provinces in Korea were chosen through the quota sampling method in 2012. A face-to-face research survey with 13 questions was conducted using a Computer Assisted Personal Interview (CAPI) system. Previous exposure to the PSA message, understanding, and behavioral intention to change was assessed. Results: After watching the PSA, about 75% of participants answered that they could understand the contents well and 70% had willingness to change their behaviors associated with CVD. However, only 24% of participants answered they watched the PSA during the past year. Conclusion: The PSA had positive effects on increasing the level of understanding and intention to change behaviors regarding CVD. However, the level of exposure was low. KCDC should make an effort to increase the public exposure level, which could be an important success factor regarding the PSA. In addition, KCDC should consider customized PSA for vulnerable people such as multi-cultural families, the disabled, and the elderly. PMID:27928529

  19. Now and next-generation sequencing techniques: future of sequence analysis using cloud computing.

    PubMed

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.

  20. Associations between neck musculoskeletal complaints and work related factors among public service computer workers in Kaunas.

    PubMed

    Kaliniene, Gintaré; Ustinaviciene, Ruta; Skemiene, Lina; Januskevicius, Vidmantas

    2013-10-01

    Information technologies have been developing very rapidly, also in the case of occupational activities. Epidemiological studies have shown that employees, who work with computers, are more likely to complain of musculoskeletal disorders (MSD). The aim of this study was to evaluate associations between neck MSD and individual and work related factors. The investigation which consisted of two parts - a questionnaire study (using Nordic Musculoskeletal questionnaire and Copenhagen Psychosocial Questionnaire) and a direct observation (to evaluate ergonomic work environment using RULA method) was carried out in three randomly selected public sector companies of Kaunas. The study population consisted of 513 public service office workers. The survey showed that neck MSDs were very common in the investigated population. The prevalence rate amounted to 65.7%. According to our survey neck MSDs were significantly associated with older age, bigger work experience, high quantitative and cognitive job demands, working for longer than 2 h without taking a break as well as with higher ergonomic risk score. The fully adjusted model working for longer than 2 h without taking a break had the strongest associations with neck complaints. It was confirmed, that neck MSDs were significantly associated with individual factors as well as conditions of work, therefore, preventive actions against neck complaints should be oriented at psychosocial and ergonomic work environment as well as at individual factors.

Top