Sample records for practical computer analysis

  1. Performance review using sequential sampling and a practice computer.

    PubMed

    Difford, F

    1988-06-01

    The use of sequential sample analysis for repeated performance review is described with examples from several areas of practice. The value of a practice computer in providing a random sample from a complete population, evaluating the parameters of a sequential procedure, and producing a structured worksheet is discussed. It is suggested that sequential analysis has advantages over conventional sampling in the area of performance review in general practice.

  2. The SQL Server Database for Non Computer Professional Teaching Reform

    ERIC Educational Resources Information Center

    Liu, Xiangwei

    2012-01-01

    A summary of the teaching methods of the non-computer professional SQL Server database, analyzes the current situation of the teaching course. According to non computer professional curriculum teaching characteristic, put forward some teaching reform methods, and put it into practice, improve the students' analysis ability, practice ability and…

  3. For operation of the Computer Software Management and Information Center (COSMIC)

    NASA Technical Reports Server (NTRS)

    Carmon, J. L.

    1983-01-01

    Computer programs for large systems of normal equations, an interactive digital signal process, structural analysis of cylindrical thrust chambers, swirling turbulent axisymmetric recirculating flows in practical isothermal combustor geometrics, computation of three dimensional combustor performance, a thermal radiation analysis system, transient response analysis, and a software design analysis are summarized.

  4. Self-Directed Student Research through Analysis of Microarray Datasets: A Computer-Based Functional Genomics Practical Class for Masters-Level Students

    ERIC Educational Resources Information Center

    Grenville-Briggs, Laura J.; Stansfield, Ian

    2011-01-01

    This report describes a linked series of Masters-level computer practical workshops. They comprise an advanced functional genomics investigation, based upon analysis of a microarray dataset probing yeast DNA damage responses. The workshops require the students to analyse highly complex transcriptomics datasets, and were designed to stimulate…

  5. Computer Center: It's Time to Take Inventory.

    ERIC Educational Resources Information Center

    Spain, James D.

    1984-01-01

    Describes typical instructional applications of computers. Areas considered include: (1) instructional simulations and animations; (2) data analysis; (3) drill and practice; (4) student evaluation; (5) development of computer models and simulations; (6) biometrics or biostatistics; and (7) direct data acquisition and analysis. (JN)

  6. Computer Code for Transportation Network Design and Analysis

    DOT National Transportation Integrated Search

    1977-01-01

    This document describes the results of research into the application of the mathematical programming technique of decomposition to practical transportation network problems. A computer code called Catnap (for Control Analysis Transportation Network A...

  7. A visual study of computers on doctors' desks.

    PubMed

    Pearce, Christopher; Walker, Hannah; O'Shea, Carolyn

    2008-01-01

    General practice has rapidly computerised over the past ten years, thereby changing the nature of general practice rooms. Most general practice consulting rooms were designed and created in an era without computer hardware, establishing a pattern of work around maximising the doctor-patient relationship. General practitioners (GPs) and patients have had to integrate the computer into this environment. Twenty GPs allowed access to their rooms and consultations as part of a larger study. The results are based on an analysis of still shots of the consulting rooms. Analysis used dramaturgical methodology; thus the room is described as though it is the setting for a play. First, several desk areas were identified: a shared or patient area, a working area, a clinical area and an administrative area. Then, within that framework, we were able to identify two broad categories of setting, one inclusive of the patient and one exclusive. With the increasing significance of the computer in the three-way doctor-patient-computer relationship, an understanding of the social milieu in which the three players in the consultation interact (the staging) will inform further analysis of the interaction, and allow a framework for assessing the effects of different computer placements.

  8. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  9. Image/Time Series Mining Algorithms: Applications to Developmental Biology, Document Processing and Data Streams

    ERIC Educational Resources Information Center

    Tataw, Oben Moses

    2013-01-01

    Interdisciplinary research in computer science requires the development of computational techniques for practical application in different domains. This usually requires careful integration of different areas of technical expertise. This dissertation presents image and time series analysis algorithms, with practical interdisciplinary applications…

  10. Computers in general practice: the patient's voice

    PubMed Central

    Potter, A. R.

    1981-01-01

    Analysis of answers to a questionnaire on the use of computers in general practice showed that 19 per cent of patients in two practices in Staffordshire would be worried if their general practitioner used a computer to store medical records. Twenty-seven per cent of patients would be unwilling to speak frankly about personal matters to their general practitioner if he or she used a computer and 7 per cent said that they would change to another doctor. Fifteen per cent stated that their general practitioner already had information about them that they would not want to be included in a computerized record of their medical history. PMID:7328555

  11. Does attitude matter in computer use in Australian general practice? A zero-inflated Poisson regression analysis.

    PubMed

    Khan, Asaduzzaman; Western, Mark

    The purpose of this study was to explore factors that facilitate or hinder effective use of computers in Australian general medical practice. This study is based on data extracted from a national telephone survey of 480 general practitioners (GPs) across Australia. Clinical functions performed by GPs using computers were examined using a zero-inflated Poisson (ZIP) regression modelling. About 17% of GPs were not using computer for any clinical function, while 18% reported using computers for all clinical functions. The ZIP model showed that computer anxiety was negatively associated with effective computer use, while practitioners' belief about usefulness of computers was positively associated with effective computer use. Being a female GP or working in partnership or group practice increased the odds of effectively using computers for clinical functions. To fully capitalise on the benefits of computer technology, GPs need to be convinced that this technology is useful and can make a difference.

  12. Cardiology office computer use: primer, pointers, pitfalls.

    PubMed

    Shepard, R B; Blum, R I

    1986-10-01

    An office computer is a utility, like an automobile, with benefits and costs that are both direct and hidden and potential for disaster. For the cardiologist or cardiovascular surgeon, the increasing power and decreasing costs of computer hardware and the availability of software make use of an office computer system an increasingly attractive possibility. Management of office business functions is common; handling and scientific analysis of practice medical information are less common. The cardiologist can also access national medical information systems for literature searches and for interactive further education. Selection and testing of programs and the entire computer system before purchase of computer hardware will reduce the chances of disappointment or serious problems. Personnel pretraining and planning for office information flow and medical information security are necessary. Some cardiologists design their own office systems, buy hardware and software as needed, write programs for themselves and carry out the implementation themselves. For most cardiologists, the better course will be to take advantage of the professional experience of expert advisors. This article provides a starting point from which the practicing cardiologist can approach considering, specifying or implementing an office computer system for business functions and for scientific analysis of practice results.

  13. Computer-Assisted Analysis of Spontaneous Speech: Quantification of Basic Parameters in Aphasic and Unimpaired Language

    ERIC Educational Resources Information Center

    Hussmann, Katja; Grande, Marion; Meffert, Elisabeth; Christoph, Swetlana; Piefke, Martina; Willmes, Klaus; Huber, Walter

    2012-01-01

    Although generally accepted as an important part of aphasia assessment, detailed analysis of spontaneous speech is rarely carried out in clinical practice mostly due to time limitations. The Aachener Sprachanalyse (ASPA; Aachen Speech Analysis) is a computer-assisted method for the quantitative analysis of German spontaneous speech that allows for…

  14. From micro to mainframe. A practical approach to perinatal data processing.

    PubMed

    Yeh, S Y; Lincoln, T

    1985-04-01

    A new, practical approach to perinatal data processing for a large obstetric population is described. This was done with a microcomputer for data entry and a mainframe computer for data reduction. The Screen Oriented Data Access (SODA) program was used to generate the data entry form and to input data into the Apple II Plus computer. Data were stored on diskettes and transmitted through a modern and telephone line to the IBM 370/168 computer. The Statistical Analysis System (SAS) program was used for statistical analyses and report generations. This approach was found to be most practical, flexible, and economical.

  15. The impact of distributed computing on education

    NASA Technical Reports Server (NTRS)

    Utku, S.; Lestingi, J.; Salama, M.

    1982-01-01

    In this paper, developments in digital computer technology since the early Fifties are reviewed briefly, and the parallelism which exists between these developments and developments in analysis and design procedures of structural engineering is identified. The recent trends in digital computer technology are examined in order to establish the fact that distributed processing is now an accepted philosophy for further developments. The impact of this on the analysis and design practices of structural engineering is assessed by first examining these practices from a data processing standpoint to identify the key operations and data bases, and then fitting them to the characteristics of distributed processing. The merits and drawbacks of the present philosophy in educating structural engineers are discussed and projections are made for the industry-academia relations in the distributed processing environment of structural analysis and design. An ongoing experiment of distributed computing in a university environment is described.

  16. Correction for spatial averaging in laser speckle contrast analysis

    PubMed Central

    Thompson, Oliver; Andrews, Michael; Hirst, Evan

    2011-01-01

    Practical laser speckle contrast analysis systems face a problem of spatial averaging of speckles, due to the pixel size in the cameras used. Existing practice is to use a system factor in speckle contrast analysis to account for spatial averaging. The linearity of the system factor correction has not previously been confirmed. The problem of spatial averaging is illustrated using computer simulation of time-integrated dynamic speckle, and the linearity of the correction confirmed using both computer simulation and experimental results. The valid linear correction allows various useful compromises in the system design. PMID:21483623

  17. Guidelines for computer security in general practice.

    PubMed

    Schattner, Peter; Pleteshner, Catherine; Bhend, Heinz; Brouns, Johan

    2007-01-01

    As general practice becomes increasingly computerised, data security becomes increasingly important for both patient health and the efficient operation of the practice. To develop guidelines for computer security in general practice based on a literature review, an analysis of available information on current practice and a series of key stakeholder interviews. While the guideline was produced in the context of Australian general practice, we have developed a template that is also relevant for other countries. Current data on computer security measures was sought from Australian divisions of general practice. Semi-structured interviews were conducted with general practitioners (GPs), the medical software industry, senior managers within government responsible for health IT (information technology) initiatives, technical IT experts, divisions of general practice and a member of a health information consumer group. The respondents were asked to assess both the likelihood and the consequences of potential risks in computer security being breached. The study suggested that the most important computer security issues in general practice were: the need for a nominated IT security coordinator; having written IT policies, including a practice disaster recovery plan; controlling access to different levels of electronic data; doing and testing backups; protecting against viruses and other malicious codes; installing firewalls; undertaking routine maintenance of hardware and software; and securing electronic communication, for example via encryption. This information led to the production of computer security guidelines, including a one-page summary checklist, which were subsequently distributed to all GPs in Australia. This paper maps out a process for developing computer security guidelines for general practice. The specific content will vary in different countries according to their levels of adoption of IT, and cultural, technical and other health service factors. Making these guidelines relevant to local contexts should help maximise their uptake.

  18. SMART USE OF COMPUTER-AIDED SPERM ANALYSIS (CASA) TO CHARACTERIZE SPERM MOTION

    EPA Science Inventory

    Computer-aided sperm analysis (CASA) has evolved over the past fifteen years to provide an objective, practical means of measuring and characterizing the velocity and parttern of sperm motion. CASA instruments use video frame-grabber boards to capture multiple images of spermato...

  19. Evaluating Computer-Related Incidents on Campus

    ERIC Educational Resources Information Center

    Rothschild, Daniel; Rezmierski, Virginia

    2004-01-01

    The Computer Incident Factor Analysis and Categorization (CIFAC) Project at the University of Michigan began in September 2003 with grants from EDUCAUSE and the National Science Foundation (NSF). The project's primary goal is to create a best-practices security framework for colleges and universities based on rigorous quantitative analysis of…

  20. Relationship between quality of care and choice of clinical computing system: retrospective analysis of family practice performance under the UK's quality and outcomes framework.

    PubMed

    Kontopantelis, Evangelos; Buchan, Iain; Reeves, David; Checkland, Kath; Doran, Tim

    2013-08-02

    To investigate the relationship between performance on the UK Quality and Outcomes Framework pay-for-performance scheme and choice of clinical computer system. Retrospective longitudinal study. Data for 2007-2008 to 2010-2011, extracted from the clinical computer systems of general practices in England. All English practices participating in the pay-for-performance scheme: average 8257 each year, covering over 99% of the English population registered with a general practice. Levels of achievement on 62 quality-of-care indicators, measured as: reported achievement (levels of care after excluding inappropriate patients); population achievement (levels of care for all patients with the relevant condition) and percentage of available quality points attained. Multilevel mixed effects multiple linear regression models were used to identify population, practice and clinical computing system predictors of achievement. Seven clinical computer systems were consistently active in the study period, collectively holding approximately 99% of the market share. Of all population and practice characteristics assessed, choice of clinical computing system was the strongest predictor of performance across all three outcome measures. Differences between systems were greatest for intermediate outcomes indicators (eg, control of cholesterol levels). Under the UK's pay-for-performance scheme, differences in practice performance were associated with the choice of clinical computing system. This raises the question of whether particular system characteristics facilitate higher quality of care, better data recording or both. Inconsistencies across systems need to be understood and addressed, and researchers need to be cautious when generalising findings from samples of providers using a single computing system.

  1. A PROBABILISTIC APPROACH FOR ANALYSIS OF UNCERTAINTY IN THE EVALUATION OF WATERSHED MANAGEMENT PRACTICES

    EPA Science Inventory

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...

  2. Continuing educational needs in computers and informatics. McGill survey of family physicians.

    PubMed Central

    McClaran, J.; Snell, L.; Duarte-Franco, E.

    2000-01-01

    OBJECTIVE: To describe family physicians' perceived educational needs in computers and informatics. DESIGN: Mailed survey. SETTING: General or family practices in Canada. PARTICIPANTS: Physicians (489 responded to a mailing sent to 2,500 physicians) who might attend sessions at the McGill Centre for CME. Two duplicate questionnaires were excluded from the analysis. METHOD: Four domains were addressed: practice profile, clinical CME needs, professional CME needs, and preferred learning formats. Data were entered on dBASE IV; analyses were performed on SPSS. MAIN FINDINGS: In the 487 questionnaires retained for analysis, "informatics and computers" was mentioned more than any other clinical diagnostic area, any other professional area, and all but three patient groups and service areas as a topic where improvement in knowledge and skills was needed in the coming year. Most physicians had no access to computer support for practice (62.6%); physicians caring for neonates, toddlers, or hospital inpatients were more likely to report some type of computer support. CONCLUSIONS: Family physicians selected knowledge and skills for computers and informatics as an area for improvement in the coming year more frequently than they selected most traditional clinical CME topics. This educational need is particularly great in small towns and in settings where some computerized hospital data are already available. PMID:10790816

  3. Finite-data-size study on practical universal blind quantum computation

    NASA Astrophysics Data System (ADS)

    Zhao, Qiang; Li, Qiong

    2018-07-01

    The universal blind quantum computation with weak coherent pulses protocol is a practical scheme to allow a client to delegate a computation to a remote server while the computation hidden. However, in the practical protocol, a finite data size will influence the preparation efficiency in the remote blind qubit state preparation (RBSP). In this paper, a modified RBSP protocol with two decoy states is studied in the finite data size. The issue of its statistical fluctuations is analyzed thoroughly. The theoretical analysis and simulation results show that two-decoy-state case with statistical fluctuation is closer to the asymptotic case than the one-decoy-state case with statistical fluctuation. Particularly, the two-decoy-state protocol can achieve a longer communication distance than the one-decoy-state case in this statistical fluctuation situation.

  4. Home care nurses' attitudes toward computers. A confirmatory factor analysis of the Stronge and Brodt instrument.

    PubMed

    Stricklin, Mary Lou; Bierer, S Beth; Struk, Cynthia

    2003-01-01

    Point-of-care technology for home care use will be the final step in enterprise-wide healthcare electronic communications. Successful implementation of home care point-of-care technology hinges upon nurses' attitudes toward point-of-care technology and its use in clinical practice. This study addresses the factors associated with home care nurses' attitudes using Stronge and Brodt's Nurse Attitudes Toward Computers instrument. In this study, the Nurses Attitudes Toward Computers instrument was administered to a convenience sample of 138 nurses employed by a large midwestern home care agency, with an 88% response rate. Confirmatory factor analysis corroborated the Nurses Attitudes Toward Computers' 3-dimensional factor structure for practicing nurses, which was labeled as nurses' work, security issues, and perceived barriers. Results from the confirmatory factor analysis also suggest that these 3 factors are internally correlated and represent multiple dimensions of a higher order construct labeled as nurses' attitudes toward computers. Additionally, two of these factors, nurses' work and perceived barriers, each appears to explain more variance in nurses' attitudes toward computers than security issues. Instrument reliability was high for the sample (.90), with subscale reliabilities ranging from 86 to 70.

  5. Evaluating computer capabilities in a primary care practice-based research network.

    PubMed

    Ariza, Adolfo J; Binns, Helen J; Christoffel, Katherine Kaufer

    2004-01-01

    We wanted to assess computer capabilities in a primary care practice-based research network and to understand how receptive the practices were to new ideas for automation of practice activities and research. This study was conducted among members of the Pediatric Practice Research Group (PPRG). A survey to assess computer capabilities was developed to explore hardware types, software programs, Internet connectivity and data transmission; views on privacy and security; and receptivity to future electronic data collection approaches. Of the 40 PPRG practices participating in the study during the autumn of 2001, all used IBM-compatible systems. Of these, 45% used stand-alone desktops, 40% had networked desktops, and approximately 15% used laptops and minicomputers. A variety of software packages were used, with most practices (82%) having software for some aspect of patient care documentation, patient accounting (90%), business support (60%), and management reports and analysis (97%). The main obstacles to expanding use of computers in patient care were insufficient staff training (63%) and privacy concerns (82%). If provided with training and support, most practices indicated they were willing to consider an array of electronic data collection options for practice-based research activities. There is wide variability in hardware and software use in the pediatric practice setting. Implementing electronic data collection in the PPRG would require a substantial start-up effort and ongoing training and support at the practice site.

  6. Relationship between quality of care and choice of clinical computing system: retrospective analysis of family practice performance under the UK's quality and outcomes framework

    PubMed Central

    Kontopantelis, Evangelos; Buchan, Iain; Reeves, David; Checkland, Kath; Doran, Tim

    2013-01-01

    Objectives To investigate the relationship between performance on the UK Quality and Outcomes Framework pay-for-performance scheme and choice of clinical computer system. Design Retrospective longitudinal study. Setting Data for 2007–2008 to 2010–2011, extracted from the clinical computer systems of general practices in England. Participants All English practices participating in the pay-for-performance scheme: average 8257 each year, covering over 99% of the English population registered with a general practice. Main outcome measures Levels of achievement on 62 quality-of-care indicators, measured as: reported achievement (levels of care after excluding inappropriate patients); population achievement (levels of care for all patients with the relevant condition) and percentage of available quality points attained. Multilevel mixed effects multiple linear regression models were used to identify population, practice and clinical computing system predictors of achievement. Results Seven clinical computer systems were consistently active in the study period, collectively holding approximately 99% of the market share. Of all population and practice characteristics assessed, choice of clinical computing system was the strongest predictor of performance across all three outcome measures. Differences between systems were greatest for intermediate outcomes indicators (eg, control of cholesterol levels). Conclusions Under the UK's pay-for-performance scheme, differences in practice performance were associated with the choice of clinical computing system. This raises the question of whether particular system characteristics facilitate higher quality of care, better data recording or both. Inconsistencies across systems need to be understood and addressed, and researchers need to be cautious when generalising findings from samples of providers using a single computing system. PMID:23913774

  7. Wildlife software: procedures for publication of computer software

    USGS Publications Warehouse

    Samuel, M.D.

    1990-01-01

    Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.

  8. Analysis of basic clustering algorithms for numerical estimation of statistical averages in biomolecules.

    PubMed

    Anandakrishnan, Ramu; Onufriev, Alexey

    2008-03-01

    In statistical mechanics, the equilibrium properties of a physical system of particles can be calculated as the statistical average over accessible microstates of the system. In general, these calculations are computationally intractable since they involve summations over an exponentially large number of microstates. Clustering algorithms are one of the methods used to numerically approximate these sums. The most basic clustering algorithms first sub-divide the system into a set of smaller subsets (clusters). Then, interactions between particles within each cluster are treated exactly, while all interactions between different clusters are ignored. These smaller clusters have far fewer microstates, making the summation over these microstates, tractable. These algorithms have been previously used for biomolecular computations, but remain relatively unexplored in this context. Presented here, is a theoretical analysis of the error and computational complexity for the two most basic clustering algorithms that were previously applied in the context of biomolecular electrostatics. We derive a tight, computationally inexpensive, error bound for the equilibrium state of a particle computed via these clustering algorithms. For some practical applications, it is the root mean square error, which can be significantly lower than the error bound, that may be more important. We how that there is a strong empirical relationship between error bound and root mean square error, suggesting that the error bound could be used as a computationally inexpensive metric for predicting the accuracy of clustering algorithms for practical applications. An example of error analysis for such an application-computation of average charge of ionizable amino-acids in proteins-is given, demonstrating that the clustering algorithm can be accurate enough for practical purposes.

  9. Multiphysics Simulations: Challenges and Opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, David; McInnes, Lois C.; Woodward, Carol

    2013-02-12

    We consider multiphysics applications from algorithmic and architectural perspectives, where ‘‘algorithmic’’ includes both mathematical analysis and computational complexity, and ‘‘architectural’’ includes both software and hardware environments. Many diverse multiphysics applications can be reduced, en route to their computational simulation, to a common algebraic coupling paradigm. Mathematical analysis of multiphysics coupling in this form is not always practical for realistic applications, but model problems representative of applications discussed herein can provide insight. A variety of software frameworks for multiphysics applications have been constructed and refined within disciplinary communities and executed on leading-edge computer systems. We examine several of these, expose somemore » commonalities among them, and attempt to extrapolate best practices to future systems. From our study, we summarize challenges and forecast opportunities.« less

  10. On computations of variance, covariance and correlation for interval data

    NASA Astrophysics Data System (ADS)

    Kishida, Masako

    2017-02-01

    In many practical situations, the data on which statistical analysis is to be performed is only known with interval uncertainty. Different combinations of values from the interval data usually lead to different values of variance, covariance, and correlation. Hence, it is desirable to compute the endpoints of possible values of these statistics. This problem is, however, NP-hard in general. This paper shows that the problem of computing the endpoints of possible values of these statistics can be rewritten as the problem of computing skewed structured singular values ν, for which there exist feasible (polynomial-time) algorithms that compute reasonably tight bounds in most practical cases. This allows one to find tight intervals of the aforementioned statistics for interval data.

  11. Accelerate Healthcare Data Analytics: An Agile Practice to Perform Collaborative and Reproducible Analyses.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Li, Jing; Hu, Gang; Xie, Guotong

    2016-01-01

    Recent advances in cloud computing and machine learning made it more convenient for researchers to gain insights from massive healthcare data, while performing analyses on healthcare data in current practice still lacks efficiency for researchers. What's more, collaborating among different researchers and sharing analysis results are challenging issues. In this paper, we developed a practice to make analytics process collaborative and analysis results reproducible by exploiting and extending Jupyter Notebook. After applying this practice in our use cases, we can perform analyses and deliver results with less efforts in shorter time comparing to our previous practice.

  12. 13C-based metabolic flux analysis: fundamentals and practice.

    PubMed

    Yang, Tae Hoon

    2013-01-01

    Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.

  13. Computer technology applications in industrial and organizational psychology.

    PubMed

    Crespin, Timothy R; Austin, James T

    2002-08-01

    This article reviews computer applications developed and utilized by industrial-organizational (I-O) psychologists, both in practice and in research. A primary emphasis is on applications developed for Internet usage, because this "network of networks" changes the way I-O psychologists work. The review focuses on traditional and emerging topics in I-O psychology. The first topic involves information technology applications in measurement, defined broadly across levels of analysis (persons, groups, organizations) and domains (abilities, personality, attitudes). Discussion then focuses on individual learning at work, both in formal training and in coping with continual automation of work. A section on job analysis follows, illustrating the role of computers and the Internet in studying jobs. Shifting focus to the group level of analysis, we briefly review how information technology is being used to understand and support cooperative work. Finally, special emphasis is given to the emerging "third discipline" in I-O psychology research-computational modeling of behavioral events in organizations. Throughout this review, themes of innovation and dissemination underlie a continuum between research and practice. The review concludes by setting a framework for I-O psychology in a computerized and networked world.

  14. Developing Tools for Research on School Leadership Development: An Illustrative Case of a Computer Simulation

    ERIC Educational Resources Information Center

    Showanasai, Parinya; Lu, Jiafang; Hallinger, Philip

    2013-01-01

    Purpose: The extant literature on school leadership development is dominated by conceptual analysis, descriptive studies of current practice, critiques of current practice, and prescriptions for better ways to approach practice. Relatively few studies have examined impact of leadership development using experimental methods, among which even fewer…

  15. An Investigation of Student Practices in Asynchronous Computer Conferencing Courses

    ERIC Educational Resources Information Center

    Peters, Vanessa L.; Hewitt, Jim

    2010-01-01

    This study investigated the online practices of students enrolled in graduate-level distance education courses. Using interviews and a questionnaire as data sources, the study sought to: (a) identify common practices that students adopt in asynchronous discussions, and (b) gain an understanding of why students adopt them. An analysis of the data…

  16. Practical quality control tools for curves and surfaces

    NASA Technical Reports Server (NTRS)

    Small, Scott G.

    1992-01-01

    Curves (geometry) and surfaces created by Computer Aided Geometric Design systems in the engineering environment must satisfy two basic quality criteria: the geometric shape must have the desired engineering properties; and the objects must be parameterized in a way which does not cause computational difficulty for geometric processing and engineering analysis. Interactive techniques are described which are in use at Boeing to evaluate the quality of aircraft geometry prior to Computational Fluid Dynamic analysis, including newly developed methods for examining surface parameterization and its effects.

  17. Towards practical multiscale approach for analysis of reinforced concrete structures

    NASA Astrophysics Data System (ADS)

    Moyeda, Arturo; Fish, Jacob

    2017-12-01

    We present a novel multiscale approach for analysis of reinforced concrete structural elements that overcomes two major hurdles in utilization of multiscale technologies in practice: (1) coupling between material and structural scales due to consideration of large representative volume elements (RVE), and (2) computational complexity of solving complex nonlinear multiscale problems. The former is accomplished using a variant of computational continua framework that accounts for sizeable reinforced concrete RVEs by adjusting the location of quadrature points. The latter is accomplished by means of reduced order homogenization customized for structural elements. The proposed multiscale approach has been verified against direct numerical simulations and validated against experimental results.

  18. Teacher Education Faculty and Computer Competency.

    ERIC Educational Resources Information Center

    Barger, Robert N.; Armel, Donald

    A project was introduced in the College of Education at Eastern Illinois University to assist faculty, through inservice training, to become more knowledgeable about computer applications and limitations. Practical needs of faculty included word processing, statistical analysis, database manipulation, electronic mail, file transfers, file…

  19. The Computer Integration into the EFL Instruction in Indonesia: An Analysis of Two University Instructors in Integrating Computer Technology into EFL Instruction to Encourage Students' Language Learning Engagement

    ERIC Educational Resources Information Center

    Prihatin, Pius N.

    2012-01-01

    Computer technology has been popular for teaching English as a foreign language in non-English speaking countries. This case study explored the way language instructors designed and implemented computer-based instruction so that students are engaged in English language learning. This study explored the beliefs, practices and perceptions of…

  20. Universal, computer facilitated, steady state oscillator, closed loop analysis theory and some applications to precision oscillators

    NASA Technical Reports Server (NTRS)

    Parzen, Benjamin

    1992-01-01

    The theory of oscillator analysis in the immittance domain should be read in conjunction with the additional theory presented here. The combined theory enables the computer simulation of the steady state oscillator. The simulation makes the calculation of the oscillator total steady state performance practical, including noise at all oscillator locations. Some specific precision oscillators are analyzed.

  1. 3-D Electromagnetic field analysis of wireless power transfer system using K computer

    NASA Astrophysics Data System (ADS)

    Kawase, Yoshihiro; Yamaguchi, Tadashi; Murashita, Masaya; Tsukada, Shota; Ota, Tomohiro; Yamamoto, Takeshi

    2018-05-01

    We analyze the electromagnetic field of a wireless power transfer system using the 3-D parallel finite element method on K computer, which is a super computer in Japan. It is clarified that the electromagnetic field of the wireless power transfer system can be analyzed in a practical time using the parallel computation on K computer, moreover, the accuracy of the loss calculation becomes better as the mesh division of the shield becomes fine.

  2. Computer classes and games in virtual reality environment to reduce loneliness among students of an elderly reference center

    PubMed Central

    Antunes, Thaiany Pedrozo Campos; de Oliveira, Acary Souza Bulle; Crocetta, Tania Brusque; Antão, Jennifer Yohanna Ferreira de Lima; Barbosa, Renata Thais de Almeida; Guarnieri, Regiani; Massetti, Thais; Monteiro, Carlos Bandeira de Mello; de Abreu, Luiz Carlos

    2017-01-01

    Abstract Introduction: Physical and mental changes associated with aging commonly lead to a decrease in communication capacity, reducing social interactions and increasing loneliness. Computer classes for older adults make significant contributions to social and cognitive aspects of aging. Games in a virtual reality (VR) environment stimulate the practice of communicative and cognitive skills and might also bring benefits to older adults. Furthermore, it might help to initiate their contact to the modern technology. The purpose of this study protocol is to evaluate the effects of practicing VR games during computer classes on the level of loneliness of students of an elderly reference center. Methods and Analysis: This study will be a prospective longitudinal study with a randomised cross-over design, with subjects aged 50 years and older, of both genders, spontaneously enrolled in computer classes for beginners. Data collection will be done in 3 moments: moment 0 (T0) – at baseline; moment 1 (T1) – after 8 typical computer classes; and moment 2 (T2) – after 8 computer classes which include 15 minutes for practicing games in VR environment. A characterization questionnaire, the short version of the Short Social and Emotional Loneliness Scale for Adults (SELSA-S) and 3 games with VR (Random, MoviLetrando, and Reaction Time) will be used. For the intervention phase 4 other games will be used: Coincident Timing, Motor Skill Analyser, Labyrinth, and Fitts. The statistical analysis will compare the evolution in loneliness perception, performance, and reaction time during the practice of the games between the 3 moments of data collection. Performance and reaction time during the practice of the games will also be correlated to the loneliness perception. Ethics and Dissemination: The protocol is approved by the host institution's ethics committee under the number 52305215.3.0000.0082. Results will be disseminated via peer-reviewed journal articles and conferences. This clinical trial is registered at ClinicalTrials.gov identifier: NCT02798081. PMID:28272198

  3. A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Piascik, Robert S.; Newman, James C., Jr.

    1999-01-01

    An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.

  4. A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints

    NASA Technical Reports Server (NTRS)

    Harris, C. E.; Piascik, R. S.; Newman, J. C., Jr.

    2000-01-01

    An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.

  5. Legal issues of computer imaging in plastic surgery: a primer.

    PubMed

    Chávez, A E; Dagum, P; Koch, R J; Newman, J P

    1997-11-01

    Although plastic surgeons are increasingly incorporating computer imaging techniques into their practices, many fear the possibility of legally binding themselves to achieve surgical results identical to those reflected in computer images. Computer imaging allows surgeons to manipulate digital photographs of patients to project possible surgical outcomes. Some of the many benefits imaging techniques pose include improving doctor-patient communication, facilitating the education and training of residents, and reducing administrative and storage costs. Despite the many advantages computer imaging systems offer, however, surgeons understandably worry that imaging systems expose them to immense legal liability. The possible exploitation of computer imaging by novice surgeons as a marketing tool, coupled with the lack of consensus regarding the treatment of computer images, adds to the concern of surgeons. A careful analysis of the law, however, reveals that surgeons who use computer imaging carefully and conservatively, and adopt a few simple precautions, substantially reduce their vulnerability to legal claims. In particular, surgeons face possible claims of implied contract, failure to instruct, and malpractice from their use or failure to use computer imaging. Nevertheless, legal and practical obstacles frustrate each of those causes of actions. Moreover, surgeons who incorporate a few simple safeguards into their practice may further reduce their legal susceptibility.

  6. Protecting genomic data analytics in the cloud: state of the art and opportunities.

    PubMed

    Tang, Haixu; Jiang, Xiaoqian; Wang, Xiaofeng; Wang, Shuang; Sofia, Heidi; Fox, Dov; Lauter, Kristin; Malin, Bradley; Telenti, Amalio; Xiong, Li; Ohno-Machado, Lucila

    2016-10-13

    The outsourcing of genomic data into public cloud computing settings raises concerns over privacy and security. Significant advancements in secure computation methods have emerged over the past several years, but such techniques need to be rigorously evaluated for their ability to support the analysis of human genomic data in an efficient and cost-effective manner. With respect to public cloud environments, there are concerns about the inadvertent exposure of human genomic data to unauthorized users. In analyses involving multiple institutions, there is additional concern about data being used beyond agreed research scope and being prcoessed in untrused computational environments, which may not satisfy institutional policies. To systematically investigate these issues, the NIH-funded National Center for Biomedical Computing iDASH (integrating Data for Analysis, 'anonymization' and SHaring) hosted the second Critical Assessment of Data Privacy and Protection competition to assess the capacity of cryptographic technologies for protecting computation over human genomes in the cloud and promoting cross-institutional collaboration. Data scientists were challenged to design and engineer practical algorithms for secure outsourcing of genome computation tasks in working software, whereby analyses are performed only on encrypted data. They were also challenged to develop approaches to enable secure collaboration on data from genomic studies generated by multiple organizations (e.g., medical centers) to jointly compute aggregate statistics without sharing individual-level records. The results of the competition indicated that secure computation techniques can enable comparative analysis of human genomes, but greater efficiency (in terms of compute time and memory utilization) are needed before they are sufficiently practical for real world environments.

  7. Computer-based quantitative computed tomography image analysis in idiopathic pulmonary fibrosis: A mini review.

    PubMed

    Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio

    2018-01-01

    Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  8. Linking Questions and Evidence

    ERIC Educational Resources Information Center

    Tenenberg, Josh; McCartney, Robert

    2008-01-01

    This special issue features a set of papers recently published in the 3rd International Workshop on Computing Education Research (ICER'07). The papers were selected because they closely meet the publication criteria for ACM/JERIC: stemming from computing education practice, grounded in relevant literature, containing analysis of primary empirical…

  9. Information Technology: A Community of Practice. A Workplace Analysis

    ERIC Educational Resources Information Center

    Guerrero, Tony

    2014-01-01

    Information Technology (IT) encompasses all aspects of computing technology. IT is concerned with issues relating to supporting technology users and meeting their needs within an organizational and societal context through the selection, creation, application, integration, and administration of computing technologies (Lunt, et. al., 2008). The…

  10. A Practice-Oriented Bifurcation Analysis for Pulse Energy Converters: A Stability Margin

    NASA Astrophysics Data System (ADS)

    Kolokolov, Yury; Monovskaya, Anna

    The popularity of systems of pulse energy conversion (PEC-systems) for practical applications is due to the heightened efficiency of energy conversion processes with comparatively simple realizations. Nevertheless, a PEC-system represents a nonlinear object with a variable structure, and the bifurcation analysis remains the basic tool to describe PEC dynamics evolution. The paper is devoted to the discussion on whether the scientific viewpoint on the natural nonlinear dynamics evolution can be involved in practical applications. We focus on the problems connected with stability boundaries of an operating regime. The results of both small-signal analysis and computational bifurcation analysis are considered in the parametrical space in comparison with the results of the experimental identification of the zonal heterogeneity of the operating process. This allows to propose an adapted stability margin as a sufficiently safe distance before the point after which the operating process begins to lose the stability. Such stability margin can extend the permissible operating domain in the parametrical space at the expense of using cause-and-effect relations in the context of natural regularities of nonlinear dynamics. Reasoning and discussion are based on the experimental and computational results for a synchronous buck converter with a pulse-width modulation. The presented results can be useful, first of all, for PEC-systems with significant variation of equivalent inductance and/or capacity. We believe that the discussion supports a viewpoint by which the contemporary methods of the computational and experimental bifurcation analyses possess both analytical abilities and experimental techniques for promising solutions which could be practice-oriented for PEC-systems.

  11. Real Time Text Analysis

    NASA Astrophysics Data System (ADS)

    Senthilkumar, K.; Ruchika Mehra Vijayan, E.

    2017-11-01

    This paper aims to illustrate real time analysis of large scale data. For practical implementation we are performing sentiment analysis on live Twitter feeds for each individual tweet. To analyze sentiments we will train our data model on sentiWordNet, a polarity assigned wordNet sample by Princeton University. Our main objective will be to efficiency analyze large scale data on the fly using distributed computation. Apache Spark and Apache Hadoop eco system is used as distributed computation platform with Java as development language

  12. Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.

    1981-01-01

    Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.

  13. Interfacing Email Tutoring: Shaping an Emergent Literate Practice.

    ERIC Educational Resources Information Center

    Anderson, Dana

    2002-01-01

    Presents a descriptive analysis of 29 online writing lab sites for email tutoring, currently the most popular mode of computer-mediated collaboration. Considers how email tutoring interfaces represent the literate practice of email tutoring, shaping expectations and experiences consistent with its literate aims. Suggests that email tutoring…

  14. Conformational Analysis of Drug Molecules: A Practical Exercise in the Medicinal Chemistry Course

    ERIC Educational Resources Information Center

    Yuriev, Elizabeth; Chalmers, David; Capuano, Ben

    2009-01-01

    Medicinal chemistry is a specialized, scientific discipline. Computational chemistry and structure-based drug design constitute important themes in the education of medicinal chemists. This problem-based task is associated with structure-based drug design lectures. It requires students to use computational techniques to investigate conformational…

  15. STARS: An integrated general-purpose finite element structural, aeroelastic, and aeroservoelastic analysis computer program

    NASA Technical Reports Server (NTRS)

    Gupta, Kajal K.

    1991-01-01

    The details of an integrated general-purpose finite element structural analysis computer program which is also capable of solving complex multidisciplinary problems is presented. Thus, the SOLIDS module of the program possesses an extensive finite element library suitable for modeling most practical problems and is capable of solving statics, vibration, buckling, and dynamic response problems of complex structures, including spinning ones. The aerodynamic module, AERO, enables computation of unsteady aerodynamic forces for both subsonic and supersonic flow for subsequent flutter and divergence analysis of the structure. The associated aeroservoelastic analysis module, ASE, effects aero-structural-control stability analysis yielding frequency responses as well as damping characteristics of the structure. The program is written in standard FORTRAN to run on a wide variety of computers. Extensive graphics, preprocessing, and postprocessing routines are also available pertaining to a number of terminals.

  16. Information-seeking practices of dental hygienists.

    PubMed Central

    Gravois, S L; Fisher, W; Bowen, D M

    1995-01-01

    This paper reports on a survey of the information-seeking, critical-analysis, and computer-application practices of dental hygienists. Questionnaires were mailed to a convenience sample of seventy-one dental hygiene practitioners. A 62% response rate was achieved. Results indicated that discussions with colleagues, continuing education courses, journals, and newsletters were the sources used most frequently for professional development and information retrieval. To evaluate professional information, these hygienists tended to rely on personal experience, credibility of the journal, and discussions with colleagues. Word processing was the most frequently used computer application; online database searching was rare in this group. Computer used within the employment setting was primarily for business rather than clinical applications. Many hygienists were interested in attending continuing education courses on use of computers to acquire professional information. PMID:8547904

  17. Analyzing and Modifying Coaching Behaviors by Means of Computer Aided Observation.

    ERIC Educational Resources Information Center

    Partridge, David; Franks, Ian M.

    1996-01-01

    This study examined the effectiveness of the Coaching Analysis Instrument (CAI), which collects data and provides feedback on coaches' verbal behaviors as they organize and instruct athletes during practices. Practice sessions were videotaped and analyzed. CAI results helped modify the coaches' behaviors in positive ways. (SM)

  18. Content Analysis: What Are They Talking About?

    ERIC Educational Resources Information Center

    Strijbos, Jan-Willem; Martens, Rob L.; Prins, Frans J.; Jochems, Wim M. G.

    2006-01-01

    Quantitative content analysis is increasingly used to surpass surface level analyses in computer-supported collaborative learning (e.g., counting messages), but critical reflection on accepted practice has generally not been reported. A review of CSCL conference proceedings revealed a general vagueness in definitions of units of analysis. In…

  19. Fusing Symbolic and Numerical Diagnostic Computations

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.

  20. Cost-effective use of minicomputers to solve structural problems

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Foster, E. P.

    1978-01-01

    Minicomputers are receiving increased use throughout the aerospace industry. Until recently, their use focused primarily on process control and numerically controlled tooling applications, while their exposure to and the opportunity for structural calculations has been limited. With the increased availability of this computer hardware, the question arises as to the feasibility and practicality of carrying out comprehensive structural analysis on a minicomputer. This paper presents results on the potential for using minicomputers for structural analysis by (1) selecting a comprehensive, finite-element structural analysis system in use on large mainframe computers; (2) implementing the system on a minicomputer; and (3) comparing the performance of the minicomputers with that of a large mainframe computer for the solution to a wide range of finite element structural analysis problems.

  1. A System Architecture for Efficient Transmission of Massive DNA Sequencing Data.

    PubMed

    Sağiroğlu, Mahmut Şamİl; Külekcİ, M Oğuzhan

    2017-11-01

    The DNA sequencing data analysis pipelines require significant computational resources. In that sense, cloud computing infrastructures appear as a natural choice for this processing. However, the first practical difficulty in reaching the cloud computing services is the transmission of the massive DNA sequencing data from where they are produced to where they will be processed. The daily practice here begins with compressing the data in FASTQ file format, and then sending these data via fast data transmission protocols. In this study, we address the weaknesses in that daily practice and present a new system architecture that incorporates the computational resources available on the client side while dynamically adapting itself to the available bandwidth. Our proposal considers the real-life scenarios, where the bandwidth of the connection between the parties may fluctuate, and also the computing power on the client side may be of any size ranging from moderate personal computers to powerful workstations. The proposed architecture aims at utilizing both the communication bandwidth and the computing resources for satisfying the ultimate goal of reaching the results as early as possible. We present a prototype implementation of the proposed architecture, and analyze several real-life cases, which provide useful insights for the sequencing centers, especially on deciding when to use a cloud service and in what conditions.

  2. Manual vs. computer-assisted sperm analysis: can CASA replace manual assessment of human semen in clinical practice?

    PubMed

    Talarczyk-Desole, Joanna; Berger, Anna; Taszarek-Hauke, Grażyna; Hauke, Jan; Pawelczyk, Leszek; Jedrzejczak, Piotr

    2017-01-01

    The aim of the study was to check the quality of computer-assisted sperm analysis (CASA) system in comparison to the reference manual method as well as standardization of the computer-assisted semen assessment. The study was conducted between January and June 2015 at the Andrology Laboratory of the Division of Infertility and Reproductive Endocrinology, Poznań University of Medical Sciences, Poland. The study group consisted of 230 men who gave sperm samples for the first time in our center as part of an infertility investigation. The samples underwent manual and computer-assisted assessment of concentration, motility and morphology. A total of 184 samples were examined twice: manually, according to the 2010 WHO recommendations, and with CASA, using the program set-tings provided by the manufacturer. Additionally, 46 samples underwent two manual analyses and two computer-assisted analyses. The p-value of p < 0.05 was considered as statistically significant. Statistically significant differences were found between all of the investigated sperm parameters, except for non-progressive motility, measured with CASA and manually. In the group of patients where all analyses with each method were performed twice on the same sample we found no significant differences between both assessments of the same probe, neither in the samples analyzed manually nor with CASA, although standard deviation was higher in the CASA group. Our results suggest that computer-assisted sperm analysis requires further improvement for a wider application in clinical practice.

  3. Parallel-vector computation for structural analysis and nonlinear unconstrained optimization problems

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.

    1990-01-01

    Practical engineering application can often be formulated in the form of a constrained optimization problem. There are several solution algorithms for solving a constrained optimization problem. One approach is to convert a constrained problem into a series of unconstrained problems. Furthermore, unconstrained solution algorithms can be used as part of the constrained solution algorithms. Structural optimization is an iterative process where one starts with an initial design, a finite element structure analysis is then performed to calculate the response of the system (such as displacements, stresses, eigenvalues, etc.). Based upon the sensitivity information on the objective and constraint functions, an optimizer such as ADS or IDESIGN, can be used to find the new, improved design. For the structural analysis phase, the equation solver for the system of simultaneous, linear equations plays a key role since it is needed for either static, or eigenvalue, or dynamic analysis. For practical, large-scale structural analysis-synthesis applications, computational time can be excessively large. Thus, it is necessary to have a new structural analysis-synthesis code which employs new solution algorithms to exploit both parallel and vector capabilities offered by modern, high performance computers such as the Convex, Cray-2 and Cray-YMP computers. The objective of this research project is, therefore, to incorporate the latest development in the parallel-vector equation solver, PVSOLVE into the widely popular finite-element production code, such as the SAP-4. Furthermore, several nonlinear unconstrained optimization subroutines have also been developed and tested under a parallel computer environment. The unconstrained optimization subroutines are not only useful in their own right, but they can also be incorporated into a more popular constrained optimization code, such as ADS.

  4. Evaluating How the Computer-Supported Collaborative Learning Community Fosters Critical Reflective Practices

    ERIC Educational Resources Information Center

    Ma, Ada W.W.

    2013-01-01

    In recent research, little attention has been paid to issues of methodology and analysis methods to evaluate the quality of the collaborative learning community. To address such issues, an attempt is made to adopt the Activity System Model as an analytical framework to examine the relationship between computer supported collaborative learning…

  5. Negotiating Knowledge Contribution to Multiple Discourse Communities: A Doctoral Student of Computer Science Writing for Publication

    ERIC Educational Resources Information Center

    Li, Yongyan

    2006-01-01

    Despite the rich literature on disciplinary knowledge construction and multilingual scholars' academic literacy practices, little is known about how novice scholars are engaged in knowledge construction in negotiation with various target discourse communities. In this case study, with a focused analysis of a Chinese computer science doctoral…

  6. Computers in the Schools: How Will Educators Cope with the Revolution?

    ERIC Educational Resources Information Center

    Gleason, Gerald T.; Reed, Timothy

    A study was implemented to conduct a long-range observation and analysis of the process by which computers are channeled into educational practice. Data collection involved a structured interview with knowledgeable representatives of 35 school districts in Wisconsin. Participating schools were selected randomly and stratified by size. Questions in…

  7. Difference-Equation/Flow-Graph Circuit Analysis

    NASA Technical Reports Server (NTRS)

    Mcvey, I. M.

    1988-01-01

    Numerical technique enables rapid, approximate analyses of electronic circuits containing linear and nonlinear elements. Practiced in variety of computer languages on large and small computers; for circuits simple enough, programmable hand calculators used. Although some combinations of circuit elements make numerical solutions diverge, enables quick identification of divergence and correction of circuit models to make solutions converge.

  8. The emergence of spatial cyberinfrastructure.

    PubMed

    Wright, Dawn J; Wang, Shaowen

    2011-04-05

    Cyberinfrastructure integrates advanced computer, information, and communication technologies to empower computation-based and data-driven scientific practice and improve the synthesis and analysis of scientific data in a collaborative and shared fashion. As such, it now represents a paradigm shift in scientific research that has facilitated easy access to computational utilities and streamlined collaboration across distance and disciplines, thereby enabling scientific breakthroughs to be reached more quickly and efficiently. Spatial cyberinfrastructure seeks to resolve longstanding complex problems of handling and analyzing massive and heterogeneous spatial datasets as well as the necessity and benefits of sharing spatial data flexibly and securely. This article provides an overview and potential future directions of spatial cyberinfrastructure. The remaining four articles of the special feature are introduced and situated in the context of providing empirical examples of how spatial cyberinfrastructure is extending and enhancing scientific practice for improved synthesis and analysis of both physical and social science data. The primary focus of the articles is spatial analyses using distributed and high-performance computing, sensor networks, and other advanced information technology capabilities to transform massive spatial datasets into insights and knowledge.

  9. The emergence of spatial cyberinfrastructure

    PubMed Central

    Wright, Dawn J.; Wang, Shaowen

    2011-01-01

    Cyberinfrastructure integrates advanced computer, information, and communication technologies to empower computation-based and data-driven scientific practice and improve the synthesis and analysis of scientific data in a collaborative and shared fashion. As such, it now represents a paradigm shift in scientific research that has facilitated easy access to computational utilities and streamlined collaboration across distance and disciplines, thereby enabling scientific breakthroughs to be reached more quickly and efficiently. Spatial cyberinfrastructure seeks to resolve longstanding complex problems of handling and analyzing massive and heterogeneous spatial datasets as well as the necessity and benefits of sharing spatial data flexibly and securely. This article provides an overview and potential future directions of spatial cyberinfrastructure. The remaining four articles of the special feature are introduced and situated in the context of providing empirical examples of how spatial cyberinfrastructure is extending and enhancing scientific practice for improved synthesis and analysis of both physical and social science data. The primary focus of the articles is spatial analyses using distributed and high-performance computing, sensor networks, and other advanced information technology capabilities to transform massive spatial datasets into insights and knowledge. PMID:21467227

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D.; McInnes, L. C.; Woodward, C.

    This report is an outcome of the workshop Multiphysics Simulations: Challenges and Opportunities, sponsored by the Institute of Computing in Science (ICiS). Additional information about the workshop, including relevant reading and presentations on multiphysics issues in applications, algorithms, and software, is available via https://sites.google.com/site/icismultiphysics2011/. We consider multiphysics applications from algorithmic and architectural perspectives, where 'algorithmic' includes both mathematical analysis and computational complexity and 'architectural' includes both software and hardware environments. Many diverse multiphysics applications can be reduced, en route to their computational simulation, to a common algebraic coupling paradigm. Mathematical analysis of multiphysics coupling in this form is not alwaysmore » practical for realistic applications, but model problems representative of applications discussed herein can provide insight. A variety of software frameworks for multiphysics applications have been constructed and refined within disciplinary communities and executed on leading-edge computer systems. We examine several of these, expose some commonalities among them, and attempt to extrapolate best practices to future systems. From our study, we summarize challenges and forecast opportunities. We also initiate a modest suite of test problems encompassing features present in many applications.« less

  11. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  12. The Research of Computer Aided Farm Machinery Designing Method Based on Ergonomics

    NASA Astrophysics Data System (ADS)

    Gao, Xiyin; Li, Xinling; Song, Qiang; Zheng, Ying

    Along with agricultural economy development, the farm machinery product type Increases gradually, the ergonomics question is also getting more and more prominent. The widespread application of computer aided machinery design makes it possible that farm machinery design is intuitive, flexible and convenient. At present, because the developed computer aided ergonomics software has not suitable human body database, which is needed in view of farm machinery design in China, the farm machinery design have deviation in ergonomics analysis. This article puts forward that using the open database interface procedure in CATIA to establish human body database which aims at the farm machinery design, and reading the human body data to ergonomics module of CATIA can product practical application virtual body, using human posture analysis and human activity analysis module to analysis the ergonomics in farm machinery, thus computer aided farm machinery designing method based on engineering can be realized.

  13. Thermohydrodynamic Analysis of Cryogenic Liquid Turbulent Flow Fluid Film Bearings

    NASA Technical Reports Server (NTRS)

    San Andres, Luis

    1996-01-01

    This report describes a thermohydrodynamic analysis and computer programs for the prediction of the static and dynamic force response of fluid film bearings for cryogenic applications. The research performed addressed effectively the most important theoretical and practical issues related to the operation and performance of cryogenic fluid film bearings. Five computer codes have been licensed by the Texas A&M University to NASA centers and contractors and a total of 14 technical papers have been published.

  14. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud

    PubMed Central

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Background Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. Results We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. Conclusions This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation. PMID:26501966

  15. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud.

    PubMed

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation.

  16. Protocol for the PINCER trial: a cluster randomised trial comparing the effectiveness of a pharmacist-led IT-based intervention with simple feedback in reducing rates of clinically important errors in medicines management in general practices

    PubMed Central

    Avery, Anthony J; Rodgers, Sarah; Cantrill, Judith A; Armstrong, Sarah; Elliott, Rachel; Howard, Rachel; Kendrick, Denise; Morris, Caroline J; Murray, Scott A; Prescott, Robin J; Cresswell, Kathrin; Sheikh, Aziz

    2009-01-01

    Background Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family) practice. Methods Research subject group: "At-risk" patients registered with computerised general practices in two geographical regions in England. Design: Parallel group pragmatic cluster randomised trial. Interventions: Practices will be randomised to either: (i) Computer-generated feedback; or (ii) Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. Primary outcome measures: The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs - with a computer-recorded diagnosis of asthma being prescribed beta-blockers - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. Secondary outcome measures; These relate to a number of other examples of potentially hazardous prescribing and medicines management. Economic analysis: An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS), comparing the pharmacist-led intervention with simple feedback. Qualitative analysis: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove ineffective. Sample size: 34 practices in each of the two treatment arms would provide at least 80% power (two-tailed alpha of 0.05) to demonstrate a 50% reduction in error rates for each of the three primary outcome measures in the pharmacist-led intervention arm compared with a 11% reduction in the simple feedback arm. Discussion At the time of submission of this article, 72 general practices have been recruited (36 in each arm of the trial) and the interventions have been delivered. Analysis has not yet been undertaken. Trial registration Current controlled trials ISRCTN21785299 PMID:19409095

  17. Incorporating Computer-Aided Language Sample Analysis into Clinical Practice

    ERIC Educational Resources Information Center

    Price, Lisa Hammett; Hendricks, Sean; Cook, Colleen

    2010-01-01

    Purpose: During the evaluation of language abilities, the needs of the child are best served when multiple types and sources of data are included in the evaluation process. Current educational policies and practice guidelines further dictate the use of authentic assessment data to inform diagnosis and treatment planning. Language sampling and…

  18. Farmers' Preferences for Methods of Receiving Information on New or Innovative Farming Practices.

    ERIC Educational Resources Information Center

    Riesenberg, Lou E.; Gor, Christopher Obel

    1989-01-01

    Survey of 386 Idaho farmers (response rate 58 percent) identified preferred methods of receiving information on new or innovative farming practices. Analysis revealed preference for interpersonal methods (demonstrations, tours, and field trips) over mass media such as computer-assisted instruction (CAI) and home study, although younger farmers,…

  19. Turbulent Dispersion Modelling in a Complex Urban Environment - Data Analysis and Model Development

    DTIC Science & Technology

    2010-02-01

    Technology Laboratory (Dstl) is used as a benchmark for comparison. Comparisons are also made with some more practically oriented computational fluid dynamics...predictions. To achieve clarity in the range of approaches available for practical models of con- taminant dispersion in urban areas, an overview of...complexity of those methods is simplified to a degree that allows straightforward practical implementation and application. Using these results as a

  20. Linguistics, Computers, and the Language Teacher. A Communicative Approach.

    ERIC Educational Resources Information Center

    Underwood, John H.

    This analysis of the state of the art of computer programs and programming for language teaching has two parts. In the first part, an overview of the theory and practice of language teaching, Noam Chomsky's view of language, and the implications and problems of generative theory are presented. The theory behind the input model of language…

  1. Computer-Based Testing in the Medical Curriculum: A Decade of Experiences at One School

    ERIC Educational Resources Information Center

    McNulty, John; Chandrasekhar, Arcot; Hoyt, Amy; Gruener, Gregory; Espiritu, Baltazar; Price, Ron, Jr.

    2011-01-01

    This report summarizes more than a decade of experiences with implementing computer-based testing across a 4-year medical curriculum. Practical considerations are given to the fields incorporated within an item database and their use in the creation and analysis of examinations, security issues in the delivery and integrity of examinations,…

  2. Conductors of the Digitized Underground Railroad: Black Teachers Empower Pedagogies with Computer Technology

    ERIC Educational Resources Information Center

    Fredrick, Rona M.

    2007-01-01

    An interpretive case study framed by the critical race theory (CRT) and African centered theory is used to examine the teaching practices of two transformative African American teachers, which transformed the thinking and lives of their students. The analysis has illustrated that the computer technology has helped teachers in engaging in…

  3. Spherical roller bearing analysis. SKF computer program SPHERBEAN. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Kleckner, R. J.; Dyba, G. J.

    1980-01-01

    The user's guide for the SPHERBEAN computer program for prediction of the thermomechanical performance characteristics of high speed lubricated double row spherical roller bearings is presented. The material presented is structured to guide the user in the practical and correct implementation of SPHERBEAN. Input and output, guidelines for program use, and sample executions are detailed.

  4. Reconfiguring practice: the interdependence of experimental procedure and computing infrastructure in distributed earthquake engineering.

    PubMed

    De La Flor, Grace; Ojaghi, Mobin; Martínez, Ignacio Lamata; Jirotka, Marina; Williams, Martin S; Blakeborough, Anthony

    2010-09-13

    When transitioning local laboratory practices into distributed environments, the interdependent relationship between experimental procedure and the technologies used to execute experiments becomes highly visible and a focal point for system requirements. We present an analysis of ways in which this reciprocal relationship is reconfiguring laboratory practices in earthquake engineering as a new computing infrastructure is embedded within three laboratories in order to facilitate the execution of shared experiments across geographically distributed sites. The system has been developed as part of the UK Network for Earthquake Engineering Simulation e-Research project, which links together three earthquake engineering laboratories at the universities of Bristol, Cambridge and Oxford. We consider the ways in which researchers have successfully adapted their local laboratory practices through the modification of experimental procedure so that they may meet the challenges of coordinating distributed earthquake experiments.

  5. Statistical analysis tables for truncated or censored samples

    NASA Technical Reports Server (NTRS)

    Cohen, A. C.; Cooley, C. G.

    1971-01-01

    Compilation describes characteristics of truncated and censored samples, and presents six illustrations of practical use of tables in computing mean and variance estimates for normal distribution using selected samples.

  6. Interactive Multimedia Instruction versus Traditional Training Programmes: Analysis of Their Effectiveness and Perception

    ERIC Educational Resources Information Center

    Shanthy, T. Rajula; Thiagarajan, R.

    2011-01-01

    In this article, the practicability of introduction of computer multimedia as an educational tool was compared with the traditional approach for training sugarcane growers in ratoon management practices in three villages of Tamil Nadu state, India using pre-test, post-test control group experimental design. A CD-ROM was developed as a multimedia…

  7. The Reflective Macintosh: A Computer-Assisted Approach to Understanding and Improving Managerial Practice. Project Report.

    ERIC Educational Resources Information Center

    Kerchner, Charles; And Others

    The early stages of a microcomputer-based project to integrate managerial knowledge and practice are described in this report. Analysis of the problem-framing process that effective principals use to reduce complex problems into more manageable ones forms the basis of the project. Three cognitive-mapping techniques are used to understand the…

  8. The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences.

    PubMed

    Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker

    2016-01-01

    The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant's platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses.

  9. A Framework for Understanding Physics Students' Computational Modeling Practices

    NASA Astrophysics Data System (ADS)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by their existing physics content knowledge, particularly their knowledge of analytic procedures. While this existing knowledge was often applied in inappropriate circumstances, the students were still able to display a considerable amount of understanding of the physics content and of analytic solution procedures. These observations could not be adequately accommodated by the existing literature of programming comprehension. In extending the resource framework to the task of computational modeling, I model students' practices in terms of three important elements. First, a knowledge base includes re- sources for understanding physics, math, and programming structures. Second, a mechanism for monitoring and control describes students' expectations as being directed towards numerical, analytic, qualitative or rote solution approaches and which can be influenced by the problem representation. Third, a set of solution approaches---many of which were identified in this study---describe what aspects of the knowledge base students use and how they use that knowledge to enact their expectations. This framework allows us as researchers to track student discussions and pinpoint the source of difficulties. This work opens up many avenues of potential research. First, this framework gives researchers a vocabulary for extending Resource Theory to other domains of instruction, such as modeling how physics students use graphs. Second, this framework can be used as the basis for modeling expert physicists' programming practices. Important instructional implications also follow from this research. Namely, as we broaden the use of computational modeling in the physics classroom, our instructional practices should focus on helping students understand the step-by-step nature of programming in contrast to the already salient analytic procedures.

  10. Kruskal-Wallis test: BASIC computer program to perform nonparametric one-way analysis of variance and multiple comparisons on ranks of several independent samples.

    PubMed

    Theodorsson-Norheim, E

    1986-08-01

    Multiple t tests at a fixed p level are frequently used to analyse biomedical data where analysis of variance followed by multiple comparisons or the adjustment of the p values according to Bonferroni would be more appropriate. The Kruskal-Wallis test is a nonparametric 'analysis of variance' which may be used to compare several independent samples. The present program is written in an elementary subset of BASIC and will perform Kruskal-Wallis test followed by multiple comparisons between the groups on practically any computer programmable in BASIC.

  11. Ethics Regulation in Social Computing Research: Examining the Role of Institutional Review Boards.

    PubMed

    Vitak, Jessica; Proferes, Nicholas; Shilton, Katie; Ashktorab, Zahra

    2017-12-01

    The parallel rise of pervasive data collection platforms and computational methods for collecting, analyzing, and drawing inferences from large quantities of user data has advanced social computing research, investigating digital traces to understand mediated behaviors of individuals, groups, and societies. At the same time, methods employed to access these data have raised questions about ethical research practices. This article provides insights into U.S. institutional review boards' (IRBs) attitudes and practices regulating social computing research. Through descriptive and inferential analysis of survey data from staff at 59 IRBs at research universities, we examine how IRBs evaluate the growing variety of studies using pervasive digital data. Findings unpack the difficulties IRB staff face evaluating increasingly technical research proposals while highlighting the belief in their ability to surmount these difficulties. They also indicate a lack of consensus among IRB staff about what should be reviewed and a willingness to work closely with researchers.

  12. Propagation of Computational Uncertainty Using the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2007-01-01

    This paper describes the use of formally designed experiments to aid in the error analysis of a computational experiment. A method is described by which the underlying code is approximated with relatively low-order polynomial graduating functions represented by truncated Taylor series approximations to the true underlying response function. A resource-minimal approach is outlined by which such graduating functions can be estimated from a minimum number of case runs of the underlying computational code. Certain practical considerations are discussed, including ways and means of coping with high-order response functions. The distributional properties of prediction residuals are presented and discussed. A practical method is presented for quantifying that component of the prediction uncertainty of a computational code that can be attributed to imperfect knowledge of independent variable levels. This method is illustrated with a recent assessment of uncertainty in computational estimates of Space Shuttle thermal and structural reentry loads attributable to ice and foam debris impact on ascent.

  13. Transonic Flow Field Analysis for Wing-Fuselage Configurations

    NASA Technical Reports Server (NTRS)

    Boppe, C. W.

    1980-01-01

    A computational method for simulating the aerodynamics of wing-fuselage configurations at transonic speeds is developed. The finite difference scheme is characterized by a multiple embedded mesh system coupled with a modified or extended small disturbance flow equation. This approach permits a high degree of computational resolution in addition to coordinate system flexibility for treating complex realistic aircraft shapes. To augment the analysis method and permit applications to a wide range of practical engineering design problems, an arbitrary fuselage geometry modeling system is incorporated as well as methodology for computing wing viscous effects. Configuration drag is broken down into its friction, wave, and lift induced components. Typical computed results for isolated bodies, isolated wings, and wing-body combinations are presented. The results are correlated with experimental data. A computer code which employs this methodology is described.

  14. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    DOE PAGES

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; ...

    2015-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-end NGS analysis requirements. The Globus Genomicsmore » system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.« less

  15. Intelligent Data Analysis in the 21st Century

    NASA Astrophysics Data System (ADS)

    Cohen, Paul; Adams, Niall

    When IDA began, data sets were small and clean, data provenance and management were not significant issues, workflows and grid computing and cloud computing didn’t exist, and the world was not populated with billions of cellphone and computer users. The original conception of intelligent data analysis — automating some of the reasoning of skilled data analysts — has not been updated to account for the dramatic changes in what skilled data analysis means, today. IDA might update its mission to address pressing problems in areas such as climate change, habitat loss, education, and medicine. It might anticipate data analysis opportunities five to ten years out, such as customizing educational trajectories to individual students, and personalizing medical protocols. Such developments will elevate the conference and our community by shifting our focus from arbitrary measures of the performance of isolated algorithms to the practical, societal value of intelligent data analysis systems.

  16. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    PubMed Central

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; Rodriguez, Alex; Madduri, Ravi; Dave, Utpal; Lacinski, Lukasz; Foster, Ian; Gusev, Yuriy; Madhavan, Subha

    2014-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon 's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research. PMID:26925205

  17. On the convergence of nanotechnology and Big Data analysis for computer-aided diagnosis.

    PubMed

    Rodrigues, Jose F; Paulovich, Fernando V; de Oliveira, Maria Cf; de Oliveira, Osvaldo N

    2016-04-01

    An overview is provided of the challenges involved in building computer-aided diagnosis systems capable of precise medical diagnostics based on integration and interpretation of data from different sources and formats. The availability of massive amounts of data and computational methods associated with the Big Data paradigm has brought hope that such systems may soon be available in routine clinical practices, which is not the case today. We focus on visual and machine learning analysis of medical data acquired with varied nanotech-based techniques and on methods for Big Data infrastructure. Because diagnosis is essentially a classification task, we address the machine learning techniques with supervised and unsupervised classification, making a critical assessment of the progress already made in the medical field and the prospects for the near future. We also advocate that successful computer-aided diagnosis requires a merge of methods and concepts from nanotechnology and Big Data analysis.

  18. Template Interfaces for Agile Parallel Data-Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramakrishnan, Lavanya; Gunter, Daniel; Pastorello, Gilerto Z.

    Tigres provides a programming library to compose and execute large-scale data-intensive scientific workflows from desktops to supercomputers. DOE User Facilities and large science collaborations are increasingly generating large enough data sets that it is no longer practical to download them to a desktop to operate on them. They are instead stored at centralized compute and storage resources such as high performance computing (HPC) centers. Analysis of this data requires an ability to run on these facilities, but with current technologies, scaling an analysis to an HPC center and to a large data set is difficult even for experts. Tigres ismore » addressing the challenge of enabling collaborative analysis of DOE Science data through a new concept of reusable "templates" that enable scientists to easily compose, run and manage collaborative computational tasks. These templates define common computation patterns used in analyzing a data set.« less

  19. Practical Use of Computationally Frugal Model Analysis Methods

    DOE PAGES

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; ...

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  20. A Comparison of Missing-Data Procedures for Arima Time-Series Analysis

    ERIC Educational Resources Information Center

    Velicer, Wayne F.; Colby, Suzanne M.

    2005-01-01

    Missing data are a common practical problem for longitudinal designs. Time-series analysis is a longitudinal method that involves a large number of observations on a single unit. Four different missing-data methods (deletion, mean substitution, mean of adjacent observations, and maximum likelihood estimation) were evaluated. Computer-generated…

  1. 76 FR 23824 - Guidance for Industry: “Computer Crossmatch” (Computerized Analysis of the Compatibility Between...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-28

    ... the Compatibility Between the Donor's Cell Type and the Recipient's Serum or Plasma Type... Crossmatch' (Computerized Analysis of the Compatibility between the Donor's Cell Type and the Recipient's... donor's cell type and the recipient's serum or plasma type. The guidance describes practices that we...

  2. Digital Data Collection and Analysis: Application for Clinical Practice

    ERIC Educational Resources Information Center

    Ingram, Kelly; Bunta, Ferenc; Ingram, David

    2004-01-01

    Technology for digital speech recording and speech analysis is now readily available for all clinicians who use a computer. This article discusses some advantages of moving from analog to digital recordings and outlines basic recording procedures. The purpose of this article is to familiarize speech-language pathologists with computerized audio…

  3. Guidelines for Using the "Q" Test in Meta-Analysis

    ERIC Educational Resources Information Center

    Maeda, Yukiko; Harwell, Michael R.

    2016-01-01

    The "Q" test is regularly used in meta-analysis to examine variation in effect sizes. However, the assumptions of "Q" are unlikely to be satisfied in practice prompting methodological researchers to conduct computer simulation studies examining its statistical properties. Narrative summaries of this literature are available but…

  4. Practical quantification of necrosis in histological whole-slide images.

    PubMed

    Homeyer, André; Schenk, Andrea; Arlt, Janine; Dahmen, Uta; Dirsch, Olaf; Hahn, Horst K

    2013-06-01

    Since the histological quantification of necrosis is a common task in medical research and practice, we evaluate different image analysis methods for quantifying necrosis in whole-slide images. In a practical usage scenario, we assess the impact of different classification algorithms and feature sets on both accuracy and computation time. We show how a well-chosen combination of multiresolution features and an efficient postprocessing step enables the accurate quantification necrosis in gigapixel images in less than a minute. The results are general enough to be applied to other areas of histological image analysis as well. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. A Simple Method for Automated Equilibration Detection in Molecular Simulations.

    PubMed

    Chodera, John D

    2016-04-12

    Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure and demonstrate its utility on typical molecular simulation data.

  6. A simple method for automated equilibration detection in molecular simulations

    PubMed Central

    Chodera, John D.

    2016-01-01

    Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest, in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure, and demonstrate its utility on typical molecular simulation data. PMID:26771390

  7. Artificial Intelligence in Medical Practice: The Question to the Answer?

    PubMed

    Miller, D Douglas; Brown, Eric W

    2018-02-01

    Computer science advances and ultra-fast computing speeds find artificial intelligence (AI) broadly benefitting modern society-forecasting weather, recognizing faces, detecting fraud, and deciphering genomics. AI's future role in medical practice remains an unanswered question. Machines (computers) learn to detect patterns not decipherable using biostatistics by processing massive datasets (big data) through layered mathematical models (algorithms). Correcting algorithm mistakes (training) adds to AI predictive model confidence. AI is being successfully applied for image analysis in radiology, pathology, and dermatology, with diagnostic speed exceeding, and accuracy paralleling, medical experts. While diagnostic confidence never reaches 100%, combining machines plus physicians reliably enhances system performance. Cognitive programs are impacting medical practice by applying natural language processing to read the rapidly expanding scientific literature and collate years of diverse electronic medical records. In this and other ways, AI may optimize the care trajectory of chronic disease patients, suggest precision therapies for complex illnesses, reduce medical errors, and improve subject enrollment into clinical trials. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Assessing Scientific Practices Using Machine-Learning Methods: How Closely Do They Match Clinical Interview Performance?

    NASA Astrophysics Data System (ADS)

    Beggrow, Elizabeth P.; Ha, Minsu; Nehm, Ross H.; Pearl, Dennis; Boone, William J.

    2014-02-01

    The landscape of science education is being transformed by the new Framework for Science Education (National Research Council, A framework for K-12 science education: practices, crosscutting concepts, and core ideas. The National Academies Press, Washington, DC, 2012), which emphasizes the centrality of scientific practices—such as explanation, argumentation, and communication—in science teaching, learning, and assessment. A major challenge facing the field of science education is developing assessment tools that are capable of validly and efficiently evaluating these practices. Our study examined the efficacy of a free, open-source machine-learning tool for evaluating the quality of students' written explanations of the causes of evolutionary change relative to three other approaches: (1) human-scored written explanations, (2) a multiple-choice test, and (3) clinical oral interviews. A large sample of undergraduates (n = 104) exposed to varying amounts of evolution content completed all three assessments: a clinical oral interview, a written open-response assessment, and a multiple-choice test. Rasch analysis was used to compute linear person measures and linear item measures on a single logit scale. We found that the multiple-choice test displayed poor person and item fit (mean square outfit >1.3), while both oral interview measures and computer-generated written response measures exhibited acceptable fit (average mean square outfit for interview: person 0.97, item 0.97; computer: person 1.03, item 1.06). Multiple-choice test measures were more weakly associated with interview measures (r = 0.35) than the computer-scored explanation measures (r = 0.63). Overall, Rasch analysis indicated that computer-scored written explanation measures (1) have the strongest correspondence to oral interview measures; (2) are capable of capturing students' normative scientific and naive ideas as accurately as human-scored explanations, and (3) more validly detect understanding than the multiple-choice assessment. These findings demonstrate the great potential of machine-learning tools for assessing key scientific practices highlighted in the new Framework for Science Education.

  9. Secure distributed genome analysis for GWAS and sequence comparison computation.

    PubMed

    Zhang, Yihua; Blanton, Marina; Almashaqbeh, Ghada

    2015-01-01

    The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice.

  10. Secure distributed genome analysis for GWAS and sequence comparison computation

    PubMed Central

    2015-01-01

    Background The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. Methods In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. Results We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. Conclusions This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice. PMID:26733307

  11. Parallel peak pruning for scalable SMP contour tree computation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carr, Hamish A.; Weber, Gunther H.; Sewell, Christopher M.

    As data sets grow to exascale, automated data analysis and visualisation are increasingly important, to intermediate human understanding and to reduce demands on disk storage via in situ analysis. Trends in architecture of high performance computing systems necessitate analysis algorithms to make effective use of combinations of massively multicore and distributed systems. One of the principal analytic tools is the contour tree, which analyses relationships between contours to identify features of more than local importance. Unfortunately, the predominant algorithms for computing the contour tree are explicitly serial, and founded on serial metaphors, which has limited the scalability of this formmore » of analysis. While there is some work on distributed contour tree computation, and separately on hybrid GPU-CPU computation, there is no efficient algorithm with strong formal guarantees on performance allied with fast practical performance. Here in this paper, we report the first shared SMP algorithm for fully parallel contour tree computation, withfor-mal guarantees of O(lgnlgt) parallel steps and O(n lgn) work, and implementations with up to 10x parallel speed up in OpenMP and up to 50x speed up in NVIDIA Thrust.« less

  12. Efficient Privacy-Aware Record Integration.

    PubMed

    Kuzu, Mehmet; Kantarcioglu, Murat; Inan, Ali; Bertino, Elisa; Durham, Elizabeth; Malin, Bradley

    2013-01-01

    The integration of information dispersed among multiple repositories is a crucial step for accurate data analysis in various domains. In support of this goal, it is critical to devise procedures for identifying similar records across distinct data sources. At the same time, to adhere to privacy regulations and policies, such procedures should protect the confidentiality of the individuals to whom the information corresponds. Various private record linkage (PRL) protocols have been proposed to achieve this goal, involving secure multi-party computation (SMC) and similarity preserving data transformation techniques. SMC methods provide secure and accurate solutions to the PRL problem, but are prohibitively expensive in practice, mainly due to excessive computational requirements. Data transformation techniques offer more practical solutions, but incur the cost of information leakage and false matches. In this paper, we introduce a novel model for practical PRL, which 1) affords controlled and limited information leakage, 2) avoids false matches resulting from data transformation. Initially, we partition the data sources into blocks to eliminate comparisons for records that are unlikely to match. Then, to identify matches, we apply an efficient SMC technique between the candidate record pairs. To enable efficiency and privacy, our model leaks a controlled amount of obfuscated data prior to the secure computations. Applied obfuscation relies on differential privacy which provides strong privacy guarantees against adversaries with arbitrary background knowledge. In addition, we illustrate the practical nature of our approach through an empirical analysis with data derived from public voter records.

  13. Software and the Scientist: Coding and Citation Practices in Geodynamics

    NASA Astrophysics Data System (ADS)

    Hwang, Lorraine; Fish, Allison; Soito, Laura; Smith, MacKenzie; Kellogg, Louise H.

    2017-11-01

    In geodynamics as in other scientific areas, computation has become a core component of research, complementing field observation, laboratory analysis, experiment, and theory. Computational tools for data analysis, mapping, visualization, modeling, and simulation are essential for all aspects of the scientific workflow. Specialized scientific software is often developed by geodynamicists for their own use, and this effort represents a distinctive intellectual contribution. Drawing on a geodynamics community that focuses on developing and disseminating scientific software, we assess the current practices of software development and attribution, as well as attitudes about the need and best practices for software citation. We analyzed publications by participants in the Computational Infrastructure for Geodynamics and conducted mixed method surveys of the solid earth geophysics community. From this we learned that coding skills are typically learned informally. Participants considered good code as trusted, reusable, readable, and not overly complex and considered a good coder as one that participates in the community in an open and reasonable manor contributing to both long- and short-term community projects. Participants strongly supported citing software reflected by the high rate a software package was named in the literature and the high rate of citations in the references. However, lacking are clear instructions from developers on how to cite and education of users on what to cite. In addition, citations did not always lead to discoverability of the resource. A unique identifier to the software package itself, community education, and citation tools would contribute to better attribution practices.

  14. Computer ergonomics: the medical practice guide to developing good computer habits.

    PubMed

    Hills, Laura

    2011-01-01

    Medical practice employees are likely to use computers for at least some of their work. Some sit several hours each day at computer workstations. Therefore, it is important that members of your medical practice team develop good computer work habits and that they know how to align equipment, furniture, and their bodies to prevent strain, stress, and computer-related injuries. This article delves into the field of computer ergonomics-the design of computer workstations and work habits to reduce user fatigue, discomfort, and injury. It describes practical strategies medical practice employees can use to improve their computer work habits. Specifically, this article describes the proper use of the computer workstation chair, the ideal placement of the computer monitor and keyboard, and the best lighting for computer work areas and tasks. Moreover, this article includes computer ergonomic guidelines especially for bifocal and progressive lens wearers and offers 10 tips for proper mousing. Ergonomically correct posture, movements, positioning, and equipment are all described in detail to enable the frequent computer user in your medical practice to remain healthy, pain-free, and productive.

  15. Desktop microsimulation: a tool to improve efficiency in the medical office practice.

    PubMed

    Montgomery, James B; Linville, Beth A; Slonim, Anthony D

    2013-01-01

    Because the economic crisis in the United States continues to have an impact on healthcare organizations, industry leaders must optimize their decision making. Discrete-event computer simulation is a quality tool with a demonstrated track record of improving the precision of analysis for process redesign. However, the use of simulation to consolidate practices and design efficiencies into an unfinished medical office building was a unique task. A discrete-event computer simulation package was used to model the operations and forecast future results for four orthopedic surgery practices. The scenarios were created to allow an evaluation of the impact of process change on the output variables of exam room utilization, patient queue size, and staff utilization. The model helped with decisions regarding space allocation and efficient exam room use by demonstrating the impact of process changes in patient queues at check-in/out, x-ray, and cast room locations when compared to the status quo model. The analysis impacted decisions on facility layout, patient flow, and staff functions in this newly consolidated practice. Simulation was found to be a useful tool for process redesign and decision making even prior to building occupancy. © 2011 National Association for Healthcare Quality.

  16. An analytical procedure for evaluating shuttle abort staging aerodynamic characteristics

    NASA Technical Reports Server (NTRS)

    Meyer, R.

    1973-01-01

    An engineering analysis and computer code (AERSEP) for predicting Space Shuttle Orbiter - HO Tank longitudinal aerodynamic characteristics during abort separation has been developed. Computed results are applicable at Mach numbers above 2 for angle-of-attack between plus or minus 10 degrees. No practical restrictions on orbiter-tank relative positioning are indicated for tank-under-orbiter configurations. Input data requirements and computer running times are minimal facilitating program use for parametric studies, test planning, and trajectory analysis. In a majority of cases AERSEP Orbiter-Tank interference predictions are as accurate as state-of-the-art estimates for interference-free or isolated-vehicle configurations. AERSEP isolated-orbiter predictions also show excellent correlation with data.

  17. Modeling Criminal Activity in Urban Landscapes

    NASA Astrophysics Data System (ADS)

    Brantingham, Patricia; Glässer, Uwe; Jackson, Piper; Vajihollahi, Mona

    Computational and mathematical methods arguably have an enormous potential for serving practical needs in crime analysis and prevention by offering novel tools for crime investigations and experimental platforms for evidence-based policy making. We present a comprehensive formal framework and tool support for mathematical and computational modeling of criminal behavior to facilitate systematic experimental studies of a wide range of criminal activities in urban environments. The focus is on spatial and temporal aspects of different forms of crime, including opportunistic and serial violent crimes. However, the proposed framework provides a basis to push beyond conventional empirical research and engage the use of computational thinking and social simulations in the analysis of terrorism and counter-terrorism.

  18. The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences

    PubMed Central

    Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker

    2016-01-01

    The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant’s platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses. PMID:26752627

  19. Programmers, professors, and parasites: credit and co-authorship in computer science.

    PubMed

    Solomon, Justin

    2009-12-01

    This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.

  20. MIT Lincoln Laboratory Takes the Mystery Out of Supercomupting

    DTIC Science & Technology

    2017-01-18

    analysis, designing sensors, and developing algorithms. In 2008, the Lincoln demonstrated the largest single problem ever run on a computer using ... computation . As we design and prototype these devices, the use of leading–edge engineering practices have become the de facto standard. This includes...MIT Lincoln Laboratory Takes the Mystery Out of Supercomputing By Dr. Jeremy Kepner 1 The introduction of multicore and manycore processors

  1. Development of a Aerothermoelastic-Acoustics Simulation Capability of Flight Vehicles

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.; Choi, S. B.; Ibrahim, A.

    2010-01-01

    A novel numerical, finite element based analysis methodology is presented in this paper suitable for accurate and efficient simulation of practical, complex flight vehicles. An associated computer code, developed in this connection, is also described in some detail. Thermal effects of high speed flow obtained from a heat conduction analysis are incorporated in the modal analysis which in turn affects the unsteady flow arising out of interaction of elastic structures with the air. Numerical examples pertaining to representative problems are given in much detail testifying to the efficacy of the advocated techniques. This is a unique implementation of temperature effects in a finite element CFD based multidisciplinary simulation analysis capability involving large scale computations.

  2. AstroML: Python-powered Machine Learning for Astronomy

    NASA Astrophysics Data System (ADS)

    Vander Plas, Jake; Connolly, A. J.; Ivezic, Z.

    2014-01-01

    As astronomical data sets grow in size and complexity, automated machine learning and data mining methods are becoming an increasingly fundamental component of research in the field. The astroML project (http://astroML.org) provides a common repository for practical examples of the data mining and machine learning tools used and developed by astronomical researchers, written in Python. The astroML module contains a host of general-purpose data analysis and machine learning routines, loaders for openly-available astronomical datasets, and fast implementations of specific computational methods often used in astronomy and astrophysics. The associated website features hundreds of examples of these routines being used for analysis of real astronomical datasets, while the associated textbook provides a curriculum resource for graduate-level courses focusing on practical statistics, machine learning, and data mining approaches within Astronomical research. This poster will highlight several of the more powerful and unique examples of analysis performed with astroML, all of which can be reproduced in their entirety on any computer with the proper packages installed.

  3. Computer vision syndrome among computer office workers in a developing country: an evaluation of prevalence and risk factors.

    PubMed

    Ranasinghe, P; Wathurapatha, W S; Perera, Y S; Lamabadusuriya, D A; Kulatunga, S; Jayawardana, N; Katulanda, P

    2016-03-09

    Computer vision syndrome (CVS) is a group of visual symptoms experienced in relation to the use of computers. Nearly 60 million people suffer from CVS globally, resulting in reduced productivity at work and reduced quality of life of the computer worker. The present study aims to describe the prevalence of CVS and its associated factors among a nationally-representative sample of Sri Lankan computer workers. Two thousand five hundred computer office workers were invited for the study from all nine provinces of Sri Lanka between May and December 2009. A self-administered questionnaire was used to collect socio-demographic data, symptoms of CVS and its associated factors. A binary logistic regression analysis was performed in all patients with 'presence of CVS' as the dichotomous dependent variable and age, gender, duration of occupation, daily computer usage, pre-existing eye disease, not using a visual display terminal (VDT) filter, adjusting brightness of screen, use of contact lenses, angle of gaze and ergonomic practices knowledge as the continuous/dichotomous independent variables. A similar binary logistic regression analysis was performed in all patients with 'severity of CVS' as the dichotomous dependent variable and other continuous/dichotomous independent variables. Sample size was 2210 (response rate-88.4%). Mean age was 30.8 ± 8.1 years and 50.8% of the sample were males. The 1-year prevalence of CVS in the study population was 67.4%. Female gender (OR: 1.28), duration of occupation (OR: 1.07), daily computer usage (1.10), pre-existing eye disease (OR: 4.49), not using a VDT filter (OR: 1.02), use of contact lenses (OR: 3.21) and ergonomics practices knowledge (OR: 1.24) all were associated with significantly presence of CVS. The duration of occupation (OR: 1.04) and presence of pre-existing eye disease (OR: 1.54) were significantly associated with the presence of 'severe CVS'. Sri Lankan computer workers had a high prevalence of CVS. Female gender, longer duration of occupation, higher daily computer usage, pre-existing eye disease, not using a VDT filter, use of contact lenses and higher ergonomics practices knowledge all were associated with significantly with the presence of CVS. The factors associated with the severity of CVS were the duration of occupation and presence of pre-existing eye disease.

  4. ULg Spectra: An Interactive Software Tool to Improve Undergraduate Students' Structural Analysis Skills

    ERIC Educational Resources Information Center

    Agnello, Armelinda; Carre, Cyril; Billen, Roland; Leyh, Bernard; De Pauw, Edwin; Damblon, Christian

    2018-01-01

    The analysis of spectroscopic data to solve chemical structures requires practical skills and drills. In this context, we have developed ULg Spectra, a computer-based tool designed to improve the ability of learners to perform complex reasoning. The identification of organic chemical compounds involves gathering and interpreting complementary…

  5. A Meta-Analysis of Writing Instruction for Students in the Elementary Grades

    ERIC Educational Resources Information Center

    Graham, Steve; McKeown, Debra; Kiuhara, Sharlene; Harris, Karen R.

    2012-01-01

    In an effort to identify effective instructional practices for teaching writing to elementary grade students, we conducted a meta-analysis of the writing intervention literature, focusing our efforts on true and quasi-experiments. We located 115 documents that included the statistics for computing an effect size (ES). We calculated an average…

  6. Generalized causal mediation and path analysis: Extensions and practical considerations.

    PubMed

    Albert, Jeffrey M; Cho, Jang Ik; Liu, Yiying; Nelson, Suchitra

    2018-01-01

    Causal mediation analysis seeks to decompose the effect of a treatment or exposure among multiple possible paths and provide casually interpretable path-specific effect estimates. Recent advances have extended causal mediation analysis to situations with a sequence of mediators or multiple contemporaneous mediators. However, available methods still have limitations, and computational and other challenges remain. The present paper provides an extended causal mediation and path analysis methodology. The new method, implemented in the new R package, gmediation (described in a companion paper), accommodates both a sequence (two stages) of mediators and multiple mediators at each stage, and allows for multiple types of outcomes following generalized linear models. The methodology can also handle unsaturated models and clustered data. Addressing other practical issues, we provide new guidelines for the choice of a decomposition, and for the choice of a reference group multiplier for the reduction of Monte Carlo error in mediation formula computations. The new method is applied to data from a cohort study to illuminate the contribution of alternative biological and behavioral paths in the effect of socioeconomic status on dental caries in adolescence.

  7. Knowledge and utilization of computer-software for statistics among Nigerian dentists.

    PubMed

    Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I

    2013-01-01

    The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.

  8. Comparison of meaningful learning characteristics in simulated nursing practice after traditional versus computer-based simulation method: a qualitative videography study.

    PubMed

    Poikela, Paula; Ruokamo, Heli; Teräs, Marianne

    2015-02-01

    Nursing educators must ensure that nursing students acquire the necessary competencies; finding the most purposeful teaching methods and encouraging learning through meaningful learning opportunities is necessary to meet this goal. We investigated student learning in a simulated nursing practice using videography. The purpose of this paper is to examine how two different teaching methods presented students' meaningful learning in a simulated nursing experience. The 6-hour study was divided into three parts: part I, general information; part II, training; and part III, simulated nursing practice. Part II was delivered by two different methods: a computer-based simulation and a lecture. The study was carried out in the simulated nursing practice in two universities of applied sciences, in Northern Finland. The participants in parts II and I were 40 first year nursing students; 12 student volunteers continued to part III. Qualitative analysis method was used. The data were collected using video recordings and analyzed by videography. The students who used a computer-based simulation program were more likely to report meaningful learning themes than those who were first exposed to lecture method. Educators should be encouraged to use computer-based simulation teaching in conjunction with other teaching methods to ensure that nursing students are able to receive the greatest educational benefits. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. The computation of three-dimensional flows using unstructured grids

    NASA Technical Reports Server (NTRS)

    Morgan, K.; Peraire, J.; Peiro, J.; Hassan, O.

    1991-01-01

    A general method is described for automatically discretizing, into unstructured assemblies of tetrahedra, the three-dimensional solution domains of complex shape which are of interest in practical computational aerodynamics. An algorithm for the solution of the compressible Euler equations which can be implemented on such general unstructured tetrahedral grids is described. This is an explicit cell-vertex scheme which follows a general Taylor-Galerkin philosophy. The approach is employed to compute a transonic inviscid flow over a standard wing and the results are shown to compare favorably with experimental observations. As a more practical demonstration, the method is then applied to the analysis of inviscid flow over a complete modern fighter configuration. The effect of using mesh adaptivity is illustrated when the method is applied to the solution of high speed flow in an engine inlet.

  10. Computer use in primary care practices in Canada.

    PubMed

    Anisimowicz, Yvonne; Bowes, Andrea E; Thompson, Ashley E; Miedema, Baukje; Hogg, William E; Wong, Sabrina T; Katz, Alan; Burge, Fred; Aubrey-Bassler, Kris; Yelland, Gregory S; Wodchis, Walter P

    2017-05-01

    To examine the use of computers in primary care practices. The international Quality and Cost of Primary Care study was conducted in Canada in 2013 and 2014 using a descriptive cross-sectional survey method to collect data from practices across Canada. Participating practices filled out several surveys, one of them being the Family Physician Survey, from which this study collected its data. All 10 Canadian provinces. A total of 788 family physicians. A computer use scale measured the extent to which family physicians integrated computers into their practices, with higher scores indicating a greater integration of computer use in practice. Analyses included t tests and 2 tests comparing new and traditional models of primary care on measures of computer use and electronic health record (EHR) use, as well as descriptive statistics. Nearly all (97.5%) physicians reported using a computer in their practices, with moderately high computer use scale scores (mean [SD] score of 5.97 [2.96] out of 9), and many (65.7%) reported using EHRs. Physicians with practices operating under new models of primary care reported incorporating computers into their practices to a greater extent (mean [SD] score of 6.55 [2.64]) than physicians operating under traditional models did (mean [SD] score of 5.33 [3.15]; t 726.60 = 5.84; P < .001; Cohen d = 0.42, 95% CI 0.808 to 1.627) and were more likely to report using EHRs (73.8% vs 56.7%; [Formula: see text]; P < .001; odds ratio = 2.15). Overall, there was a statistically significant variability in computer use across provinces. Most family physicians in Canada have incorporated computers into their practices for administrative and scholarly activities; however, EHRs have not been adopted consistently across the country. Physicians with practices operating under the new, more collaborative models of primary care use computers more comprehensively and are more likely to use EHRs than those in practices operating under traditional models of primary care. Copyright© the College of Family Physicians of Canada.

  11. Applications of complex systems theory in nursing education, research, and practice.

    PubMed

    Clancy, Thomas R; Effken, Judith A; Pesut, Daniel

    2008-01-01

    The clinical and administrative processes in today's healthcare environment are becoming increasingly complex. Multiple providers, new technology, competition, and the growing ubiquity of information all contribute to the notion of health care as a complex system. A complex system (CS) is characterized by a highly connected network of entities (e.g., physical objects, people or groups of people) from which higher order behavior emerges. Research in the transdisciplinary field of CS has focused on the use of computational modeling and simulation as a methodology for analyzing CS behavior. The creation of virtual worlds through computer simulation allows researchers to analyze multiple variables simultaneously and begin to understand behaviors that are common regardless of the discipline. The application of CS principles, mediated through computer simulation, informs nursing practice of the benefits and drawbacks of new procedures, protocols and practices before having to actually implement them. The inclusion of new computational tools and their applications in nursing education is also gaining attention. For example, education in CSs and applied computational applications has been endorsed by The Institute of Medicine, the American Organization of Nurse Executives and the American Association of Colleges of Nursing as essential training of nurse leaders. The purpose of this article is to review current research literature regarding CS science within the context of expert practice and implications for the education of nurse leadership roles. The article focuses on 3 broad areas: CS defined, literature review and exemplars from CS research and applications of CS theory in nursing leadership education. The article also highlights the key role nursing informaticists play in integrating emerging computational tools in the analysis of complex nursing systems.

  12. [Algorithms of artificial neural networks--practical application in medical science].

    PubMed

    Stefaniak, Bogusław; Cholewiński, Witold; Tarkowska, Anna

    2005-12-01

    Artificial Neural Networks (ANN) may be a tool alternative and complementary to typical statistical analysis. However, in spite of many computer applications of various ANN algorithms ready for use, artificial intelligence is relatively rarely applied to data processing. This paper presents practical aspects of scientific application of ANN in medicine using widely available algorithms. Several main steps of analysis with ANN were discussed starting from material selection and dividing it into groups, to the quality assessment of obtained results at the end. The most frequent, typical reasons for errors as well as the comparison of ANN method to the modeling by regression analysis were also described.

  13. Computer classes and games in virtual reality environment to reduce loneliness among students of an elderly reference center: Study protocol for a randomised cross-over design.

    PubMed

    Antunes, Thaiany Pedrozo Campos; Oliveira, Acary Souza Bulle de; Crocetta, Tania Brusque; Antão, Jennifer Yohanna Ferreira de Lima; Barbosa, Renata Thais de Almeida; Guarnieri, Regiani; Massetti, Thais; Monteiro, Carlos Bandeira de Mello; Abreu, Luiz Carlos de

    2017-03-01

    Physical and mental changes associated with aging commonly lead to a decrease in communication capacity, reducing social interactions and increasing loneliness. Computer classes for older adults make significant contributions to social and cognitive aspects of aging. Games in a virtual reality (VR) environment stimulate the practice of communicative and cognitive skills and might also bring benefits to older adults. Furthermore, it might help to initiate their contact to the modern technology. The purpose of this study protocol is to evaluate the effects of practicing VR games during computer classes on the level of loneliness of students of an elderly reference center. This study will be a prospective longitudinal study with a randomised cross-over design, with subjects aged 50 years and older, of both genders, spontaneously enrolled in computer classes for beginners. Data collection will be done in 3 moments: moment 0 (T0) - at baseline; moment 1 (T1) - after 8 typical computer classes; and moment 2 (T2) - after 8 computer classes which include 15 minutes for practicing games in VR environment. A characterization questionnaire, the short version of the Short Social and Emotional Loneliness Scale for Adults (SELSA-S) and 3 games with VR (Random, MoviLetrando, and Reaction Time) will be used. For the intervention phase 4 other games will be used: Coincident Timing, Motor Skill Analyser, Labyrinth, and Fitts. The statistical analysis will compare the evolution in loneliness perception, performance, and reaction time during the practice of the games between the 3 moments of data collection. Performance and reaction time during the practice of the games will also be correlated to the loneliness perception. The protocol is approved by the host institution's ethics committee under the number 52305215.3.0000.0082. Results will be disseminated via peer-reviewed journal articles and conferences. This clinical trial is registered at ClinicalTrials.gov identifier: NCT02798081.

  14. Computer-aided head film analysis: the University of California San Francisco method.

    PubMed

    Baumrind, S; Miller, D M

    1980-07-01

    Computer technology is already assuming an important role in the management of orthodontic practices. The next 10 years are likely to see expansion in computer usage into the areas of diagnosis, treatment planning, and treatment-record keeping. In the areas of diagnosis and treatment planning, one of the first problems to be attacked will be the automation of head film analysis. The problems of constructing computer-aided systems for this purpose are considered herein in the light of the authors' 10 years of experience in developing a similar system for research purposes. The need for building in methods for automatic detection and correction of gross errors is discussed and the authors' method for doing so is presented. The construction of a rudimentary machine-readable data base for research and clinical purposes is described.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    I. W. Ginsberg

    Multiresolutional decompositions known as spectral fingerprints are often used to extract spectral features from multispectral/hyperspectral data. In this study, the authors investigate the use of wavelet-based algorithms for generating spectral fingerprints. The wavelet-based algorithms are compared to the currently used method, traditional convolution with first-derivative Gaussian filters. The comparison analyses consists of two parts: (a) the computational expense of the new method is compared with the computational costs of the current method and (b) the outputs of the wavelet-based methods are compared with those of the current method to determine any practical differences in the resulting spectral fingerprints. The resultsmore » show that the wavelet-based algorithms can greatly reduce the computational expense of generating spectral fingerprints, while practically no differences exist in the resulting fingerprints. The analysis is conducted on a database of hyperspectral signatures, namely, Hyperspectral Digital Image Collection Experiment (HYDICE) signatures. The reduction in computational expense is by a factor of about 30, and the average Euclidean distance between resulting fingerprints is on the order of 0.02.« less

  16. An analysis of computerization in primary care practices.

    PubMed

    Condon, James V; Smith, Sherry P

    2002-12-01

    To remain profitable, primary care practices, the front-line health care providers, must provide excellent patient care and reduce expenses while providing payers with accurate data. Many primary care practices have turned to computer technology to achieve these goals. This study examined the degree of computerization of primary care providers in the Augusta, Georgia, metropolitan area as well as the level of awareness of the Health Insurance Portability and Accountability Act (HIPAA) by primary care providers and its potential effect on their future computerization plans. The study's findings are presented and discussed as well as a number of recommendations for practice managers.

  17. Protocol for the PINCER trial: a cluster randomised trial comparing the effectiveness of a pharmacist-led IT-based intervention with simple feedback in reducing rates of clinically important errors in medicines management in general practices.

    PubMed

    Avery, Anthony J; Rodgers, Sarah; Cantrill, Judith A; Armstrong, Sarah; Elliott, Rachel; Howard, Rachel; Kendrick, Denise; Morris, Caroline J; Murray, Scott A; Prescott, Robin J; Cresswell, Kathrin; Sheikh, Aziz

    2009-05-01

    Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family) practice. RESEARCH SUBJECT GROUP: "At-risk" patients registered with computerised general practices in two geographical regions in England. Parallel group pragmatic cluster randomised trial. Practices will be randomised to either: (i) Computer-generated feedback; or (ii) Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs; - with a computer-recorded diagnosis of asthma being prescribed beta-blockers; - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. SECONDARY OUTCOME MEASURES; These relate to a number of other examples of potentially hazardous prescribing and medicines management. An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS), comparing the pharmacist-led intervention with simple feedback. QUALITATIVE ANALYSIS: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove ineffective. 34 practices in each of the two treatment arms would provide at least 80% power (two-tailed alpha of 0.05) to demonstrate a 50% reduction in error rates for each of the three primary outcome measures in the pharmacist-led intervention arm compared with a 11% reduction in the simple feedback arm. At the time of submission of this article, 72 general practices have been recruited (36 in each arm of the trial) and the interventions have been delivered. Analysis has not yet been undertaken.

  18. Tally and geometry definition influence on the computing time in radiotherapy treatment planning with MCNP Monte Carlo code.

    PubMed

    Juste, B; Miro, R; Gallardo, S; Santos, A; Verdu, G

    2006-01-01

    The present work has simulated the photon and electron transport in a Theratron 780 (MDS Nordion) (60)Co radiotherapy unit, using the Monte Carlo transport code, MCNP (Monte Carlo N-Particle), version 5. In order to become computationally more efficient in view of taking part in the practical field of radiotherapy treatment planning, this work is focused mainly on the analysis of dose results and on the required computing time of different tallies applied in the model to speed up calculations.

  19. Computing the modal mass from the state space model in combined experimental-operational modal analysis

    NASA Astrophysics Data System (ADS)

    Cara, Javier

    2016-05-01

    Modal parameters comprise natural frequencies, damping ratios, modal vectors and modal masses. In a theoretic framework, these parameters are the basis for the solution of vibration problems using the theory of modal superposition. In practice, they can be computed from input-output vibration data: the usual procedure is to estimate a mathematical model from the data and then to compute the modal parameters from the estimated model. The most popular models for input-output data are based on the frequency response function, but in recent years the state space model in the time domain has become popular among researchers and practitioners of modal analysis with experimental data. In this work, the equations to compute the modal parameters from the state space model when input and output data are available (like in combined experimental-operational modal analysis) are derived in detail using invariants of the state space model: the equations needed to compute natural frequencies, damping ratios and modal vectors are well known in the operational modal analysis framework, but the equation needed to compute the modal masses has not generated much interest in technical literature. These equations are applied to both a numerical simulation and an experimental study in the last part of the work.

  20. Primary care physicians' perspectives on computer-based health risk assessment tools for chronic diseases: a mixed methods study.

    PubMed

    Voruganti, Teja R; O'Brien, Mary Ann; Straus, Sharon E; McLaughlin, John R; Grunfeld, Eva

    2015-09-24

    Health risk assessment tools compute an individual's risk of developing a disease. Routine use of such tools by primary care physicians (PCPs) is potentially useful in chronic disease prevention. We sought physicians' awareness and perceptions of the usefulness, usability and feasibility of performing assessments with computer-based risk assessment tools in primary care settings. Focus groups and usability testing with a computer-based risk assessment tool were conducted with PCPs from both university-affiliated and community-based practices. Analysis was derived from grounded theory methodology. PCPs (n = 30) were aware of several risk assessment tools although only select tools were used routinely. The decision to use a tool depended on how use impacted practice workflow and whether the tool had credibility. Participants felt that embedding tools in the electronic medical records (EMRs) system might allow for health information from the medical record to auto-populate into the tool. User comprehension of risk could also be improved with computer-based interfaces that present risk in different formats. In this study, PCPs chose to use certain tools more regularly because of usability and credibility. Despite there being differences in the particular tools a clinical practice used, there was general appreciation for the usefulness of tools for different clinical situations. Participants characterised particular features of an ideal tool, feeling strongly that embedding risk assessment tools in the EMR would maximise accessibility and use of the tool for chronic disease management. However, appropriate practice workflow integration and features that facilitate patient understanding at point-of-care are also essential.

  1. Value of wireless personal digital assistants for practice: perceptions of advanced practice nurses.

    PubMed

    Garrett, Bernard; Klein, Gerri

    2008-08-01

    The aims were to explore advanced practice nurses' perceptions on wireless Personal Digital Assistant technologies, to establish the type and range of tools that would be useful to support their practice and to identify any requirements and limitations that may impact the implementation of wireless Personal Digital Assistants in practice. The wireless Personal Digital Assistant is becoming established as a hand-held computing tool for healthcare professionals. The reflections of advanced practice nurses' about the value of wireless Personal Digital Assistants and its potential to contribute to improved patient care has not been investigated. A qualitative interpretivist design was used to explore advanced practice nurses' perceptions on the value of wireless Personal Digital Assistant technologies to support their practice. The data were collected using survey questionnaires and individual and focus group interviews with nurse practitioners, clinical nurse specialists and information technology managers based in British Columbia, Canada. An open-coding content analysis was performed using qualitative data analysis software. Wireless Personal Digital Assistant's use supports the principles of pervasivity and is a technology rapidly being adopted by advanced practice nurses. Some nurses indicated a reluctance to integrate wireless Personal Digital Assistant technologies into their practices because of the cost and the short technological life cycle of these devices. Many of the barriers which precluded the use of wireless networks within facilities are being removed. Nurses demonstrated a complex understanding of wireless Personal Digital Assistant technologies and gave good rationales for its integration in their practice. Nurses identified improved client care as the major benefit of this technology in practice and the type and range of tools they identified included clinical reference tools such as drug and diagnostic/laboratory reference applications and wireless communications. Nurses in this study support integrating wireless mobile computing technologies into their practice to improve client care.

  2. LOCAL ORTHOGONAL CUTTING METHOD FOR COMPUTING MEDIAL CURVES AND ITS BIOMEDICAL APPLICATIONS

    PubMed Central

    Einstein, Daniel R.; Dyedov, Vladimir

    2010-01-01

    Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method called local orthogonal cutting (LOC) for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stability and consistency tests. These concepts lend themselves to robust numerical techniques and result in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods. PMID:20628546

  3. Computer-aided roll pass design in rolling of airfoil shapes

    NASA Technical Reports Server (NTRS)

    Akgerman, N.; Lahoti, G. D.; Altan, T.

    1980-01-01

    This paper describes two computer-aided design (CAD) programs developed for modeling the shape rolling process for airfoil sections. The first program, SHPROL, uses a modular upper-bound method of analysis and predicts the lateral spread, elongation, and roll torque. The second program, ROLPAS, predicts the stresses, roll separating force, the roll torque and the details of metal flow by simulating the rolling process, using the slab method of analysis. ROLPAS is an interactive program; it offers graphic display capabilities and allows the user to interact with the computer via a keyboard, CRT, and a light pen. The accuracy of the computerized models was evaluated by (a) rolling a selected airfoil shape at room temperature from 1018 steel and isothermally at high temperature from Ti-6Al-4V, and (b) comparing the experimental results with computer predictions. The comparisons indicated that the CAD systems, described here, are useful for practical engineering purposes and can be utilized in roll pass design and analysis for airfoil and similar shapes.

  4. Clinical applications of biomechanics cinematography.

    PubMed

    Woodle, A S

    1986-10-01

    Biomechanics cinematography is the analysis of movement of living organisms through the use of cameras, image projection systems, electronic digitizers, and computers. This article is a comparison of cinematographic systems and details practical uses of the modality in research and education.

  5. Beyond Logging of Fingertip Actions: Analysis of Collaborative Learning Using Multiple Sources of Data

    ERIC Educational Resources Information Center

    Avouris, N.; Fiotakis, G.; Kahrimanis, G.; Margaritis, M.; Komis, V.

    2007-01-01

    In this article, we discuss key requirements for collecting behavioural data concerning technology-supported collaborative learning activities. It is argued that the common practice of analysis of computer generated log files of user interactions with software tools is not enough for building a thorough view of the activity. Instead, more…

  6. Local Orthogonal Cutting Method for Computing Medial Curves and Its Biomedical Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiao, Xiangmin; Einstein, Daniel R.; Dyedov, Volodymyr

    2010-03-24

    Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stabilitymore » and consistency tests. These concepts lend themselves to robust numerical techniques including eigenvalue analysis, weighted least squares approximations, and numerical minimization, resulting in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods.« less

  7. Boundary element analysis of post-tensioned slabs

    NASA Astrophysics Data System (ADS)

    Rashed, Youssef F.

    2015-06-01

    In this paper, the boundary element method is applied to carry out the structural analysis of post-tensioned flat slabs. The shear-deformable plate-bending model is employed. The effect of the pre-stressing cables is taken into account via the equivalent load method. The formulation is automated using a computer program, which uses quadratic boundary elements. Verification samples are presented, and finally a practical application is analyzed where results are compared against those obtained from the finite element method. The proposed method is efficient in terms of computer storage and processing time as well as the ease in data input and modifications.

  8. Supersonic second order analysis and optimization program user's manual

    NASA Technical Reports Server (NTRS)

    Clever, W. C.

    1984-01-01

    Approximate nonlinear inviscid theoretical techniques for predicting aerodynamic characteristics and surface pressures for relatively slender vehicles at supersonic and moderate hypersonic speeds were developed. Emphasis was placed on approaches that would be responsive to conceptual configuration design level of effort. Second order small disturbance theory was utilized to meet this objective. Numerical codes were developed for analysis and design of relatively general three dimensional geometries. Results from the computations indicate good agreement with experimental results for a variety of wing, body, and wing-body shapes. Case computational time of one minute on a CDC 176 are typical for practical aircraft arrangement.

  9. Programming Probabilistic Structural Analysis for Parallel Processing Computer

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Chamis, Christos C.; Murthy, Pappu L. N.

    1991-01-01

    The ultimate goal of this research program is to make Probabilistic Structural Analysis (PSA) computationally efficient and hence practical for the design environment by achieving large scale parallelism. The paper identifies the multiple levels of parallelism in PSA, identifies methodologies for exploiting this parallelism, describes the development of a parallel stochastic finite element code, and presents results of two example applications. It is demonstrated that speeds within five percent of those theoretically possible can be achieved. A special-purpose numerical technique, the stochastic preconditioned conjugate gradient method, is also presented and demonstrated to be extremely efficient for certain classes of PSA problems.

  10. The Morningside Initiative: Collaborative Development of a Knowledge Repository to Accelerate Adoption of Clinical Decision Support

    DTIC Science & Technology

    2010-01-01

    Comparative Effectiveness Research, or other efforts to determine best practices and to develop guidelines based on meta-analysis and evidence - based medicine . An...authoritative reviews or other evidence - based medicine sources, but they have been made unambiguous and computable – a process which sounds...best practice recommendation created through an evidence - based medicine (EBM) development process. The lifecycle envisions four stages of refinement

  11. Self-directed student research through analysis of microarray datasets: a computer-based functional genomics practical class for masters-level students.

    PubMed

    Grenville-Briggs, Laura J; Stansfield, Ian

    2011-01-01

    This report describes a linked series of Masters-level computer practical workshops. They comprise an advanced functional genomics investigation, based upon analysis of a microarray dataset probing yeast DNA damage responses. The workshops require the students to analyse highly complex transcriptomics datasets, and were designed to stimulate active learning through experience of current research methods in bioinformatics and functional genomics. They seek to closely mimic a realistic research environment, and require the students first to propose research hypotheses, then test those hypotheses using specific sections of the microarray dataset. The complexity of the microarray data provides students with the freedom to propose their own unique hypotheses, tested using appropriate sections of the microarray data. This research latitude was highly regarded by students and is a strength of this practical. In addition, the focus on DNA damage by radiation and mutagenic chemicals allows them to place their results in a human medical context, and successfully sparks broad interest in the subject material. In evaluation, 79% of students scored the practical workshops on a five-point scale as 4 or 5 (totally effective) for student learning. More broadly, the general use of microarray data as a "student research playground" is also discussed. Copyright © 2011 Wiley Periodicals, Inc.

  12. Computational perspectives in the history of science: to the memory of Peter Damerow.

    PubMed

    Laubichler, Manfred D; Maienschein, Jane; Renn, Jürgen

    2013-03-01

    Computational methods and perspectives can transform the history of science by enabling the pursuit of novel types of questions, dramatically expanding the scale of analysis (geographically and temporally), and offering novel forms of publication that greatly enhance access and transparency. This essay presents a brief summary of a computational research system for the history of science, discussing its implications for research, education, and publication practices and its connections to the open-access movement and similar transformations in the natural and social sciences that emphasize big data. It also argues that computational approaches help to reconnect the history of science to individual scientific disciplines.

  13. Machine learning applications in genetics and genomics.

    PubMed

    Libbrecht, Maxwell W; Noble, William Stafford

    2015-06-01

    The field of machine learning, which aims to develop computer algorithms that improve with experience, holds promise to enable computers to assist humans in the analysis of large, complex data sets. Here, we provide an overview of machine learning applications for the analysis of genome sequencing data sets, including the annotation of sequence elements and epigenetic, proteomic or metabolomic data. We present considerations and recurrent challenges in the application of supervised, semi-supervised and unsupervised machine learning methods, as well as of generative and discriminative modelling approaches. We provide general guidelines to assist in the selection of these machine learning methods and their practical application for the analysis of genetic and genomic data sets.

  14. Design of hat-stiffened composite panels loaded in axial compression

    NASA Astrophysics Data System (ADS)

    Paul, T. K.; Sinha, P. K.

    An integrated step-by-step analysis procedure for the design of axially compressed stiffened composite panels is outlined. The analysis makes use of the effective width concept. A computer code, BUSTCOP, is developed incorporating various aspects of buckling such as skin buckling, stiffener crippling and column buckling. Other salient features of the computer code include capabilities for generation of data based on micromechanics theories and hygrothermal analysis, and for prediction of strength failure. Parametric studies carried out on a hat-stiffened structural element indicate that, for all practical purposes, composite panels exhibit higher structural efficiency. Some hybrid laminates with outer layers made of aluminum alloy also show great promise for flight vehicle structural applications.

  15. Computer use in primary care practices in Canada

    PubMed Central

    Anisimowicz, Yvonne; Bowes, Andrea E.; Thompson, Ashley E.; Miedema, Baukje; Hogg, William E.; Wong, Sabrina T.; Katz, Alan; Burge, Fred; Aubrey-Bassler, Kris; Yelland, Gregory S.; Wodchis, Walter P.

    2017-01-01

    Abstract Objective To examine the use of computers in primary care practices. Design The international Quality and Cost of Primary Care study was conducted in Canada in 2013 and 2014 using a descriptive cross-sectional survey method to collect data from practices across Canada. Participating practices filled out several surveys, one of them being the Family Physician Survey, from which this study collected its data. Setting All 10 Canadian provinces. Participants A total of 788 family physicians. Main outcome measures A computer use scale measured the extent to which family physicians integrated computers into their practices, with higher scores indicating a greater integration of computer use in practice. Analyses included t tests and 2 tests comparing new and traditional models of primary care on measures of computer use and electronic health record (EHR) use, as well as descriptive statistics. Results Nearly all (97.5%) physicians reported using a computer in their practices, with moderately high computer use scale scores (mean [SD] score of 5.97 [2.96] out of 9), and many (65.7%) reported using EHRs. Physicians with practices operating under new models of primary care reported incorporating computers into their practices to a greater extent (mean [SD] score of 6.55 [2.64]) than physicians operating under traditional models did (mean [SD] score of 5.33 [3.15]; t726.60 = 5.84; P < .001; Cohen d = 0.42, 95% CI 0.808 to 1.627) and were more likely to report using EHRs (73.8% vs 56.7%; χ12=25.43; P < .001; odds ratio = 2.15). Overall, there was a statistically significant variability in computer use across provinces. Conclusion Most family physicians in Canada have incorporated computers into their practices for administrative and scholarly activities; however, EHRs have not been adopted consistently across the country. Physicians with practices operating under the new, more collaborative models of primary care use computers more comprehensively and are more likely to use EHRs than those in practices operating under traditional models of primary care. PMID:28500211

  16. Computer vision for general purpose visual inspection: a fuzzy logic approach

    NASA Astrophysics Data System (ADS)

    Chen, Y. H.

    In automatic visual industrial inspection, computer vision systems have been widely used. Such systems are often application specific, and therefore require domain knowledge in order to have a successful implementation. Since visual inspection can be viewed as a decision making process, it is argued that the integration of fuzzy logic analysis and computer vision systems provides a practical approach to general purpose visual inspection applications. This paper describes the development of an integrated fuzzy-rule-based automatic visual inspection system. Domain knowledge about a particular application is represented as a set of fuzzy rules. From the status of predefined fuzzy variables, the set of fuzzy rules are defuzzified to give the inspection results. A practical application where IC marks (often in the forms of English characters and a company logo) inspection is demonstrated, which shows a more consistent result as compared to a conventional thresholding method.

  17. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    NASA Astrophysics Data System (ADS)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  18. Connecting People to Places : Spatiotemporal Analysis of Transit Supply Using Travel Time Cubes

    DOT National Transportation Integrated Search

    2016-06-01

    Despite its importance, temporal measures of accessibility are rarely used in transit research or practice. This is primarily due to the inherent difficulty and complexity in computing time-based accessibility metrics. Estimating origin-to-destinatio...

  19. Social Network Analysis of Elders' Health Literacy and their Use of Online Health Information.

    PubMed

    Jang, Haeran; An, Ji-Young

    2014-07-01

    Utilizing social network analysis, this study aimed to analyze the main keywords in the literature regarding the health literacy of and the use of online health information by aged persons over 65. Medical Subject Heading keywords were extracted from articles on the PubMed database of the National Library of Medicine. For health literacy, 110 articles out of 361 were initially extracted. Seventy-one keywords out of 1,021 were finally selected after removing repeated keywords and applying pruning. Regarding the use of online health information, 19 articles out of 26 were selected. One hundred forty-four keywords were initially extracted. After removing the repeated keywords, 74 keywords were finally selected. Health literacy was found to be strongly connected with 'Health knowledge, attitudes, practices' and 'Patient education as topic.' 'Computer literacy' had strong connections with 'Internet' and 'Attitude towards computers.' 'Computer literacy' was connected to 'Health literacy,' and was studied according to the parameters 'Attitude towards health' and 'Patient education as topic.' The use of online health information was strongly connected with 'Health knowledge, attitudes, practices,' 'Consumer health information,' 'Patient education as topic,' etc. In the network, 'Computer literacy' was connected with 'Health education,' 'Patient satisfaction,' 'Self-efficacy,' 'Attitude to computer,' etc. Research on older citizens' health literacy and their use of online health information was conducted together with study of computer literacy, patient education, attitude towards health, health education, patient satisfaction, etc. In particular, self-efficacy was noted as an important keyword. Further research should be conducted to identify the effective outcomes of self-efficacy in the area of interest.

  20. Chemorheology of reactive systems: Finite element analysis

    NASA Technical Reports Server (NTRS)

    Douglas, C.; Roylance, D.

    1982-01-01

    The equations which govern the nonisothermal flow of reactive fluids are outlined, and the means by which finite element analysis is used to solve these equations for the sort of arbitrary boundary conditions encountered in industrial practice are described. The performance of the computer code is illustrated by several trial problems, selected more for their value in providing insight to polymer processing flows than as practical production problems. Although a good deal remains to be learned as to the performance and proper use of this numerical technique, it is undeniably useful in providing better understanding of today's complicated polymer processing problems.

  1. Articles on Practical Cybernetics. Computer-Developed Computers; Heuristics and Modern Sciences; Linguistics and Practice; Cybernetics and Moral-Ethical Considerations; and Men and Machines at the Chessboard.

    ERIC Educational Resources Information Center

    Berg, A. I.; And Others

    Five articles which were selected from a Russian language book on cybernetics and then translated are presented here. They deal with the topics of: computer-developed computers, heuristics and modern sciences, linguistics and practice, cybernetics and moral-ethical considerations, and computer chess programs. (Author/JY)

  2. Launching genomics into the cloud: deployment of Mercury, a next generation sequence analysis pipeline.

    PubMed

    Reid, Jeffrey G; Carroll, Andrew; Veeraraghavan, Narayanan; Dahdouli, Mahmoud; Sundquist, Andreas; English, Adam; Bainbridge, Matthew; White, Simon; Salerno, William; Buhay, Christian; Yu, Fuli; Muzny, Donna; Daly, Richard; Duyk, Geoff; Gibbs, Richard A; Boerwinkle, Eric

    2014-01-29

    Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples.

  3. [The application of the computer technologies for the mathematical simulation of the ethmoidal labyrinth].

    PubMed

    Markeeva, M V; Mareev, O V; Nikolenko, V N; Mareev, G O; Danilova, T V; Fadeeva, E A; Fedorov, R V

    The objective of the present work was to study the relationship between the dimensions of the ethmoidal labyrinth and the skull in the subjects differing in the nose shape by means of the factorial and correlation analysis with the application of the modern computer-assisted methods for the three-dimensional reconstruction of the skull. We developed an original method for computed craniometry with the use the original program that made it possible to determine the standard intravital craniometrics characteristics of the human skull with a high degree of accuracy based on the results of analysis of 200 computed tomograms of the head. It was shown that the length of the inferior turbinated bones and the posterior edge of the orbital plate is of special relevance for practically all parameters of the ethmoidal labyrinth. Also, the width of the choanae positively relates to the height of the ethmoidal labyrinth.

  4. Image analysis and modeling in medical image computing. Recent developments and advances.

    PubMed

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body. Hence, model-based image computing methods are important tools to improve medical diagnostics and patient treatment in future.

  5. Systematic Analysis of the Decision Rules of Traditional Chinese Medicine

    PubMed Central

    Bin-Rong, Ma; Xi-Yuan, Jiang; Su-Ming, Liso; Huai-ning, Zhu; Xiu-ru, Lin

    1981-01-01

    Chinese traditional medicine has evolved over many centuries, and has accumulated a body of observed relationships between symptoms, signs and prognoses, and the efficacy of alternative treatments and prescriptions. With the assistance of a computer-based clinical data base for recording the diagnostic and therapeutic practice of skilled practitioners of Chinese traditional medicine, a systematic program is being conducted to identify and define the clinical decision-making rules that underlie current practice.

  6. Analysis of half diallel mating designs I: a practical analysis procedure for ANOVA approximation.

    Treesearch

    G.R. Johnson; J.N. King

    1998-01-01

    Procedures to analyze half-diallel mating designs using the SAS statistical package are presented. The procedure requires two runs of PROC and VARCOMP and results in estimates of additive and non-additive genetic variation. The procedures described can be modified to work on most statistical software packages which can compute variance component estimates. The...

  7. Spatial-temporal discriminant analysis for ERP-based brain-computer interface.

    PubMed

    Zhang, Yu; Zhou, Guoxu; Zhao, Qibin; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej

    2013-03-01

    Linear discriminant analysis (LDA) has been widely adopted to classify event-related potential (ERP) in brain-computer interface (BCI). Good classification performance of the ERP-based BCI usually requires sufficient data recordings for effective training of the LDA classifier, and hence a long system calibration time which however may depress the system practicability and cause the users resistance to the BCI system. In this study, we introduce a spatial-temporal discriminant analysis (STDA) to ERP classification. As a multiway extension of the LDA, the STDA method tries to maximize the discriminant information between target and nontarget classes through finding two projection matrices from spatial and temporal dimensions collaboratively, which reduces effectively the feature dimensionality in the discriminant analysis, and hence decreases significantly the number of required training samples. The proposed STDA method was validated with dataset II of the BCI Competition III and dataset recorded from our own experiments, and compared to the state-of-the-art algorithms for ERP classification. Online experiments were additionally implemented for the validation. The superior classification performance in using few training samples shows that the STDA is effective to reduce the system calibration time and improve the classification accuracy, thereby enhancing the practicability of ERP-based BCI.

  8. [AERA. Dream machines and computing practices at the Mathematical Center].

    PubMed

    Alberts, Gerard; De Beer, Huub T

    2008-01-01

    Dream machines may be just as effective as the ones materialised. Their symbolic thrust can be quite powerful. The Amsterdam 'Mathematisch Centrum' (Mathematical Center), founded February 11, 1946, created a Computing Department in an effort to realise its goal of serving society. When Aad van Wijngaarden was appointed as head of the Computing Department, however, he claimed space for scientific research and computer construction, next to computing as a service. Still, the computing service following the five stage style of Hartree's numerical analysis remained a dominant characteristic of the work of the Computing Department. The high level of ambition held by Aad van Wijngaarden lead to ever renewed projections of big automatic computers, symbolised by the never-built AERA. Even a machine that was actually constructed, the ARRA which followed A.D. Booth's design of the ARC, never made it into real operation. It did serve Van Wijngaarden to bluff his way into the computer age by midsummer 1952. Not until January 1954 did the computing department have a working stored program computer, which for reasons of policy went under the same name: ARRA. After just one other machine, the ARMAC, had been produced, a separate company, Electrologica, was set up for the manufacture of computers, which produced the rather successful X1 computer. The combination of ambition and absence of a working machine lead to a high level of work on programming, way beyond the usual ideas of libraries of subroutines. Edsger W. Dijkstra in particular led the way to an emphasis on the duties of the programmer within the pattern of numerical analysis. Programs generating programs, known elsewhere as autocoding systems, were at the 'Mathematisch Centrum' called 'superprograms'. Practical examples were usually called a 'complex', in Dutch, where in English one might say 'system'. Historically, this is where software begins. Dekker's matrix complex, Dijkstra's interrupt system, Dijkstra and Zonneveld's ALGOL compiler--which for housekeeping contained 'the complex'--were actual examples of such super programs. In 1960 this compiler gave the Mathematical Center a leading edge in the early development of software.

  9. A practice course to cultivate students' comprehensive ability of photoelectricity

    NASA Astrophysics Data System (ADS)

    Lv, Yong; Liu, Yang; Niu, Chunhui; Liu, Lishuang

    2017-08-01

    After the studying of many theoretical courses, it's important and urgent for the students from specialty of optoelectronic information science and engineering to cultivate their comprehensive ability of photoelectricity. We set up a comprehensive practice course named "Integrated Design of Optoelectronic Information System" (IDOIS) for the purpose that students can integrate their knowledge of optics, electronics and computer programming to design, install and debug an optoelectronic system with independent functions. Eight years of practice shows that this practice course can train students' ability of analysis, design/development and debugging of photoelectric system, improve their ability in document retrieval, design proposal and summary report writing, teamwork, innovation consciousness and skill.

  10. Bayesian Latent Class Analysis Tutorial.

    PubMed

    Li, Yuelin; Lord-Bessen, Jennifer; Shiyko, Mariya; Loeb, Rebecca

    2018-01-01

    This article is a how-to guide on Bayesian computation using Gibbs sampling, demonstrated in the context of Latent Class Analysis (LCA). It is written for students in quantitative psychology or related fields who have a working knowledge of Bayes Theorem and conditional probability and have experience in writing computer programs in the statistical language R . The overall goals are to provide an accessible and self-contained tutorial, along with a practical computation tool. We begin with how Bayesian computation is typically described in academic articles. Technical difficulties are addressed by a hypothetical, worked-out example. We show how Bayesian computation can be broken down into a series of simpler calculations, which can then be assembled together to complete a computationally more complex model. The details are described much more explicitly than what is typically available in elementary introductions to Bayesian modeling so that readers are not overwhelmed by the mathematics. Moreover, the provided computer program shows how Bayesian LCA can be implemented with relative ease. The computer program is then applied in a large, real-world data set and explained line-by-line. We outline the general steps in how to extend these considerations to other methodological applications. We conclude with suggestions for further readings.

  11. Reviewing and Viewing.

    ERIC Educational Resources Information Center

    Clements, Douglas H., Ed.; And Others

    1988-01-01

    Presents reviews of three software packages. Includes "Cube Builder: A 3-D Geometry Tool," which allows students to build three-dimensional shapes; "Number Master," a multipurpose practice program for whole number computation; and "Safari Search: Problem Solving and Inference," which focuses on decision making in mathematical analysis. (PK)

  12. Medical students’ attitudes and perspectives regarding novel computer-based practical spot tests compared to traditional practical spot tests

    PubMed Central

    Wijerathne, Buddhika; Rathnayake, Geetha

    2013-01-01

    Background Most universities currently practice traditional practical spot tests to evaluate students. However, traditional methods have several disadvantages. Computer-based examination techniques are becoming more popular among medical educators worldwide. Therefore incorporating the computer interface in practical spot testing is a novel concept that may minimize the shortcomings of traditional methods. Assessing students’ attitudes and perspectives is vital in understanding how students perceive the novel method. Methods One hundred and sixty medical students were randomly allocated to either a computer-based spot test (n=80) or a traditional spot test (n=80). The students rated their attitudes and perspectives regarding the spot test method soon after the test. The results were described comparatively. Results Students had higher positive attitudes towards the computer-based practical spot test compared to the traditional spot test. Their recommendations to introduce the novel practical spot test method for future exams and to other universities were statistically significantly higher. Conclusions The computer-based practical spot test is viewed as more acceptable to students than the traditional spot test. PMID:26451213

  13. Adaptation and validation of the Evidence-Based Practice Belief and Implementation scales for French-speaking Swiss nurses and allied healthcare providers.

    PubMed

    Verloo, Henk; Desmedt, Mario; Morin, Diane

    2017-09-01

    To evaluate two psychometric properties of the French versions of the Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales, namely their internal consistency and construct validity. The Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales developed by Melnyk et al. are recognised as valid, reliable instruments in English. However, no psychometric validation for their French versions existed. Secondary analysis of a cross sectional survey. Source data came from a cross-sectional descriptive study sample of 382 nurses and other allied healthcare providers. Cronbach's alpha was used to evaluate internal consistency, and principal axis factor analysis and varimax rotation were computed to determine construct validity. The French Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales showed excellent reliability, with Cronbach's alphas close to the scores established by Melnyk et al.'s original versions. Principal axis factor analysis showed medium-to-high factor loading scores without obtaining collinearity. Principal axis factor analysis with varimax rotation of the 16-item Evidence-Based Practice Beliefs scale resulted in a four-factor loading structure. Principal axis factor analysis with varimax rotation of the 17-item Evidence-Based Practice Implementation scale revealed a two-factor loading structure. Further research should attempt to understand why the French Evidence-Based Practice Implementation scale showed a two-factor loading structure but Melnyk et al.'s original has only one. The French versions of the Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales can both be considered valid and reliable instruments for measuring Evidence-Based Practice beliefs and implementation. The results suggest that the French Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales are valid and reliable and can therefore be used to evaluate the effectiveness of organisational strategies aimed at increasing professionals' confidence in Evidence-Based Practice, supporting its use and implementation. © 2017 John Wiley & Sons Ltd.

  14. Layered Architectures for Quantum Computers and Quantum Repeaters

    NASA Astrophysics Data System (ADS)

    Jones, Nathan C.

    This chapter examines how to organize quantum computers and repeaters using a systematic framework known as layered architecture, where machine control is organized in layers associated with specialized tasks. The framework is flexible and could be used for analysis and comparison of quantum information systems. To demonstrate the design principles in practice, we develop architectures for quantum computers and quantum repeaters based on optically controlled quantum dots, showing how a myriad of technologies must operate synchronously to achieve fault-tolerance. Optical control makes information processing in this system very fast, scalable to large problem sizes, and extendable to quantum communication.

  15. Editorial

    NASA Astrophysics Data System (ADS)

    Liu, Shuai

    Fractal represents a special feature of nature and functional objects. However, fractal based computing can be applied to many research domains because of its fixed property resisted deformation, variable parameters and many unpredictable changes. Theoretical research and practical application of fractal based computing have been hotspots for 30 years and will be continued. There are many pending issues awaiting solutions in this domain, thus this thematic issue containing 14 papers publishes the state-of-the-art developments in theorem and application of fractal based computing, including mathematical analysis and novel engineering applications. The topics contain fractal and multifractal features in application and solution of nonlinear odes and equation.

  16. ELM Meets Urban Big Data Analysis: Case Studies

    PubMed Central

    Chen, Huajun; Chen, Jiaoyan

    2016-01-01

    In the latest years, the rapid progress of urban computing has engendered big issues, which creates both opportunities and challenges. The heterogeneous and big volume of data and the big difference between physical and virtual worlds have resulted in lots of problems in quickly solving practical problems in urban computing. In this paper, we propose a general application framework of ELM for urban computing. We present several real case studies of the framework like smog-related health hazard prediction and optimal retain store placement. Experiments involving urban data in China show the efficiency, accuracy, and flexibility of our proposed framework. PMID:27656203

  17. An Analysis of Computer Aided Design (CAD) Packages Used at MSFC for the Recent Initiative to Integrate Engineering Activities

    NASA Technical Reports Server (NTRS)

    Smith, Leigh M.; Parker, Nelson C. (Technical Monitor)

    2002-01-01

    This paper analyzes the use of Computer Aided Design (CAD) packages at NASA's Marshall Space Flight Center (MSFC). It examines the effectiveness of recent efforts to standardize CAD practices across MSFC engineering activities. An assessment of the roles played by management, designers, analysts, and manufacturers in this initiative will be explored. Finally, solutions are presented for better integration of CAD across MSFC in the future.

  18. Hybrid computational and experimental approach for the study and optimization of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1998-05-01

    Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.

  19. Using Microsoft PowerPoint as an Astronomical Image Analysis Tool

    NASA Astrophysics Data System (ADS)

    Beck-Winchatz, Bernhard

    2006-12-01

    Engaging students in the analysis of authentic scientific data is an effective way to teach them about the scientific process and to develop their problem solving, teamwork and communication skills. In astronomy several image processing and analysis software tools have been developed for use in school environments. However, the practical implementation in the classroom is often difficult because the teachers may not have the comfort level with computers necessary to install and use these tools, they may not have adequate computer privileges and/or support, and they may not have the time to learn how to use specialized astronomy software. To address this problem, we have developed a set of activities in which students analyze astronomical images using basic tools provided in PowerPoint. These include measuring sizes, distances, and angles, and blinking images. In contrast to specialized software, PowerPoint is broadly available on school computers. Many teachers are already familiar with PowerPoint, and the skills developed while learning how to analyze astronomical images are highly transferable. We will discuss several practical examples of measurements, including the following: -Variations in the distances to the sun and moon from their angular sizes -Magnetic declination from images of shadows -Diameter of the moon from lunar eclipse images -Sizes of lunar craters -Orbital radii of the Jovian moons and mass of Jupiter -Supernova and comet searches -Expansion rate of the universe from images of distant galaxies

  20. Comparison of the Effects of Computer-Based Practice and Conceptual Understanding Interventions on Mathematics Fact Retention and Generalization

    ERIC Educational Resources Information Center

    Kanive, Rebecca; Nelson, Peter M.; Burns, Matthew K.; Ysseldyke, James

    2014-01-01

    The authors' purpose was to determine the effects of computer-based practice and conceptual interventions on computational fluency and word-problem solving of fourth- and fifth-grade students with mathematics difficulties. A randomized pretest-posttest control group design found that students assigned to the computer-based practice intervention…

  1. A self-analysis of the NASA-TLX workload measure.

    PubMed

    Noyes, Jan M; Bruneau, Daniel P J

    2007-04-01

    Computer use and, more specifically, the administration of tests and materials online continue to proliferate. A number of subjective, self-report workload measures exist, but the National Aeronautics and Space Administration-Task Load Index (NASA-TLX) is probably the most well known and used. The aim of this paper is to consider the workload costs associated with the computer-based and paper versions of the NASA-TLX measure. It was found that there is a significant difference between the workload scores for the two media, with the computer version of the NASA-TLX incurring more workload. This has implications for the practical use of the NASA-TLX as well as for other computer-based workload measures.

  2. Digital image processing for information extraction.

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1973-01-01

    The modern digital computer has made practical image processing techniques for handling nonlinear operations in both the geometrical and the intensity domains, various types of nonuniform noise cleanup, and the numerical analysis of pictures. An initial requirement is that a number of anomalies caused by the camera (e.g., geometric distortion, MTF roll-off, vignetting, and nonuniform intensity response) must be taken into account or removed to avoid their interference with the information extraction process. Examples illustrating these operations are discussed along with computer techniques used to emphasize details, perform analyses, classify materials by multivariate analysis, detect temporal differences, and aid in human interpretation of photos.

  3. Fast linear feature detection using multiple directional non-maximum suppression.

    PubMed

    Sun, C; Vallotton, P

    2009-05-01

    The capacity to detect linear features is central to image analysis, computer vision and pattern recognition and has practical applications in areas such as neurite outgrowth detection, retinal vessel extraction, skin hair removal, plant root analysis and road detection. Linear feature detection often represents the starting point for image segmentation and image interpretation. In this paper, we present a new algorithm for linear feature detection using multiple directional non-maximum suppression with symmetry checking and gap linking. Given its low computational complexity, the algorithm is very fast. We show in several examples that it performs very well in terms of both sensitivity and continuity of detected linear features.

  4. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  5. RELIABLE COMPUTATION OF HOMOGENEOUS AZEOTROPES. (R824731)

    EPA Science Inventory

    Abstract

    It is important to determine the existence and composition of homogeneous azeotropes in the analysis of phase behavior and in the synthesis and design of separation systems, from both theoretical and practical standpoints. A new method for reliably locating an...

  6. Technology Education Practical Activities for Elementary School Teachers.

    ERIC Educational Resources Information Center

    Pedras, Melvin J.; Braukmann, Jim

    This report contains four learning modules designed to support a range of objectives that include increasing technological literacy, and improving written and verbal communication skills, psychomotor skills, computational skills, geometry, analysis, problem solving, and other critical thinking skills. The activities described in each module…

  7. Eye-gaze control of the computer interface: Discrimination of zoom intent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, J.H.; Schryver, J.C.

    1993-10-01

    An analysis methodology and associated experiment were developed to assess whether definable and repeatable signatures of eye-gaze characteristics are evident, preceding a decision to zoom-in, zoom-out, or not to zoom at a computer interface. This user intent discrimination procedure can have broad application in disability aids and telerobotic control. Eye-gaze was collected from 10 subjects in a controlled experiment, requiring zoom decisions. The eye-gaze data were clustered, then fed into a multiple discriminant analysis (MDA) for optimal definition of heuristics separating the zoom-in, zoom-out, and no-zoom conditions. Confusion matrix analyses showed that a number of variable combinations classified at amore » statistically significant level, but practical significance was more difficult to establish. Composite contour plots demonstrated the regions in parameter space consistently assigned by the MDA to unique zoom conditions. Peak classification occurred at about 1200--1600 msec. Improvements in the methodology to achieve practical real-time zoom control are considered.« less

  8. Computational Fluid Dynamic Analysis of Hydrodynamic forces on inundated bridge decks

    NASA Astrophysics Data System (ADS)

    Afzal, Bushra; Guo, Junke; Kerenyi, Kornel

    2010-11-01

    The hydraulic forces experienced by an inundated bridge deck have great importance in the design of bridges. Flood flows or hurricane add significant hydrodynamic loading on bridges, possibly resulting in failure of the bridge superstructures. The objective of the study is to establish validated computational practice to address research needs of the transportation community via computational fluid dynamic simulations. The reduced scale experiments conducted at Turner-Fairbank Highway Research Center establish the foundations of validated computational practices to address the research needs of the transportation community. Three bridge deck prototypes were used: a typical six-girder highway bridge deck, a three-girder deck, and a streamlined deck designed to better withstand the hydraulic forces. Results of the study showed that the streamlined deck significantly reduces drag, lift, and moment coefficient in comparison to the other bridge deck types. The CFD results matched the experimental data in terms of the relationship between inundation ratio and force measured at the bridge. The results of the present research will provide a tool for designing new bridges and retrofitting old ones.

  9. Computer literacy in nursing education. An overview.

    PubMed

    Newbern, V B

    1985-09-01

    Nursing educators are beginning to realize that computer literacy has become a survival skill for the profession. They understand that literacy must be at a level that assures the ability to manage and control the flood of available information and provides an openness and awareness of future technologic possibilities. The computer has been on college campuses for a number of years, used primarily for record storage and retrieval. However, early on a few nurse educators saw the potential for its use as a practice tool. Out of this foresight came both formal and nonformal educational offerings. The evolution of formal coursework in computer literacy has moved from learning about the computer to learning with the computer. Today the use of the computer is expanding geometrically as microcomputers become common. Graduate students and faculty use them for literature searches and data analysis. Undergraduates are routinely using computer-assisted instruction. Coursework in computer technology is fast becoming a given for nursing students and computer competency a requisite for faculty. However, inculcating computer competency in faculty and student repertoires is not an easy task. There are problems related to motivation, resources, and control. Territorial disputes between schools and colleges must be arbitrated. The interface with practice must be addressed. The paucity of adequate software is a real concern. But the potential is enormous, probably restricted only by human creativity. The possibilities for teaching and learning are profound, especially if geographical constraints can be effaced and scarce resources can be shared at minimal cost. Extremely sophisticated research designs and evaluation methodologies can be used routinely.(ABSTRACT TRUNCATED AT 250 WORDS)

  10. Scalable Algorithms for Clustering Large Geospatiotemporal Data Sets on Manycore Architectures

    NASA Astrophysics Data System (ADS)

    Mills, R. T.; Hoffman, F. M.; Kumar, J.; Sreepathi, S.; Sripathi, V.

    2016-12-01

    The increasing availability of high-resolution geospatiotemporal data sets from sources such as observatory networks, remote sensing platforms, and computational Earth system models has opened new possibilities for knowledge discovery using data sets fused from disparate sources. Traditional algorithms and computing platforms are impractical for the analysis and synthesis of data sets of this size; however, new algorithmic approaches that can effectively utilize the complex memory hierarchies and the extremely high levels of available parallelism in state-of-the-art high-performance computing platforms can enable such analysis. We describe a massively parallel implementation of accelerated k-means clustering and some optimizations to boost computational intensity and utilization of wide SIMD lanes on state-of-the art multi- and manycore processors, including the second-generation Intel Xeon Phi ("Knights Landing") processor based on the Intel Many Integrated Core (MIC) architecture, which includes several new features, including an on-package high-bandwidth memory. We also analyze the code in the context of a few practical applications to the analysis of climatic and remotely-sensed vegetation phenology data sets, and speculate on some of the new applications that such scalable analysis methods may enable.

  11. A new framework for comprehensive, robust, and efficient global sensitivity analysis: 1. Theory

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin V.

    2016-01-01

    Computer simulation models are continually growing in complexity with increasingly more factors to be identified. Sensitivity Analysis (SA) provides an essential means for understanding the role and importance of these factors in producing model responses. However, conventional approaches to SA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we present a new and general sensitivity analysis framework (called VARS), based on an analogy to "variogram analysis," that provides an intuitive and comprehensive characterization of sensitivity across the full spectrum of scales in the factor space. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are special cases of VARS, and that their SA indices can be computed as by-products of the VARS framework. Synthetic functions that resemble actual model response surfaces are used to illustrate the concepts, and show VARS to be as much as two orders of magnitude more computationally efficient than the state-of-the-art Sobol approach. In a companion paper, we propose a practical implementation strategy, and demonstrate the effectiveness, efficiency, and reliability (robustness) of the VARS framework on real-data case studies.

  12. Defining Computational Thinking for Mathematics and Science Classrooms

    NASA Astrophysics Data System (ADS)

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-02-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.

  13. An analysis of errors, discrepancies, and variation in opioid prescriptions for adult outpatients at a teaching hospital

    PubMed Central

    Bicket, Mark C.; Kattail, Deepa; Yaster, Myron; Wu, Christopher L.; Pronovost, Peter

    2017-01-01

    Objective To determine opioid prescribing patterns and rate of three types of errors, discrepancies, and variation from ideal practice. Design Retrospective review of opioid prescriptions processed at an outpatient pharmacy Setting Tertiary institutional medical center Patients We examined 510 consecutive opioid medication prescriptions for adult patients processed at an institutional outpatient pharmacy in June 2016 for patient, provider, and prescription characteristics. Main Outcome Measure(s) We analyzed prescriptions for deviation from best practice guidelines, lack of two patient identifiers, and noncompliance with Drug Enforcement Agency (DEA) rules. Results Mean patient age (SD) was 47.5 years (17.4). The most commonly prescribed opioid was oxycodone (71%), usually not combined with acetaminophen. Practitioners prescribed tablet formulation to 92% of the sample, averaging 57 (47) pills. We identified at least one error on 42% of prescriptions. Among all prescriptions, 9% deviated from best practice guidelines, 21% failed to include two patient identifiers, and 41% were noncompliant with DEA rules. Errors occurred in 89% of handwritten prescriptions, 0% of electronic health record (EHR) computer-generated prescriptions, and 12% of non-EHR computer-generated prescriptions. Inter-rater reliability by kappa was 0.993. Conclusions Inconsistencies in opioid prescribing remain common. Handwritten prescriptions continue to demonstrate higher associations of errors, discrepancies, and variation from ideal practice and government regulations. All computer-generated prescriptions adhered to best practice guidelines and contained two patient identifiers, and all EHR prescriptions were fully compliant with DEA rules. PMID:28345746

  14. Obtaining Content Weights for Test Specifications from Job Analysis Task Surveys: An Application of the Many-Facets Rasch Model

    ERIC Educational Resources Information Center

    Wang, Ning; Stahl, John

    2012-01-01

    This article discusses the use of the Many-Facets Rasch Model, via the FACETS computer program (Linacre, 2006a), to scale job/practice analysis survey data as well as to combine multiple rating scales into single composite weights representing the tasks' relative importance. Results from the Many-Facets Rasch Model are compared with those…

  15. Social Network Analysis of Elders' Health Literacy and their Use of Online Health Information

    PubMed Central

    Jang, Haeran

    2014-01-01

    Objectives Utilizing social network analysis, this study aimed to analyze the main keywords in the literature regarding the health literacy of and the use of online health information by aged persons over 65. Methods Medical Subject Heading keywords were extracted from articles on the PubMed database of the National Library of Medicine. For health literacy, 110 articles out of 361 were initially extracted. Seventy-one keywords out of 1,021 were finally selected after removing repeated keywords and applying pruning. Regarding the use of online health information, 19 articles out of 26 were selected. One hundred forty-four keywords were initially extracted. After removing the repeated keywords, 74 keywords were finally selected. Results Health literacy was found to be strongly connected with 'Health knowledge, attitudes, practices' and 'Patient education as topic.' 'Computer literacy' had strong connections with 'Internet' and 'Attitude towards computers.' 'Computer literacy' was connected to 'Health literacy,' and was studied according to the parameters 'Attitude towards health' and 'Patient education as topic.' The use of online health information was strongly connected with 'Health knowledge, attitudes, practices,' 'Consumer health information,' 'Patient education as topic,' etc. In the network, 'Computer literacy' was connected with 'Health education,' 'Patient satisfaction,' 'Self-efficacy,' 'Attitude to computer,' etc. Conclusions Research on older citizens' health literacy and their use of online health information was conducted together with study of computer literacy, patient education, attitude towards health, health education, patient satisfaction, etc. In particular, self-efficacy was noted as an important keyword. Further research should be conducted to identify the effective outcomes of self-efficacy in the area of interest. PMID:25152835

  16. Glitch game testers: The design and study of a learning environment for computational production with young African American males

    NASA Astrophysics Data System (ADS)

    DiSalvo, Elizabeth Betsy

    The implementation of a learning environment for young African American males, called the Glitch Game Testers, was launched in 2009. The development of this program was based on formative work that looked at the contrasting use of digital games between young African American males and individuals who chose to become computer science majors. Through analysis of cultural values and digital game play practices, the program was designed to intertwine authentic game development practices and computer science learning. The resulting program employed 25 African American male high school students to test pre-release digital games full-time in the summer and part-time in the school year, with an hour of each day dedicated to learning introductory computer science. Outcomes for persisting in computer science education are remarkable; of the 16 participants who had graduated from high school as of 2012, 12 have gone on to school in computing-related majors. These outcomes, and the participants' enthusiasm for engaging in computing, are in sharp contrast to the crisis in African American male education and learning motivation. The research presented in this dissertation discusses the formative research that shaped the design of Glitch, the evaluation of the implementation of Glitch, and a theoretical investigation of the way in which participants navigated conflicting motivations in learning environments.

  17. The Effects of Social Environments on Time Spent Gaming: Focusing on the Effects of Communities and Neighborhoods.

    PubMed

    Lim, Tee Teng; Jung, Sun Young; Kim, Eunyi

    2018-04-01

    This study examined the impact of community and neighborhood on time spent computer gaming. Computer gaming for over 20 hours a week was set as the cutoff line for "engaged use" of computer games. For the analysis, this study analyzed data for about 1,800 subjects who participated in the Korean Children and Youth Panel Survey. The main findings are as follows: first, structural community characteristics and neighborhood social capital affected the engaged use of computer games. Second, adolescents who reside in regions with a higher divorce rate or higher residential mobility were likely to exhibit engaged use of computer games. Third, adolescents who highly perceive neighborhood social capital exhibited lower possibility of engaged use of computer games. Based on these findings, practical implications and directions for further study are suggested.

  18. Internal fluid mechanics research on supercomputers for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Miller, Brent A.; Anderson, Bernhard H.; Szuch, John R.

    1988-01-01

    The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid mechanics (ICFM) to a state of practical application for aerospace propulsion systems. The strategies used to achieve this goal are to: (1) pursue an understanding of flow physics, surface heat transfer, and combustion via analysis and fundamental experiments, (2) incorporate improved understanding of these phenomena into verified 3-D CFD codes, and (3) utilize state-of-the-art computational technology to enhance experimental and CFD research. Presented is an overview of the ICFM program in high-speed propulsion, including work in inlets, turbomachinery, and chemical reacting flows. Ongoing efforts to integrate new computer technologies, such as parallel computing and artificial intelligence, into high-speed aeropropulsion research are described.

  19. Preferred computer activities among individuals with dementia: a pilot study.

    PubMed

    Tak, Sunghee H; Zhang, Hongmei; Hong, Song Hee

    2015-03-01

    Computers offer new activities that are easily accessible, cognitively stimulating, and enjoyable for individuals with dementia. The current descriptive study examined preferred computer activities among nursing home residents with different severity levels of dementia. A secondary data analysis was conducted using activity observation logs from 15 study participants with dementia (severe = 115 logs, moderate = 234 logs, and mild = 124 logs) who participated in a computer activity program. Significant differences existed in preferred computer activities among groups with different severity levels of dementia. Participants with severe dementia spent significantly more time watching slide shows with music than those with both mild and moderate dementia (F [2,12] = 9.72, p = 0.003). Preference in playing games also differed significantly across the three groups. It is critical to consider individuals' interests and functional abilities when computer activities are provided for individuals with dementia. A practice guideline for tailoring computer activities is detailed. Copyright 2015, SLACK Incorporated.

  20. Interesting viewpoints to those who will put Ada into practice

    NASA Technical Reports Server (NTRS)

    Carlsson, Arne

    1986-01-01

    Ada will most probably be used as the programming language for computers in the NASA Space Station. It is reasonable to suppose that Ada will be used for at least embedded computers, because the high software costs for these embedded computers were the reason why Ada activities were initiated about ten years ago. The on-board computers are designed for use in space applications, where maintenance by man is impossible. All manipulation of such computers has to be performed in an autonomous way or remote with commands from the ground. In a manned Space Station some maintenance work can be performed by service people on board, but there are still a lot of applications, which require autonomous computers, for example, vital Space Station functions and unmanned orbital transfer vehicles. Those aspect which have come out of the analysis of Ada characteristics together with the experience of requirements for embedded on-board computers in space applications are examined.

  1. Connecting congregations: technology resources influence parish nurse practice.

    PubMed

    Zerull, Lisa M; Near, Kelly K; Ragon, Bart; Farrell, Sarah P

    2009-01-01

    This descriptive pilot study evaluated the influence of health resource information education and the use of Web-based communication technology on the professional practice of the parish nurse in the congregational setting. Five parish nurse participants from varied denominations in rural and nonrural Virginia received a laptop computer, printer, video projector, and webcam along with high-speed Internet access in each congregational setting. The nurses attended two group education sessions that incorporated computer applications and training in accessing and using quality health information resources and communication applications such as a group "chat" software and webcam to communicate with others through high-speed Internet access. Qualitative analysis from semistructured interviews of nurses confirmed that participants found the project to be beneficial in terms of awareness, education, and applicability of technology use in parish nurse practice. Quantitative data from preproject and postproject surveys found significant differences in nurses' abilities and confidence with technology use and application. Findings showed that the knowledge and experience gained from this study enhanced parish nurse practice and confidence in using technology for communication, health education, and counseling.

  2. Bayesian networks and statistical analysis application to analyze the diagnostic test accuracy

    NASA Astrophysics Data System (ADS)

    Orzechowski, P.; Makal, Jaroslaw; Onisko, A.

    2005-02-01

    The computer aided BPH diagnosis system based on Bayesian network is described in the paper. First result are compared to a given statistical method. Different statistical methods are used successfully in medicine for years. However, the undoubted advantages of probabilistic methods make them useful in application in newly created systems which are frequent in medicine, but do not have full and competent knowledge. The article presents advantages of the computer aided BPH diagnosis system in clinical practice for urologists.

  3. Environmental Impact Analysis Process. Volume 1

    DTIC Science & Technology

    1987-04-01

    defined in the Society of Automotive Engineers ( SAE ) Aerospace Recommended Practice 865A "Definition and Procedures for Computing the Perceived Noise Level...Chicopee, MA Irving I. Farber 402 Irene Street Chicopee, MA 01020 Mrs. Norma E. Farber 402 Irene Street Chicopee, MA 01020 Maureen Filipe 50 LaBelle

  4. A Practical Model for Forecasting New Freshman Enrollment during the Application Period.

    ERIC Educational Resources Information Center

    Paulsen, Michael B.

    1989-01-01

    A simple and effective model for forecasting freshman enrollment during the application period is presented step by step. The model requires minimal and readily available information, uses a simple linear regression analysis on a personal computer, and provides updated monthly forecasts. (MSE)

  5. A Moment of Mindfulness: Computer-Mediated Mindfulness Practice Increases State Mindfulness.

    PubMed

    Mahmood, Lynsey; Hopthrow, Tim; Randsley de Moura, Georgina

    2016-01-01

    Three studies investigated the use of a 5-minute, computer-mediated mindfulness practice in increasing levels of state mindfulness. In Study 1, 54 high school students completed the computer-mediated mindfulness practice in a lab setting and Toronto Mindfulness Scale (TMS) scores were measured before and after the practice. In Study 2 (N = 90) and Study 3 (N = 61), the mindfulness practice was tested with an entirely online sample to test the delivery of the 5-minute mindfulness practice via the internet. In Study 2 and 3, we found a significant increase in TMS scores in the mindful condition, but not in the control condition. These findings highlight the impact of a brief, mindfulness practice for single-session, computer-mediated use to increase mindfulness as a state.

  6. A Moment of Mindfulness: Computer-Mediated Mindfulness Practice Increases State Mindfulness

    PubMed Central

    Mahmood, Lynsey; Hopthrow, Tim; Randsley de Moura, Georgina

    2016-01-01

    Three studies investigated the use of a 5-minute, computer-mediated mindfulness practice in increasing levels of state mindfulness. In Study 1, 54 high school students completed the computer-mediated mindfulness practice in a lab setting and Toronto Mindfulness Scale (TMS) scores were measured before and after the practice. In Study 2 (N = 90) and Study 3 (N = 61), the mindfulness practice was tested with an entirely online sample to test the delivery of the 5-minute mindfulness practice via the internet. In Study 2 and 3, we found a significant increase in TMS scores in the mindful condition, but not in the control condition. These findings highlight the impact of a brief, mindfulness practice for single-session, computer-mediated use to increase mindfulness as a state. PMID:27105428

  7. A cost-utility analysis of the use of preoperative computed tomographic angiography in abdomen-based perforator flap breast reconstruction.

    PubMed

    Offodile, Anaeze C; Chatterjee, Abhishek; Vallejo, Sergio; Fisher, Carla S; Tchou, Julia C; Guo, Lifei

    2015-04-01

    Computed tomographic angiography is a diagnostic tool increasingly used for preoperative vascular mapping in abdomen-based perforator flap breast reconstruction. This study compared the use of computed tomographic angiography and the conventional practice of Doppler ultrasonography only in postmastectomy reconstruction using a cost-utility model. Following a comprehensive literature review, a decision analytic model was created using the three most clinically relevant health outcomes in free autologous breast reconstruction with computed tomographic angiography versus Doppler ultrasonography only. Cost and utility estimates for each health outcome were used to derive the quality-adjusted life-years and incremental cost-utility ratio. One-way sensitivity analysis was performed to scrutinize the robustness of the authors' results. Six studies and 782 patients were identified. Cost-utility analysis revealed a baseline cost savings of $3179, a gain in quality-adjusted life-years of 0.25. This yielded an incremental cost-utility ratio of -$12,716, implying a dominant choice favoring preoperative computed tomographic angiography. Sensitivity analysis revealed that computed tomographic angiography was costlier when the operative time difference between the two techniques was less than 21.3 minutes. However, the clinical advantage of computed tomographic angiography over Doppler ultrasonography only showed that computed tomographic angiography would still remain the cost-effective option even if it offered no additional operating time advantage. The authors' results show that computed tomographic angiography is a cost-effective technology for identifying lower abdominal perforators for autologous breast reconstruction. Although the perfect study would be a randomized controlled trial of the two approaches with true cost accrual, the authors' results represent the best available evidence.

  8. Supercomputing resources empowering superstack with interactive and integrated systems

    NASA Astrophysics Data System (ADS)

    Rückemann, Claus-Peter

    2012-09-01

    This paper presents the results from the development and implementation of Superstack algorithms to be dynamically used with integrated systems and supercomputing resources. Processing of geophysical data, thus named geoprocessing, is an essential part of the analysis of geoscientific data. The theory of Superstack algorithms and the practical application on modern computing architectures was inspired by developments introduced with processing of seismic data on mainframes and within the last years leading to high end scientific computing applications. There are several stacking algorithms known but with low signal to noise ratio in seismic data the use of iterative algorithms like the Superstack can support analysis and interpretation. The new Superstack algorithms are in use with wave theory and optical phenomena on highly performant computing resources for huge data sets as well as for sophisticated application scenarios in geosciences and archaeology.

  9. The influence of deliberate practice on musical achievement: a meta-analysis.

    PubMed

    Platz, Friedrich; Kopiez, Reinhard; Lehmann, Andreas C; Wolf, Anna

    2014-01-01

    Deliberate practice (DP) is a task-specific structured training activity that plays a key role in understanding skill acquisition and explaining individual differences in expert performance. Relevant activities that qualify as DP have to be identified in every domain. For example, for training in classical music, solitary practice is a typical training activity during skill acquisition. To date, no meta-analysis on the quantifiable effect size of deliberate practice on attained performance in music has been conducted. Yet the identification of a quantifiable effect size could be relevant for the current discussion on the role of various factors on individual difference in musical achievement. Furthermore, a research synthesis might enable new computational approaches to musical development. Here we present the first meta-analysis on the role of deliberate practice in the domain of musical performance. A final sample size of 13 studies (total N = 788) was carefully extracted to satisfy the following criteria: reported durations of task-specific accumulated practice as predictor variables and objectively assessed musical achievement as the target variable. We identified an aggregated effect size of r c = 0.61; 95% CI [0.54, 0.67] for the relationship between task-relevant practice (which by definition includes DP) and musical achievement. Our results corroborate the central role of long-term (deliberate) practice for explaining expert performance in music.

  10. Teachers' Organization of Participation Structures for Teaching Science with Computer Technology

    NASA Astrophysics Data System (ADS)

    Subramaniam, Karthigeyan

    2016-08-01

    This paper describes a qualitative study that investigated the nature of the participation structures and how the participation structures were organized by four science teachers when they constructed and communicated science content in their classrooms with computer technology. Participation structures focus on the activity structures and processes in social settings like classrooms thereby providing glimpses into the complex dynamics of teacher-students interactions, configurations, and conventions during collective meaning making and knowledge creation. Data included observations, interviews, and focus group interviews. Analysis revealed that the dominant participation structure evident within participants' instruction with computer technology was ( Teacher) initiation-( Student and Teacher) response sequences-( Teacher) evaluate participation structure. Three key events characterized the how participants organized this participation structure in their classrooms: setting the stage for interactive instruction, the joint activity, and maintaining accountability. Implications include the following: (1) teacher educators need to tap into the knowledge base that underscores science teachers' learning to teach philosophies when computer technology is used in instruction. (2) Teacher educators need to emphasize the essential idea that learning and cognition is not situated within the computer technology but within the pedagogical practices, specifically the participation structures. (3) The pedagogical practices developed with the integration or with the use of computer technology underscored by the teachers' own knowledge of classroom contexts and curriculum needs to be the focus for how students learn science content with computer technology instead of just focusing on how computer technology solely supports students learning of science content.

  11. 77 FR 64834 - Computational Fluid Dynamics Best Practice Guidelines for Dry Cask Applications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-23

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0250] Computational Fluid Dynamics Best Practice... public comments on draft NUREG-2152, ``Computational Fluid Dynamics Best Practice Guidelines for Dry Cask... System (ADAMS): You may access publicly-available documents online in the NRC Library at http://www.nrc...

  12. Cloud Infrastructures for In Silico Drug Discovery: Economic and Practical Aspects

    PubMed Central

    Clematis, Andrea; Quarati, Alfonso; Cesini, Daniele; Milanesi, Luciano; Merelli, Ivan

    2013-01-01

    Cloud computing opens new perspectives for small-medium biotechnology laboratories that need to perform bioinformatics analysis in a flexible and effective way. This seems particularly true for hybrid clouds that couple the scalability offered by general-purpose public clouds with the greater control and ad hoc customizations supplied by the private ones. A hybrid cloud broker, acting as an intermediary between users and public providers, can support customers in the selection of the most suitable offers, optionally adding the provisioning of dedicated services with higher levels of quality. This paper analyses some economic and practical aspects of exploiting cloud computing in a real research scenario for the in silico drug discovery in terms of requirements, costs, and computational load based on the number of expected users. In particular, our work is aimed at supporting both the researchers and the cloud broker delivering an IaaS cloud infrastructure for biotechnology laboratories exposing different levels of nonfunctional requirements. PMID:24106693

  13. GT-WGS: an efficient and economic tool for large-scale WGS analyses based on the AWS cloud service.

    PubMed

    Wang, Yiqi; Li, Gen; Ma, Mark; He, Fazhong; Song, Zhuo; Zhang, Wei; Wu, Chengkun

    2018-01-19

    Whole-genome sequencing (WGS) plays an increasingly important role in clinical practice and public health. Due to the big data size, WGS data analysis is usually compute-intensive and IO-intensive. Currently it usually takes 30 to 40 h to finish a 50× WGS analysis task, which is far from the ideal speed required by the industry. Furthermore, the high-end infrastructure required by WGS computing is costly in terms of time and money. In this paper, we aim to improve the time efficiency of WGS analysis and minimize the cost by elastic cloud computing. We developed a distributed system, GT-WGS, for large-scale WGS analyses utilizing the Amazon Web Services (AWS). Our system won the first prize on the Wind and Cloud challenge held by Genomics and Cloud Technology Alliance conference (GCTA) committee. The system makes full use of the dynamic pricing mechanism of AWS. We evaluate the performance of GT-WGS with a 55× WGS dataset (400GB fastq) provided by the GCTA 2017 competition. In the best case, it only took 18.4 min to finish the analysis and the AWS cost of the whole process is only 16.5 US dollars. The accuracy of GT-WGS is 99.9% consistent with that of the Genome Analysis Toolkit (GATK) best practice. We also evaluated the performance of GT-WGS performance on a real-world dataset provided by the XiangYa hospital, which consists of 5× whole-genome dataset with 500 samples, and on average GT-WGS managed to finish one 5× WGS analysis task in 2.4 min at a cost of $3.6. WGS is already playing an important role in guiding therapeutic intervention. However, its application is limited by the time cost and computing cost. GT-WGS excelled as an efficient and affordable WGS analyses tool to address this problem. The demo video and supplementary materials of GT-WGS can be accessed at https://github.com/Genetalks/wgs_analysis_demo .

  14. Hybrid, experimental and computational, investigation of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1996-07-01

    Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.

  15. An Aggregate IRT Procedure for Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Camilli, Gregory; Fox, Jean-Paul

    2015-01-01

    An aggregation strategy is proposed to potentially address practical limitation related to computing resources for two-level multidimensional item response theory (MIRT) models with large data sets. The aggregate model is derived by integration of the normal ogive model, and an adaptation of the stochastic approximation expectation maximization…

  16. Rethinking Technology-Enhanced Physics Teacher Education: From Theory to Practice

    ERIC Educational Resources Information Center

    Milner-Bolotin, Marina

    2016-01-01

    This article discusses how modern technology, such as electronic response systems, PeerWise system, data collection and analysis tools, computer simulations, and modeling software can be used in physics methods courses to promote teacher-candidates' professional competencies and their positive attitudes about mathematics and science education. We…

  17. Empathy in Distance Learning Design Practice

    ERIC Educational Resources Information Center

    Matthews, Michael T.; Williams, Gregory S.; Yanchar, Stephen C.; McDonald, Jason K.

    2017-01-01

    The notion of designer empathy has become a cornerstone of design philosophy in fields such as product design, human-computer interaction, and service design. But the literature on instructional designer empathy and learner analysis suggests that distance learning designers are generally quite removed from the learners with whom they could be…

  18. Rock mechanics. Practical use in civil engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murakami, S.

    1985-01-01

    Because of the recent development of computer technology, a systematic analysis of the stability and behavior of rock is gradually progressing as rock mechanics. Although its progress is still behind that of engineering geology, the book aims to contribute to the systematization of the subject. Examples of design are given.

  19. Psychometric Measurement Models and Artificial Neural Networks

    ERIC Educational Resources Information Center

    Sese, Albert; Palmer, Alfonso L.; Montano, Juan J.

    2004-01-01

    The study of measurement models in psychometrics by means of dimensionality reduction techniques such as Principal Components Analysis (PCA) is a very common practice. In recent times, an upsurge of interest in the study of artificial neural networks apt to computing a principal component extraction has been observed. Despite this interest, the…

  20. Testing Different Model Building Procedures Using Multiple Regression.

    ERIC Educational Resources Information Center

    Thayer, Jerome D.

    The stepwise regression method of selecting predictors for computer assisted multiple regression analysis was compared with forward, backward, and best subsets regression, using 16 data sets. The results indicated the stepwise method was preferred because of its practical nature, when the models chosen by different selection methods were similar…

  1. Designing Adaptive Instruction for Teams: A Meta-Analysis

    ERIC Educational Resources Information Center

    Sottilare, Robert A.; Shawn Burke, C.; Salas, Eduardo; Sinatra, Anne M.; Johnston, Joan H.; Gilbert, Stephen B.

    2018-01-01

    The goal of this research was the development of a practical architecture for the computer-based tutoring of teams. This article examines the relationship of team behaviors as antecedents to successful team performance and learning during adaptive instruction guided by Intelligent Tutoring Systems (ITSs). Adaptive instruction is a training or…

  2. The semantics of Chemical Markup Language (CML) for computational chemistry : CompChem.

    PubMed

    Phadungsukanan, Weerapong; Kraft, Markus; Townsend, Joe A; Murray-Rust, Peter

    2012-08-07

    : This paper introduces a subdomain chemistry format for storing computational chemistry data called CompChem. It has been developed based on the design, concepts and methodologies of Chemical Markup Language (CML) by adding computational chemistry semantics on top of the CML Schema. The format allows a wide range of ab initio quantum chemistry calculations of individual molecules to be stored. These calculations include, for example, single point energy calculation, molecular geometry optimization, and vibrational frequency analysis. The paper also describes the supporting infrastructure, such as processing software, dictionaries, validation tools and database repositories. In addition, some of the challenges and difficulties in developing common computational chemistry dictionaries are discussed. The uses of CompChem are illustrated by two practical applications.

  3. The semantics of Chemical Markup Language (CML) for computational chemistry : CompChem

    PubMed Central

    2012-01-01

    This paper introduces a subdomain chemistry format for storing computational chemistry data called CompChem. It has been developed based on the design, concepts and methodologies of Chemical Markup Language (CML) by adding computational chemistry semantics on top of the CML Schema. The format allows a wide range of ab initio quantum chemistry calculations of individual molecules to be stored. These calculations include, for example, single point energy calculation, molecular geometry optimization, and vibrational frequency analysis. The paper also describes the supporting infrastructure, such as processing software, dictionaries, validation tools and database repositories. In addition, some of the challenges and difficulties in developing common computational chemistry dictionaries are discussed. The uses of CompChem are illustrated by two practical applications. PMID:22870956

  4. Aquatic Toxic Analysis by Monitoring Fish Behavior Using Computer Vision: A Recent Progress

    PubMed Central

    Fu, Longwen; Liu, Zuoyi

    2018-01-01

    Video tracking based biological early warning system achieved a great progress with advanced computer vision and machine learning methods. Ability of video tracking of multiple biological organisms has been largely improved in recent years. Video based behavioral monitoring has become a common tool for acquiring quantified behavioral data for aquatic risk assessment. Investigation of behavioral responses under chemical and environmental stress has been boosted by rapidly developed machine learning and artificial intelligence. In this paper, we introduce the fundamental of video tracking and present the pioneer works in precise tracking of a group of individuals in 2D and 3D space. Technical and practical issues suffered in video tracking are explained. Subsequently, the toxic analysis based on fish behavioral data is summarized. Frequently used computational methods and machine learning are explained with their applications in aquatic toxicity detection and abnormal pattern analysis. Finally, advantages of recent developed deep learning approach in toxic prediction are presented. PMID:29849612

  5. Launching genomics into the cloud: deployment of Mercury, a next generation sequence analysis pipeline

    PubMed Central

    2014-01-01

    Background Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. Results To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. Conclusions By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples. PMID:24475911

  6. The multi-modal Australian ScienceS Imaging and Visualization Environment (MASSIVE) high performance computing infrastructure: applications in neuroscience and neuroinformatics research

    PubMed Central

    Goscinski, Wojtek J.; McIntosh, Paul; Felzmann, Ulrich; Maksimenko, Anton; Hall, Christopher J.; Gureyev, Timur; Thompson, Darren; Janke, Andrew; Galloway, Graham; Killeen, Neil E. B.; Raniga, Parnesh; Kaluza, Owen; Ng, Amanda; Poudel, Govinda; Barnes, David G.; Nguyen, Toan; Bonnington, Paul; Egan, Gary F.

    2014-01-01

    The Multi-modal Australian ScienceS Imaging and Visualization Environment (MASSIVE) is a national imaging and visualization facility established by Monash University, the Australian Synchrotron, the Commonwealth Scientific Industrial Research Organization (CSIRO), and the Victorian Partnership for Advanced Computing (VPAC), with funding from the National Computational Infrastructure and the Victorian Government. The MASSIVE facility provides hardware, software, and expertise to drive research in the biomedical sciences, particularly advanced brain imaging research using synchrotron x-ray and infrared imaging, functional and structural magnetic resonance imaging (MRI), x-ray computer tomography (CT), electron microscopy and optical microscopy. The development of MASSIVE has been based on best practice in system integration methodologies, frameworks, and architectures. The facility has: (i) integrated multiple different neuroimaging analysis software components, (ii) enabled cross-platform and cross-modality integration of neuroinformatics tools, and (iii) brought together neuroimaging databases and analysis workflows. MASSIVE is now operational as a nationally distributed and integrated facility for neuroinfomatics and brain imaging research. PMID:24734019

  7. Introduction to Social Network Analysis

    NASA Astrophysics Data System (ADS)

    Zaphiris, Panayiotis; Ang, Chee Siang

    Social Network analysis focuses on patterns of relations between and among people, organizations, states, etc. It aims to describe networks of relations as fully as possible, identify prominent patterns in such networks, trace the flow of information through them, and discover what effects these relations and networks have on people and organizations. Social network analysis offers a very promising potential for analyzing human-human interactions in online communities (discussion boards, newsgroups, virtual organizations). This Tutorial provides an overview of this analytic technique and demonstrates how it can be used in Human Computer Interaction (HCI) research and practice, focusing especially on Computer Mediated Communication (CMC). This topic acquires particular importance these days, with the increasing popularity of social networking websites (e.g., youtube, myspace, MMORPGs etc.) and the research interest in studying them.

  8. Design of a Performance-Responsive Drill and Practice Algorithm for Computer-Based Training.

    ERIC Educational Resources Information Center

    Vazquez-Abad, Jesus; LaFleur, Marc

    1990-01-01

    Reviews criticisms of the use of drill and practice programs in educational computing and describes potentials for its use in instruction. Topics discussed include guidelines for developing computer-based drill and practice; scripted training courseware; item format design; item bank design; and a performance-responsive algorithm for item…

  9. Achieving benefit for patients in primary care informatics: the report of a international consensus workshop at Medinfo 2007.

    PubMed

    de Lusignan, Simon; Teasdale, Sheila

    2007-01-01

    Landmark reports suggest that sharing health data between clinical computer systems should improve patient safety and the quality of care. Enhancing the use of informatics in primary care is usually a key part of these strategies. To synthesise the learning from the international use of informatics in primary care. The workshop was attended by 21 delegates drawn from all continents. There were presentations from USA, UK and the Netherlands, and informal updates from Australia, Argentina, and Sweden and the Nordic countries. These presentations were discussed in a workshop setting to identify common issues. Key principles were synthesised through a post-workshop analysis and then sorted into themes. Themes emerged about the deployment of informatics which can be applied at health service, practice and individual clinical consultation level: 1 At the health service or provider level, success appeared proportional to the extent of collaboration between a broad range of stakeholders and identification of leaders. 2 Within the practice much is currently being achieved with legacy computer systems and apparently outdated coding systems. This includes prescribing safety alerts, clinical audit and promoting computer data recording and quality. 3 In the consultation the computer is a 'big player' and may make traditional models of the consultation redundant. We should make more efforts to share learning; develop clear internationally acceptable definitions; highlight gaps between pockets of excellence and real-world practice, and most importantly suggest how they might be bridged. Knowledge synthesis from different health systems may provide a greater understanding of how the third actor (the computer) is best used in primary care.

  10. A network-analysis-based comparative study of the throughput behavior of polymer melts in barrier screw geometries

    NASA Astrophysics Data System (ADS)

    Aigner, M.; Köpplmayr, T.; Kneidinger, C.; Miethlinger, J.

    2014-05-01

    Barrier screws are widely used in the plastics industry. Due to the extreme diversity of their geometries, describing the flow behavior is difficult and rarely done in practice. We present a systematic approach based on networks that uses tensor algebra and numerical methods to model and calculate selected barrier screw geometries in terms of pressure, mass flow, and residence time. In addition, we report the results of three-dimensional simulations using the commercially available ANSYS Polyflow software. The major drawbacks of three-dimensional finite-element-method (FEM) simulations are that they require vast computational power and, large quantities of memory, and consume considerable time to create a geometric model created by computer-aided design (CAD) and complete a flow calculation. Consequently, a modified 2.5-dimensional finite volume method, termed network analysis is preferable. The results obtained by network analysis and FEM simulations correlated well. Network analysis provides an efficient alternative to complex FEM software in terms of computing power and memory consumption. Furthermore, typical barrier screw geometries can be parameterized and used for flow calculations without timeconsuming CAD-constructions.

  11. Game-Based Practice versus Traditional Practice in Computer-Based Writing Strategy Training: Effects on Motivation and Achievement

    ERIC Educational Resources Information Center

    Proske, Antje; Roscoe, Rod D.; McNamara, Danielle S.

    2014-01-01

    Achieving sustained student engagement with practice in computer-based writing strategy training can be a challenge. One potential solution is to foster engagement by embedding practice in educational games; yet there is currently little research comparing the effectiveness of game-based practice versus more traditional forms of practice. In this…

  12. Good enough practices in scientific computing.

    PubMed

    Wilson, Greg; Bryan, Jennifer; Cranston, Karen; Kitzes, Justin; Nederbragt, Lex; Teal, Tracy K

    2017-06-01

    Computers are now essential in all branches of science, but most researchers are never taught the equivalent of basic lab skills for research computing. As a result, data can get lost, analyses can take much longer than necessary, and researchers are limited in how effectively they can work with software and data. Computing workflows need to follow the same practices as lab projects and notebooks, with organized data, documented steps, and the project structured for reproducibility, but researchers new to computing often don't know where to start. This paper presents a set of good computing practices that every researcher can adopt, regardless of their current level of computational skill. These practices, which encompass data management, programming, collaborating with colleagues, organizing projects, tracking work, and writing manuscripts, are drawn from a wide variety of published sources from our daily lives and from our work with volunteer organizations that have delivered workshops to over 11,000 people since 2010.

  13. GPU accelerated dynamic functional connectivity analysis for functional MRI data.

    PubMed

    Akgün, Devrim; Sakoğlu, Ünal; Esquivel, Johnny; Adinoff, Bryon; Mete, Mutlu

    2015-07-01

    Recent advances in multi-core processors and graphics card based computational technologies have paved the way for an improved and dynamic utilization of parallel computing techniques. Numerous applications have been implemented for the acceleration of computationally-intensive problems in various computational science fields including bioinformatics, in which big data problems are prevalent. In neuroimaging, dynamic functional connectivity (DFC) analysis is a computationally demanding method used to investigate dynamic functional interactions among different brain regions or networks identified with functional magnetic resonance imaging (fMRI) data. In this study, we implemented and analyzed a parallel DFC algorithm based on thread-based and block-based approaches. The thread-based approach was designed to parallelize DFC computations and was implemented in both Open Multi-Processing (OpenMP) and Compute Unified Device Architecture (CUDA) programming platforms. Another approach developed in this study to better utilize CUDA architecture is the block-based approach, where parallelization involves smaller parts of fMRI time-courses obtained by sliding-windows. Experimental results showed that the proposed parallel design solutions enabled by the GPUs significantly reduce the computation time for DFC analysis. Multicore implementation using OpenMP on 8-core processor provides up to 7.7× speed-up. GPU implementation using CUDA yielded substantial accelerations ranging from 18.5× to 157× speed-up once thread-based and block-based approaches were combined in the analysis. Proposed parallel programming solutions showed that multi-core processor and CUDA-supported GPU implementations accelerated the DFC analyses significantly. Developed algorithms make the DFC analyses more practical for multi-subject studies with more dynamic analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. A Gaussian Approximation Approach for Value of Information Analysis.

    PubMed

    Jalal, Hawre; Alarid-Escudero, Fernando

    2018-02-01

    Most decisions are associated with uncertainty. Value of information (VOI) analysis quantifies the opportunity loss associated with choosing a suboptimal intervention based on current imperfect information. VOI can inform the value of collecting additional information, resource allocation, research prioritization, and future research designs. However, in practice, VOI remains underused due to many conceptual and computational challenges associated with its application. Expected value of sample information (EVSI) is rooted in Bayesian statistical decision theory and measures the value of information from a finite sample. The past few years have witnessed a dramatic growth in computationally efficient methods to calculate EVSI, including metamodeling. However, little research has been done to simplify the experimental data collection step inherent to all EVSI computations, especially for correlated model parameters. This article proposes a general Gaussian approximation (GA) of the traditional Bayesian updating approach based on the original work by Raiffa and Schlaifer to compute EVSI. The proposed approach uses a single probabilistic sensitivity analysis (PSA) data set and involves 2 steps: 1) a linear metamodel step to compute the EVSI on the preposterior distributions and 2) a GA step to compute the preposterior distribution of the parameters of interest. The proposed approach is efficient and can be applied for a wide range of data collection designs involving multiple non-Gaussian parameters and unbalanced study designs. Our approach is particularly useful when the parameters of an economic evaluation are correlated or interact.

  15. Convergence acceleration of computer methods for grounding analysis in stratified soils

    NASA Astrophysics Data System (ADS)

    Colominas, I.; París, J.; Navarrina, F.; Casteleiro, M.

    2010-06-01

    The design of safe grounding systems in electrical installations is essential to assure the protection of the equipment, the power supply continuity and the security of the persons. In order to achieve these goals, it is necessary to compute the equivalent electrical resistance of the system and the potential distribution on the earth surface when a fault condition occurs. In the last years the authors have developed a numerical formulation based on the BEM for the analysis of grounding systems embedded in uniform and layered soils. As it is known, in practical cases the underlying series have a poor rate of convergence and the use of multilayer soils requires an out of range computational cost. In this paper we present an efficient technique based on the Aitken δ2-process in order to improve the rate of convergence of the involved series expansions.

  16. Innovation and Integration: Case Studies of Effective Teacher Practices in the Use of Handheld Computers

    ERIC Educational Resources Information Center

    Chavez, Raymond Anthony

    2010-01-01

    Previous research conducted on the use of handheld computers in K-12 education has focused on how handheld computer use affects student motivation, engagement, and productivity. These four case studies sought to identify effective teacher practices in the integration of handhelds into the curriculum and the factors that affect those practices. The…

  17. The Role of Computer Technology in Teaching Reading and Writing: Preschool Teachers' Beliefs and Practices

    ERIC Educational Resources Information Center

    Ihmeideh, Fathi

    2010-01-01

    This study investigated preschool teachers' beliefs and practices regarding the use of computer technology in teaching reading and writing in Jordan. The researcher developed a questionnaire consisting of two scales--Teachers' Beliefs Scale (TB Scale) and Teachers' Practices Scale (TP Scale)--to examine the role of computer technology in teaching…

  18. Interfacing computers and the internet with your allergy practice.

    PubMed

    Bernstein, Jonathan A

    2004-10-01

    Computers and the internet have begun to play a prominent role in the medical profession and, in particular, the allergy specialty. Computer technology is being used more frequently for patient and physician education, asthma management in children and adults, including environmental control, generating patient databases for research and clinical practice and in marketing and e-commerce. This article will review how computers and the internet have begun to interface with the allergy subspecialty practice in these various areas.

  19. Post-Optimality Analysis In Aerospace Vehicle Design

    NASA Technical Reports Server (NTRS)

    Braun, Robert D.; Kroo, Ilan M.; Gage, Peter J.

    1993-01-01

    This analysis pertains to the applicability of optimal sensitivity information to aerospace vehicle design. An optimal sensitivity (or post-optimality) analysis refers to computations performed once the initial optimization problem is solved. These computations may be used to characterize the design space about the present solution and infer changes in this solution as a result of constraint or parameter variations, without reoptimizing the entire system. The present analysis demonstrates that post-optimality information generated through first-order computations can be used to accurately predict the effect of constraint and parameter perturbations on the optimal solution. This assessment is based on the solution of an aircraft design problem in which the post-optimality estimates are shown to be within a few percent of the true solution over the practical range of constraint and parameter variations. Through solution of a reusable, single-stage-to-orbit, launch vehicle design problem, this optimal sensitivity information is also shown to improve the efficiency of the design process, For a hierarchically decomposed problem, this computational efficiency is realized by estimating the main-problem objective gradient through optimal sep&ivity calculations, By reducing the need for finite differentiation of a re-optimized subproblem, a significant decrease in the number of objective function evaluations required to reach the optimal solution is obtained.

  20. [Sedentary behaviour 13-years-olds and its association with selected health behaviours, parenting practices and body mass].

    PubMed

    Jodkowska, Maria; Tabak, Izabela; Oblacińska, Anna; Stalmach, Magdalena

    2013-01-01

    1. To estimate the time spent in sedentary behaviour (watching TV, using the computer, doing homework). 2. To assess the link between the total time spent on watching TV, using the computer, doing homework and dietary habits, physical activity, parental practices and body mass. Cross-sectional study was conducted in Poland in 2008 among 13-year olds (n=600). They self-reported their time of TV viewing, computer use and homework. Their dietary behaviours, physical activity (MVPA) and parenting practices were also self-reported. Height and weight were measured by school nurses. Descriptive statistics and correlation were used in this analysis. The mean time spent watching television in school days was 2.3 hours for girls and 2.2 for boys. Boys spent significantly more time using the computer than girls - respectively 1.8 and 1.5 hours, while girls took longer doing homework - respectively 1.7 and 1.3 hours. Mean screen time was about 4 hours in school days and about 6 hours during weekend, statistically longer for boys in weekdays. Screen time was positively associated with intake of sweets, chips, soft drinks, "fast food" and meals consumption during TV, and negatively with regularity of meals and parental supervision. There was no correlation between screen time with physical activity and body mass. Sedentary behaviours and physical activity are not competing behaviours in Polish teenagers, but their relationship with unhealthy dietary patterns may lead to development of obesity. Good parental practices, both mother's and father's supervision seems to be crucial for screen time limitation in their children. Parents should become aware that relevant lifestyle monitoring of their children is a crucial element of health education in prevention of civilization diseases. This is a task for both healthcare workers and educational staff.

  1. BOOK REVIEW: Vortex Methods: Theory and Practice

    NASA Astrophysics Data System (ADS)

    Cottet, G.-H.; Koumoutsakos, P. D.

    2001-03-01

    The book Vortex Methods: Theory and Practice presents a comprehensive account of the numerical technique for solving fluid flow problems. It provides a very nice balance between the theoretical development and analysis of the various techniques and their practical implementation. In fact, the presentation of the rigorous mathematical analysis of these methods instills confidence in their implementation. The book goes into some detail on the more recent developments that attempt to account for viscous effects, in particular the presence of viscous boundary layers in some flows of interest. The presentation is very readable, with most points illustrated with well-chosen examples, some quite sophisticated. It is a very worthy reference book that should appeal to a large body of readers, from those interested in the mathematical analysis of the methods to practitioners of computational fluid dynamics. The use of the book as a text is compromised by its lack of exercises for students, but it could form the basis of a graduate special topics course. Juan Lopez

  2. The outcomes of anxiety, confidence, and self-efficacy with Internet health information retrieval in older adults: a pilot study.

    PubMed

    Chu, Adeline; Mastel-Smith, Beth

    2010-01-01

    Technology has a great impact on nursing practice. With the increasing numbers of older Americans using computers and the Internet in recent years, nurses have the capability to deliver effective and efficient health education to their patients and the community. Based on the theoretical framework of Bandura's self-efficacy theory, the pilot project reported findings from a 5-week computer course on Internet health searches in older adults, 65 years or older, at a senior activity learning center. Twelve participants were recruited and randomized to either the intervention or the control group. Measures of computer anxiety, computer confidence, and computer self-efficacy scores were analyzed at baseline, at the end of the program, and 6 weeks after the completion of the program. Analysis was conducted with repeated-measures analysis of variance. Findings showed participants who attended a structured computer course on Internet health information retrieval reported lowered anxiety and increased confidence and self-efficacy at the end of the 5-week program and 6 weeks after the completion of the program as compared with participants who were not in the program. The study demonstrated that a computer course can help reduce anxiety and increase confidence and self-efficacy in online health searches in older adults.

  3. Thermal-hydraulics Analysis of a Radioisotope-powered Mars Hopper Propulsion System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robert C. O'Brien; Andrew C. Klein; William T. Taitano

    Thermal-hydraulics analyses results produced using a combined suite of computational design and analysis codes are presented for the preliminary design of a concept Radioisotope Thermal Rocket (RTR) propulsion system. Modeling of the transient heating and steady state temperatures of the system is presented. Simulation results for propellant blow down during impulsive operation are also presented. The results from this study validate the feasibility of a practical thermally capacitive RTR propulsion system.

  4. Real longitudinal data analysis for real people: building a good enough mixed model.

    PubMed

    Cheng, Jing; Edwards, Lloyd J; Maldonado-Molina, Mildred M; Komro, Kelli A; Muller, Keith E

    2010-02-20

    Mixed effects models have become very popular, especially for the analysis of longitudinal data. One challenge is how to build a good enough mixed effects model. In this paper, we suggest a systematic strategy for addressing this challenge and introduce easily implemented practical advice to build mixed effects models. A general discussion of the scientific strategies motivates the recommended five-step procedure for model fitting. The need to model both the mean structure (the fixed effects) and the covariance structure (the random effects and residual error) creates the fundamental flexibility and complexity. Some very practical recommendations help to conquer the complexity. Centering, scaling, and full-rank coding of all the predictor variables radically improve the chances of convergence, computing speed, and numerical accuracy. Applying computational and assumption diagnostics from univariate linear models to mixed model data greatly helps to detect and solve the related computational problems. Applying computational and assumption diagnostics from the univariate linear models to the mixed model data can radically improve the chances of convergence, computing speed, and numerical accuracy. The approach helps to fit more general covariance models, a crucial step in selecting a credible covariance model needed for defensible inference. A detailed demonstration of the recommended strategy is based on data from a published study of a randomized trial of a multicomponent intervention to prevent young adolescents' alcohol use. The discussion highlights a need for additional covariance and inference tools for mixed models. The discussion also highlights the need for improving how scientists and statisticians teach and review the process of finding a good enough mixed model. (c) 2009 John Wiley & Sons, Ltd.

  5. Structure preserving parallel algorithms for solving the Bethe–Salpeter eigenvalue problem

    DOE PAGES

    Shao, Meiyue; da Jornada, Felipe H.; Yang, Chao; ...

    2015-10-02

    The Bethe–Salpeter eigenvalue problem is a dense structured eigenvalue problem arising from discretized Bethe–Salpeter equation in the context of computing exciton energies and states. A computational challenge is that at least half of the eigenvalues and the associated eigenvectors are desired in practice. In this paper, we establish the equivalence between Bethe–Salpeter eigenvalue problems and real Hamiltonian eigenvalue problems. Based on theoretical analysis, structure preserving algorithms for a class of Bethe–Salpeter eigenvalue problems are proposed. We also show that for this class of problems all eigenvalues obtained from the Tamm–Dancoff approximation are overestimated. In order to solve large scale problemsmore » of practical interest, we discuss parallel implementations of our algorithms targeting distributed memory systems. Finally, several numerical examples are presented to demonstrate the efficiency and accuracy of our algorithms.« less

  6. Interpretation of impeller flow calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuzson, J.

    1993-09-01

    Most available computer programs are analysis and not design programs. Therefore the intervention of the designer is indispensable. Guidelines are needed to evaluate the degree of fluid mechanic perfection of a design which is compromised for practical reasons. A new way of plotting the computer output is proposed here which illustrates the energy distribution throughout the flow. The consequence of deviating from optimal flow pattern is discussed and specific cases are reviewed. A criterion is derived for the existence of a jet/wake flow pattern and for the minimum wake mixing loss.

  7. LANDSAT technology transfer to the private and public sectors through community colleges and other locally available institutions

    NASA Technical Reports Server (NTRS)

    Rogers, R. H. (Principal Investigator)

    1980-01-01

    The results achieved during the first eight months of a program to transfer LANDSAT technology to practicing professionals in the private and public sectors (grass roots) through community colleges and other locally available institutions are reported. The approach offers hands-on interactive analysis training and demonstrations through the use of color desktop computer terminals communicating with a host computer by telephone lines. The features of the terminals and associated training materials are reviewed together with plans for their use in training and demonstration projects.

  8. Physics and engineering aspects of cell and tissue imaging systems: microscopic devices and computer assisted diagnosis.

    PubMed

    Chen, Xiaodong; Ren, Liqiang; Zheng, Bin; Liu, Hong

    2013-01-01

    The conventional optical microscopes have been used widely in scientific research and in clinical practice. The modern digital microscopic devices combine the power of optical imaging and computerized analysis, archiving and communication techniques. It has a great potential in pathological examinations for improving the efficiency and accuracy of clinical diagnosis. This chapter reviews the basic optical principles of conventional microscopes, fluorescence microscopes and electron microscopes. The recent developments and future clinical applications of advanced digital microscopic imaging methods and computer assisted diagnosis schemes are also discussed.

  9. A study of the dynamics of rotating space stations with elastically connected counterweight and attached flexible appendages. Volume 1: Theory

    NASA Technical Reports Server (NTRS)

    Austin, F.; Markowitz, J.; Goldenberg, S.; Zetkov, G. A.

    1973-01-01

    The formulation of a mathematical model for predicting the dynamic behavior of rotating flexible space station configurations was conducted. The overall objectives of the study were: (1) to develop the theoretical techniques for determining the behavior of a realistically modeled rotating space station, (2) to provide a versatile computer program for the numerical analysis, and (3) to present practical concepts for experimental verification of the analytical results. The mathematical model and its associated computer program are described.

  10. Computer-Mediated Social Support for Physical Activity: A Content Analysis

    ERIC Educational Resources Information Center

    Stragier, Jeroen; Mechant, Peter; De Marez, Lieven; Cardon, Greet

    2018-01-01

    Purpose: Online fitness communities are a recent phenomenon experiencing growing user bases. They can be considered as online social networks in which recording, monitoring, and sharing of physical activity (PA) are the most prevalent practices. They have added a new dimension to the social experience of PA in which online peers function as…

  11. Learning about Language and Learners from Computer Programs

    ERIC Educational Resources Information Center

    Cobb, Tom

    2010-01-01

    Making Nation's text analysis software accessible via the World Wide Web has opened up an exploration of how his learning principles can best be realized in practice. This paper discusses 3 representative episodes in the ongoing exploration. The first concerns an examination of the assumptions behind modeling what texts look like to learners with…

  12. The Use of Images in Intelligent Advisor Systems.

    ERIC Educational Resources Information Center

    Boulet, Marie-Michele

    This paper describes the intelligent advisor system, named CODAMA, used in teaching a university-level systems analysis and design course. The paper discusses: (1) the use of CODAMA to assist students to transfer theoretical knowledge to the practical; (2) details of how CODAMA is applied in conjunction with a computer-aided software engineering…

  13. Second Life as a Support Element for Learning Electronic Related Subjects: A Real Case

    ERIC Educational Resources Information Center

    Beltran Sierra, Luis M.; Gutierrez, Ronald S.; Garzon-Castro, Claudia L.

    2012-01-01

    Looking for more active and motivating methodological alternatives from the students' perspective, which promote analysis and investigation abilities that make the student a more participative agent and some learning processes are facilitated, a practical study was conducted in the University of La Sabana (Chia, Colombia), in Computing Engineering…

  14. Health adaptation policy for climate vulnerable groups: a 'critical computational linguistics' analysis.

    PubMed

    Seidel, Bastian M; Bell, Erica

    2014-11-28

    Many countries are developing or reviewing national adaptation policy for climate change but the extent to which these meet the health needs of vulnerable groups has not been assessed. This study examines the adequacy of such policies for nine known climate-vulnerable groups: people with mental health conditions, Aboriginal people, culturally and linguistically diverse groups, aged people, people with disabilities, rural communities, children, women, and socioeconomically disadvantaged people. The study analyses an exhaustive sample of national adaptation policy documents from Annex 1 ('developed') countries of the United Nations Framework Convention on Climate Change: 20 documents from 12 countries. A 'critical computational linguistics' method was used involving novel software-driven quantitative mapping and traditional critical discourse analysis. The study finds that references to vulnerable groups are relatively little present or non-existent, as well as poorly connected to language about practical strategies and socio-economic contexts, both also little present. The conclusions offer strategies for developing policy that is better informed by a 'social determinants of health' definition of climate vulnerability, consistent with best practice in the literature and global policy prescriptions.

  15. Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Y; Glascoe, L

    The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less

  16. Computer Activities for Persons With Dementia.

    PubMed

    Tak, Sunghee H; Zhang, Hongmei; Patel, Hetal; Hong, Song Hee

    2015-06-01

    The study examined participant's experience and individual characteristics during a 7-week computer activity program for persons with dementia. The descriptive study with mixed methods design collected 612 observational logs of computer sessions from 27 study participants, including individual interviews before and after the program. Quantitative data analysis included descriptive statistics, correlational coefficients, t-test, and chi-square. Content analysis was used to analyze qualitative data. Each participant averaged 23 sessions and 591min for 7 weeks. Computer activities included slide shows with music, games, internet use, and emailing. On average, they had a high score of intensity in engagement per session. Women attended significantly more sessions than men. Higher education level was associated with a higher number of different activities used per session and more time spent on online games. Older participants felt more tired. Feeling tired was significantly correlated with a higher number of weeks with only one session attendance per week. More anticholinergic medications taken by participants were significantly associated with a higher percentage of sessions with disengagement. The findings were significant at p < .05. Qualitative content analysis indicated tailoring computer activities appropriate to individual's needs and functioning is critical. All participants needed technical assistance. A framework for tailoring computer activities may provide guidance on developing and maintaining treatment fidelity of tailored computer activity interventions among persons with dementia. Practice guidelines and education protocols may assist caregivers and service providers to integrate computer activities into homes and aging services settings. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Effects of Computer-Based Practice on the Acquisition and Maintenance of Basic Academic Skills for Children with Moderate to Intensive Educational Needs

    ERIC Educational Resources Information Center

    Everhart, Julie M.; Alber-Morgan, Sheila R.; Park, Ju Hee

    2011-01-01

    This study investigated the effects of computer-based practice on the acquisition and maintenance of basic academic skills for two children with moderate to intensive disabilities. The special education teacher created individualized computer games that enabled the participants to independently practice academic skills that corresponded with their…

  18. Fracture risk assessment: improved evaluation of vertebral integrity among metastatic cancer patients to aid in surgical decision-making

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Camp, Jon J.; Holmes, David R.; Huddleston, Paul M.; Lu, Lichun; Yaszemski, Michael J.; Robb, Richard A.

    2012-03-01

    Failure of the spine's structural integrity from metastatic disease can lead to both pain and neurologic deficit. Fractures that require treatment occur in over 30% of bony metastases. Our objective is to use computed tomography (CT) in conjunction with analytic techniques that have been previously developed to predict fracture risk in cancer patients with metastatic disease to the spine. Current clinical practice for cancer patients with spine metastasis often requires an empirical decision regarding spinal reconstructive surgery. Early image-based software systems used for CT analysis are time consuming and poorly suited for clinical application. The Biomedical Image Resource (BIR) at Mayo Clinic, Rochester has developed an image analysis computer program that calculates from CT scans, the residual load-bearing capacity in a vertebra with metastatic cancer. The Spine Cancer Assessment (SCA) program is built on a platform designed for clinical practice, with a workflow format that allows for rapid selection of patient CT exams, followed by guided image analysis tasks, resulting in a fracture risk report. The analysis features allow the surgeon to quickly isolate a single vertebra and obtain an immediate pre-surgical multiple parallel section composite beam fracture risk analysis based on algorithms developed at Mayo Clinic. The analysis software is undergoing clinical validation studies. We expect this approach will facilitate patient management and utilization of reliable guidelines for selecting among various treatment option based on fracture risk.

  19. Cloud Fingerprinting: Using Clock Skews To Determine Co Location Of Virtual Machines

    DTIC Science & Technology

    2016-09-01

    DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Cloud computing has quickly revolutionized computing practices of organizations, to include the Department of... Cloud computing has quickly revolutionized computing practices of organizations, to in- clude the Department of Defense. However, security concerns...vi Table of Contents 1 Introduction 1 1.1 Proliferation of Cloud Computing . . . . . . . . . . . . . . . . . . 1 1.2 Problem Statement

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bakosi, Jozsef; Christon, Mark A.; Francois, Marianne M.

    Progress is reported on computational capabilities for the grid-to-rod-fretting (GTRF) problem of pressurized water reactors. Numeca's Hexpress/Hybrid mesh generator is demonstrated as an excellent alternative to generating computational meshes for complex flow geometries, such as in GTRF. Mesh assessment is carried out using standard industrial computational fluid dynamics practices. Hydra-TH, a simulation code developed at LANL for reactor thermal-hydraulics, is demonstrated on hybrid meshes, containing different element types. A series of new Hydra-TH calculations has been carried out collecting turbulence statistics. Preliminary results on the newly generated meshes are discussed; full analysis will be documented in the L3 milestone, THM.CFD.P5.05,more » Sept. 2012.« less

  1. A tutorial on the use of ROC analysis for computer-aided diagnostic systems.

    PubMed

    Scheipers, Ulrich; Perrey, Christian; Siebers, Stefan; Hansen, Christian; Ermert, Helmut

    2005-07-01

    The application of the receiver operating characteristic (ROC) curve for computer-aided diagnostic systems is reviewed. A statistical framework is presented and different methods of evaluating the classification performance of computer-aided diagnostic systems, and, in particular, systems for ultrasonic tissue characterization, are derived. Most classifiers that are used today are dependent on a separation threshold, which can be chosen freely in many cases. The separation threshold separates the range of output values of the classification system into different target groups, thus conducting the actual classification process. In the first part of this paper, threshold specific performance measures, e.g., sensitivity and specificity, are presented. In the second part, a threshold-independent performance measure, the area under the ROC curve, is reviewed. Only the use of separation threshold-independent performance measures provides classification results that are overall representative for computer-aided diagnostic systems. The following text was motivated by the lack of a complete and definite discussion of the underlying subject in available textbooks, references and publications. Most manuscripts published so far address the theme of performance evaluation using ROC analysis in a manner too general to be practical for everyday use in the development of computer-aided diagnostic systems. Nowadays, the user of computer-aided diagnostic systems typically handles huge amounts of numerical data, not always distributed normally. Many assumptions made in more or less theoretical works on ROC analysis are no longer valid for real-life data. The paper aims at closing the gap between theoretical works and real-life data. The review provides the interested scientist with information needed to conduct ROC analysis and to integrate algorithms performing ROC analysis into classification systems while understanding the basic principles of classification.

  2. Technical Considerations on Scanning and Image Analysis for Amyloid PET in Dementia.

    PubMed

    Akamatsu, Go; Ohnishi, Akihito; Aita, Kazuki; Ikari, Yasuhiko; Yamamoto, Yasuji; Senda, Michio

    2017-01-01

    Brain imaging techniques, such as computed tomography (CT), magnetic resonance imaging (MRI), single photon emission computed tomography (SPECT), and positron emission tomography (PET), can provide essential and objective information for the early and differential diagnosis of dementia. Amyloid PET is especially useful to evaluate the amyloid-β pathological process as a biomarker of Alzheimer's disease. This article reviews critical points about technical considerations on the scanning and image analysis methods for amyloid PET. Each amyloid PET agent has its own proper administration instructions and recommended uptake time, scan duration, and the method of image display and interpretation. In addition, we have introduced general scanning information, including subject positioning, reconstruction parameters, and quantitative and statistical image analysis. We believe that this article could make amyloid PET a more reliable tool in clinical study and practice.

  3. Designing a Hydro-Economic Collaborative Computer Decision Support System: Approaches, Best Practices, Lessons Learned, and Future Trends

    NASA Astrophysics Data System (ADS)

    Rosenberg, D. E.

    2008-12-01

    Designing and implementing a hydro-economic computer model to support or facilitate collaborative decision making among multiple stakeholders or users can be challenging and daunting. Collaborative modeling is distinguished and more difficult than non-collaborative efforts because of a large number of users with different backgrounds, disagreement or conflict among stakeholders regarding problem definitions, modeling roles, and analysis methods, plus evolving ideas of model scope and scale and needs for information and analysis as stakeholders interact, use the model, and learn about the underlying water system. This presentation reviews the lifecycle for collaborative model making and identifies some key design decisions that stakeholders and model developers must make to develop robust and trusted, verifiable and transparent, integrated and flexible, and ultimately useful models. It advances some best practices to implement and program these decisions. Among these best practices are 1) modular development of data- aware input, storage, manipulation, results recording and presentation components plus ways to couple and link to other models and tools, 2) explicitly structure both input data and the meta data that describes data sources, who acquired it, gaps, and modifications or translations made to put the data in a form usable by the model, 3) provide in-line documentation on model inputs, assumptions, calculations, and results plus ways for stakeholders to document their own model use and share results with others, and 4) flexibly program with graphical object-oriented properties and elements that allow users or the model maintainers to easily see and modify the spatial, temporal, or analysis scope as the collaborative process moves forward. We draw on examples of these best practices from the existing literature, the author's prior work, and some new applications just underway. The presentation concludes by identifying some future directions for collaborative modeling including geo-spatial display and analysis, real-time operations, and internet-based tools plus the design and programming needed to implement these capabilities.

  4. Large-scale neural circuit mapping data analysis accelerated with the graphical processing unit (GPU).

    PubMed

    Shi, Yulin; Veidenbaum, Alexander V; Nicolau, Alex; Xu, Xiangmin

    2015-01-15

    Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post hoc processing and analysis. Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22× speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Large scale neural circuit mapping data analysis accelerated with the graphical processing unit (GPU)

    PubMed Central

    Shi, Yulin; Veidenbaum, Alexander V.; Nicolau, Alex; Xu, Xiangmin

    2014-01-01

    Background Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post-hoc processing and analysis. New Method Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. Results We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22x speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. Comparison with Existing Method(s) To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Conclusions Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. PMID:25277633

  6. An assessment of finite-element modeling techniques for thick-solid/thin-shell joints analysis

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Androlake, S. G.

    1993-01-01

    The subject of finite-element modeling has long been of critical importance to the practicing designer/analyst who is often faced with obtaining an accurate and cost-effective structural analysis of a particular design. Typically, these two goals are in conflict. The purpose is to discuss the topic of finite-element modeling for solid/shell connections (joints) which are significant for the practicing modeler. Several approaches are currently in use, but frequently various assumptions restrict their use. Such techniques currently used in practical applications were tested, especially to see which technique is the most ideally suited for the computer aided design (CAD) environment. Some basic thoughts regarding each technique are also discussed. As a consequence, some suggestions based on the results are given to lead reliable results in geometrically complex joints where the deformation and stress behavior are complicated.

  7. A PRACTICAL ONTOLOGY FOR THE LARGE-SCALE MODELING OF SCHOLARLY ARTIFACTS AND THEIR USAGE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    RODRIGUEZ, MARKO A.; BOLLEN, JOHAN; VAN DE SOMPEL, HERBERT

    2007-01-30

    The large-scale analysis of scholarly artifact usage is constrained primarily by current practices in usage data archiving, privacy issues concerned with the dissemination of usage data, and the lack of a practical ontology for modeling the usage domain. As a remedy to the third constraint, this article presents a scholarly ontology that was engineered to represent those classes for which large-scale bibliographic and usage data exists, supports usage research, and whose instantiation is scalable to the order of 50 million articles along with their associated artifacts (e.g. authors and journals) and an accompanying 1 billion usage events. The real worldmore » instantiation of the presented abstract ontology is a semantic network model of the scholarly community which lends the scholarly process to statistical analysis and computational support. They present the ontology, discuss its instantiation, and provide some example inference rules for calculating various scholarly artifact metrics.« less

  8. Composite theory applied to elastomers

    NASA Technical Reports Server (NTRS)

    Clark, S. K.

    1986-01-01

    Reinforced elastomers form the basis for most of the structural or load carrying applications of rubber products. Computer based structural analysis in the form of finite element codes was highly successful in refining structural design in both isotropic materials and rigid composites. This has lead the rubber industry to attempt to make use of such techniques in the design of structural cord-rubber composites. While such efforts appear promising, they were not easy to achieve for several reasons. Among these is a distinct lack of a clearly defined set of material property descriptors suitable for computer analysis. There are substantial differences between conventional steel, aluminum, or even rigid composites such as graphite-epoxy, and textile-cord reinforced rubber. These differences which are both conceptual and practical are discussed.

  9. Use of Failure in IS Development Statistics: Lessons for IS Curriculum Design

    ERIC Educational Resources Information Center

    Longenecker, Herbert H., Jr.; Babb, Jeffry; Waguespack, Leslie; Tastle, William; Landry, Jeff

    2016-01-01

    The evolution of computing education reflects the history of the professional practice of computing. Keeping computing education current has been a major challenge due to the explosive advances in technologies. Academic programs in Information Systems, a long-standing computing discipline, develop and refine the theory and practice of computing…

  10. Computer conferencing: Choices and strategies

    NASA Technical Reports Server (NTRS)

    Smith, Jill Y.

    1991-01-01

    Computer conferencing permits meeting through the computer while sharing a common file. The primary advantages of computer conferencing are that participants may (1) meet simultaneously or nonsimultaneously, and (2) contribute across geographic distance and time zones. Due to these features, computer conferencing offers a viable meeting option for distributed business teams. Past research and practice is summarized denoting practical uses of computer conferencing as well as types of meeting activities ill suited to the medium. Additionally, effective team strategies are outlined which maximize the benefits of computer conferencing.

  11. Qualitative data analysis: conceptual and practical considerations.

    PubMed

    Liamputtong, Pranee

    2009-08-01

    Qualitative inquiry requires that collected data is organised in a meaningful way, and this is referred to as data analysis. Through analytic processes, researchers turn what can be voluminous data into understandable and insightful analysis. This paper sets out the different approaches that qualitative researchers can use to make sense of their data including thematic analysis, narrative analysis, discourse analysis and semiotic analysis and discusses the ways that qualitative researchers can analyse their data. I first discuss salient issues in performing qualitative data analysis, and then proceed to provide some suggestions on different methods of data analysis in qualitative research. Finally, I provide some discussion on the use of computer-assisted data analysis.

  12. A Crafts-Oriented Approach to Computing in High School: Introducing Computational Concepts, Practices, and Perspectives with Electronic Textiles

    ERIC Educational Resources Information Center

    Kafai, Yasmin B.; Lee, Eunkyoung; Searle, Kristin; Fields, Deborah; Kaplan, Eliot; Lui, Debora

    2014-01-01

    In this article, we examine the use of electronic textiles (e-textiles) for introducing key computational concepts and practices while broadening perceptions about computing. The starting point of our work was the design and implementation of a curriculum module using the LilyPad Arduino in a pre-AP high school computer science class. To…

  13. Learning Together; part 2: training costs and health gain - a cost analysis.

    PubMed

    Cullen, Katherine; Riches, Wendy; Macaulay, Chloe; Spicer, John

    2017-01-01

    Learning Together is a complex educational intervention aimed at improving health outcomes for children and young people. There is an additional cost as two doctors are seeing patients together for a longer appointment than a standard general practice (GP) appointment. Our approach combines the impact of the training clinics on activity in South London in 2014-15 with health gain, using NICE guidance and standards to allow comparison of training options. Activity data was collected from Training Practices hosting Learning Together. A computer based model was developed to analyse the costs of the Learning Together intervention compared to usual training in a partial economic evaluation. The results of the model were used to value the health gain required to make the intervention cost effective. Data were returned for 363 patients booked into 61 clinics across 16 Training Practices. Learning Together clinics resulted in an increase in costs of £37 per clinic. Threshold analysis illustrated one child with a common illness like constipation needs to be well for two weeks, in one Practice hosting four training clinics for the clinics to be considered cost effective. Learning Together is of minimal training cost. Our threshold analysis produced a rubric that can be used locally to test cost effectiveness at a Practice or Programme level.

  14. Accelerating Large Scale Image Analyses on Parallel, CPU-GPU Equipped Systems

    PubMed Central

    Teodoro, George; Kurc, Tahsin M.; Pan, Tony; Cooper, Lee A.D.; Kong, Jun; Widener, Patrick; Saltz, Joel H.

    2014-01-01

    The past decade has witnessed a major paradigm shift in high performance computing with the introduction of accelerators as general purpose processors. These computing devices make available very high parallel computing power at low cost and power consumption, transforming current high performance platforms into heterogeneous CPU-GPU equipped systems. Although the theoretical performance achieved by these hybrid systems is impressive, taking practical advantage of this computing power remains a very challenging problem. Most applications are still deployed to either GPU or CPU, leaving the other resource under- or un-utilized. In this paper, we propose, implement, and evaluate a performance aware scheduling technique along with optimizations to make efficient collaborative use of CPUs and GPUs on a parallel system. In the context of feature computations in large scale image analysis applications, our evaluations show that intelligently co-scheduling CPUs and GPUs can significantly improve performance over GPU-only or multi-core CPU-only approaches. PMID:25419545

  15. From bed to bench: bridging from informatics practice to theory: an exploratory analysis.

    PubMed

    Haux, R; Lehmann, C U

    2014-01-01

    In 2009, Applied Clinical Informatics (ACI)--focused on applications in clinical informatics--was launched as a companion journal to Methods of Information in Medicine (MIM). Both journals are official journals of the International Medical Informatics Association. To explore which congruencies and interdependencies exist in publications from theory to practice and from practice to theory and to determine existing gaps. Major topics discussed in ACI and MIM were analyzed. We explored if the intention of publishing companion journals to provide an information bridge from informatics theory to informatics practice and vice versa could be supported by this model. In this manuscript we will report on congruencies and interdependences from practice to theory and on major topics in MIM. Retrospective, prolective observational study on recent publications of ACI and MIM. All publications of the years 2012 and 2013 were indexed and analyzed. Hundred and ninety-six publications were analyzed (ACI 87, MIM 109). In MIM publications, modelling aspects as well as methodological and evaluation approaches for the analysis of data, information, and knowledge in biomedicine and health care were frequently raised - and often discussed from an interdisciplinary point of view. Important themes were ambient-assisted living, anatomic spatial relations, biomedical informatics as scientific discipline, boosting, coding, computerized physician order entry, data analysis, grid and cloud computing, health care systems and services, health-enabling technologies, health information search, health information systems, imaging, knowledge-based decision support, patient records, signal analysis, and web science. Congruencies between journals could be found in themes, but with a different focus on content. Interdependencies from practice to theory, found in these publications, were only limited. Bridging from informatics theory to practice and vice versa remains a major component of successful research and practice as well as a major challenge.

  16. Stigma of People with Epilepsy in China: Views of health professionals, teachers, employers and community leaders

    PubMed Central

    Yang, Rongrong; Wang, Wenzhi; Snape, Dee; Chen, Gong; Zhang, Lei; Wu, Jianzhong; Baker, Gus A; Zheng, Xiaoying; Jacoby, Ann

    2011-01-01

    To identify the possible sources of stigma of epilepsy in key informant groups, “mini-ethnographic” studies were conducted in rural and urban locations in China. Data from 45 semi-structured interviews and 8 focus group discussions (6 persons each) were analysed to investigate the world experienced by people with epilepsy. Underpinned by a social constructionist approach to data analysis, emerging themes were identified with the use of computer-assisted data analysis (NVivo 8). A hierarchical model was then constructed, to include: Practical Level issues: attitudes to risk, attitudes towards costs of epilepsy; and Cultural Level issues: the contrast between rurality and tradition and urbanization and modernity in the Chinese context. The analysis enriches current research on factors and sources of stigma of epilepsy and highlights issues for future practice. PMID:21606005

  17. Aircraft optimization by a system approach: Achievements and trends

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1992-01-01

    Recently emerging methodology for optimal design of aircraft treated as a system of interacting physical phenomena and parts is examined. The methodology is found to coalesce into methods for hierarchic, non-hierarchic, and hybrid systems all dependent on sensitivity analysis. A separate category of methods has also evolved independent of sensitivity analysis, hence suitable for discrete problems. References and numerical applications are cited. Massively parallel computer processing is seen as enabling technology for practical implementation of the methodology.

  18. A Lumped Computational Model for Sodium Sulfur Battery Analysis

    NASA Astrophysics Data System (ADS)

    Wu, Fan

    Due to the cost of materials and time consuming testing procedures, development of new batteries is a slow and expensive practice. The purpose of this study is to develop a computational model and assess the capabilities of such a model designed to aid in the design process and control of sodium sulfur batteries. To this end, a transient lumped computational model derived from an integral analysis of the transport of species, energy and charge throughout the battery has been developed. The computation processes are coupled with the use of Faraday's law, and solutions for the species concentrations, electrical potential and current are produced in a time marching fashion. Properties required for solving the governing equations are calculated and updated as a function of time based on the composition of each control volume. The proposed model is validated against multi- dimensional simulations and experimental results from literatures, and simulation results using the proposed model is presented and analyzed. The computational model and electrochemical model used to solve the equations for the lumped model are compared with similar ones found in the literature. The results obtained from the current model compare favorably with those from experiments and other models.

  19. Privacy-preserving GWAS analysis on federated genomic datasets.

    PubMed

    Constable, Scott D; Tang, Yuzhe; Wang, Shuang; Jiang, Xiaoqian; Chapin, Steve

    2015-01-01

    The biomedical community benefits from the increasing availability of genomic data to support meaningful scientific research, e.g., Genome-Wide Association Studies (GWAS). However, high quality GWAS usually requires a large amount of samples, which can grow beyond the capability of a single institution. Federated genomic data analysis holds the promise of enabling cross-institution collaboration for effective GWAS, but it raises concerns about patient privacy and medical information confidentiality (as data are being exchanged across institutional boundaries), which becomes an inhibiting factor for the practical use. We present a privacy-preserving GWAS framework on federated genomic datasets. Our method is to layer the GWAS computations on top of secure multi-party computation (MPC) systems. This approach allows two parties in a distributed system to mutually perform secure GWAS computations, but without exposing their private data outside. We demonstrate our technique by implementing a framework for minor allele frequency counting and χ2 statistics calculation, one of typical computations used in GWAS. For efficient prototyping, we use a state-of-the-art MPC framework, i.e., Portable Circuit Format (PCF) 1. Our experimental results show promise in realizing both efficient and secure cross-institution GWAS computations.

  20. Effect of yoga on musculoskeletal discomfort and motor functions in professional computer users.

    PubMed

    Telles, Shirley; Dash, Manoj; Naveen, K V

    2009-01-01

    The self-rated musculoskeletal discomfort, hand grip strength, tapping speed, and low back and hamstring flexibility (based on a sit and reach task) were assessed in 291 professional computer users. They were then randomized as Yoga (YG; n=146) and Wait-list control (WL; n=145) groups. Follow-up assessments for both groups were after 60 days during which the YG group practiced yoga for 60 minutes daily, for 5 days in a week. The WL group spent the same time in their usual recreational activities. At the end of 60 days, the YG group (n=62) showed a significant decrease in the frequency, intensity and degree of interference due to musculoskeletal discomfort, an increase in bilateral hand grip strength, the right hand tapping speed, and low back and hamstring flexibility (repeated measures ANOVA and post hoc analysis with Bonferroni adjustment). In contrast, the WL group (n=56) showed an increase in musculoskeletal discomfort and a decrease in left hand tapping speed. The results suggest that yoga practice is a useful addition to the routine of professional computer users.

  1. Exact posterior computation in non-conjugate Gaussian location-scale parameters models

    NASA Astrophysics Data System (ADS)

    Andrade, J. A. A.; Rathie, P. N.

    2017-12-01

    In Bayesian analysis the class of conjugate models allows to obtain exact posterior distributions, however this class quite restrictive in the sense that it involves only a few distributions. In fact, most of the practical applications involves non-conjugate models, thus approximate methods, such as the MCMC algorithms, are required. Although these methods can deal with quite complex structures, some practical problems can make their applications quite time demanding, for example, when we use heavy-tailed distributions, convergence may be difficult, also the Metropolis-Hastings algorithm can become very slow, in addition to the extra work inevitably required on choosing efficient candidate generator distributions. In this work, we draw attention to the special functions as a tools for Bayesian computation, we propose an alternative method for obtaining the posterior distribution in Gaussian non-conjugate models in an exact form. We use complex integration methods based on the H-function in order to obtain the posterior distribution and some of its posterior quantities in an explicit computable form. Two examples are provided in order to illustrate the theory.

  2. Integrating the Apache Big Data Stack with HPC for Big Data

    NASA Astrophysics Data System (ADS)

    Fox, G. C.; Qiu, J.; Jha, S.

    2014-12-01

    There is perhaps a broad consensus as to important issues in practical parallel computing as applied to large scale simulations; this is reflected in supercomputer architectures, algorithms, libraries, languages, compilers and best practice for application development. However, the same is not so true for data intensive computing, even though commercially clouds devote much more resources to data analytics than supercomputers devote to simulations. We look at a sample of over 50 big data applications to identify characteristics of data intensive applications and to deduce needed runtime and architectures. We suggest a big data version of the famous Berkeley dwarfs and NAS parallel benchmarks and use these to identify a few key classes of hardware/software architectures. Our analysis builds on combining HPC and ABDS the Apache big data software stack that is well used in modern cloud computing. Initial results on clouds and HPC systems are encouraging. We propose the development of SPIDAL - Scalable Parallel Interoperable Data Analytics Library -- built on system aand data abstractions suggested by the HPC-ABDS architecture. We discuss how it can be used in several application areas including Polar Science.

  3. Design of a receiver operating characteristic (ROC) study of 10:1 lossy image compression

    NASA Astrophysics Data System (ADS)

    Collins, Cary A.; Lane, David; Frank, Mark S.; Hardy, Michael E.; Haynor, David R.; Smith, Donald V.; Parker, James E.; Bender, Gregory N.; Kim, Yongmin

    1994-04-01

    The digital archiving system at Madigan Army Medical Center (MAMC) uses a 10:1 lossy data compression algorithm for most forms of computed radiography. A systematic study on the potential effect of lossy image compression on patient care has been initiated with a series of studies focused on specific diagnostic tasks. The studies are based upon the receiver operating characteristic (ROC) method of analysis for diagnostic systems. The null hypothesis is that observer performance with approximately 10:1 compressed and decompressed images is not different from using original, uncompressed images for detecting subtle pathologic findings seen on computed radiographs of bone, chest, or abdomen, when viewed on a high-resolution monitor. Our design involves collecting cases from eight pathologic categories. Truth is determined by committee using confirmatory studies performed during routine clinical practice whenever possible. Software has been developed to aid in case collection and to allow reading of the cases for the study using stand-alone Siemens Litebox workstations. Data analysis uses two methods, ROC analysis and free-response ROC (FROC) methods. This study will be one of the largest ROC/FROC studies of its kind and could benefit clinical radiology practice using PACS technology. The study design and results from a pilot FROC study are presented.

  4. A Computer Program for Practical Semivariogram Modeling and Ordinary Kriging: A Case Study of Porosity Distribution in an Oil Field

    NASA Astrophysics Data System (ADS)

    Mert, Bayram Ali; Dag, Ahmet

    2017-12-01

    In this study, firstly, a practical and educational geostatistical program (JeoStat) was developed, and then example analysis of porosity parameter distribution, using oilfield data, was presented. With this program, two or three-dimensional variogram analysis can be performed by using normal, log-normal or indicator transformed data. In these analyses, JeoStat offers seven commonly used theoretical variogram models (Spherical, Gaussian, Exponential, Linear, Generalized Linear, Hole Effect and Paddington Mix) to the users. These theoretical models can be easily and quickly fitted to experimental models using a mouse. JeoStat uses ordinary kriging interpolation technique for computation of point or block estimate, and also uses cross-validation test techniques for validation of the fitted theoretical model. All the results obtained by the analysis as well as all the graphics such as histogram, variogram and kriging estimation maps can be saved to the hard drive, including digitised graphics and maps. As such, the numerical values of any point in the map can be monitored using a mouse and text boxes. This program is available to students, researchers, consultants and corporations of any size free of charge. The JeoStat software package and source codes available at: http://www.jeostat.com/JeoStat_2017.0.rar.

  5. The Information Needs of Practicing Physicians in Northeastern New York State*

    PubMed Central

    Strasser, Theresa C.

    2012-01-01

    The information needs of practicing physicians in seventeen counties of upstate New York were surveyed by questionnaire. A 45.6% response, or 258 usable replies, was obtained. Computer-aided market analysis indicated that the areas of greatest need for improved information were new developments in specialties and government regulations relating to health care. Sources most frequently used were journal papers, colleagues, and books, in that order. Specialty-related differences occurred with both specific information needs and source use. Degree date, geographical location, and type of practice (hospital, nonhospital, private, and so on), and involvement in research or education were also analyzed in relation to information needs and sources. Implications for library service are discussed. PMID:23509429

  6. Providing Practical Applications of Computer Technology for Fifth Grade Students in Career Awareness Laboratories.

    ERIC Educational Resources Information Center

    Pereno, Joan S.

    This practicum addressed the problem of providing practical computer application experiences to fifth grade students as they relate to real life work situations. The primary goal was to have students become cognizant of computer functions within the work setting as contrasted with viewing computer activities as instruments used for games or…

  7. Neurostimulation options for failed back surgery syndrome: The need for rational and objective measurements. Proposal of an international clinical network using an integrated database and health economic analysis: the PROBACK network.

    PubMed

    Rigoard, P; Slavin, K

    2015-03-01

    In the context of failed back surgery syndrome (FBSS) treatment, the current practice in neurostimulation varies from center-to-center and most clinical decisions are based on an individual diagnosis. Neurostimulation evaluation tools and pain relief assessment are of major concern, as they now constitute one of the main biases of clinical trials. Moreover, the proliferation of technological devices, in a fertile and unsatisfied market, fosters and only furthers the confusion. There are three options available to apply scientific debates to our daily neurostimulation practice: intentional ignorance, standardized evidence-based practice or alternative data mining approach. In view of the impossibility of conducting multiple randomized clinical trials comparing various devices, one by one, the proposed concept would be to redefine the indications and the respective roles of the various spinal cord and peripheral nerve stimulation devices with large-scale computational modeling/data mining approach, by conducting a multicenter prospective database registry, supported by a clinician's global network called "PROBACK". We chose to specifically analyze 6 parameters: device coverage performance/coverage selectivity/persistence of the long-term electrical response (technical criteria) and comparative mapping of patient pain relief/persistence of the long-term clinical response/safety and complications occurrence (clinical criteria). Two types of analysis will be performed: immediate analysis (including cost analysis) and computational analysis, i.e. demonstration of the robustness of certain correlations of variables, in order to extract response predictors. By creating an international prospective database, the purpose of the PROBACK project was to set up a process of extraction and comparative analysis of data derived from the selection, implantation and follow-up of FBSS patients candidates for implanted neurostimulation. This evaluation strategy should help to change the opinions of each implanter and each health system towards a more rational decision-making approach subtended by mathematical reality. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  8. Effect of Spray Cone Angle on Flame Stability in an Annular Gas Turbine Combustor

    NASA Astrophysics Data System (ADS)

    Mishra, R. K.; Kumar, S. Kishore; Chandel, Sunil

    2016-04-01

    Effect of fuel spray cone angle in an aerogas turbine combustor has been studied using computational fluid dynamics (CFD) and full-scale combustor testing. For CFD analysis, a 22.5° sector of an annular combustor is modeled and the governing equations are solved using the eddy dissipation combustion model in ANSYS CFX computational package. The analysis has been carried out at 125 kPa and 303 K inlet conditions for spray cone angles from 60° to 140°. The lean blowout limits are established by studying the behavior of combustion zone during transient engine operation from an initial steady-state condition. The computational study has been followed by testing the practical full-scale annular combustor in an aerothermal test facility. The experimental result is in a good agreement with the computational predictions. The lean blowout fuel-air ratio increases as the spray cone angle is decreased at constant operating pressure and temperature. At higher spray cone angle, the flame and high-temperature zone moves upstream close to atomizer face and a uniform flame is sustained over a wide region causing better flame stability.

  9. Product or waste? Importation and end-of-life processing of computers in Peru.

    PubMed

    Kahhat, Ramzy; Williams, Eric

    2009-08-01

    This paper considers the importation of used personal computers (PCs) in Peru and domestic practices in their production, reuse, and end-of-life processing. The empirical pillars of this study are analysis of government data describing trade in used and new computers and surveys and interviews of computer sellers, refurbishers, and recyclers. The United States is the primary source of used PCs imported to Peru. Analysis of shipment value (as measured by trade statistics) shows that 87-88% of imported used computers had a price higher than the ideal recycle value of constituent materials. The official trade in end-of-life computers is thus driven by reuse as opposed to recycling. The domestic reverse supply chain of PCs is well developed with extensive collection, reuse, and recycling. Environmental problems identified include open burning of copper-bearing wires to remove insulation and landfilling of CRT glass. Distinct from informal recycling in China and India, printed circuit boards are usually not recycled domestically but exported to Europe for advanced recycling or to China for (presumably) informal recycling. It is notable that purely economic considerations lead to circuit boards being exported to Europe where environmental standards are stringent, presumably due to higher recovery of precious metals.

  10. A Review of Methods for Analysis of the Expected Value of Information.

    PubMed

    Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca

    2017-10-01

    In recent years, value-of-information analysis has become more widespread in health economic evaluations, specifically as a tool to guide further research and perform probabilistic sensitivity analysis. This is partly due to methodological advancements allowing for the fast computation of a typical summary known as the expected value of partial perfect information (EVPPI). A recent review discussed some approximation methods for calculating the EVPPI, but as the research has been active over the intervening years, that review does not discuss some key estimation methods. Therefore, this paper presents a comprehensive review of these new methods. We begin by providing the technical details of these computation methods. We then present two case studies in order to compare the estimation performance of these new methods. We conclude that a method based on nonparametric regression offers the best method for calculating the EVPPI in terms of accuracy, computational time, and ease of implementation. This means that the EVPPI can now be used practically in health economic evaluations, especially as all the methods are developed in parallel with R functions and a web app to aid practitioners.

  11. Frequency modulation television analysis: Threshold impulse analysis. [with computer program

    NASA Technical Reports Server (NTRS)

    Hodge, W. H.

    1973-01-01

    A computer program is developed to calculate the FM threshold impulse rates as a function of the carrier-to-noise ratio for a specified FM system. The system parameters and a vector of 1024 integers, representing the probability density of the modulating voltage, are required as input parameters. The computer program is utilized to calculate threshold impulse rates for twenty-four sets of measured probability data supplied by NASA and for sinusoidal and Gaussian modulating waveforms. As a result of the analysis several conclusions are drawn: (1) The use of preemphasis in an FM television system improves the threshold by reducing the impulse rate. (2) Sinusoidal modulation produces a total impulse rate which is a practical upper bound for the impulse rates of TV signals providing the same peak deviations. (3) As the moment of the FM spectrum about the center frequency of the predetection filter increases, the impulse rate tends to increase. (4) A spectrum having an expected frequency above (below) the center frequency of the predetection filter produces a higher negative (positive) than positive (negative) impulse rate.

  12. Steady shape analysis of tomographic pumping tests for characterization of aquifer heterogeneities

    USGS Publications Warehouse

    Bohling, Geoffrey C.; Zhan, Xiaoyong; Butler, James J.; Zheng, Li

    2002-01-01

    Hydraulic tomography, a procedure involving the performance of a suite of pumping tests in a tomographic format, provides information about variations in hydraulic conductivity at a level of detail not obtainable with traditional well tests. However, analysis of transient data from such a suite of pumping tests represents a substantial computational burden. Although steady state responses can be analyzed to reduce this computational burden significantly, the time required to reach steady state will often be too long for practical applications of the tomography concept. In addition, uncertainty regarding the mechanisms driving the system to steady state can propagate to adversely impact the resulting hydraulic conductivity estimates. These disadvantages of a steady state analysis can be overcome by exploiting the simplifications possible under the steady shape flow regime. At steady shape conditions, drawdown varies with time but the hydraulic gradient does not. Thus transient data can be analyzed with the computational efficiency of a steady state model. In this study, we demonstrate the value of the steady shape concept for inversion of hydraulic tomography data and investigate its robustness with respect to improperly specified boundary conditions.

  13. Correcting for Indirect Range Restriction in Meta-Analysis: Testing a New Meta-Analytic Procedure

    ERIC Educational Resources Information Center

    Le, Huy; Schmidt, Frank L.

    2006-01-01

    Using computer simulation, the authors assessed the accuracy of J. E. Hunter, F. L. Schmidt, and H. Le's (2006) procedure for correcting for indirect range restriction, the most common type of range restriction, in comparison with the conventional practice of applying the Thorndike Case II correction for direct range restriction. Hunter et…

  14. Online Reading Practices of Students Who Are Deaf/Hard of Hearing

    ERIC Educational Resources Information Center

    Donne, Vicki; Rugg, Natalie

    2015-01-01

    This study sought to investigate reading perceptions, computer use perceptions, and online reading comprehension strategy use of 26 students who are deaf/hard of hearing in grades 4 through 8 attending public school districts in a tri-state area of the U.S. Students completed an online questionnaire and descriptive analysis indicated that students…

  15. A Fine-Grained Analysis of the Effects of Negative Evidence with and without Metalinguistic Information in Language Development

    ERIC Educational Resources Information Center

    Lado, Beatriz; Bowden, Harriet Wood; Stafford, Catherine A; Sanz, Cristina

    2014-01-01

    The current study compared the effectiveness of computer-delivered task-essential practice coupled with feedback consisting of (1) negative evidence with metalinguistic information (NE+MI) or (2) negative evidence without metalinguistic information (NE-MI) in promoting absolute beginners' (n = 58) initial learning of aspects of Latin…

  16. Re-Aligning Research into Teacher Education for CALL and Bringing It into the Mainstream

    ERIC Educational Resources Information Center

    Motteram, Gary

    2014-01-01

    This paper explores three research projects conducted by the writer and others with a view to demonstrating the importance of effective theory and methodology in the analysis of teaching situations where Computer Assisted Language Learning (CALL), teacher practice and teacher education meet. It argues that there is a tendency in the field of…

  17. Comparative Analysis, Hypercard, and the Future of Social Studies Education.

    ERIC Educational Resources Information Center

    Jennings, James M.

    This research paper seeks to address new theories of learning and instructional practices that will be needed to meet the demands of 21st century education. A brief review of the literature on the topics of constructivism, reflective inquiry, and multicultural education, which form the major elements of a computer-based system called HyperCAP, are…

  18. The Contribution of CALL to Advanced-Level Foreign/Second Language Instruction

    ERIC Educational Resources Information Center

    Burston, Jack; Arispe, Kelly

    2016-01-01

    This paper evaluates the contribution of instructional technology to advanced-level foreign/second language learning (AL2) over the past thirty years. It is shown that the most salient feature of AL2 practice and associated Computer-Assisted Language Learning (CALL) research are their rarity and restricted nature. Based on an analysis of four…

  19. Rural School Finance: A Critical Analysis of Current Practice in Illinois.

    ERIC Educational Resources Information Center

    Lows, Raymond L.

    School finance is basic to understanding and improving the condition of rural education. Information necessary for financial planning and policy development at the state and local levels is stored in large computer readable data bases and needs to be accessed. Pertinent data elements need to be extracted from the data base and relationships among…

  20. Establishing a communications link between two different, incompatible, personal computers: with practical examples and illustrations and program code.

    PubMed

    Davidson, R W

    1985-01-01

    The increasing need to communicate to exchange data can be handled by personal microcomputers. The necessity for the transference of information stored in one type of personal computer to another type of personal computer is often encountered in the process of integrating multiple sources of information stored in different and incompatible computers in Medical Research and Practice. A practical example is demonstrated with two relatively inexpensive commonly used computers, the IBM PC jr. and the Apple IIe. The basic input/output (I/O) interface chip for serial communication for each computer are joined together using a Null connector and cable to form a communications link. Using BASIC (Beginner's All-purpose Symbolic Instruction Code) Computer Language and the Disk Operating System (DOS) the communications handshaking protocol and file transfer is established between the two computers. The BASIC programming languages used are Applesoft (Apple Personal Computer) and PC BASIC (IBM Personal computer).

  1. A 3-D Approach for Teaching and Learning about Surface Water Systems through Computational Thinking, Data Visualization and Physical Models

    NASA Astrophysics Data System (ADS)

    Caplan, B.; Morrison, A.; Moore, J. C.; Berkowitz, A. R.

    2017-12-01

    Understanding water is central to understanding environmental challenges. Scientists use `big data' and computational models to develop knowledge about the structure and function of complex systems, and to make predictions about changes in climate, weather, hydrology, and ecology. Large environmental systems-related data sets and simulation models are difficult for high school teachers and students to access and make sense of. Comp Hydro, a collaboration across four states and multiple school districts, integrates computational thinking and data-related science practices into water systems instruction to enhance development of scientific model-based reasoning, through curriculum, assessment and teacher professional development. Comp Hydro addresses the need for 1) teaching materials for using data and physical models of hydrological phenomena, 2) building teachers' and students' comfort or familiarity with data analysis and modeling, and 3) infusing the computational knowledge and practices necessary to model and visualize hydrologic processes into instruction. Comp Hydro teams in Baltimore, MD and Fort Collins, CO are integrating teaching about surface water systems into high school courses focusing on flooding (MD) and surface water reservoirs (CO). This interactive session will highlight the successes and challenges of our physical and simulation models in helping teachers and students develop proficiency with computational thinking about surface water. We also will share insights from comparing teacher-led vs. project-led development of curriculum and our simulations.

  2. Analytical solutions for coagulation and condensation kinetics of composite particles

    NASA Astrophysics Data System (ADS)

    Piskunov, Vladimir N.

    2013-04-01

    The processes of composite particles formation consisting of a mixture of different materials are essential for many practical problems: for analysis of the consequences of accidental releases in atmosphere; for simulation of precipitation formation in clouds; for description of multi-phase processes in chemical reactors and industrial facilities. Computer codes developed for numerical simulation of these processes require optimization of computational methods and verification of numerical programs. Kinetic equations of composite particle formation are given in this work in a concise form (impurity integrated). Coagulation, condensation and external sources associated with nucleation are taken into account. Analytical solutions were obtained in a number of model cases. The general laws for fraction redistribution of impurities were defined. The results can be applied to develop numerical algorithms considerably reducing the simulation effort, as well as to verify the numerical programs for calculation of the formation kinetics of composite particles in the problems of practical importance.

  3. Forensic Analysis of Compromised Computers

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  4. DREAMS and IMAGE: A Model and Computer Implementation for Concurrent, Life-Cycle Design of Complex Systems

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.

  5. Computer Drawing Method for Operating Characteristic Curve of PV Power Plant Array Unit

    NASA Astrophysics Data System (ADS)

    Tan, Jianbin

    2018-02-01

    According to the engineering design of large-scale grid-connected photovoltaic power stations and the research and development of many simulation and analysis systems, it is necessary to draw a good computer graphics of the operating characteristic curves of photovoltaic array elements and to propose a good segmentation non-linear interpolation algorithm. In the calculation method, Component performance parameters as the main design basis, the computer can get 5 PV module performances. At the same time, combined with the PV array series and parallel connection, the computer drawing of the performance curve of the PV array unit can be realized. At the same time, the specific data onto the module of PV development software can be calculated, and the good operation of PV array unit can be improved on practical application.

  6. Differences in nursing practice environment among US acute care unit types: a descriptive study.

    PubMed

    Choi, JiSun; Boyle, Diane K

    2014-11-01

    The hospital nursing practice environment has been found to be crucial for better nurse and patient outcomes. Yet little is known about the professional nursing practice environment at the unit level where nurses provide 24-hour bedside care to patients. To examine differences in nursing practice environments among 11 unit types (critical care, step-down, medical, surgical, combined medical-surgical, obstetric, neonatal, pediatric, psychiatric, perioperative, and emergency) and by Magnet status overall, as well as four specific aspects of the practice environment. Cross-sectional study. 5322 nursing units in 519 US acute care hospitals. The nursing practice environment was measured by the Practice Environment Scale of the Nursing Work Index. The Practice Environment Scale of the Nursing Work Index mean composite and four subscale scores were computed at the unit level. Two statistical approaches (one-way analysis of covariance and multivariate analysis of covariance analysis) were employed with a Tukey-Kramer post hoc test. In general, the nursing practice environment was favorable in all unit types. There were significant differences in the nursing practice environment among the 11 unit types and by Magnet status. Pediatric units had the most favorable practice environment and medical-surgical units had the least favorable. A consistent finding across all unit types except neonatal units was that the staffing and resource adequacy subscale scored the lowest compared with all other Practice Environment Scale of the Nursing Work Index subscales (nursing foundations for quality of care, nurse manager ability, leadership, and support, and nurse-physician relations). Unit nursing practice environments were more favorable in Magnet than non-Magnet hospitals. Findings indicate that there are significant variations in unit nursing practice environments among 11 unit types and by hospital Magnet status. Both hospital-level and unit-specific strategies should be considered to achieve an excellent nursing practice environment in all hospital units. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Effective approach to spectroscopy and spectral analysis techniques using Matlab

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Lv, Yong

    2017-08-01

    With the development of electronic information, computer and network, modern education technology has entered new era, which would give a great impact on teaching process. Spectroscopy and spectral analysis is an elective course for Optoelectronic Information Science and engineering. The teaching objective of this course is to master the basic concepts and principles of spectroscopy, spectral analysis and testing of basic technical means. Then, let the students learn the principle and technology of the spectrum to study the structure and state of the material and the developing process of the technology. MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment and fourth-generation programming language. A proprietary programming language developed by MathWorks, MATLAB allows matrix manipulations, plotting of functions and data, Based on the teaching practice, this paper summarizes the new situation of applying Matlab to the teaching of spectroscopy. This would be suitable for most of the current school multimedia assisted teaching

  8. Comparative Investigation of Normal Modes and Molecular Dynamics of Hepatitis C NS5B Protein

    NASA Astrophysics Data System (ADS)

    Asafi, M. S.; Yildirim, A.; Tekpinar, M.

    2016-04-01

    Understanding dynamics of proteins has many practical implications in terms of finding a cure for many protein related diseases. Normal mode analysis and molecular dynamics methods are widely used physics-based computational methods for investigating dynamics of proteins. In this work, we studied dynamics of Hepatitis C NS5B protein with molecular dynamics and normal mode analysis. Principal components obtained from a 100 nanoseconds molecular dynamics simulation show good overlaps with normal modes calculated with a coarse-grained elastic network model. Coarse-grained normal mode analysis takes at least an order of magnitude shorter time. Encouraged by this good overlaps and short computation times, we analyzed further low frequency normal modes of Hepatitis C NS5B. Motion directions and average spatial fluctuations have been analyzed in detail. Finally, biological implications of these motions in drug design efforts against Hepatitis C infections have been elaborated.

  9. The finite element method in low speed aerodynamics

    NASA Technical Reports Server (NTRS)

    Baker, A. J.; Manhardt, P. D.

    1975-01-01

    The finite element procedure is shown to be of significant impact in design of the 'computational wind tunnel' for low speed aerodynamics. The uniformity of the mathematical differential equation description, for viscous and/or inviscid, multi-dimensional subsonic flows about practical aerodynamic system configurations, is utilized to establish the general form of the finite element algorithm. Numerical results for inviscid flow analysis, as well as viscous boundary layer, parabolic, and full Navier Stokes flow descriptions verify the capabilities and overall versatility of the fundamental algorithm for aerodynamics. The proven mathematical basis, coupled with the distinct user-orientation features of the computer program embodiment, indicate near-term evolution of a highly useful analytical design tool to support computational configuration studies in low speed aerodynamics.

  10. Life Prediction for a CMC Component Using the NASALIFE Computer Code

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, John Z.; Murthy, Pappu L. N.; Mital, Subodh K.

    2005-01-01

    The computer code, NASALIFE, was used to provide estimates for life of an SiC/SiC stator vane under varying thermomechanical loading conditions. The primary intention of this effort is to show how the computer code NASALIFE can be used to provide reasonable estimates of life for practical propulsion system components made of advanced ceramic matrix composites (CMC). Simple loading conditions provided readily observable and acceptable life predictions. Varying the loading conditions such that low cycle fatigue and creep were affected independently provided expected trends in the results for life due to varying loads and life due to creep. Analysis was based on idealized empirical data for the 9/99 Melt Infiltrated SiC fiber reinforced SiC.

  11. Use of computational fluid dynamics in respiratory medicine.

    PubMed

    Fernández Tena, Ana; Casan Clarà, Pere

    2015-06-01

    Computational Fluid Dynamics (CFD) is a computer-based tool for simulating fluid movement. The main advantages of CFD over other fluid mechanics studies include: substantial savings in time and cost, the analysis of systems or conditions that are very difficult to simulate experimentally (as is the case of the airways), and a practically unlimited level of detail. We used the Ansys-Fluent CFD program to develop a conducting airway model to simulate different inspiratory flow rates and the deposition of inhaled particles of varying diameters, obtaining results consistent with those reported in the literature using other procedures. We hope this approach will enable clinicians to further individualize the treatment of different respiratory diseases. Copyright © 2014 SEPAR. Published by Elsevier Espana. All rights reserved.

  12. Computer screening for palliative care needs in primary care: a mixed-methods study.

    PubMed

    Mason, Bruce; Boyd, Kirsty; Steyn, John; Kendall, Marilyn; Macpherson, Stella; Murray, Scott A

    2018-05-01

    Though the majority of people could benefit from palliative care before they die, most do not receive this approach, especially those with multimorbidity and frailty. GPs find it difficult to identify such patients. To refine and evaluate the utility of a computer application (AnticiPal) to help primary care teams screen their registered patients for people who could benefit from palliative care. A mixed-methods study of eight GP practices in Scotland, conducted in 2016-2017. After a search development cycle the authors adopted a mixed-methods approach, combining analysis of the number of people identified by the search with qualitative observations of the computer search as used by primary care teams, and interviews with professionals and patients. The search identified 0.8% of 62 708 registered patients. A total of 27 multidisciplinary meetings were observed, and eight GPs and 10 patients were interviewed. GPs thought the search identified many unrecognised patients with advanced multimorbidity and frailty, but were concerned about workload implications of assessment and care planning. Patients and carers endorsed the value of proactive identification of people with advanced illness. GP practices can use computer searching to generate lists of patients for review and care planning. The challenges of starting a conversation about the future remain. However, most patients regard key components of palliative care (proactive planning, including sharing information with urgent care services) as important. Screening for people with deteriorating health at risk from unplanned care is a current focus for quality improvement and should not be limited by labelling it solely as 'palliative care'. © British Journal of General Practice 2018.

  13. Computational Fluid Dynamics Technology for Hypersonic Applications

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2003-01-01

    Several current challenges in computational fluid dynamics and aerothermodynamics for hypersonic vehicle applications are discussed. Example simulations are presented from code validation and code benchmarking efforts to illustrate capabilities and limitations. Opportunities to advance the state-of-art in algorithms, grid generation and adaptation, and code validation are identified. Highlights of diverse efforts to address these challenges are then discussed. One such effort to re-engineer and synthesize the existing analysis capability in LAURA, VULCAN, and FUN3D will provide context for these discussions. The critical (and evolving) role of agile software engineering practice in the capability enhancement process is also noted.

  14. Efficient Monte Carlo Estimation of the Expected Value of Sample Information Using Moment Matching.

    PubMed

    Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca

    2018-02-01

    The Expected Value of Sample Information (EVSI) is used to calculate the economic value of a new research strategy. Although this value would be important to both researchers and funders, there are very few practical applications of the EVSI. This is due to computational difficulties associated with calculating the EVSI in practical health economic models using nested simulations. We present an approximation method for the EVSI that is framed in a Bayesian setting and is based on estimating the distribution of the posterior mean of the incremental net benefit across all possible future samples, known as the distribution of the preposterior mean. Specifically, this distribution is estimated using moment matching coupled with simulations that are available for probabilistic sensitivity analysis, which is typically mandatory in health economic evaluations. This novel approximation method is applied to a health economic model that has previously been used to assess the performance of other EVSI estimators and accurately estimates the EVSI. The computational time for this method is competitive with other methods. We have developed a new calculation method for the EVSI which is computationally efficient and accurate. This novel method relies on some additional simulation so can be expensive in models with a large computational cost.

  15. Methodological considerations for the evaluation of EEG mapping data: a practical example based on a placebo/diazepam crossover trial.

    PubMed

    Jähnig, P; Jobert, M

    1995-01-01

    Quantitative EEG is a sensitive method for measuring pharmacological effects on the central nervous system. Nowadays, computers enable EEG data to be stored and spectral parameters to be computed for signals obtained from a large number of electrode locations. However, the statistical analysis of such vast amounts of EEG data is complicated due to the limited number of subjects usually involved in pharmacological studies. In the present study, data from a trial aimed at comparing diazepam and placebo were used to investigate different properties of EEG mapping data and to compare different methods of data analysis. Both the topography and the temporal changes of EEG activity were investigated using descriptive data analysis, which is based on an inspection of patterns of pd values (descriptive p values) assessed for all pair-wise tests for differences in time or treatment. An empirical measure (tri-mean) for the computation of group maps is suggested, allowing a better description of group effects with skewed data of small samples size. Finally, both the investigation of maps based on principal component analysis and the notion of distance between maps are discussed and applied to the analysis of the data collected under diazepam treatment, exemplifying the evaluation of pharmacodynamic drug effects.

  16. Integrating on campus problem based learning and practice based learning: issues and challenges in using computer mediated communication.

    PubMed

    Conway, J; Sharkey, R

    2002-10-01

    The Faculty of Nursing, University of Newcastle, Australia, has been keen to initiate strategies that enhance student learning and nursing practice. Two strategies are problem based learning (PBL) and clinical practice. The Faculty has maintained a comparatively high proportion of the undergraduate hours in the clinical setting in times when financial constraints suggest that simulations and on campus laboratory experiences may be less expensive.Increasingly, computer based technologies are becoming sufficiently refined to support the exploration of nursing practice in a non-traditional lecture/tutorial environment. In 1998, a group of faculty members proposed that computer mediated instruction would provide an opportunity for partnership between students, academics and clinicians that would promote more positive outcomes for all and maintain the integrity of the PBL approach. This paper discusses the similarities between problem based and practice based learning and presents the findings of an evaluative study of the implementation of a practice based learning model that uses computer mediated communication to promote integration of practice experiences with the broader goals of the undergraduate curriculum.

  17. Secure software practices among Malaysian software practitioners: An exploratory study

    NASA Astrophysics Data System (ADS)

    Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina

    2016-08-01

    Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.

  18. Fiji: an open-source platform for biological-image analysis.

    PubMed

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2012-06-28

    Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

  19. Numerical analysis of stiffened shells of revolution. Volume 3: Users' manual for STARS-2B, 2V, shell theory automated for rotational structures, 2 (buckling, vibrations), digital computer programs

    NASA Technical Reports Server (NTRS)

    Svalbonas, V.

    1973-01-01

    The User's manual for the shell theory automated for rotational structures (STARS) 2B and 2V (buckling, vibrations) is presented. Several features of the program are: (1) arbitrary branching of the shell meridians, (2) arbitrary boundary conditions, (3) minimum input requirements to describe a complex, practical shell of revolution structure, and (4) accurate analysis capability using a minimum number of degrees of freedom.

  20. Simplified diagnostic coding sheet for computerized data storage and analysis in ophthalmology.

    PubMed

    Tauber, J; Lahav, M

    1987-11-01

    A review of currently-available diagnostic coding systems revealed that most are either too abbreviated or too detailed. We have compiled a simplified diagnostic coding sheet based on the International Coding and Diagnosis (ICD-9), which is both complete and easy to use in a general practice. The information is transferred to a computer, which uses the relevant (ICD-9) diagnoses as database and can be retrieved later for display of patients' problems or analysis of clinical data.

  1. Applications of the pipeline environment for visual informatics and genomics computations

    PubMed Central

    2011-01-01

    Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102

  2. Exploiting symmetries in the modeling and analysis of tires

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Andersen, C. M.; Tanner, John A.

    1989-01-01

    A computational procedure is presented for reducing the size of the analysis models of tires having unsymmetric material, geometry and/or loading. The two key elements of the procedure when applied to anisotropic tires are: (1) decomposition of the stiffness matrix into the sum of an orthotropic and nonorthotropic parts; and (2) successive application of the finite-element method and the classical Rayleigh-Ritz technique. The finite-element method is first used to generate few global approximation vectors (or modes). Then the amplitudes of these modes are computed by using the Rayleigh-Ritz technique. The proposed technique has high potential for handling practical tire problems with anisotropic materials, unsymmetric imperfections and asymmetric loading. It is also particularly useful for use with three-dimensional finite-element models of tires.

  3. Encoder fault analysis system based on Moire fringe error signal

    NASA Astrophysics Data System (ADS)

    Gao, Xu; Chen, Wei; Wan, Qiu-hua; Lu, Xin-ran; Xie, Chun-yu

    2018-02-01

    Aiming at the problem of any fault and wrong code in the practical application of photoelectric shaft encoder, a fast and accurate encoder fault analysis system is researched from the aspect of Moire fringe photoelectric signal processing. DSP28335 is selected as the core processor and high speed serial A/D converter acquisition card is used. And temperature measuring circuit using AD7420 is designed. Discrete data of Moire fringe error signal is collected at different temperatures and it is sent to the host computer through wireless transmission. The error signal quality index and fault type is displayed on the host computer based on the error signal identification method. The error signal quality can be used to diagnosis the state of error code through the human-machine interface.

  4. Handheld Computer Use in U.S. Family Practice Residency Programs

    PubMed Central

    Criswell, Dan F.; Parchman, Michael L.

    2002-01-01

    Objective: The purpose of the study was to evaluate the uses of handheld computers (also called personal digital assistants, or PDAs) in family practice residency programs in the United States. Study Design: In November 2000, the authors mailed a questionnaire to the program directors of all American Academy of Family Physicians (AAFP) and American College of Osteopathic Family Practice (ACOFP) residency programs in the United States. Measurements: Data and patterns of the use and non-use of handheld computers were identified. Results: Approximately 50 percent (306 of 610) of the programs responded to the survey. Two thirds of the programs reported that handheld computers were used in their residencies, and an additional 14 percent had plans for implementation within 24 months. Both the Palm and the Windows CE operating systems were used, with the Palm operating system the most common. Military programs had the highest rate of use (8 of 10 programs, 80 percent), and osteopathic programs had the lowest (23 of 55 programs, 42 percent). Of programs that reported handheld computer use, 45 percent had required handheld computer applications that are used uniformly by all users. Funding for handheld computers and related applications was non-budgeted in 76percent of the programs in which handheld computers were used. In programs providing a budget for handheld computers, the average annual budget per user was $461.58. Interested faculty or residents, rather than computer information services personnel, performed upkeep and maintenance of handheld computers in 72 percent of the programs in which the computers are used. In addition to the installed calendar, memo pad, and address book, the most common clinical uses of handheld computers in the programs were as medication reference tools, electronic textbooks, and clinical computational or calculator-type programs. Conclusions: Handheld computers are widely used in family practice residency programs in the United States. Although handheld computers were designed as electronic organizers, in family practice residencies they are used as medication reference tools, electronic textbooks, and clinical computational programs and to track activities that were previously associated with desktop database applications. PMID:11751806

  5. Handheld computer use in U.S. family practice residency programs.

    PubMed

    Criswell, Dan F; Parchman, Michael L

    2002-01-01

    The purpose of the study was to evaluate the uses of handheld computers (also called personal digital assistants, or PDAs) in family practice residency programs in the United States. In November 2000, the authors mailed a questionnaire to the program directors of all American Academy of Family Physicians (AAFP) and American College of Osteopathic Family Practice (ACOFP) residency programs in the United States. Data and patterns of the use and non-use of handheld computers were identified. Approximately 50 percent (306 of 610) of the programs responded to the survey. Two thirds of the programs reported that handheld computers were used in their residencies, and an additional 14 percent had plans for implementation within 24 months. Both the Palm and the Windows CE operating systems were used, with the Palm operating system the most common. Military programs had the highest rate of use (8 of 10 programs, 80 percent), and osteopathic programs had the lowest (23 of 55 programs, 42 percent). Of programs that reported handheld computer use, 45 percent had required handheld computer applications that are used uniformly by all users. Funding for handheld computers and related applications was non-budgeted in 76percent of the programs in which handheld computers were used. In programs providing a budget for handheld computers, the average annual budget per user was 461.58 dollars. Interested faculty or residents, rather than computer information services personnel, performed upkeep and maintenance of handheld computers in 72 percent of the programs in which the computers are used. In addition to the installed calendar, memo pad, and address book, the most common clinical uses of handheld computers in the programs were as medication reference tools, electronic textbooks, and clinical computational or calculator-type programs. Handheld computers are widely used in family practice residency programs in the United States. Although handheld computers were designed as electronic organizers, in family practice residencies they are used as medication reference tools, electronic textbooks, and clinical computational programs and to track activities that were previously associated with desktop database applications.

  6. Analysis of the knowledge and opinions of students and qualified dentists regarding the use of computers.

    PubMed

    Castelló Castañeda, Coral; Ríos Santos, Jose Vicente; Bullón, Pedro

    2008-01-01

    Dentists are currently required to make multiple diagnoses and treatment decisions every day and the information necessary to achieve this satisfactorily doubles in volume every five years. Knowledge therefore rapidly becomes out of date, so that it is often impossible to remember established information and assimilate new concepts. This may result in a significant lack of knowledge in the future, which would jeopardize the success of treatments. To remedy this situation and to prevent it, we nowadays have access to modern computing systems, with an extensive data base, which helps us to retain the information necessary for daily practice and access it instantaneously. The objectives of this study are therefore to determine how widespread the use of computing is in this environment and to determine the opinion of students and qualified dentists as regards its use in Dentistry. 90 people were chosen to take part in the study, divided into the following groups (students) (newly qualified dentists) (experts). It has been demonstrated that a high percentage (93.30%) use a computer, but that their level of computing knowledge is predominantly moderate. The place where a computer is used most is the home, which suggests that the majority own a computer. Analysis of the results obtained for evaluation of computers in teaching showed that the participants thought that it saved a great deal of time and had great potential for providing an image (in terms of marketing) and they considered it a very innovative and stimulating tool.

  7. Shell stability analysis in a computer aided engineering (CAE) environment

    NASA Technical Reports Server (NTRS)

    Arbocz, J.; Hol, J. M. A. M.

    1993-01-01

    The development of 'DISDECO', the Delft Interactive Shell DEsign COde is described. The purpose of this project is to make the accumulated theoretical, numerical and practical knowledge of the last 25 years or so readily accessible to users interested in the analysis of buckling sensitive structures. With this open ended, hierarchical, interactive computer code the user can access from his workstation successively programs of increasing complexity. The computational modules currently operational in DISDECO provide the prospective user with facilities to calculate the critical buckling loads of stiffened anisotropic shells under combined loading, to investigate the effects the various types of boundary conditions will have on the critical load, and to get a complete picture of the degrading effects the different shapes of possible initial imperfections might cause, all in one interactive session. Once a design is finalized, its collapse load can be verified by running a large refined model remotely from behind the workstation with one of the current generation 2-dimensional codes, with advanced capabilities to handle both geometric and material nonlinearities.

  8. Structural Loads Analysis for Wave Energy Converters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Rij, Jennifer A; Yu, Yi-Hsiang; Guo, Yi

    2017-06-03

    This study explores and verifies the generalized body-modes method for evaluating the structural loads on a wave energy converter (WEC). Historically, WEC design methodologies have focused primarily on accurately evaluating hydrodynamic loads, while methodologies for evaluating structural loads have yet to be fully considered and incorporated into the WEC design process. As wave energy technologies continue to advance, however, it has become increasingly evident that an accurate evaluation of the structural loads will enable an optimized structural design, as well as the potential utilization of composites and flexible materials, and hence reduce WEC costs. Although there are many computational fluidmore » dynamics, structural analyses and fluid-structure-interaction (FSI) codes available, the application of these codes is typically too computationally intensive to be practical in the early stages of the WEC design process. The generalized body-modes method, however, is a reduced order, linearized, frequency-domain FSI approach, performed in conjunction with the linear hydrodynamic analysis, with computation times that could realistically be incorporated into the WEC design process.« less

  9. Computer multitasking with Desqview 386 in a family practice.

    PubMed Central

    Davis, A E

    1990-01-01

    Computers are now widely used in medical practice for accounting and secretarial tasks. However, it has been much more difficult to use computers in more physician-related activities of daily practice. I investigated the Desqview multitasking system on a 386 computer as a solution to this problem. Physician-directed tasks of management of patient charts, retrieval of reference information, word processing, appointment scheduling and office organization were each managed by separate programs. Desqview allowed instantaneous switching back and forth between the various programs. I compared the time and cost savings and the need for physician input between Desqview 386, a 386 computer alone and an older, XT computer. Desqview significantly simplified the use of computer programs for medical information management and minimized the necessity for physician intervention. The time saved was 15 minutes per day; the costs saved were estimated to be $5000 annually. PMID:2383848

  10. Improving Communicative Competence through Synchronous Communication in Computer-Supported Collaborative Learning Environments: A Systematic Review

    ERIC Educational Resources Information Center

    Huang, Xi

    2018-01-01

    Computer-supported collaborative learning facilitates the extension of second language acquisition into social practice. Studies on its achievement effects speak directly to the pedagogical notion of treating communicative practice in synchronous computer-mediated communication (SCMC): real-time communication that takes place between human beings…

  11. Theory-Guided Technology in Computer Science.

    ERIC Educational Resources Information Center

    Ben-Ari, Mordechai

    2001-01-01

    Examines the history of major achievements in computer science as portrayed by winners of the prestigious Turing award and identifies a possibly unique activity called Theory-Guided Technology (TGT). Researchers develop TGT by using theoretical results to create practical technology. Discusses reasons why TGT is practical in computer science and…

  12. A Framework for Understanding Physics Students' Computational Modeling Practices

    ERIC Educational Resources Information Center

    Lunk, Brandon Robert

    2012-01-01

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content…

  13. Learning about Computers through Art History and Art Practice.

    ERIC Educational Resources Information Center

    Lichtman, Loy

    1996-01-01

    Describes a Victoria University (Australia) program that combines art history, computer graphics, and studio practice. Discusses the social applications of technology, the creation and manipulation of computer imagery, and the ways that these impact traditional concepts of art. The program has proven particularly successful with female students.…

  14. Inter-expert agreement and similarity analysis of traditional diagnoses and acupuncture prescriptions in textbook- and pragmatic-based practices.

    PubMed

    Alvim, Danielle Terra; Ferreira, Arthur Sá

    2018-02-01

    This study examined (1) the agreement of acupuncture experts with textbook prescriptions and among themselves, and (2) the association between similar traditional diagnoses and textbook acupuncture prescriptions, examining whether pragmatic practice (i.e., modifying prescriptions according to personal clinical practice) alters such an association. A computational analysis quantified the diagnosis-prescription association from a textbook. Eight acupuncture experts were independently interviewed. Experts modified the textbook prescriptions according to their pragmatic practice. Experts mostly agreed (19-90%) or strongly agreed (0-29%) with the textbook prescriptions, with no-better-than-chance agreement on their ratings (Light's κ = 0.036, CI 95%  = [0.003; 0.081]). The number of manifestations in traditional diagnoses weakly explains the variability (Spearman's ρ = 0.260, p = 0.038) of the number of acupoints in prescriptions. The association between similar traditional diagnoses and acupuncture prescriptions is strong in the textbook (γ = 0.720, CI 95%  = [0.658, 0.783]), whereas pragmatic practice had little effect on this association (γ = 0.724-0.769). Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. An online forum as a qualitative research method: practical issues.

    PubMed

    Im, Eun-Ok; Chee, Wonshik

    2006-01-01

    Despite the positive aspects of online forums as a qualitative research method, very little is known on the practical issues involved in using online forums for data collection, especially for a qualitative research project. The aim of this study was to describe the practical issues encountered in implementing an online forum as a qualitative component of a larger study on cancer pain experience. Throughout the study process, the research staff recorded issues ranging from minor technical problems to serious ethical dilemmas as they arose and wrote memos about them. The memos and written records of the discussions were reviewed and analyzed using content analysis. Two practical issues related to credibility were identified: (a) a high response and retention rate and (b) automatic transcripts. An issue related to dependability was the participants' forgetfulness. The issues related to confirmability were difficulties in theoretical saturation and unstandardized computer and Internet jargon. A security issue related to hacking attempts was noted as well. The analysis of these issues suggests several implications for future researchers who want to use online forums as a qualitative data collection method.

  16. Performance Analysis, Design Considerations, and Applications of Extreme-Scale In Situ Infrastructures

    DOE PAGES

    Ayachit, Utkarsh; Bauer, Andrew; Duque, Earl P. N.; ...

    2016-11-01

    A key trend facing extreme-scale computational science is the widening gap between computational and I/O rates, and the challenge that follows is how to best gain insight from simulation data when it is increasingly impractical to save it to persistent storage for subsequent visual exploration and analysis. One approach to this challenge is centered around the idea of in situ processing, where visualization and analysis processing is performed while data is still resident in memory. Our paper examines several key design and performance issues related to the idea of in situ processing at extreme scale on modern platforms: Scalability, overhead,more » performance measurement and analysis, comparison and contrast with a traditional post hoc approach, and interfacing with simulation codes. We illustrate these principles in practice with studies, conducted on large-scale HPC platforms, that include a miniapplication and multiple science application codes, one of which demonstrates in situ methods in use at greater than 1M-way concurrency.« less

  17. Computer-Based Technologies in Dentistry: Types and Applications

    PubMed Central

    Albuha Al-Mussawi, Raja’a M.; Farid, Farzaneh

    2016-01-01

    During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR) simulators, augmented reality (AR) and computer aided design/computer aided manufacturing (CAD/CAM) systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D) virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established. This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice. PMID:28392819

  18. Computer-Based Technologies in Dentistry: Types and Applications.

    PubMed

    Albuha Al-Mussawi, Raja'a M; Farid, Farzaneh

    2016-06-01

    During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR) simulators, augmented reality (AR) and computer aided design/computer aided manufacturing (CAD/CAM) systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D) virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established. This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice.

  19. Design of a Workstation by a Cognitive Approach

    PubMed Central

    Jaspers, MWM; Steen, T.; Geelen, M.; van den Bos, C.

    2001-01-01

    To ensure ultimate acceptance of computer systems that are easy to use, provide the desired functionality and fits into users work practices requires the use of improved methods for system design and evaluation. Both designing and evaluating workstations that link up smoothly with daily routine of physicians' work requires a thorough understanding of their working practices. The application of methods from cognitive science may contribute to a thorough understanding of the activities involved in medical information processing. We used cognitive task analysis in designing a physicians' workstation, which seems a promising method to ensure that the system meets the user needs.

  20. Computer modeling of a hot filament diamond deposition reactor

    NASA Technical Reports Server (NTRS)

    Kuczmarski, Maria A.; Washlock, Paul A.; Angus, John C.

    1991-01-01

    A commercial fluid mechanics program, FLUENT, has been applied to the modeling of a hot-filament diamond deposition reactor. Streamlines and contours of constant temperature and species concentrations are obtained for practical reactor geometries and conditions. The modeling is presently restricted to two-dimensional simulations and to a chemical mechanism of ten independent homogeneous and surface reactions. Comparisons are made between predicted power consumption, substrate temperature, and concentrations of atomic hydrogen and methyl-radical with values taken from the literature. The results to date indicate that the modeling can aid in the rational design and analysis of practical reactor configurations.

  1. Cognitive Task Analysis and Intelligent Computer-Based Training Systems: Lessons Learned from Coached Practice Environments in Air Force Avionics.

    ERIC Educational Resources Information Center

    Katz, Sandra N.; Hall, Ellen; Lesgold, Alan

    This paper describes some results of a collaborative effort between the University of Pittsburgh and the Air Force to develop advanced troubleshooting training for F-15 maintenance technicians. The focus is on the cognitive task methodology used in the development of three intelligent tutoring systems to inform their instructional content and…

  2. The Practical Impact of Recent Computer Advances on the Analysis and Design of Large Scale Networks

    DTIC Science & Technology

    1974-12-01

    Communications, ICC-74, June 17-19, Minneapolis, Minnesota, pp. 31C-1-21C-5. 28. Gitman , I., R, M. Van Slvke and H. Frank, "On Splitting Random Access Broadcast...1974. 29. Gitman , I., "On the Capacity of Slotted ALOHA Network and Some Design Problems," IEEE Transactions on Communications, Maren, 1975. 30

  3. Systematic Review of Studies Promoting the Use of Assistive Technology Devices by Young Children with Disabilities. Practical Evaluation Reports, Volume 5, Number 1

    ERIC Educational Resources Information Center

    Dunst, Carl J.; Trivette, Carol M.; Hamby, Deborah W.; Simkus, Andrew

    2013-01-01

    Findings from a meta-analysis of studies investigating the use of five different assistive technology devices (switch interfaces, powered mobility, computers, augmentative communication, weighted/pressure vests) with young children with disabilities are reported. One hundred and nine studies including 1,342 infants, toddlers, and preschoolers were…

  4. Language and Discourse Analysis with Coh-Metrix: Applications from Educational Material to Learning Environments at Scale

    ERIC Educational Resources Information Center

    Dowell, Nia M. M.; Graesser, Arthur\tC.; Cai, Zhiqiang

    2016-01-01

    The goal of this article is to preserve and distribute the information presented at the LASI (2014) workshop on Coh-Metrix, a theoretically grounded, computational linguistics facility that analyzes texts on multiple levels of language and discourse. The workshop focused on the utility of Coh-Metrix in discourse theory and educational practice. We…

  5. Supporting Regularized Logistic Regression Privately and Efficiently.

    PubMed

    Li, Wenfa; Liu, Hongzhe; Yang, Peng; Xie, Wei

    2016-01-01

    As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc.

  6. Patient and health care professional views and experiences of computer agent-supported health care.

    PubMed

    Neville, Ron G; Greene, Alexandra C; Lewis, Sue

    2006-01-01

    To explore patient and health care professional (HCP) views towards the use of multi-agent computer systems in their GP practice. Qualitative analysis of in-depth interviews and analysis of transcriptions. Urban health centre in Dundee, Scotland. Five representative healthcare professionals and 11 patients. Emergent themes from interviews revealed participants' attitudes and beliefs, which were coded and indexed. Patients and HCPs had similar beliefs, attitudes and views towards the implementation of multi-agent systems (MAS). Both felt modern communication methods were useful to supplement, not supplant, face-to-face consultations between doctors and patients. This was based on the immense trust these patients placed in their doctors in this practice, which extended to trust in their choice of communication technology and security. Rapid access to medical information increased patients' sense of shared partnership and self-efficacy. Patients and HCPs expressed respect for each other's time and were keen to embrace technology that made interactions more efficient, including for the altruistic benefit of others less technically competent. Patients and HCPs welcomed the introduction of agent technology to the delivery of health care. Widespread use will depend more on the trust patients place in their own GP than on technological issues.

  7. Supporting Regularized Logistic Regression Privately and Efficiently

    PubMed Central

    Li, Wenfa; Liu, Hongzhe; Yang, Peng; Xie, Wei

    2016-01-01

    As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc. PMID:27271738

  8. 16 CFR 4.3 - Time.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Time. 4.3 Section 4.3 Commercial Practices FEDERAL TRADE COMMISSION ORGANIZATION, PROCEDURES AND RULES OF PRACTICE MISCELLANEOUS RULES § 4.3 Time. (a) Computation. Computation of any period of time prescribed or allowed by the rules in this chapter, by order of...

  9. 16 CFR 4.3 - Time.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Time. 4.3 Section 4.3 Commercial Practices FEDERAL TRADE COMMISSION ORGANIZATION, PROCEDURES AND RULES OF PRACTICE MISCELLANEOUS RULES § 4.3 Time. (a) Computation. Computation of any period of time prescribed or allowed by the rules in this chapter, by order of...

  10. 16 CFR 4.3 - Time.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Time. 4.3 Section 4.3 Commercial Practices FEDERAL TRADE COMMISSION ORGANIZATION, PROCEDURES AND RULES OF PRACTICE MISCELLANEOUS RULES § 4.3 Time. (a) Computation. Computation of any period of time prescribed or allowed by the rules in this chapter, by order of...

  11. 16 CFR 4.3 - Time.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 1 2012-01-01 2012-01-01 false Time. 4.3 Section 4.3 Commercial Practices FEDERAL TRADE COMMISSION ORGANIZATION, PROCEDURES AND RULES OF PRACTICE MISCELLANEOUS RULES § 4.3 Time. (a) Computation. Computation of any period of time prescribed or allowed by the rules in this chapter, by order of...

  12. 16 CFR 4.3 - Time.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Time. 4.3 Section 4.3 Commercial Practices FEDERAL TRADE COMMISSION ORGANIZATION, PROCEDURES AND RULES OF PRACTICE MISCELLANEOUS RULES § 4.3 Time. (a) Computation. Computation of any period of time prescribed or allowed by the rules in this chapter, by order of...

  13. Some Thoughts Regarding Practical Quantum Computing

    NASA Astrophysics Data System (ADS)

    Ghoshal, Debabrata; Gomez, Richard; Lanzagorta, Marco; Uhlmann, Jeffrey

    2006-03-01

    Quantum computing has become an important area of research in computer science because of its potential to provide more efficient algorithmic solutions to certain problems than are possible with classical computing. The ability of performing parallel operations over an exponentially large computational space has proved to be the main advantage of the quantum computing model. In this regard, we are particularly interested in the potential applications of quantum computers to enhance real software systems of interest to the defense, industrial, scientific and financial communities. However, while much has been written in popular and scientific literature about the benefits of the quantum computational model, several of the problems associated to the practical implementation of real-life complex software systems in quantum computers are often ignored. In this presentation we will argue that practical quantum computation is not as straightforward as commonly advertised, even if the technological problems associated to the manufacturing and engineering of large-scale quantum registers were solved overnight. We will discuss some of the frequently overlooked difficulties that plague quantum computing in the areas of memories, I/O, addressing schemes, compilers, oracles, approximate information copying, logical debugging, error correction and fault-tolerant computing protocols.

  14. The role of information technology usage in physician practice satisfaction.

    PubMed

    Menachemi, Nir; Powers, Thomas L; Brooks, Robert G

    2009-01-01

    Despite the growing use of information technology (IT) in medical practices, little is known about the relationship between IT and physician satisfaction. The objective of this study was to examine the relationship between physician IT adoption (of various applications) and overall practice satisfaction, as well as satisfaction with the level of computerization at the practice. Data from a Florida survey examining physicians' use of IT and satisfaction were analyzed. Odds ratios (ORs), adjusted for physician demographics and practice characteristics, were computed utilizing logistic regressions to study the independent relationship of electronic health record (EHR) usage, PDA usage, use of e-mail with patients, and the use of disease management software with satisfaction. In addition, we examined the relationship between satisfaction with IT and overall satisfaction with the current medical practice. In multivariate analysis, EHR users were 5 times more likely to be satisfied with the level of computerization in their practice (OR = 4.93, 95% CI = 3.68-6.61) and 1.8 times more likely to be satisfied with their overall medical practice (OR = 1.77, 95% CI = 1.35-2.32). PDA use was also associated with an increase in satisfaction with the level of computerization (OR = 1.23, 95% CI = 1.02-1.47) and with the overall medical practice (OR = 1.30, 95% CI = 1.07-1.57). E-mail use with patients was negatively related to satisfaction with the level of computerization in the practice (OR = 0.69, 95% CI = 0.54-0.90). Last, physicians who were satisfied with IT were 4 times more likely to be satisfied with the current state of their medical practice (OR = 3.97, 95% CI = 3.29-4.81). Physician users of IT applications, especially EHRs, are generally satisfied with these technologies. Potential adopters and/or policy makers interested in influencing IT adoption should consider the positive impact that computer automation can have on medical practice.

  15. Doctors' experience with handheld computers in clinical practice: qualitative study.

    PubMed

    McAlearney, Ann Scheck; Schweikhart, Sharon B; Medow, Mitchell A

    2004-05-15

    To examine doctors' perspectives about their experiences with handheld computers in clinical practice. Qualitative study of eight focus groups consisting of doctors with diverse training and practice patterns. Six practice settings across the United States and two additional focus group sessions held at a national meeting of general internists. 54 doctors who did or did not use handheld computers. Doctors who used handheld computers in clinical practice seemed generally satisfied with them and reported diverse patterns of use. Users perceived that the devices helped them increase productivity and improve patient care. Barriers to use concerned the device itself and personal and perceptual constraints, with perceptual factors such as comfort with technology, preference for paper, and the impression that the devices are not easy to use somewhat difficult to overcome. Participants suggested that organisations can help promote handheld computers by providing advice on purchase, usage, training, and user support. Participants expressed concern about reliability and security of the device but were particularly concerned about dependency on the device and over-reliance as a substitute for clinical thinking. Doctors expect handheld computers to become more useful, and most seem interested in leveraging (getting the most value from) their use. Key opportunities with handheld computers included their use as a stepping stone to build doctors' comfort with other information technology and ehealth initiatives and providing point of care support that helps improve patient care.

  16. A comparison of computer-assisted detection (CAD) programs for the identification of colorectal polyps: performance and sensitivity analysis, current limitations and practical tips for radiologists.

    PubMed

    Bell, L T O; Gandhi, S

    2018-06-01

    To directly compare the accuracy and speed of analysis of two commercially available computer-assisted detection (CAD) programs in detecting colorectal polyps. In this retrospective single-centre study, patients who had colorectal polyps identified on computed tomography colonography (CTC) and subsequent lower gastrointestinal endoscopy, were analysed using two commercially available CAD programs (CAD1 and CAD2). Results were compared against endoscopy to ascertain sensitivity and positive predictive value (PPV) for colorectal polyps. Time taken for CAD analysis was also calculated. CAD1 demonstrated a sensitivity of 89.8%, PPV of 17.6% and mean analysis time of 125.8 seconds. CAD2 demonstrated a sensitivity of 75.5%, PPV of 44.0% and mean analysis time of 84.6 seconds. The sensitivity and PPV for colorectal polyps and CAD analysis times can vary widely between current commercially available CAD programs. There is still room for improvement. Generally, there is a trade-off between sensitivity and PPV, and so further developments should aim to optimise both. Information on these factors should be made routinely available, so that an informed choice on their use can be made. This information could also potentially influence the radiologist's use of CAD results. Copyright © 2018 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  17. Spatial distribution of clinical computer systems in primary care in England in 2016 and implications for primary care electronic medical record databases: a cross-sectional population study.

    PubMed

    Kontopantelis, Evangelos; Stevens, Richard John; Helms, Peter J; Edwards, Duncan; Doran, Tim; Ashcroft, Darren M

    2018-02-28

    UK primary care databases (PCDs) are used by researchers worldwide to inform clinical practice. These databases have been primarily tied to single clinical computer systems, but little is known about the adoption of these systems by primary care practices or their geographical representativeness. We explore the spatial distribution of clinical computing systems and discuss the implications for the longevity and regional representativeness of these resources. Cross-sectional study. English primary care clinical computer systems. 7526 general practices in August 2016. Spatial mapping of family practices in England in 2016 by clinical computer system at two geographical levels, the lower Clinical Commissioning Group (CCG, 209 units) and the higher National Health Service regions (14 units). Data for practices included numbers of doctors, nurses and patients, and area deprivation. Of 7526 practices, Egton Medical Information Systems (EMIS) was used in 4199 (56%), SystmOne in 2552 (34%) and Vision in 636 (9%). Great regional variability was observed for all systems, with EMIS having a stronger presence in the West of England, London and the South; SystmOne in the East and some regions in the South; and Vision in London, the South, Greater Manchester and Birmingham. PCDs based on single clinical computer systems are geographically clustered in England. For example, Clinical Practice Research Datalink and The Health Improvement Network, the most popular primary care databases in terms of research outputs, are based on the Vision clinical computer system, used by <10% of practices and heavily concentrated in three major conurbations and the South. Researchers need to be aware of the analytical challenges posed by clustering, and barriers to accessing alternative PCDs need to be removed. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. The acceptability of computer applications to group practices.

    PubMed

    Zimmerman, J; Gordon, R S; Tao, D K; Boxerman, S B

    1978-01-01

    Of the 72 identified group practices in a midwest urban environment, 39 were found to use computers. The practices had been influenced strongly by vendors in their selection of an automated system or service, and had usually spent less than a work-month analyzing their needs and reviewing alternate ways in which those needs could be met. Ninety-seven percent of the practices had some financial applications and 64% had administrative applications, but only 2.5% had medical applications. For half the practices at least 2 months elapsed from the time the automated applications were put into operation until they were considered to be integrated into the office routine. Advantages experienced by at least a third of the practices using computers were that the work was done faster, information was more readily available, and costs were reduced. The most common disadvantage was inflexibility. Most (89%) of the practices believed that automation was preferable to their previous manual system.

  19. Practical considerations in experimental computational sensing

    NASA Astrophysics Data System (ADS)

    Poon, Phillip K.

    Computational sensing has demonstrated the ability to ameliorate or eliminate many trade-offs in traditional sensors. Rather than attempting to form a perfect image, then sampling at the Nyquist rate, and reconstructing the signal of interest prior to post-processing, the computational sensor attempts to utilize a priori knowledge, active or passive coding of the signal-of-interest combined with a variety of algorithms to overcome the trade-offs or to improve various task-specific metrics. While it is a powerful approach to radically new sensor architectures, published research tends to focus on architecture concepts and positive results. Little attention is given towards the practical issues when faced with implementing computational sensing prototypes. I will discuss the various practical challenges that I encountered while developing three separate applications of computational sensors. The first is a compressive sensing based object tracking camera, the SCOUT, which exploits the sparsity of motion between consecutive frames while using no moving parts to create a psuedo-random shift variant point-spread function. The second is a spectral imaging camera, the AFSSI-C, which uses a modified version of Principal Component Analysis with a Bayesian strategy to adaptively design spectral filters for direct spectral classification using a digital micro-mirror device (DMD) based architecture. The third demonstrates two separate architectures to perform spectral unmixing by using an adaptive algorithm or a hybrid techniques of using Maximum Noise Fraction and random filter selection from a liquid crystal on silicon based computational spectral imager, the LCSI. All of these applications demonstrate a variety of challenges that have been addressed or continue to challenge the computational sensing community. One issue is calibration, since many computational sensors require an inversion step and in the case of compressive sensing, lack of redundancy in the measurement data. Another issue is over multiplexing, as more light is collected per sample, the finite amount of dynamic range and quantization resolution can begin to degrade the recovery of the relevant information. A priori knowledge of the sparsity and or other statistics of the signal or noise is often used by computational sensors to outperform their isomorphic counterparts. This is demonstrated in all three of the sensors I have developed. These challenges and others will be discussed using a case-study approach through these three applications.

  20. Soldiers in the Blogosphere Using New Media to Help Win the War for Public Opinion

    DTIC Science & Technology

    2009-04-01

    relaxed its policy on blogging, but still bans the use of community networking sites such as MySpace on government computers . The Army has practically...of community networking sites such as MySpace on government computers . The Army has practically ignored these new media sources to get its story...its policy on blogging, but still bans the use of community networking sites such as MySpace on government computers . The Army has practically

  1. Hardware accelerated high performance neutron transport computation based on AGENT methodology

    NASA Astrophysics Data System (ADS)

    Xiao, Shanjie

    The spatial heterogeneity of the next generation Gen-IV nuclear reactor core designs brings challenges to the neutron transport analysis. The Arbitrary Geometry Neutron Transport (AGENT) AGENT code is a three-dimensional neutron transport analysis code being developed at the Laboratory for Neutronics and Geometry Computation (NEGE) at Purdue University. It can accurately describe the spatial heterogeneity in a hierarchical structure through the R-function solid modeler. The previous version of AGENT coupled the 2D transport MOC solver and the 1D diffusion NEM solver to solve the three dimensional Boltzmann transport equation. In this research, the 2D/1D coupling methodology was expanded to couple two transport solvers, the radial 2D MOC solver and the axial 1D MOC solver, for better accuracy. The expansion was benchmarked with the widely applied C5G7 benchmark models and two fast breeder reactor models, and showed good agreement with the reference Monte Carlo results. In practice, the accurate neutron transport analysis for a full reactor core is still time-consuming and thus limits its application. Therefore, another content of my research is focused on designing a specific hardware based on the reconfigurable computing technique in order to accelerate AGENT computations. It is the first time that the application of this type is used to the reactor physics and neutron transport for reactor design. The most time consuming part of the AGENT algorithm was identified. Moreover, the architecture of the AGENT acceleration system was designed based on the analysis. Through the parallel computation on the specially designed, highly efficient architecture, the acceleration design on FPGA acquires high performance at the much lower working frequency than CPUs. The whole design simulations show that the acceleration design would be able to speedup large scale AGENT computations about 20 times. The high performance AGENT acceleration system will drastically shortening the computation time for 3D full-core neutron transport analysis, making the AGENT methodology unique and advantageous, and thus supplies the possibility to extend the application range of neutron transport analysis in either industry engineering or academic research.

  2. Cost-Effectiveness and Cost-Utility of Internet-Based Computer Tailoring for Smoking Cessation

    PubMed Central

    Evers, Silvia MAA; de Vries, Hein; Hoving, Ciska

    2013-01-01

    Background Although effective smoking cessation interventions exist, information is limited about their cost-effectiveness and cost-utility. Objective To assess the cost-effectiveness and cost-utility of an Internet-based multiple computer-tailored smoking cessation program and tailored counseling by practice nurses working in Dutch general practices compared with an Internet-based multiple computer-tailored program only and care as usual. Methods The economic evaluation was embedded in a randomized controlled trial, for which 91 practice nurses recruited 414 eligible smokers. Smokers were randomized to receive multiple tailoring and counseling (n=163), multiple tailoring only (n=132), or usual care (n=119). Self-reported cost and quality of life were assessed during a 12-month follow-up period. Prolonged abstinence and 24-hour and 7-day point prevalence abstinence were assessed at 12-month follow-up. The trial-based economic evaluation was conducted from a societal perspective. Uncertainty was accounted for by bootstrapping (1000 times) and sensitivity analyses. Results No significant differences were found between the intervention arms with regard to baseline characteristics or effects on abstinence, quality of life, and addiction level. However, participants in the multiple tailoring and counseling group reported significantly more annual health care–related costs than participants in the usual care group. Cost-effectiveness analysis, using prolonged abstinence as the outcome measure, showed that the mere multiple computer-tailored program had the highest probability of being cost-effective. Compared with usual care, in this group €5100 had to be paid for each additional abstinent participant. With regard to cost-utility analyses, using quality of life as the outcome measure, usual care was probably most efficient. Conclusions To our knowledge, this was the first study to determine the cost-effectiveness and cost-utility of an Internet-based smoking cessation program with and without counseling by a practice nurse. Although the Internet-based multiple computer-tailored program seemed to be the most cost-effective treatment, the cost-utility was probably highest for care as usual. However, to ease the interpretation of cost-effectiveness results, future research should aim at identifying an acceptable cutoff point for the willingness to pay per abstinent participant. PMID:23491820

  3. Engaging Women in Computer Science and Engineering: Promising Practices for Promoting Gender Equity in Undergraduate Research Experiences

    ERIC Educational Resources Information Center

    Kim, Karen A.; Fann, Amy J.; Misa-Escalante, Kimberly O.

    2011-01-01

    Building on research that identifies and addresses issues of women's underrepresentation in computing, this article describes promising practices in undergraduate research experiences that promote women's long-term interest in computer science and engineering. Specifically, this article explores whether and how REU programs include programmatic…

  4. Cutting Technology Costs with Refurbished Computers

    ERIC Educational Resources Information Center

    Dessoff, Alan

    2010-01-01

    Many district administrators are finding that they can save money on computers by buying preowned ones instead of new ones. The practice has other benefits as well: It allows districts to give more computers to more students who need them, and it also promotes good environmental practices by keeping the machines out of landfills, where they…

  5. Using Computer-Aided Instruction to Support the Systematic Practice of Phonological Skills in Beginning Readers

    ERIC Educational Resources Information Center

    Wild, Mary

    2009-01-01

    The paper reports the results of a randomised control trial investigating the use of computer-aided instruction (CAI) for practising phonological awareness skills with beginning readers. Two intervention groups followed the same phonological awareness programme: one group undertook practice exercises using a computer and the other group undertook…

  6. Correlation between Academic and Skills-Based Tests in Computer Networks

    ERIC Educational Resources Information Center

    Buchanan, William

    2006-01-01

    Computing-related programmes and modules have many problems, especially related to large class sizes, large-scale plagiarism, module franchising, and an increased requirement from students for increased amounts of hands-on, practical work. This paper presents a practical computer networks module which uses a mixture of online examinations and a…

  7. Cane Toad or Computer Mouse? Real and Computer-Simulated Laboratory Exercises in Physiology Classes

    ERIC Educational Resources Information Center

    West, Jan; Veenstra, Anneke

    2012-01-01

    Traditional practical classes in many countries are being rationalised to reduce costs. The challenge for university educators is to provide students with the opportunity to reinforce theoretical concepts by running something other than a traditional practical program. One alternative is to replace wet labs with comparable computer simulations.…

  8. Computers in the examination room and the electronic health record: physicians' perceived impact on clinical encounters before and after full installation and implementation.

    PubMed

    Doyle, Richard J; Wang, Nina; Anthony, David; Borkan, Jeffrey; Shield, Renee R; Goldman, Roberta E

    2012-10-01

    We compared physicians' self-reported attitudes and behaviours regarding electronic health record (EHR) use before and after installation of computers in patient examination rooms and transition to full implementation of an EHR in a family medicine training practice to identify anticipated and observed effects these changes would have on physicians' practices and clinical encounters. We conducted two individual qualitative interviews with family physicians. The first interview was before and second interview was 8 months later after full implementation of an EHR and computer installation in the examination rooms. Data were analysed through project team discussions and subsequent coding with qualitative analysis software. At the first interviews, physicians frequently expressed concerns about the potential negative effect of the EHR on quality of care and physician-patient interaction, adequacy of their skills in EHR use and privacy and confidentiality concerns. Nevertheless, most physicians also anticipated multiple benefits, including improved accessibility of patient data and online health information. In the second interviews, physicians reported that their concerns did not persist. Many anticipated benefits were realized, appearing to facilitate collaborative physician-patient relationships. Physicians reported a greater teaching role with patients and sharing online medical information and treatment plan decisions. Before computer installation and full EHR implementation, physicians expressed concerns about the impact of computer use on patient care. After installation and implementation, however, many concerns were mitigated. Using computers in the examination rooms to document and access patients' records along with online medical information and decision-making tools appears to contribute to improved physician-patient communication and collaboration.

  9. Earth Science Informatics Comes of Age

    NASA Technical Reports Server (NTRS)

    Jodha, Siri; Khalsa, S.; Ramachandran, Rahul

    2014-01-01

    The volume and complexity of Earth science data have steadily increased, placing ever-greater demands on researchers, software developers and data managers tasked with handling such data. Additional demands arise from requirements being levied by funding agencies and governments to better manage, preserve and provide open access to data. Fortunately, over the past 10-15 years significant advances in information technology, such as increased processing power, advanced programming languages, more sophisticated and practical standards, and near-ubiquitous internet access have made the jobs of those acquiring, processing, distributing and archiving data easier. These advances have also led to an increasing number of individuals entering the field of informatics as it applies to Geoscience and Remote Sensing. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of data, information, and knowledge. Informatics also encompasses the use of computers and computational methods to support decisionmaking and other applications for societal benefits.

  10. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  11. Patients and Computers as Reminders to Screen for Diabetes in Family Practice

    PubMed Central

    Kenealy, Tim; Arroll, Bruce; Petrie, Keith J

    2005-01-01

    Background In New Zealand, more than 5% of people aged 50 years and older have undiagnosed diabetes; most of them attend family practitioners (FPs) at least once a year. Objectives To test the effectiveness of patients or computers as reminders to screen for diabetes in patients attending FPs. Design A randomized-controlled trial compared screening rates in 4 intervention arms: patient reminders, computer reminders, both reminders, and usual care. The trial lasted 2 months. The patient reminder was a diabetes risk self-assessment sheet filled in by patients and given to the FP during the consultation. The computer reminder was an icon that flashed only for patients considered eligible for screening. Participants One hundred and seven FPs. Measurements The primary outcome was whether each eligible patient, who attended during the trial, was or was not tested for blood glucose. Analysis was by intention to treat and allowed for clustering by FP. Results Patient reminders (odds ratio [OR] 1.72, 95% confidence interval [CI] 1.21, 2.43), computer reminders (OR 2.55, 1.68, 3.88), and both reminders (OR 1.69, 1.11, 2.59) were all effective compared with usual care. Computer reminders were more effective than patient reminders (OR 1.49, 1.07, 2.07). Patients were more likely to be screened if they visited the FP repeatedly, if patients were non-European, if they were “regular” patients of the practice, and if their FP had a higher screening rate prior to the study. Conclusions Patient and computer reminders were effective methods to increase screening for diabetes. However, the effects were not additive. PMID:16191138

  12. Improving finite element results in modeling heart valve mechanics.

    PubMed

    Earl, Emily; Mohammadi, Hadi

    2018-06-01

    Finite element analysis is a well-established computational tool which can be used for the analysis of soft tissue mechanics. Due to the structural complexity of the leaflet tissue of the heart valve, the currently available finite element models do not adequately represent the leaflet tissue. A method of addressing this issue is to implement computationally expensive finite element models, characterized by precise constitutive models including high-order and high-density mesh techniques. In this study, we introduce a novel numerical technique that enhances the results obtained from coarse mesh finite element models to provide accuracy comparable to that of fine mesh finite element models while maintaining a relatively low computational cost. Introduced in this study is a method by which the computational expense required to solve linear and nonlinear constitutive models, commonly used in heart valve mechanics simulations, is reduced while continuing to account for large and infinitesimal deformations. This continuum model is developed based on the least square algorithm procedure coupled with the finite difference method adhering to the assumption that the components of the strain tensor are available at all nodes of the finite element mesh model. The suggested numerical technique is easy to implement, practically efficient, and requires less computational time compared to currently available commercial finite element packages such as ANSYS and/or ABAQUS.

  13. Blood Pump Development Using Rocket Engine Flow Simulation Technology

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin

    2001-01-01

    This paper reports the progress made towards developing complete blood flow simulation capability in humans, especially in the presence of artificial devices such as valves and ventricular assist devices. Devices modeling poses unique challenges different from computing the blood flow in natural hearts and arteries. There are many elements needed to quantify the flow in these devices such as flow solvers, geometry modeling including flexible walls, moving boundary procedures and physiological characterization of blood. As a first step, computational technology developed for aerospace applications was extended to the analysis and development of a ventricular assist device (VAD), i.e., a blood pump. The blood flow in a VAD is practically incompressible and Newtonian, and thus an incompressible Navier-Stokes solution procedure can be applied. A primitive variable formulation is used in conjunction with the overset grid approach to handle complex moving geometry. The primary purpose of developing the incompressible flow analysis capability was to quantify the flow in advanced turbopump for space propulsion system. The same procedure has been extended to the development of NASA-DeBakey VAD that is based on an axial blood pump. Due to massive computing requirements, high-end computing is necessary for simulating three-dimensional flow in these pumps. Computational, experimental, and clinical results are presented.

  14. Zero side force volute development

    NASA Technical Reports Server (NTRS)

    Anderson, P. G.; Franz, R. J.; Farmer, R. C.; Chen, Y. S.

    1995-01-01

    Collector scrolls on high performance centrifugal pumps are currently designed with methods which are based on very approximate flowfield models. Such design practices result in some volute configurations causing excessive side loads even at design flowrates. The purpose of this study was to develop and verify computational design tools which may be used to optimize volute configurations with respect to avoiding excessive loads on the bearings. The new design methodology consisted of a volute grid generation module and a computational fluid dynamics (CFD) module to describe the volute geometry and predict the radial forces for a given flow condition, respectively. Initially, the CFD module was used to predict the impeller and the volute flowfields simultaneously; however, the required computation time was found to be excessive for parametric design studies. A second computational procedure was developed which utilized an analytical impeller flowfield model and an ordinary differential equation to describe the impeller/volute coupling obtained from the literature, Adkins & Brennen (1988). The second procedure resulted in 20 to 30 fold increase in computational speed for an analysis. The volute design analysis was validated by postulating a volute geometry, constructing a volute to this configuration, and measuring the steady radial forces over a range of flow coefficients. Excellent agreement between model predictions and observed pump operation prove the computational impeller/volute pump model to be a valuable design tool. Further applications are recommended to fully establish the benefits of this new methodology.

  15. A Secure Alignment Algorithm for Mapping Short Reads to Human Genome.

    PubMed

    Zhao, Yongan; Wang, Xiaofeng; Tang, Haixu

    2018-05-09

    The elastic and inexpensive computing resources such as clouds have been recognized as a useful solution to analyzing massive human genomic data (e.g., acquired by using next-generation sequencers) in biomedical researches. However, outsourcing human genome computation to public or commercial clouds was hindered due to privacy concerns: even a small number of human genome sequences contain sufficient information for identifying the donor of the genomic data. This issue cannot be directly addressed by existing security and cryptographic techniques (such as homomorphic encryption), because they are too heavyweight to carry out practical genome computation tasks on massive data. In this article, we present a secure algorithm to accomplish the read mapping, one of the most basic tasks in human genomic data analysis based on a hybrid cloud computing model. Comparing with the existing approaches, our algorithm delegates most computation to the public cloud, while only performing encryption and decryption on the private cloud, and thus makes the maximum use of the computing resource of the public cloud. Furthermore, our algorithm reports similar results as the nonsecure read mapping algorithms, including the alignment between reads and the reference genome, which can be directly used in the downstream analysis such as the inference of genomic variations. We implemented the algorithm in C++ and Python on a hybrid cloud system, in which the public cloud uses an Apache Spark system.

  16. Computer-Assisted Instruction in Practical Nursing Education

    ERIC Educational Resources Information Center

    Kelley, Maureen

    1976-01-01

    Existing computer-assisted instructional programs for nursing students are studied and their application to the education of practical nurses is considered in the light of the recent history of nursing education. (Author)

  17. A Patient Record-Filing System for Family Practice

    PubMed Central

    Levitt, Cheryl

    1988-01-01

    The efficient storage and easy retrieval of quality records are a central concern of good family practice. Many physicians starting out in practice have difficulty choosing a practical and lasting system for storing their records. Some who have established practices are installing computers in their offices and finding that their filing systems are worn, outdated, and incompatible with computerized systems. This article describes a new filing system installed simultaneously with a new computer system in a family-practice teaching centre. The approach adopted solved all identifiable problems and is applicable in family practices of all sizes.

  18. Advances in computer imaging/applications in facial plastic surgery.

    PubMed

    Papel, I D; Jiannetto, D F

    1999-01-01

    Rapidly progressing computer technology, ever-increasing expectations of patients, and a confusing medicolegal environment requires a clarification of the role of computer imaging/applications. Advances in computer technology and its applications are reviewed. A brief historical discussion is included for perspective. Improvements in both hardware and software with the advent of digital imaging have allowed great increases in speed and accuracy in patient imaging. This facilitates doctor-patient communication and possibly realistic patient expectations. Patients seeking cosmetic surgery now often expect preoperative imaging. Although society in general has become more litigious, a literature search up to 1998 reveals no lawsuits directly involving computer imaging. It appears that conservative utilization of computer imaging by the facial plastic surgeon may actually reduce liability and promote communication. Recent advances have significantly enhanced the value of computer imaging in the practice of facial plastic surgery. These technological advances in computer imaging appear to contribute a useful technique for the practice of facial plastic surgery. Inclusion of computer imaging should be given serious consideration as an adjunct to clinical practice.

  19. Digital Pathology: Data-Intensive Frontier in Medical Imaging

    PubMed Central

    Cooper, Lee A. D.; Carter, Alexis B.; Farris, Alton B.; Wang, Fusheng; Kong, Jun; Gutman, David A.; Widener, Patrick; Pan, Tony C.; Cholleti, Sharath R.; Sharma, Ashish; Kurc, Tahsin M.; Brat, Daniel J.; Saltz, Joel H.

    2013-01-01

    Pathology is a medical subspecialty that practices the diagnosis of disease. Microscopic examination of tissue reveals information enabling the pathologist to render accurate diagnoses and to guide therapy. The basic process by which anatomic pathologists render diagnoses has remained relatively unchanged over the last century, yet advances in information technology now offer significant opportunities in image-based diagnostic and research applications. Pathology has lagged behind other healthcare practices such as radiology where digital adoption is widespread. As devices that generate whole slide images become more practical and affordable, practices will increasingly adopt this technology and eventually produce an explosion of data that will quickly eclipse the already vast quantities of radiology imaging data. These advances are accompanied by significant challenges for data management and storage, but they also introduce new opportunities to improve patient care by streamlining and standardizing diagnostic approaches and uncovering disease mechanisms. Computer-based image analysis is already available in commercial diagnostic systems, but further advances in image analysis algorithms are warranted in order to fully realize the benefits of digital pathology in medical discovery and patient care. In coming decades, pathology image analysis will extend beyond the streamlining of diagnostic workflows and minimizing interobserver variability and will begin to provide diagnostic assistance, identify therapeutic targets, and predict patient outcomes and therapeutic responses. PMID:25328166

  20. Evaluating the Motivational Impact of CALL Systems: Current Practices and Future Directions

    ERIC Educational Resources Information Center

    Bodnar, Stephen; Cucchiarini, Catia; Strik, Helmer; van Hout, Roeland

    2016-01-01

    A major aim of computer-assisted language learning (CALL) is to create computer environments that facilitate students' second language (L2) acquisition. To achieve this aim, CALL employs technological innovations to create novel types of language practice. Evaluations of the new practice types serve the important role of distinguishing effective…

  1. Best Practices for Crash Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Jackson, Karen E.

    2002-01-01

    Aviation safety can be greatly enhanced by the expeditious use of computer simulations of crash impact. Unlike automotive impact testing, which is now routine, experimental crash tests of even small aircraft are expensive and complex due to the high cost of the aircraft and the myriad of crash impact conditions that must be considered. Ultimately, the goal is to utilize full-scale crash simulations of aircraft for design evaluation and certification. The objective of this publication is to describe "best practices" for modeling aircraft impact using explicit nonlinear dynamic finite element codes such as LS-DYNA, DYNA3D, and MSC.Dytran. Although "best practices" is somewhat relative, it is hoped that the authors' experience will help others to avoid some of the common pitfalls in modeling that are not documented in one single publication. In addition, a discussion of experimental data analysis, digital filtering, and test-analysis correlation is provided. Finally, some examples of aircraft crash simulations are described in several appendices following the main report.

  2. WINPEPI updated: computer programs for epidemiologists, and their teaching potential

    PubMed Central

    2011-01-01

    Background The WINPEPI computer programs for epidemiologists are designed for use in practice and research in the health field and as learning or teaching aids. The programs are free, and can be downloaded from the Internet. Numerous additions have been made in recent years. Implementation There are now seven WINPEPI programs: DESCRIBE, for use in descriptive epidemiology; COMPARE2, for use in comparisons of two independent groups or samples; PAIRSetc, for use in comparisons of paired and other matched observations; LOGISTIC, for logistic regression analysis; POISSON, for Poisson regression analysis; WHATIS, a "ready reckoner" utility program; and ETCETERA, for miscellaneous other procedures. The programs now contain 122 modules, each of which provides a number, sometimes a large number, of statistical procedures. The programs are accompanied by a Finder that indicates which modules are appropriate for different purposes. The manuals explain the uses, limitations and applicability of the procedures, and furnish formulae and references. Conclusions WINPEPI is a handy resource for a wide variety of statistical routines used by epidemiologists. Because of its ready availability, portability, ease of use, and versatility, WINPEPI has a considerable potential as a learning and teaching aid, both with respect to practical procedures in the planning and analysis of epidemiological studies, and with respect to important epidemiological concepts. It can also be used as an aid in the teaching of general basic statistics. PMID:21288353

  3. Reverse engineering biological networks :applications in immune responses to bio-toxins.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martino, Anthony A.; Sinclair, Michael B.; Davidson, George S.

    Our aim is to determine the network of events, or the regulatory network, that defines an immune response to a bio-toxin. As a model system, we are studying T cell regulatory network triggered through tyrosine kinase receptor activation using a combination of pathway stimulation and time-series microarray experiments. Our approach is composed of five steps (1) microarray experiments and data error analysis, (2) data clustering, (3) data smoothing and discretization, (4) network reverse engineering, and (5) network dynamics analysis and fingerprint identification. The technological outcome of this study is a suite of experimental protocols and computational tools that reverse engineermore » regulatory networks provided gene expression data. The practical biological outcome of this work is an immune response fingerprint in terms of gene expression levels. Inferring regulatory networks from microarray data is a new field of investigation that is no more than five years old. To the best of our knowledge, this work is the first attempt that integrates experiments, error analyses, data clustering, inference, and network analysis to solve a practical problem. Our systematic approach of counting, enumeration, and sampling networks matching experimental data is new to the field of network reverse engineering. The resulting mathematical analyses and computational tools lead to new results on their own and should be useful to others who analyze and infer networks.« less

  4. EHR adoption among doctors who treat the elderly.

    PubMed

    Yeager, Valerie A; Menachemi, Nir; Brooks, Robert G

    2010-12-01

    The purpose of this study is to examine Electronic Health Record (EHR) adoption among Florida doctors who treat the elderly. This analysis contributes to the EHR adoption literature by determining if doctors who disproportionately treat the elderly differ from their counterparts with respect to the utilization of an important quality-enhancing health information technology application. This study is based on a primary survey of a large, statewide sample of doctors practising in outpatient settings in Florida. Logistic regression analysis was used to determine whether doctors who treat a high volume of elderly (HVE) patients were different with respect to EHR adoption. Our analyses included responses from 1724 doctors. In multivariate analyses controlling for doctor age, training, computer sophistication, practice size and practice setting, HVE doctors were significantly less likely to adopt EHR. Specifically, compared with their counterparts, HVE doctors were observed to be 26.7% less likely to be utilizing an EHR system (OR=0.733, 95% CI 0.547-0.982). We also found that doctor age is negatively related to EHR adoption, and practice size and doctor computer savvy-ness is positively associated. Despite the fact that EHR adoption has improved in recent years, doctors in Florida who serve the elderly are less likely to adopt EHRs. As long as HVE doctors are adopting EHR systems at slower rates, the elderly patients treated by these doctors will be at a disadvantage with respect to potential benefits offered by this technology. © 2010 Blackwell Publishing Ltd.

  5. A comparative study and validation of upwind and central-difference Navier-Stokes codes for high-speed flows

    NASA Technical Reports Server (NTRS)

    Rudy, David H.; Kumar, Ajay; Thomas, James L.; Gnoffo, Peter A.; Chakravarthy, Sukumar R.

    1988-01-01

    A comparative study was made using 4 different computer codes for solving the compressible Navier-Stokes equations. Three different test problems were used, each of which has features typical of high speed internal flow problems of practical importance in the design and analysis of propulsion systems for advanced hypersonic vehicles. These problems are the supersonic flow between two walls, one of which contains a 10 deg compression ramp, the flow through a hypersonic inlet, and the flow in a 3-D corner formed by the intersection of two symmetric wedges. Three of the computer codes use similar recently developed implicit upwind differencing technology, while the fourth uses a well established explicit method. The computed results were compared with experimental data where available.

  6. Changes in the social context and conduct of eating in four Nordic countries between 1997 and 2012.

    PubMed

    Holm, Lotte; Lauridsen, Drude; Lund, Thomas Bøker; Gronow, Jukka; Niva, Mari; Mäkelä, Johanna

    2016-08-01

    How have eating patterns changed in modern life? In public and academic debate concern has been expressed that the social function of eating may be challenged by de-structuration and the dissolution of traditions. We analyzed changes in the social context and conduct of eating in four Nordic countries over the period 1997-2012. We focused on three interlinked processes often claimed to be distinctive of modern eating: delocalization of eating from private households to commercial settings, individualization in the form of more eating alone, and informalization, implying more casual codes of conduct. We based the analysis on data from two surveys conducted in Denmark, Finland, Norway and Sweden in 1997 and 2012. The surveys reported in detail one day of eating in representative samples of adult populations in the four countries (N = 4823 and N = 8242). We compared data regarding where, with whom, and for how long people ate, and whether parallel activities took place while eating. While Nordic people's primary location for eating remained the home and the workplace, the practices of eating in haste, and while watching television increased and using tablets, computers and smartphones while eating was frequent in 2012. Propensity to eat alone increased slightly in Denmark and Norway, and decreased slightly in Sweden. While such practices vary with socio-economic background, regression analysis showed several changes were common across the Nordic populations. However, the new practice of using tablets, computers, and smartphones while eating was strongly associated with young age. Further, each of the practices appeared to be related to different types of meal. We conclude that while the changes in the social organization of eating were not dramatic, signs of individualization and informalization could be detected. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Computer Simulations Improve University Instructional Laboratories1

    PubMed Central

    2004-01-01

    Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and “wet” laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff. PMID:15592599

  8. Live interactive computer music performance practice

    NASA Astrophysics Data System (ADS)

    Wessel, David

    2002-05-01

    A live-performance musical instrument can be assembled around current lap-top computer technology. One adds a controller such as a keyboard or other gestural input device, a sound diffusion system, some form of connectivity processor(s) providing for audio I/O and gestural controller input, and reactive real-time native signal processing software. A system consisting of a hand gesture controller; software for gesture analysis and mapping, machine listening, composition, and sound synthesis; and a controllable radiation pattern loudspeaker are described. Interactivity begins in the set up wherein the speaker-room combination is tuned with an LMS procedure. This system was designed for improvisation. It is argued that software suitable for carrying out an improvised musical dialog with another performer poses special challenges. The processes underlying the generation of musical material must be very adaptable, capable of rapid changes in musical direction. Machine listening techniques are used to help the performer adapt to new contexts. Machine learning can play an important role in the development of such systems. In the end, as with any musical instrument, human skill is essential. Practice is required not only for the development of musically appropriate human motor programs but for the adaptation of the computer-based instrument as well.

  9. Cloud Service Selection Using Multicriteria Decision Analysis

    PubMed Central

    Anuar, Nor Badrul; Shiraz, Muhammad; Haque, Israat Tanzeena

    2014-01-01

    Cloud computing (CC) has recently been receiving tremendous attention from the IT industry and academic researchers. CC leverages its unique services to cloud customers in a pay-as-you-go, anytime, anywhere manner. Cloud services provide dynamically scalable services through the Internet on demand. Therefore, service provisioning plays a key role in CC. The cloud customer must be able to select appropriate services according to his or her needs. Several approaches have been proposed to solve the service selection problem, including multicriteria decision analysis (MCDA). MCDA enables the user to choose from among a number of available choices. In this paper, we analyze the application of MCDA to service selection in CC. We identify and synthesize several MCDA techniques and provide a comprehensive analysis of this technology for general readers. In addition, we present a taxonomy derived from a survey of the current literature. Finally, we highlight several state-of-the-art practical aspects of MCDA implementation in cloud computing service selection. The contributions of this study are four-fold: (a) focusing on the state-of-the-art MCDA techniques, (b) highlighting the comparative analysis and suitability of several MCDA methods, (c) presenting a taxonomy through extensive literature review, and (d) analyzing and summarizing the cloud computing service selections in different scenarios. PMID:24696645

  10. Cloud service selection using multicriteria decision analysis.

    PubMed

    Whaiduzzaman, Md; Gani, Abdullah; Anuar, Nor Badrul; Shiraz, Muhammad; Haque, Mohammad Nazmul; Haque, Israat Tanzeena

    2014-01-01

    Cloud computing (CC) has recently been receiving tremendous attention from the IT industry and academic researchers. CC leverages its unique services to cloud customers in a pay-as-you-go, anytime, anywhere manner. Cloud services provide dynamically scalable services through the Internet on demand. Therefore, service provisioning plays a key role in CC. The cloud customer must be able to select appropriate services according to his or her needs. Several approaches have been proposed to solve the service selection problem, including multicriteria decision analysis (MCDA). MCDA enables the user to choose from among a number of available choices. In this paper, we analyze the application of MCDA to service selection in CC. We identify and synthesize several MCDA techniques and provide a comprehensive analysis of this technology for general readers. In addition, we present a taxonomy derived from a survey of the current literature. Finally, we highlight several state-of-the-art practical aspects of MCDA implementation in cloud computing service selection. The contributions of this study are four-fold: (a) focusing on the state-of-the-art MCDA techniques, (b) highlighting the comparative analysis and suitability of several MCDA methods, (c) presenting a taxonomy through extensive literature review, and (d) analyzing and summarizing the cloud computing service selections in different scenarios.

  11. Practical applications of hand-held computers in dermatology.

    PubMed

    Goldblum, Orin M

    2002-09-01

    For physicians, hand-held computers are gaining popularity as point of care reference tools. The convergence of hand-held computers, the Internet, and wireless networks will enable these devices to assume more essential roles as mobile transmitters and receivers of digital medical Information. In addition to serving as portable medical reference sources, these devices can be Internet-enabled, allowing them to communicate over wireless wide and local area networks. With enhanced wireless connectivity, hand-held computers can be used at the point of patient care for charge capture, electronic prescribing, laboratory test ordering, laboratory result retrieval, web access, e-mail communication, and other clinical and administrative tasks. Physicians In virtually every medical specialty have begun using these devices in various ways. This review of hand-held computer use in dermatology illustrates practical examples of the many different ways hand-held computers can be effectively used by the practicing dermatologist.

  12. Extension of the ADjoint Approach to a Laminar Navier-Stokes Solver

    NASA Astrophysics Data System (ADS)

    Paige, Cody

    The use of adjoint methods is common in computational fluid dynamics to reduce the cost of the sensitivity analysis in an optimization cycle. The forward mode ADjoint is a combination of an adjoint sensitivity analysis method with a forward mode automatic differentiation (AD) and is a modification of the reverse mode ADjoint method proposed by Mader et al.[1]. A colouring acceleration technique is presented to reduce the computational cost increase associated with forward mode AD. The forward mode AD facilitates the implementation of the laminar Navier-Stokes (NS) equations. The forward mode ADjoint method is applied to a three-dimensional computational fluid dynamics solver. The resulting Euler and viscous ADjoint sensitivities are compared to the reverse mode Euler ADjoint derivatives and a complex-step method to demonstrate the reduced computational cost and accuracy. Both comparisons demonstrate the benefits of the colouring method and the practicality of using a forward mode AD. [1] Mader, C.A., Martins, J.R.R.A., Alonso, J.J., and van der Weide, E. (2008) ADjoint: An approach for the rapid development of discrete adjoint solvers. AIAA Journal, 46(4):863-873. doi:10.2514/1.29123.

  13. Case-Deletion Diagnostics for Maximum Likelihood Multipoint Quantitative Trait Locus Linkage Analysis

    PubMed Central

    Mendoza, Maria C.B.; Burns, Trudy L.; Jones, Michael P.

    2009-01-01

    Objectives Case-deletion diagnostic methods are tools that allow identification of influential observations that may affect parameter estimates and model fitting conclusions. The goal of this paper was to develop two case-deletion diagnostics, the exact case deletion (ECD) and the empirical influence function (EIF), for detecting outliers that can affect results of sib-pair maximum likelihood quantitative trait locus (QTL) linkage analysis. Methods Subroutines to compute the ECD and EIF were incorporated into the maximum likelihood QTL variance estimation components of the linkage analysis program MAPMAKER/SIBS. Performance of the diagnostics was compared in simulation studies that evaluated the proportion of outliers correctly identified (sensitivity), and the proportion of non-outliers correctly identified (specificity). Results Simulations involving nuclear family data sets with one outlier showed EIF sensitivities approximated ECD sensitivities well for outlier-affected parameters. Sensitivities were high, indicating the outlier was identified a high proportion of the time. Simulations also showed the enormous computational time advantage of the EIF. Diagnostics applied to body mass index in nuclear families detected observations influential on the lod score and model parameter estimates. Conclusions The EIF is a practical diagnostic tool that has the advantages of high sensitivity and quick computation. PMID:19172086

  14. SPLASH program for three dimensional fluid dynamics with free surface boundaries

    NASA Astrophysics Data System (ADS)

    Yamaguchi, A.

    1996-05-01

    This paper describes a three dimensional computer program SPLASH that solves Navier-Stokes equations based on the Arbitrary Lagrangian Eulerian (ALE) finite element method. SPLASH has been developed for application to the fluid dynamics problems including the moving boundary of a liquid metal cooled Fast Breeder Reactor (FBR). To apply SPLASH code to the free surface behavior analysis, a capillary model using a cubic Spline function has been developed. Several sample problems, e.g., free surface oscillation, vortex shedding development, and capillary tube phenomena, are solved to verify the computer program. In the analyses, the numerical results are in good agreement with the theoretical value or experimental observance. Also SPLASH code has been applied to an analysis of a free surface sloshing experiment coupled with forced circulation flow in a rectangular tank. This is a simplified situation of the flow field in a reactor vessel of the FBR. The computational simulation well predicts the general behavior of the fluid flow inside and the free surface behavior. Analytical capability of the SPLASH code has been verified in this study and the application to more practical problems such as FBR design and safety analysis is under way.

  15. The study on the parallel processing based time series correlation analysis of RBC membrane flickering in quantitative phase imaging

    NASA Astrophysics Data System (ADS)

    Lee, Minsuk; Won, Youngjae; Park, Byungjun; Lee, Seungrag

    2017-02-01

    Not only static characteristics but also dynamic characteristics of the red blood cell (RBC) contains useful information for the blood diagnosis. Quantitative phase imaging (QPI) can capture sample images with subnanometer scale depth resolution and millisecond scale temporal resolution. Various researches have been used QPI for the RBC diagnosis, and recently many researches has been developed to decrease the process time of RBC information extraction using QPI by the parallel computing algorithm, however previous studies are interested in the static parameters such as morphology of the cells or simple dynamic parameters such as root mean square (RMS) of the membrane fluctuations. Previously, we presented a practical blood test method using the time series correlation analysis of RBC membrane flickering with QPI. However, this method has shown that there is a limit to the clinical application because of the long computation time. In this study, we present an accelerated time series correlation analysis of RBC membrane flickering using the parallel computing algorithm. This method showed consistent fractal scaling exponent results of the surrounding medium and the normal RBC with our previous research.

  16. The computer in office medical practice.

    PubMed

    Dowdle, John

    2002-04-01

    There will continue to be change and evolution in the medical office environment. As voice recognition systems continue to improve, instant creation of office notes with the absence of dictation may be commonplace. As medical and computer technology evolves, we must continue to evaluate the many new computer systems that can assist us in our clinical office practice.

  17. A Survey and Evaluation of Simulators Suitable for Teaching Courses in Computer Architecture and Organization

    ERIC Educational Resources Information Center

    Nikolic, B.; Radivojevic, Z.; Djordjevic, J.; Milutinovic, V.

    2009-01-01

    Courses in Computer Architecture and Organization are regularly included in Computer Engineering curricula. These courses are usually organized in such a way that students obtain not only a purely theoretical experience, but also a practical understanding of the topics lectured. This practical work is usually done in a laboratory using simulators…

  18. Do the Effects of Computer-Assisted Practice Differ for Children with Reading Disabilities with and without IQ-Achievement Discrepancy?

    ERIC Educational Resources Information Center

    Jimenez, Juan E.; Ortiz, Maria del Rosario; Rodrigo, Mercedes; Hernandez-Valle, Isabel; Ramirez, Gustavo; Estevez, Adelina; O'Shanahan, Isabel; Trabaue, Maria de la Luz

    2003-01-01

    A study assessed whether the effects of computer-assisted practice on visual word recognition differed for 73 Spanish children with reading disabilities with or without aptitude-achievement discrepancy. Computer-assisted intervention improved word recognition. However, children with dyslexia had more difficulties than poor readers during…

  19. Exploiting graphics processing units for computational biology and bioinformatics.

    PubMed

    Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H

    2010-09-01

    Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700.

  20. Psychiatrists’ Comfort Using Computers and Other Electronic Devices in Clinical Practice

    PubMed Central

    Fochtmann, Laura J.; Clarke, Diana E.; Barber, Keila; Hong, Seung-Hee; Yager, Joel; Mościcki, Eve K.; Plovnick, Robert M.

    2015-01-01

    This report highlights findings from the Study of Psychiatrists’ Use of Informational Resources in Clinical Practice, a cross-sectional Web- and paper-based survey that examined psychiatrists’ comfort using computers and other electronic devices in clinical practice. One-thousand psychiatrists were randomly selected from the American Medical Association Physician Masterfile and asked to complete the survey between May and August, 2012. A total of 152 eligible psychiatrists completed the questionnaire (response rate 22.2 %). The majority of psychiatrists reported comfort using computers for educational and personal purposes. However, 26 % of psychiatrists reported not using or not being comfortable using computers for clinical functions. Psychiatrists under age 50 were more likely to report comfort using computers for all purposes than their older counterparts. Clinical tasks for which computers were reportedly used comfortably, specifically by psychiatrists younger than 50, included documenting clinical encounters, prescribing, ordering laboratory tests, accessing read-only patient information (e.g., test results), conducting internet searches for general clinical information, accessing online patient educational materials, and communicating with patients or other clinicians. Psychiatrists generally reported comfort using computers for personal and educational purposes. However, use of computers in clinical care was less common, particularly among psychiatrists 50 and older. Information and educational resources need to be available in a variety of accessible, user-friendly, computer and non-computer-based formats, to support use across all ages. Moreover, ongoing training and technical assistance with use of electronic and mobile device technologies in clinical practice is needed. Research on barriers to clinical use of computers is warranted. PMID:26667248

  1. Psychiatrists' Comfort Using Computers and Other Electronic Devices in Clinical Practice.

    PubMed

    Duffy, Farifteh F; Fochtmann, Laura J; Clarke, Diana E; Barber, Keila; Hong, Seung-Hee; Yager, Joel; Mościcki, Eve K; Plovnick, Robert M

    2016-09-01

    This report highlights findings from the Study of Psychiatrists' Use of Informational Resources in Clinical Practice, a cross-sectional Web- and paper-based survey that examined psychiatrists' comfort using computers and other electronic devices in clinical practice. One-thousand psychiatrists were randomly selected from the American Medical Association Physician Masterfile and asked to complete the survey between May and August, 2012. A total of 152 eligible psychiatrists completed the questionnaire (response rate 22.2 %). The majority of psychiatrists reported comfort using computers for educational and personal purposes. However, 26 % of psychiatrists reported not using or not being comfortable using computers for clinical functions. Psychiatrists under age 50 were more likely to report comfort using computers for all purposes than their older counterparts. Clinical tasks for which computers were reportedly used comfortably, specifically by psychiatrists younger than 50, included documenting clinical encounters, prescribing, ordering laboratory tests, accessing read-only patient information (e.g., test results), conducting internet searches for general clinical information, accessing online patient educational materials, and communicating with patients or other clinicians. Psychiatrists generally reported comfort using computers for personal and educational purposes. However, use of computers in clinical care was less common, particularly among psychiatrists 50 and older. Information and educational resources need to be available in a variety of accessible, user-friendly, computer and non-computer-based formats, to support use across all ages. Moreover, ongoing training and technical assistance with use of electronic and mobile device technologies in clinical practice is needed. Research on barriers to clinical use of computers is warranted.

  2. Gold rush - A swarm dynamics in games

    NASA Astrophysics Data System (ADS)

    Zelinka, Ivan; Bukacek, Michal

    2017-07-01

    This paper is focused on swarm intelligence techniques and its practical use in computer games. The aim is to show how a swarm dynamics can be generated by multiplayer game, then recorded, analyzed and eventually controlled. In this paper we also discuss possibility to use swarm intelligence instead of game players. Based on our previous experiments two games, using swarm algorithms are mentioned briefly here. The first one is strategy game StarCraft: Brood War, and TicTacToe in which SOMA algorithm has also take a role of player against human player. Open research reported here has shown potential benefit of swarm computation in the field of strategy games and players strategy based on swarm behavior record and analysis. We propose new game called Gold Rush as an experimental environment for human or artificial swarm behavior and consequent analysis.

  3. Predictive Anomaly Management for Resilient Virtualized Computing Infrastructures

    DTIC Science & Technology

    2015-05-27

    PREC: Practical Root Exploit Containment for Android Devices, ACM Conference on Data and Application Security and Privacy (CODASPY) . 03-MAR-14...05-OCT-11, . : , Hiep Nguyen, Yongmin Tan, Xiaohui Gu. Propagation-aware Anomaly Localization for Cloud Hosted Distributed Applications , ACM...Workshop on Managing Large-Scale Systems via the Analysis of System Logs and the Application of Machine Learning Techniques (SLAML) in conjunction with SOSP

  4. An Analysis of Mathematics Education Students' Skills in the Process of Programming and Their Practices of Integrating It into Their Teaching

    ERIC Educational Resources Information Center

    Gökçe, Semirhan; Yenmez, Arzu Aydogan; Özpinar, Ilknur

    2017-01-01

    Recent developments in technology have changed the learner's profile and the learning outcomes. Today, with the emergence of higher-order thinking skills and computer literacy skills, teaching through traditional methods is likely to fail to achieve the learning outcomes. That is why; teachers and teacher candidates are expected to have computer…

  5. Computational and Genomic Analysis of Mycobacteriophage: A Longitudinal Study of Technology Engineered Biology Courses That Implemented an Inquiry Based Laboratory Practice Designed to Enhance, Encourage, and Empower Student Learning

    ERIC Educational Resources Information Center

    Hollowell, Gail P.; Osler, James E.; Hester, April L.

    2015-01-01

    This paper provides an applied research rational for a longitudinal investigation that involved teaching a "Technology Engineered Science Education Course" via an Interactive Laboratory Based Genomics Curriculum. The Technology st Engineering [TE] methodology was first introduced at the SAPES: South Atlantic Philosophy of Education…

  6. PLANETSYS, a Computer Program for the Steady State and Transient Thermal Analysis of a Planetary Power Transmission System: User's Manual

    NASA Technical Reports Server (NTRS)

    Hadden, G. B.; Kleckner, R. J.; Ragen, M. A.; Dyba, G. J.; Sheynin, L.

    1981-01-01

    The material presented is structured to guide the user in the practical and correct implementation of PLANETSYS which is capable of simulating the thermomechanical performance of a multistage planetary power transmission. In this version of PLANETSYS, the user can select either SKF or NASA models in calculating lubricant film thickness and traction forces.

  7. Hiring Practices of African American Males in Academic Leadership Positions at American Colleges and Universities: An Employment Trends and Disparate Impact Analysis

    ERIC Educational Resources Information Center

    Jackson, Jerlando F. L.

    2006-01-01

    This study examined the status of African American males in academic leadership positions at American colleges and universities in comparison with other males (e.g., Asian). Guided by disparate impact theory, descriptive trend analyses and impact ratios were computed using the 1993 and 1999 National Study of Postsecondary Faculty (NSOPF). These…

  8. Aeroelastic-Acoustics Simulation of Flight Systems

    NASA Technical Reports Server (NTRS)

    Gupta, kajal K.; Choi, S.; Ibrahim, A.

    2009-01-01

    This paper describes the details of a numerical finite element (FE) based analysis procedure and a resulting code for the simulation of the acoustics phenomenon arising from aeroelastic interactions. Both CFD and structural simulations are based on FE discretization employing unstructured grids. The sound pressure level (SPL) on structural surfaces is calculated from the root mean square (RMS) of the unsteady pressure and the acoustic wave frequencies are computed from a fast Fourier transform (FFT) of the unsteady pressure distribution as a function of time. The resulting tool proves to be unique as it is designed to analyze complex practical problems, involving large scale computations, in a routine fashion.

  9. Real-time radar signal processing using GPGPU (general-purpose graphic processing unit)

    NASA Astrophysics Data System (ADS)

    Kong, Fanxing; Zhang, Yan Rockee; Cai, Jingxiao; Palmer, Robert D.

    2016-05-01

    This study introduces a practical approach to develop real-time signal processing chain for general phased array radar on NVIDIA GPUs(Graphical Processing Units) using CUDA (Compute Unified Device Architecture) libraries such as cuBlas and cuFFT, which are adopted from open source libraries and optimized for the NVIDIA GPUs. The processed results are rigorously verified against those from the CPUs. Performance benchmarked in computation time with various input data cube sizes are compared across GPUs and CPUs. Through the analysis, it will be demonstrated that GPGPUs (General Purpose GPU) real-time processing of the array radar data is possible with relatively low-cost commercial GPUs.

  10. Time-Domain Impedance Boundary Conditions for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.; Auriault, Laurent

    1996-01-01

    It is an accepted practice in aeroacoustics to characterize the properties of an acoustically treated surface by a quantity known as impedance. Impedance is a complex quantity. As such, it is designed primarily for frequency-domain analysis. Time-domain boundary conditions that are the equivalent of the frequency-domain impedance boundary condition are proposed. Both single frequency and model broadband time-domain impedance boundary conditions are provided. It is shown that the proposed boundary conditions, together with the linearized Euler equations, form well-posed initial boundary value problems. Unlike ill-posed problems, they are free from spurious instabilities that would render time-marching computational solutions impossible.

  11. International Instrumentation Symposium, 38th, Las Vegas, NV, Apr. 26-30, 1992, Proceedings

    NASA Astrophysics Data System (ADS)

    The present volume on aerospace instrumentation discusses computer applications, blast and shock, implementation of the Clean Air Act amendments, and thermal systems. Attention is given to measurement uncertainty/flow measurement, data acquisition and processing, force/acceleration/motion measurements, and hypersonics/reentry vehicle systems. Topics addressed include wind tunnels, real time systems, and pressure effects. Also discussed are a distributed data and control system for space simulation and thermal testing a stepwise shockwave velocity determinator, computer tracking and decision making, the use of silicon diodes for detecting the liquid-vapor interface in hydrogen, and practical methods for analysis of uncertainty propagation.

  12. Doctors' experience with handheld computers in clinical practice: qualitative study

    PubMed Central

    McAlearney, Ann Scheck; Schweikhart, Sharon B; Medow, Mitchell A

    2004-01-01

    Objective To examine doctors' perspectives about their experiences with handheld computers in clinical practice. Design Qualitative study of eight focus groups consisting of doctors with diverse training and practice patterns. Setting Six practice settings across the United States and two additional focus group sessions held at a national meeting of general internists. Participants 54 doctors who did or did not use handheld computers. Results Doctors who used handheld computers in clinical practice seemed generally satisfied with them and reported diverse patterns of use. Users perceived that the devices helped them increase productivity and improve patient care. Barriers to use concerned the device itself and personal and perceptual constraints, with perceptual factors such as comfort with technology, preference for paper, and the impression that the devices are not easy to use somewhat difficult to overcome. Participants suggested that organisations can help promote handheld computers by providing advice on purchase, usage, training, and user support. Participants expressed concern about reliability and security of the device but were particularly concerned about dependency on the device and over-reliance as a substitute for clinical thinking. Conclusions Doctors expect handheld computers to become more useful, and most seem interested in leveraging (getting the most value from) their use. Key opportunities with handheld computers included their use as a stepping stone to build doctors' comfort with other information technology and ehealth initiatives and providing point of care support that helps improve patient care. PMID:15142920

  13. ReSeqTools: an integrated toolkit for large-scale next-generation sequencing based resequencing analysis.

    PubMed

    He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z

    2013-12-04

    Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.

  14. Constructivist-Compatible Beliefs and Practices among U.S. Teachers. Teaching, Learning, and Computing: 1998 National Survey Report #4.

    ERIC Educational Resources Information Center

    Ravitz, Jason L.; Becker, Henry Jay; Wong, YanTien

    This report, the forth in a series from the spring 1998 national survey, "Teaching, Learning, and Computing," examines teachers' survey responses that describe the frequency with which their teaching practice involves those five types of activities and the frequency with which their practice involves more traditional transmission and…

  15. Computer-Based Counselor-in-Training Supervision: Ethical and Practical Implications for Counselor Educators and Supervisors

    ERIC Educational Resources Information Center

    Vaccaro, Nicole; Lambie, Glenn W.

    2007-01-01

    Computer-based clinical supervision of counselors-in-training is becoming more prevalent (M. Reisch & L. Jarman-Rohde, 2000); however, its use is still in its infancy, and ethical standards have not been established regarding its practice. There exists a dearth of literature focusing on the ethical practice and development of supervisees when…

  16. Educational Technology: Best Practices from America's Schools.

    ERIC Educational Resources Information Center

    Bozeman, William C.; Baumbach, Donna J.

    This book begins with an overview of computer technology concepts, including computer system configurations, computer communications, and software. Instructional computer applications are then discussed; topics include computer-assisted instruction, computer-managed instruction, computer-enhanced instruction, LOGO, authoring programs, presentation…

  17. Numerically accurate computational techniques for optimal estimator analyses of multi-parameter models

    NASA Astrophysics Data System (ADS)

    Berger, Lukas; Kleinheinz, Konstantin; Attili, Antonio; Bisetti, Fabrizio; Pitsch, Heinz; Mueller, Michael E.

    2018-05-01

    Modelling unclosed terms in partial differential equations typically involves two steps: First, a set of known quantities needs to be specified as input parameters for a model, and second, a specific functional form needs to be defined to model the unclosed terms by the input parameters. Both steps involve a certain modelling error, with the former known as the irreducible error and the latter referred to as the functional error. Typically, only the total modelling error, which is the sum of functional and irreducible error, is assessed, but the concept of the optimal estimator enables the separate analysis of the total and the irreducible errors, yielding a systematic modelling error decomposition. In this work, attention is paid to the techniques themselves required for the practical computation of irreducible errors. Typically, histograms are used for optimal estimator analyses, but this technique is found to add a non-negligible spurious contribution to the irreducible error if models with multiple input parameters are assessed. Thus, the error decomposition of an optimal estimator analysis becomes inaccurate, and misleading conclusions concerning modelling errors may be drawn. In this work, numerically accurate techniques for optimal estimator analyses are identified and a suitable evaluation of irreducible errors is presented. Four different computational techniques are considered: a histogram technique, artificial neural networks, multivariate adaptive regression splines, and an additive model based on a kernel method. For multiple input parameter models, only artificial neural networks and multivariate adaptive regression splines are found to yield satisfactorily accurate results. Beyond a certain number of input parameters, the assessment of models in an optimal estimator analysis even becomes practically infeasible if histograms are used. The optimal estimator analysis in this paper is applied to modelling the filtered soot intermittency in large eddy simulations using a dataset of a direct numerical simulation of a non-premixed sooting turbulent flame.

  18. Using computers in the exam room.

    PubMed

    McGrath, Debra

    2009-01-01

    Purchasing an electronic health records system is the first step to assimilation of the new system into the fabric of a practice. The next hurdle is use of the electronic health record as close to the point of patient care as possible, which requires the clinician to use a computer. This article presents some of the unique challenges of using computers to document patient encounters and some practical advice and considerations for improving the use of computers at the bedside.

  19. Detailed Requirements Analysis for a Management Information System for the Department of Family Practice and Community Medicine at Silas B. Hays Army Community Hospital, Fort Ord, California

    DTIC Science & Technology

    1989-03-01

    Chapter II. Chapter UI discusses the theory of information systems and the analysis and design of such systems. The last section of Chapter II introduces...34 Improved personnel morale and job satisfaction. Doctors and hospital administrators are trying to recover from the medical computing lag which has...discussed below). The primary source of equipment authorizations is the Table of Distribution and Allowances ( TDA ) which shows the equipment authorized to be

  20. A methodological approach to the analysis of egocentric social networks in public health research: a practical example.

    PubMed

    Djomba, Janet Klara; Zaletel-Kragelj, Lijana

    2016-12-01

    Research on social networks in public health focuses on how social structures and relationships influence health and health-related behaviour. While the sociocentric approach is used to study complete social networks, the egocentric approach is gaining popularity because of its focus on individuals, groups and communities. One of the participants of the healthy lifestyle health education workshop 'I'm moving', included in the study of social support for exercise was randomly selected. The participant was denoted as the ego and members of her/his social network as the alteri. Data were collected by personal interviews using a self-made questionnaire. Numerical methods and computer programmes for the analysis of social networks were used for the demonstration of analysis. The size, composition and structure of the egocentric social network were obtained by a numerical analysis. The analysis of composition included homophily and homogeneity. Moreover, the analysis of the structure included the degree of the egocentric network, the strength of the ego-alter ties and the average strength of ties. Visualisation of the network was performed by three freely available computer programmes, namely: Egonet.QF, E-net and Pajek. The computer programmes were described and compared by their usefulness. Both numerical analysis and visualisation have their benefits. The decision what approach to use is depending on the purpose of the social network analysis. While the numerical analysis can be used in large-scale population-based studies, visualisation of personal networks can help health professionals at creating, performing and evaluation of preventive programmes, especially if focused on behaviour change.

  1. Atmospheric model development in support of SEASAT. Volume 1: Summary of findings

    NASA Technical Reports Server (NTRS)

    Kesel, P. G.

    1977-01-01

    Atmospheric analysis and prediction models of varying (grid) resolution were developed. The models were tested using real observational data for the purpose of assessing the impact of grid resolution on short range numerical weather prediction. The discretionary model procedures were examined so that the computational viability of SEASAT data might be enhanced during the conduct of (future) sensitivity tests. The analysis effort covers: (1) examining the procedures for allowing data to influence the analysis; (2) examining the effects of varying the weights in the analysis procedure; (3) testing and implementing procedures for solving the minimization equation in an optimal way; (4) describing the impact of grid resolution on analysis; and (5) devising and implementing numerous practical solutions to analysis problems, generally.

  2. EPA CHEMICAL PRIORITIZATION COMMUNITY OF PRACTICE.

    EPA Science Inventory

    IN 2005 THE NATIONAL CENTER FOR COMPUTATIONAL TOXICOLOGY (NCCT) ORGANIZED EPA CHEMICAL PRIORITIATION COMMUNITY OF PRACTICE (CPCP) TO PROVIDE A FORUM FOR DISCUSSING THE UTILITY OF COMPUTATIONAL CHEMISTRY, HIGH-THROUGHPUT SCREENIG (HTS) AND VARIOUS TOXICOGENOMIC TECHNOLOGIES FOR CH...

  3. Damage Based Analysis (DBA) - Theory, Derivation and Practical Application Using Both an Acceleration and Pseudo Velocity Approach

    NASA Technical Reports Server (NTRS)

    Grillo, Vince

    2017-01-01

    The objective of this presentation is to give a brief overview of the theory behind the (DBA) method, an overview of the derivation and a practical application of the theory using the Python computer language. The Theory and Derivation will use both Acceleration and Pseudo Velocity methods to derive a series of equations for processing by Python. We will take the results and compare both Acceleration and Pseudo Velocity methods and discuss implementation of the Python functions. Also, we will discuss the efficiency of the methods and the amount of computer time required for the solution. In conclusion, (DBA) offers a powerful method to evaluate the amount of energy imparted into a system in the form of both Amplitude and Duration during qualification testing and flight environments. Many forms of steady state and transient vibratory motion can be characterized using this technique. (DBA) provides a more robust alternative to traditional methods such Power Spectral Density (PSD) using a maximax approach.

  4. Damage Based Analysis (DBA): Theory, Derivation and Practical Application - Using Both an Acceleration and Pseudo-Velocity Approach

    NASA Technical Reports Server (NTRS)

    Grillo, Vince

    2016-01-01

    The objective of this presentation is to give a brief overview of the theory behind the (DBA) method, an overview of the derivation and a practical application of the theory using the Python computer language. The Theory and Derivation will use both Acceleration and Pseudo Velocity methods to derive a series of equations for processing by Python. We will take the results and compare both Acceleration and Pseudo Velocity methods and discuss implementation of the Python functions. Also, we will discuss the efficiency of the methods and the amount of computer time required for the solution. In conclusion, (DBA) offers a powerful method to evaluate the amount of energy imparted into a system in the form of both Amplitude and Duration during qualification testing and flight environments. Many forms of steady state and transient vibratory motion can be characterized using this technique. (DBA) provides a more robust alternative to traditional methods such Power Spectral Density (PSD) using a Maximax approach.

  5. Biomarkers in Breast Cancer – An Update

    PubMed Central

    Schmidt, M.; Fasching, P. A.; Beckmann, M. W.; Kölbl, H.

    2012-01-01

    The therapy of choice for breast cancer patients requiring adjuvant chemo- or radiotherapy is increasingly guided by the principle of weighing the individual effectiveness of the therapy against the associated side effects. This has only been made possible by the discovery and validation of modern biomarkers. In the last decades and in the last few years some biomarkers have been integrated in clinical practice and a number have been included in modern study concepts. The importance of biomarkers lies not merely in their prognostic value indicating the future course of disease but also in their use to predict patient response to therapy. Due to the many subgroups, mathematical models and computer-assisted analysis are increasingly being used to assess the prognostic information obtained from established clinical and histopathological factors. In addition to describing some recent computer programmes this overview will focus on established molecular markers which have already been extensively validated in clinical practice and on new molecular markers identified by genome-wide studies. PMID:26640290

  6. Privacy and Data Security under Cloud Computing Arrangements: The Legal Framework and Practical Do's and Don'ts

    ERIC Educational Resources Information Center

    Buckman, Joel; Gold, Stephanie

    2012-01-01

    This article outlines privacy and data security compliance issues facing postsecondary education institutions when they utilize cloud computing and concludes with a practical list of do's and dont's. Cloud computing does not change an institution's privacy and data security obligations. It does involve reliance on a third party, which requires an…

  7. Exploring the Effects of Web-Mediated Computational Thinking on Developing Students' Computing Skills in a Ubiquitous Learning Environment

    ERIC Educational Resources Information Center

    Tsai, Chia-Wen; Shen, Pei-Di; Tsai, Meng-Chuan; Chen, Wen-Yu

    2017-01-01

    Much application software education in Taiwan can hardly be regarded as practical. The researchers in this study provided a flexible means of ubiquitous learning (u-learning) with a mobile app for students to access the learning material. In addition, the authors also adopted computational thinking (CT) to help students develop practical computing…

  8. Cybersecurity Workforce Development and the Protection of Critical Infrastructure

    DTIC Science & Technology

    2017-03-31

    communicat ions products, and limited travel for site visits and conferencing. The CSCC contains a developed web-based coordination site, computer ...the CSCC. The Best Practices Ana~yst position maintains a lisr of best practices, computer related patches. and standard operating procedures (SOP...involved in conducting vulnerability assessments of computer networks. To adequately exercise and experiment with industry standard software, it was

  9. A practically unconditionally gradient stable scheme for the N-component Cahn-Hilliard system

    NASA Astrophysics Data System (ADS)

    Lee, Hyun Geun; Choi, Jeong-Whan; Kim, Junseok

    2012-02-01

    We present a practically unconditionally gradient stable conservative nonlinear numerical scheme for the N-component Cahn-Hilliard system modeling the phase separation of an N-component mixture. The scheme is based on a nonlinear splitting method and is solved by an efficient and accurate nonlinear multigrid method. The scheme allows us to convert the N-component Cahn-Hilliard system into a system of N-1 binary Cahn-Hilliard equations and significantly reduces the required computer memory and CPU time. We observe that our numerical solutions are consistent with the linear stability analysis results. We also demonstrate the efficiency of the proposed scheme with various numerical experiments.

  10. Cardiovascular imaging and image processing: Theory and practice - 1975; Proceedings of the Conference, Stanford University, Stanford, Calif., July 10-12, 1975

    NASA Technical Reports Server (NTRS)

    Harrison, D. C.; Sandler, H.; Miller, H. A.

    1975-01-01

    The present collection of papers outlines advances in ultrasonography, scintigraphy, and commercialization of medical technology as applied to cardiovascular diagnosis in research and clinical practice. Particular attention is given to instrumentation, image processing and display. As necessary concomitants to mathematical analysis, recently improved magnetic recording methods using tape or disks and high-speed computers of large capacity are coming into use. Major topics include Doppler ultrasonic techniques, high-speed cineradiography, three-dimensional imaging of the myocardium with isotopes, sector-scanning echocardiography, and commercialization of the echocardioscope. Individual items are announced in this issue.

  11. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  12. An Update on Physician Practice Cost Shares

    PubMed Central

    Dayhoff, Debra A.; Cromwell, Jerry; Rosenbach, Margo L.

    1993-01-01

    The 1988 physicians' practice costs and income survey (PPCIS) collected detailed costs, revenues, and incomes data for a sample of 3,086 physicians. These data are utilized to update the Health Care Financing Administration (HCFA) cost shares used in calculating the medicare economic index (MEI) and the geographic practice cost index (GPCI). Cost shares were calculated for the national sample, for 16 specialty groupings, for urban and rural areas, and for 9 census divisions. Although statistical tests reveal that cost shares differ across specialties and geographic areas, sensitivity analysis shows that these differences are small enough to have trivial effects in computing the MEI and GPCI. These results may inform policymakers on one aspect of the larger issue of whether physician payments should vary by geographic location or specialty. PMID:10130573

  13. Contemporary analysis of practicing otolaryngologists.

    PubMed

    Harrill, Willard C; Melon, David E; Seshul, Merritt J; Katz, Marc S; Zanation, Adam M

    2018-05-04

    To investigate contemporary issues facing practicing otolaryngologists including workforce dynamics, ancillary service modeling, otolaryngic allergy integration, ambulatory surgery center utilization, and relevant certificate of need legislation. A cross-sectional survey analysis of academic and private practicing otolaryngologists in North and South Carolina in 2016. A cross-sectional survey was e-mailed to 510 practicing otolaryngologists in North and South Carolina. A 21.3% survey response rate was achieved. Otolaryngology workforce was defined by horizontal aggregation of otolaryngologists into larger group models, with fewer solo practitioners being replaced by younger otolaryngologists or employing otolaryngology extenders. Excluding academic practice, few otolaryngologists have chosen direct hospital employment as a career option, although otolaryngologists with fewer years of practice are pursuing that option with greater frequency. Ancillary services showed audiology and hearing aid services being the most common, followed by otolaryngic allergy, point-of-service computed tomography, and ultrasound. Although otolaryngologists tend to avoid vertical integration, ambulatory surgery center (ASC) ownership trends favor a joint venture model with a hospital system partner. Most otolaryngologists favor changes to certificate of need legislation to improve patient access to these lower-cost facilities, regardless of whether they currently utilize or have access to an ASC. Otolaryngology is uniquely positioned to adapt and respond to current paradigm shifts within ambulatory medicine. Further analysis is needed to prepare current and future otolaryngologists for the demands and opportunities these challenges pose as patient-centered care models and consumer dynamics shape future patient expectations and utilization of healthcare. 5. Laryngoscope, 2018. © 2018 The American Laryngological, Rhinological and Otological Society, Inc.

  14. From Mind to Body: Is Mental Practice Effective on Strength Gains? A Meta-Analysis.

    PubMed

    Manochio, João Paulo; Lattari, Eduardo; Portugal, Eduardo Matta Mello; Monteiro-Junior, Renato Sobral; Paes, Flávia; Budde, Henning; de Tarso Veras Farinatti, Paulo; Arias-Carrión, Oscar; Wegner, Mirko; Carta, Mauro Giovanni; Mura, Gioia; Ferreira Rocha, Nuno Barbosa; Almada, Leonardo Ferreira; Nardi, Antonio Egidio; Yuan, Ti-Fei; Machado, Sergio

    2015-01-01

    Mental practice is an internal reproduction of a motor act (whose intention is to promote learning and improving motor skills). Some studies have shown that other cognitive strategies also increase the strength and muscular resistance in healthy people by the enhancement of the performance during dynamic tasks. Mental training sessions may be primordial to improving muscle strength in different subjects. The aim of this study was to systematically review and meta-analiyze studies that assessed whether mental practice is effective in improving muscular strength. We conducted an electronic-computed search in Pub-Med/Medline and ISI Web of Knowledge, Scielo and manual searchs, searching papers written in English between 1991 and 2014. There were 44 studies in Pub-Med/Medline, 631 in ISI Web of Knowledge, 11 in Scielo and 3 in manual searchs databases. After exclusion of studies for duplicate, unrelated to the topic by title and summary, different samples and methodologies, a meta-analysis of 4 studies was carried out to identify the dose-response relationship. We did not find evidence that mental practice is effective in increasing strength in healthy individuals. There is no evidence that mental practice alone can be effective to induce strength gains or to optimize the training effects.

  15. Prevalence and determinants of risky sexual practice in Ethiopia: Systematic review and Meta-analysis.

    PubMed

    Muche, Achenef Asmamaw; Kassa, Getachew Mullu; Berhe, Abadi Kidanemariam; Fekadu, Gedefaw Abeje

    2017-09-06

    Risky sexual practice is a major public health problem in Ethiopia. There are various studies on the prevalence and determinants of risky sexual practice in different regions of the country but there is no study which shows the national estimate of risky sexual practices in Ethiopia. Therefore, this review was conducted to estimate the national pooled prevalence of risky sexual practice and its risk factors in Ethiopia. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses guideline was followed to review published and unpublished studies in Ethiopia. The databases used were; PubMed, Google Scholar, CINAHL and African Journals Online. Search terms were; risky sexual behavior, risky sexual practice, unprotected sex, multiple sexual partner, early sexual initiation, and/or Ethiopia. Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review Instrument was used for critical appraisal. The meta-analysis was conducted using Review Manager software. Descriptive information of studies was presented in narrative form and quantitative results were presented in forest plots. The Cochran Q test and I 2 test statistics were used to test heterogeneity across studies. The pooled estimate prevalence and the odd ratios with 95% confidence intervals were computed by a random effect model. A total of 31 studies with 43,695 participants were included in the meta-analysis. The pooled prevalence of risky sexual practice was 42.80% (95% CI: 35.64%, 49.96%). Being male (OR: 1.69; 95% CI: 1.21, 2.37), substance use (OR: 3.42; 95% CI: 1.41, 8.31), peer pressure (OR: 3.41; 95% CI: 1.69, 6.87) and watching pornography (OR: 3.6; 95% CI: 2.21, 5.86) were factors associated with an increase in risky sexual practices. The prevalence of risky sexual practices is high in Ethiopia. Being male, substance use, peer pressure and viewing pornographic materials were found to be associated with risky sexual practices. Therefore, life skills training is recommended to reduce peer pressure among individuals. Interventions should be designed to reduce substance use and viewing pornography.

  16. Data-Aware Retrodiction for Asynchronous Harmonic Measurement in a Cyber-Physical Energy System.

    PubMed

    Liu, Youda; Wang, Xue; Liu, Yanchi; Cui, Sujin

    2016-08-18

    Cyber-physical energy systems provide a networked solution for safety, reliability and efficiency problems in smart grids. On the demand side, the secure and trustworthy energy supply requires real-time supervising and online power quality assessing. Harmonics measurement is necessary in power quality evaluation. However, under the large-scale distributed metering architecture, harmonic measurement faces the out-of-sequence measurement (OOSM) problem, which is the result of latencies in sensing or the communication process and brings deviations in data fusion. This paper depicts a distributed measurement network for large-scale asynchronous harmonic analysis and exploits a nonlinear autoregressive model with exogenous inputs (NARX) network to reorder the out-of-sequence measuring data. The NARX network gets the characteristics of the electrical harmonics from practical data rather than the kinematic equations. Thus, the data-aware network approximates the behavior of the practical electrical parameter with real-time data and improves the retrodiction accuracy. Theoretical analysis demonstrates that the data-aware method maintains a reasonable consumption of computing resources. Experiments on a practical testbed of a cyber-physical system are implemented, and harmonic measurement and analysis accuracy are adopted to evaluate the measuring mechanism under a distributed metering network. Results demonstrate an improvement of the harmonics analysis precision and validate the asynchronous measuring method in cyber-physical energy systems.

  17. Embedding Secure Coding Instruction into the IDE: Complementing Early and Intermediate CS Courses with ESIDE

    ERIC Educational Resources Information Center

    Whitney, Michael; Lipford, Heather Richter; Chu, Bill; Thomas, Tyler

    2018-01-01

    Many of the software security vulnerabilities that people face today can be remediated through secure coding practices. A critical step toward the practice of secure coding is ensuring that our computing students are educated on these practices. We argue that secure coding education needs to be included across a computing curriculum. We are…

  18. Immersogeometric cardiovascular fluid–structure interaction analysis with divergence-conforming B-splines

    PubMed Central

    Kamensky, David; Hsu, Ming-Chen; Yu, Yue; Evans, John A.; Sacks, Michael S.; Hughes, Thomas J. R.

    2016-01-01

    This paper uses a divergence-conforming B-spline fluid discretization to address the long-standing issue of poor mass conservation in immersed methods for computational fluid–structure interaction (FSI) that represent the influence of the structure as a forcing term in the fluid subproblem. We focus, in particular, on the immersogeometric method developed in our earlier work, analyze its convergence for linear model problems, then apply it to FSI analysis of heart valves, using divergence-conforming B-splines to discretize the fluid subproblem. Poor mass conservation can manifest as effective leakage of fluid through thin solid barriers. This leakage disrupts the qualitative behavior of FSI systems such as heart valves, which exist specifically to block flow. Divergence-conforming discretizations can enforce mass conservation exactly, avoiding this problem. To demonstrate the practical utility of immersogeometric FSI analysis with divergence-conforming B-splines, we use the methods described in this paper to construct and evaluate a computational model of an in vitro experiment that pumps water through an artificial valve. PMID:28239201

  19. Quantification of sensory and food quality: the R-index analysis.

    PubMed

    Lee, Hye-Seong; van Hout, Danielle

    2009-08-01

    The accurate quantification of sensory difference/similarity between foods, as well as consumer acceptance/preference and concepts, is greatly needed to optimize and maintain food quality. The R-Index is one class of measures of the degree of difference/similarity, and was originally developed for sensory difference tests for food quality control, product development, and so on. The index is based on signal detection theory and is free of the response bias that can invalidate difference testing protocols, including categorization and same-different and A-Not A tests. It is also a nonparametric analysis, making no assumptions about sensory distributions, and is simple to compute and understand. The R-Index is also flexible in its application. Methods based on R-Index analysis have been used as detection and sensory difference tests, as simple alternatives to hedonic scaling, and for the measurement of consumer concepts. This review indicates the various computational strategies for the R-Index and its practical applications to consumer and sensory measurements in food science.

  20. Comparative analysis of two discretizations of Ricci curvature for complex networks.

    PubMed

    Samal, Areejit; Sreejith, R P; Gu, Jiao; Liu, Shiping; Saucan, Emil; Jost, Jürgen

    2018-06-05

    We have performed an empirical comparison of two distinct notions of discrete Ricci curvature for graphs or networks, namely, the Forman-Ricci curvature and Ollivier-Ricci curvature. Importantly, these two discretizations of the Ricci curvature were developed based on different properties of the classical smooth notion, and thus, the two notions shed light on different aspects of network structure and behavior. Nevertheless, our extensive computational analysis in a wide range of both model and real-world networks shows that the two discretizations of Ricci curvature are highly correlated in many networks. Moreover, we show that if one considers the augmented Forman-Ricci curvature which also accounts for the two-dimensional simplicial complexes arising in graphs, the observed correlation between the two discretizations is even higher, especially, in real networks. Besides the potential theoretical implications of these observations, the close relationship between the two discretizations has practical implications whereby Forman-Ricci curvature can be employed in place of Ollivier-Ricci curvature for faster computation in larger real-world networks whenever coarse analysis suffices.

  1. Memory consolidation and contextual interference effects with computer games.

    PubMed

    Shewokis, Patricia A

    2003-10-01

    Some investigators of the contextual interference effect contend that there is a direct relation between the amount of practice and the contextual interference effect based on the prediction that the improvement in learning tasks in a random practice schedule, compared to a blocked practice schedule, increases in magnitude as the amount of practice during acquisition on the tasks increases. Research using computer games in contextual interference studies has yielded a large effect (f = .50) with a random practice schedule advantage during transfer. These investigations had a total of 36 and 72 acquisition trials, respectively. The present study tested this prediction by having 72 college students, who were randomly assigned to a blocked or random practice schedule, practice 102 trials of three computer-game tasks across three days. After a 24-hr. interval, 6 retention and 5 transfer trials were performed. Dependent variables were time to complete an event in seconds and number of errors. No significant differences were found for retention and transfer. These results are discussed in terms of how the amount of practice, task-related factors, and memory consolidation mediate the contextual interference effect.

  2. The Pursuit of K: Reflections on the Current State-of-the-Art in Stress Intensity Factor Solutions for Practical Aerospace Applications

    NASA Technical Reports Server (NTRS)

    CraigMcClung, R.; Lee, Yi-Der; Cardinal, Joseph W.; Guo, Yajun

    2012-01-01

    The elastic stress intensity factor (SIF, commonly denoted as K) is the foundation of practical fracture mechanics (FM) analysis for aircraft structures. This single parameter describes the first-order effects of stress magnitude and distribution as well as the geometry of both structure/component and crack. Hence, the calculation of K is often the most significant step in fatigue analysis based on FM. This presentation will provide several reflections on the current state-of-the-art in SIF solution methods used for practical aerospace applications, including a brief historical perspective, descriptions of some recent and ongoing advances, and comments on some remaining challenges. Newman and Raju made significant early contributions to practical structural analysis by developing closed-form SIF equations for surface and corner cracks in simplified geometries, often based on empirical fits of finite element (FE) solutions. Those solutions (and others like them) were sometimes revised as new analyses were conducted or limitations discovered. The foundational solutions have exhibited striking longevity, despite the relatively "coarse" FE models employed many decades ago. However, in recent years, the accumulation of different generations of solutions for the same nominal geometry has led to some confusion (which solution is correct?), and steady increases in computational capabilities have facilitated the discovery of inaccuracies in some (not all!) of the legacy solutions. Some examples of problems and solutions are presented and discussed, including the challenge of maintaining consistency with legacy design applications. As computational power has increased, the prospect of calculating large numbers of SIF solutions for specific complex geometries with advanced numerical methods has grown more attractive. Fawaz and Andersson, for example, have been generating literally millions of new SIF solutions for different combinations of multiple cracks under simplified loading schemes using p-version FE methods. These data are invaluable, but questions remain about their practical use, because the tabular databases of key results needed to support practical life analysis can occupy gigabytes of storage for only a few classes of geometries. The prospect of using such advanced numerical methods to calculate in real time only those K solutions actually needed to support a specific crack growth analysis is also tempting, but the stark reality is that the computational cost is still so high that the approach is not practical except for specific, critical application problems. Some thoughts are offered about alternative paradigms. Compounding approaches are some of the earliest building blocks of SIF development for more complex geometries. These approaches are especially attractive because of their very low computational cost and their conceptual robustness; they are, in some ways, an intriguing contrast and complement to the brute-force numerical methods. In recent years, researchers at NRC-Canada have published remarkable results showing how compounding approaches can be used to generate accurate solutions for very difficult problems. Examples are provided of some successes--and some limitations--using this approach. These closed-form, tabulated numerical, and compounding approaches have typically been used for simple remote loading with simple load paths to the crack. However, many significant cracks occur in complex stress gradient fields. This is a job for weight function (WF) methods, where the arbitrary stress distribution on the crack plane in the corresponding uncracked body (typically determined using FE methods) is used to determine K. Several significant recent advances in WF methods and solutions are highlighted here. Fueled by advanced 3D numerical methods, many new solutions have been generated for classic geometries such as surface and corner cracks with wide ranges of geometrical validity. A new WF formulation has also be developed for part-through cracks considering the arbitrary stress gradients in all directions in the crack plane (so-called bivariant solutions). Basic WF methods have recently been combined with analytical expressions for crack plane stresses to develop a large family of accurate SIF solutions for corner, surface, and through cracks at internal or external notches with very wide ranges of shapes, sizes, acuities, and offsets. Finally, WF solutions are much faster than FE or boundary element solutions, but can still be much slower than simple closed-form solutions, especially for bivariant solutions that can require 2D numerical integration. Novel pre-integration and dynamic tabular methods have been developed that substantially increase the speed of these advanced WF solutions. The practical utility of advanced SIF methods, including both WF and direct numerical methods, is greatly enhanced if the FM life analysis can be directly and efficiently linked with digital models of the actual structure or component (e.g., FE models for stress analysis). Two recent advances of this type will be described. One approach directly interfaces the FM life analysis with the FE model of the uncracked component (including stress results). Through a powerful graphical user interface, simplified FM life models can be constructed (and visualized) directly on the component model, with the computer collecting the geometry and stress gradient information needed for the life calculation. An even more powerful paradigm uses expert logic to automatically build an optimum simple fracture model at any and every desired location in the component model, perform the life calculation, and even generate fatigue crack growth life contour maps, all with minimal user intervention. This paradigm has also been extended to the automatic calculation of fracture risk, considering uncertainty or variability in key input parameters such as initial crack size or location. Another new integrated approach links the engineering life analysis, the component model, and a 3D numerical fracture analysis built with the same component model to generate a table of SIF values at a specific location that can then be employed efficiently to perform the life calculation. Some attention must be given to verification and validation (V&V) issues and challenges: how good are these SIF solutions, how good is good enough, and does anyone believe the life answer? It is important to think critically about the different sources of error or uncertainty and to perform V&V in a hierarchal, building-block manner. Some accuracy issues for SIF solutions, for example, may actually involve independent material behavior issues, such as constraint loss effects for crack fronts near component surfaces, and can be a source of confusion. Recommendations are proposed for improved V&V approaches. This presentation will briefly but critically survey the range of issues and advances mentioned above, with a particular view towards assembling an integrated approach that combines different methods to create practical tools for real-world design and analysis problems. Examples will be selectively drawn from the recent literature, from recent enhancements in the NASGRO and DARWIN computer codes, and from previously unpublished research

  3. An Online Forum As a Qualitative Research Method: Practical Issues

    PubMed Central

    Im, Eun-Ok; Chee, Wonshik

    2008-01-01

    Background Despite positive aspects of online forums as a qualitative research method, very little is known about practical issues involved in using online forums for data collection, especially for a qualitative research project. Objectives The purpose of this paper is to describe the practical issues that the researchers encountered in implementing an online forum as a qualitative component of a larger study on cancer pain experience. Method Throughout the study process, the research staff recorded issues ranged from minor technical problems to serious ethical dilemmas as they arose and wrote memos about them. The memos and written records of discussions were reviewed and analyzed using the content analysis suggested by Weber. Results Two practical issues related to credibility were identified: a high response and retention rate and automatic transcripts. An issue related to dependability was the participants’ easy forgetfulness. The issues related to confirmability were difficulties in theoretical saturation and unstandardized computer and Internet jargon. A security issue related to hacking attempts was noted as well. Discussion The analysis of these issues suggests several implications for future researchers who want to use online forums as a qualitative data collection method. PMID:16849979

  4. Methods of space radiation dose analysis with applications to manned space systems

    NASA Technical Reports Server (NTRS)

    Langley, R. W.; Billings, M. P.

    1972-01-01

    The full potential of state-of-the-art space radiation dose analysis for manned missions has not been exploited. Point doses have been overemphasized, and the critical dose to the bone marrow has been only crudely approximated, despite the existence of detailed man models and computer codes for dose integration in complex geometries. The method presented makes it practical to account for the geometrical detail of the astronaut as well as the vehicle. Discussed are the major assumptions involved and the concept of applying the results of detailed proton dose analysis to the real-time interpretation of on-board dosimetric measurements.

  5. Lung sound analysis for wheeze episode detection.

    PubMed

    Jain, Abhishek; Vepa, Jithendra

    2008-01-01

    Listening and interpreting lung sounds by a stethoscope had been an important component of screening and diagnosing lung diseases. However this practice has always been vulnerable to poor audibility, inter-observer variations (between different physicians) and poor reproducibility. Thus computerized analysis of lung sounds for objective diagnosis of lung diseases is seen as a probable aid. In this paper we aim at automatic analysis of lung sounds for wheeze episode detection and quantification. The proposed algorithm integrates and analyses the set of parameters based on ATS (American Thoracic Society) definition of wheezes. It is very robust, computationally simple and yielded sensitivity of 84% and specificity of 86%.

  6. [Computational medical imaging (radiomics) and potential for immuno-oncology].

    PubMed

    Sun, R; Limkin, E J; Dercle, L; Reuzé, S; Zacharaki, E I; Chargari, C; Schernberg, A; Dirand, A S; Alexis, A; Paragios, N; Deutsch, É; Ferté, C; Robert, C

    2017-10-01

    The arrival of immunotherapy has profoundly changed the management of multiple cancers, obtaining unexpected tumour responses. However, until now, the majority of patients do not respond to these new treatments. The identification of biomarkers to determine precociously responding patients is a major challenge. Computational medical imaging (also known as radiomics) is a promising and rapidly growing discipline. This new approach consists in the analysis of high-dimensional data extracted from medical imaging, to further describe tumour phenotypes. This approach has the advantages of being non-invasive, capable of evaluating the tumour and its microenvironment in their entirety, thus characterising spatial heterogeneity, and being easily repeatable over time. The end goal of radiomics is to determine imaging biomarkers as decision support tools for clinical practice and to facilitate better understanding of cancer biology, allowing the assessment of the changes throughout the evolution of the disease and the therapeutic sequence. This review will develop the process of computational imaging analysis and present its potential in immuno-oncology. Copyright © 2017 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  7. Fundamental analysis of the failure of polymer-based fiber reinforced composites

    NASA Technical Reports Server (NTRS)

    Kanninen, M. F.; Rybicki, E. F.; Griffith, W. I.; Broek, D.

    1976-01-01

    A mathematical model is described which will permit predictions of the strength of fiber reinforced composites containing known flaws to be made from the basic properties of their constituents. The approach was to embed a local heterogeneous region (LHR) surrounding the crack tip into an anisotropic elastic continuum. The model should (1) permit an explicit analysis of the micromechanical processes involved in the fracture process, and (2) remain simple enough to be useful in practical computations. Computations for arbitrary flaw size and orientation under arbitrary applied load combinations were performed from unidirectional composites with linear elastic-brittle constituent behavior. The mechanical properties were nominally those of graphite epoxy. With the rupture properties arbitrarily varied to test the capability of the model to reflect real fracture modes in fiber composites, it was shown that fiber breakage, matrix crazing, crack bridging, matrix-fiber debonding, and axial splitting can all occur during a period of (gradually) increasing load prior to catastrophic fracture. The computations reveal qualitatively the sequential nature of the stable crack process that precedes fracture.

  8. On the elastic–plastic decomposition of crystal deformation at the atomic scale

    DOE PAGES

    Stukowski, Alexander; Arsenlis, A.

    2012-03-02

    Given two snapshots of an atomistic system, taken at different stages of the deformation process, one can compute the incremental deformation gradient field, F, as defined by continuum mechanics theory, from the displacements of atoms. However, such a kinematic analysis of the total deformation does not reveal the respective contributions of elastic and plastic deformation. We develop a practical technique to perform the multiplicative decomposition of the deformation field, F = F eF p, into elastic and plastic parts for the case of crystalline materials. The described computational analysis method can be used to quantify plastic deformation in a materialmore » due to crystal slip-based mechanisms in molecular dynamics and molecular statics simulations. The knowledge of the plastic deformation field, F p, and its variation with time can provide insight into the number, motion and localization of relevant crystal defects such as dislocations. As a result, the computed elastic field, F e, provides information about inhomogeneous lattice strains and lattice rotations induced by the presence of defects.« less

  9. A Reduced Order Model for Whole-Chip Thermal Analysis of Microfluidic Lab-on-a-Chip Systems

    PubMed Central

    Wang, Yi; Song, Hongjun; Pant, Kapil

    2013-01-01

    This paper presents a Krylov subspace projection-based Reduced Order Model (ROM) for whole microfluidic chip thermal analysis, including conjugate heat transfer. Two key steps in the reduced order modeling procedure are described in detail, including (1) the acquisition of a 3D full-scale computational model in the state-space form to capture the dynamic thermal behavior of the entire microfluidic chip; and (2) the model order reduction using the Block Arnoldi algorithm to markedly lower the dimension of the full-scale model. Case studies using practically relevant thermal microfluidic chip are undertaken to establish the capability and to evaluate the computational performance of the reduced order modeling technique. The ROM is compared against the full-scale model and exhibits good agreement in spatiotemporal thermal profiles (<0.5% relative error in pertinent time scales) and over three orders-of-magnitude acceleration in computational speed. The salient model reusability and real-time simulation capability renders it amenable for operational optimization and in-line thermal control and management of microfluidic systems and devices. PMID:24443647

  10. Studies Related to Computer-Assisted Instruction. Semi-Annual Progress Report on Contract Nonr-624(18) October 1, 1968 through March 31, 1969.

    ERIC Educational Resources Information Center

    Glaser, Robert

    A study of response latency in a drill-and-practice task showed that variability in latency measures could be reduced by the use of self-pacing procedures, but not by the detailed analysis of latency into separate components. Experiments carried out on instructional history variables in teaching a mirror image, oblique line discrimination, showed…

  11. Improved work zone design guidelines and enhanced model of travel delays in work zones : Phase I, portability and scalability of interarrival and service time probability distribution functions for different locations in Ohio and the establishment of impr

    DOT National Transportation Integrated Search

    2006-01-01

    The project focuses on two major issues - the improvement of current work zone design practices and an analysis of : vehicle interarrival time (IAT) and speed distributions for the development of a digital computer simulation model for : queues and t...

  12. Machine learning in genetics and genomics

    PubMed Central

    Libbrecht, Maxwell W.; Noble, William Stafford

    2016-01-01

    The field of machine learning promises to enable computers to assist humans in making sense of large, complex data sets. In this review, we outline some of the main applications of machine learning to genetic and genomic data. In the process, we identify some recurrent challenges associated with this type of analysis and provide general guidelines to assist in the practical application of machine learning to real genetic and genomic data. PMID:25948244

  13. Increasing Touch-Keyboarding Skills in the Middle School Student: "KeyWords" vs. "Type To Learn," Hand Covers vs. No Hand Covers.

    ERIC Educational Resources Information Center

    Reagan, Steven Dallas

    A computer teacher in a middle school in East Tennessee observed that his students were entering the middle school program with computer familiarity but without the touch keyboarding skills necessary to operate the computer efficiently. It was also observed that even with instruction and practice using drill and practice keyboarding software, the…

  14. Computer-aided analysis with Image J for quantitatively assessing psoriatic lesion area.

    PubMed

    Sun, Z; Wang, Y; Ji, S; Wang, K; Zhao, Y

    2015-11-01

    Body surface area is important in determining the severity of psoriasis. However, objective, reliable, and practical method is still in need for this purpose. We performed a computer image analysis (CIA) of psoriatic area using the image J freeware to determine whether this method could be used for objective evaluation of psoriatic area. Fifteen psoriasis patients were randomized to be treated with adalimumab or placebo in a clinical trial. At each visit, the psoriasis area of each body site was estimated by two physicians (E-method), and standard photographs were taken. The psoriasis area in the pictures was assessed with CIA using semi-automatic threshold selection (T-method), or manual selection (M-method, gold standard). The results assessed by the three methods were analyzed with reliability and affecting factors evaluated. Both T- and E-method correlated strongly with M-method, and T-method had a slightly stronger correlation with M-method. Both T- and E-methods had a good consistency between the evaluators. All the three methods were able to detect the change in the psoriatic area after treatment, while the E-method tends to overestimate. The CIA with image J freeware is reliable and practicable in quantitatively assessing the lesional of psoriasis area. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Single-trial detection of visual evoked potentials by common spatial patterns and wavelet filtering for brain-computer interface.

    PubMed

    Tu, Yiheng; Huang, Gan; Hung, Yeung Sam; Hu, Li; Hu, Yong; Zhang, Zhiguo

    2013-01-01

    Event-related potentials (ERPs) are widely used in brain-computer interface (BCI) systems as input signals conveying a subject's intention. A fast and reliable single-trial ERP detection method can be used to develop a BCI system with both high speed and high accuracy. However, most of single-trial ERP detection methods are developed for offline EEG analysis and thus have a high computational complexity and need manual operations. Therefore, they are not applicable to practical BCI systems, which require a low-complexity and automatic ERP detection method. This work presents a joint spatial-time-frequency filter that combines common spatial patterns (CSP) and wavelet filtering (WF) for improving the signal-to-noise (SNR) of visual evoked potentials (VEP), which can lead to a single-trial ERP-based BCI.

  16. Fun and Arithmetic Practice with Days and Dates.

    ERIC Educational Resources Information Center

    Richbart, Lynn A.

    1985-01-01

    Two worksheets are given, outlining algorithms to help students determine the day of the week an event will occur and to find the date for Easter. The activity provides computational practice. A computer program for determining Easter is also included. (MNS)

  17. Visualization techniques for computer network defense

    NASA Astrophysics Data System (ADS)

    Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew

    2011-06-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  18. Segmentation and Image Analysis of Abnormal Lungs at CT: Current Approaches, Challenges, and Future Trends

    PubMed Central

    Mansoor, Awais; Foster, Brent; Xu, Ziyue; Papadakis, Georgios Z.; Folio, Les R.; Udupa, Jayaram K.; Mollura, Daniel J.

    2015-01-01

    The computer-based process of identifying the boundaries of lung from surrounding thoracic tissue on computed tomographic (CT) images, which is called segmentation, is a vital first step in radiologic pulmonary image analysis. Many algorithms and software platforms provide image segmentation routines for quantification of lung abnormalities; however, nearly all of the current image segmentation approaches apply well only if the lungs exhibit minimal or no pathologic conditions. When moderate to high amounts of disease or abnormalities with a challenging shape or appearance exist in the lungs, computer-aided detection systems may be highly likely to fail to depict those abnormal regions because of inaccurate segmentation methods. In particular, abnormalities such as pleural effusions, consolidations, and masses often cause inaccurate lung segmentation, which greatly limits the use of image processing methods in clinical and research contexts. In this review, a critical summary of the current methods for lung segmentation on CT images is provided, with special emphasis on the accuracy and performance of the methods in cases with abnormalities and cases with exemplary pathologic findings. The currently available segmentation methods can be divided into five major classes: (a) thresholding-based, (b) region-based, (c) shape-based, (d) neighboring anatomy–guided, and (e) machine learning–based methods. The feasibility of each class and its shortcomings are explained and illustrated with the most common lung abnormalities observed on CT images. In an overview, practical applications and evolving technologies combining the presented approaches for the practicing radiologist are detailed. ©RSNA, 2015 PMID:26172351

  19. Computer/Mobile Device Screen Time of Children and Their Eye Care Behavior: The Roles of Risk Perception and Parenting.

    PubMed

    Chang, Fong-Ching; Chiu, Chiung-Hui; Chen, Ping-Hung; Miao, Nae-Fang; Chiang, Jeng-Tung; Chuang, Hung-Yi

    2018-03-01

    This study assessed the computer/mobile device screen time and eye care behavior of children and examined the roles of risk perception and parental practices. Data were obtained from a sample of 2,454 child-parent dyads recruited from 30 primary schools in Taipei city and New Taipei city, Taiwan, in 2016. Self-administered questionnaires were collected from students and parents. Fifth-grade students spend more time on new media (computer/smartphone/tablet: 16 hours a week) than on traditional media (television: 10 hours a week). The average daily screen time (3.5 hours) for these children exceeded the American Academy of Pediatrics recommendations (≤2 hours). Multivariate analysis results showed that after controlling for demographic factors, the parents with higher levels of risk perception and parental efficacy were more likely to mediate their child's eye care behavior. Children who reported lower academic performance, who were from non-intact families, reported lower levels of risk perception of mobile device use, had parents who spent more time using computers and mobile devices, and had lower levels of parental mediation were more likely to spend more time using computers and mobile devices; whereas children who reported higher academic performance, higher levels of risk perception, and higher levels of parental mediation were more likely to engage in higher levels of eye care behavior. Risk perception by children and parental practices are associated with the amount of screen time that children regularly engage in and their level of eye care behavior.

  20. Private genome analysis through homomorphic encryption

    PubMed Central

    2015-01-01

    Background The rapid development of genome sequencing technology allows researchers to access large genome datasets. However, outsourcing the data processing o the cloud poses high risks for personal privacy. The aim of this paper is to give a practical solution for this problem using homomorphic encryption. In our approach, all the computations can be performed in an untrusted cloud without requiring the decryption key or any interaction with the data owner, which preserves the privacy of genome data. Methods We present evaluation algorithms for secure computation of the minor allele frequencies and χ2 statistic in a genome-wide association studies setting. We also describe how to privately compute the Hamming distance and approximate Edit distance between encrypted DNA sequences. Finally, we compare performance details of using two practical homomorphic encryption schemes - the BGV scheme by Gentry, Halevi and Smart and the YASHE scheme by Bos, Lauter, Loftus and Naehrig. Results The approach with the YASHE scheme analyzes data from 400 people within about 2 seconds and picks a variant associated with disease from 311 spots. For another task, using the BGV scheme, it took about 65 seconds to securely compute the approximate Edit distance for DNA sequences of size 5K and figure out the differences between them. Conclusions The performance numbers for BGV are better than YASHE when homomorphically evaluating deep circuits (like the Hamming distance algorithm or approximate Edit distance algorithm). On the other hand, it is more efficient to use the YASHE scheme for a low-degree computation, such as minor allele frequencies or χ2 test statistic in a case-control study. PMID:26733152

  1. Universal blind quantum computation for hybrid system

    NASA Astrophysics Data System (ADS)

    Huang, He-Liang; Bao, Wan-Su; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Zhang, Hai-Long; Wang, Xiang

    2017-08-01

    As progress on the development of building quantum computer continues to advance, first-generation practical quantum computers will be available for ordinary users in the cloud style similar to IBM's Quantum Experience nowadays. Clients can remotely access the quantum servers using some simple devices. In such a situation, it is of prime importance to keep the security of the client's information. Blind quantum computation protocols enable a client with limited quantum technology to delegate her quantum computation to a quantum server without leaking any privacy. To date, blind quantum computation has been considered only for an individual quantum system. However, practical universal quantum computer is likely to be a hybrid system. Here, we take the first step to construct a framework of blind quantum computation for the hybrid system, which provides a more feasible way for scalable blind quantum computation.

  2. Problems Related to Parallelization of CFD Algorithms on GPU, Multi-GPU and Hybrid Architectures

    NASA Astrophysics Data System (ADS)

    Biazewicz, Marek; Kurowski, Krzysztof; Ludwiczak, Bogdan; Napieraia, Krystyna

    2010-09-01

    Computational Fluid Dynamics (CFD) is one of the branches of fluid mechanics, which uses numerical methods and algorithms to solve and analyze fluid flows. CFD is used in various domains, such as oil and gas reservoir uncertainty analysis, aerodynamic body shapes optimization (e.g. planes, cars, ships, sport helmets, skis), natural phenomena analysis, numerical simulation for weather forecasting or realistic visualizations. CFD problem is very complex and needs a lot of computational power to obtain the results in a reasonable time. We have implemented a parallel application for two-dimensional CFD simulation with a free surface approximation (MAC method) using new hardware architectures, in particular multi-GPU and hybrid computing environments. For this purpose we decided to use NVIDIA graphic cards with CUDA environment due to its simplicity of programming and good computations performance. We used finite difference discretization of Navier-Stokes equations, where fluid is propagated over an Eulerian Grid. In this model, the behavior of the fluid inside the cell depends only on the properties of local, surrounding cells, therefore it is well suited for the GPU-based architecture. In this paper we demonstrate how to use efficiently the computing power of GPUs for CFD. Additionally, we present some best practices to help users analyze and improve the performance of CFD applications executed on GPU. Finally, we discuss various challenges around the multi-GPU implementation on the example of matrix multiplication.

  3. The direction of cloud computing for Malaysian education sector in 21st century

    NASA Astrophysics Data System (ADS)

    Jaafar, Jazurainifariza; Rahman, M. Nordin A.; Kadir, M. Fadzil A.; Shamsudin, Syadiah Nor; Saany, Syarilla Iryani A.

    2017-08-01

    In 21st century, technology has turned learning environment into a new way of education to make learning systems more effective and systematic. Nowadays, education institutions are faced many challenges to ensure the teaching and learning process is running smoothly and manageable. Some of challenges in the current education management are lack of integrated systems, high cost of maintenance, difficulty of configuration and deployment as well as complexity of storage provision. Digital learning is an instructional practice that use technology to make learning experience more effective, provides education process more systematic and attractive. Digital learning can be considered as one of the prominent application that implemented under cloud computing environment. Cloud computing is a type of network resources that provides on-demands services where the users can access applications inside it at any location and no time border. It also promises for minimizing the cost of maintenance and provides a flexible of data storage capacity. The aim of this article is to review the definition and types of cloud computing for improving digital learning management as required in the 21st century education. The analysis of digital learning context focused on primary school in Malaysia. Types of cloud applications and services in education sector are also discussed in the article. Finally, gap analysis and direction of cloud computing in education sector for facing the 21st century challenges are suggested.

  4. A Bayesian network meta-analysis for binary outcome: how to do it.

    PubMed

    Greco, Teresa; Landoni, Giovanni; Biondi-Zoccai, Giuseppe; D'Ascenzo, Fabrizio; Zangrillo, Alberto

    2016-10-01

    This study presents an overview of conceptual and practical issues of a network meta-analysis (NMA), particularly focusing on its application to randomised controlled trials with a binary outcome of interest. We start from general considerations on NMA to specifically appraise how to collect study data, structure the analytical network and specify the requirements for different models and parameter interpretations, with the ultimate goal of providing physicians and clinician-investigators a practical tool to understand pros and cons of NMA. Specifically, we outline the key steps, from the literature search to sensitivity analysis, necessary to perform a valid NMA of binomial data, exploiting Markov Chain Monte Carlo approaches. We also apply this analytical approach to a case study on the beneficial effects of volatile agents compared to total intravenous anaesthetics for surgery to further clarify the statistical details of the models, diagnostics and computations. Finally, datasets and models for the freeware WinBUGS package are presented for the anaesthetic agent example. © The Author(s) 2013.

  5. Comparison of software packages for detecting differential expression in RNA-seq studies

    PubMed Central

    Seyednasrollah, Fatemeh; Laiho, Asta

    2015-01-01

    RNA-sequencing (RNA-seq) has rapidly become a popular tool to characterize transcriptomes. A fundamental research problem in many RNA-seq studies is the identification of reliable molecular markers that show differential expression between distinct sample groups. Together with the growing popularity of RNA-seq, a number of data analysis methods and pipelines have already been developed for this task. Currently, however, there is no clear consensus about the best practices yet, which makes the choice of an appropriate method a daunting task especially for a basic user without a strong statistical or computational background. To assist the choice, we perform here a systematic comparison of eight widely used software packages and pipelines for detecting differential expression between sample groups in a practical research setting and provide general guidelines for choosing a robust pipeline. In general, our results demonstrate how the data analysis tool utilized can markedly affect the outcome of the data analysis, highlighting the importance of this choice. PMID:24300110

  6. Comparison of software packages for detecting differential expression in RNA-seq studies.

    PubMed

    Seyednasrollah, Fatemeh; Laiho, Asta; Elo, Laura L

    2015-01-01

    RNA-sequencing (RNA-seq) has rapidly become a popular tool to characterize transcriptomes. A fundamental research problem in many RNA-seq studies is the identification of reliable molecular markers that show differential expression between distinct sample groups. Together with the growing popularity of RNA-seq, a number of data analysis methods and pipelines have already been developed for this task. Currently, however, there is no clear consensus about the best practices yet, which makes the choice of an appropriate method a daunting task especially for a basic user without a strong statistical or computational background. To assist the choice, we perform here a systematic comparison of eight widely used software packages and pipelines for detecting differential expression between sample groups in a practical research setting and provide general guidelines for choosing a robust pipeline. In general, our results demonstrate how the data analysis tool utilized can markedly affect the outcome of the data analysis, highlighting the importance of this choice. © The Author 2013. Published by Oxford University Press.

  7. Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  8. Uncertainty Analysis via Failure Domain Characterization: Unrestricted Requirement Functions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. The methods developed herein, which are based on nonlinear constrained optimization, are applicable to requirement functions whose functional dependency on the uncertainty is arbitrary and whose explicit form may even be unknown. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the assumed uncertainty model (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  9. Thermo-hydro-mechanical-chemical processes in fractured-porous media: Benchmarks and examples

    NASA Astrophysics Data System (ADS)

    Kolditz, O.; Shao, H.; Görke, U.; Kalbacher, T.; Bauer, S.; McDermott, C. I.; Wang, W.

    2012-12-01

    The book comprises an assembly of benchmarks and examples for porous media mechanics collected over the last twenty years. Analysis of thermo-hydro-mechanical-chemical (THMC) processes is essential to many applications in environmental engineering, such as geological waste deposition, geothermal energy utilisation, carbon capture and storage, water resources management, hydrology, even climate change. In order to assess the feasibility as well as the safety of geotechnical applications, process-based modelling is the only tool to put numbers, i.e. to quantify future scenarios. This charges a huge responsibility concerning the reliability of computational tools. Benchmarking is an appropriate methodology to verify the quality of modelling tools based on best practices. Moreover, benchmarking and code comparison foster community efforts. The benchmark book is part of the OpenGeoSys initiative - an open source project to share knowledge and experience in environmental analysis and scientific computation.

  10. Fundamental analysis of the failure of polymer-based fiber reinforced composites

    NASA Technical Reports Server (NTRS)

    Kanninen, M. F.; Rybicki, E. F.; Griffith, W. I.; Broek, D.

    1975-01-01

    A mathematical model predicting the strength of unidirectional fiber reinforced composites containing known flaws and with linear elastic-brittle material behavior was developed. The approach was to imbed a local heterogeneous region surrounding the crack tip into an anisotropic elastic continuum. This (1) permits an explicit analysis of the micromechanical processes involved in the fracture, and (2) remains simple enough to be useful in practical computations. Computations for arbitrary flaw size and orientation under arbitrary applied loads were performed. The mechanical properties were those of graphite epoxy. With the rupture properties arbitrarily varied to test the capabilities of the model to reflect real fracture modes, it was shown that fiber breakage, matrix crazing, crack bridging, matrix-fiber debonding, and axial splitting can all occur during a period of (gradually) increasing load prior to catastrophic failure. The calculations also reveal the sequential nature of the stable crack growth process proceding fracture.

  11. Efficient computation paths for the systematic analysis of sensitivities

    NASA Astrophysics Data System (ADS)

    Greppi, Paolo; Arato, Elisabetta

    2013-01-01

    A systematic sensitivity analysis requires computing the model on all points of a multi-dimensional grid covering the domain of interest, defined by the ranges of variability of the inputs. The issues to efficiently perform such analyses on algebraic models are handling solution failures within and close to the feasible region and minimizing the total iteration count. Scanning the domain in the obvious order is sub-optimal in terms of total iterations and is likely to cause many solution failures. The problem of choosing a better order can be translated geometrically into finding Hamiltonian paths on certain grid graphs. This work proposes two paths, one based on a mixed-radix Gray code and the other, a quasi-spiral path, produced by a novel heuristic algorithm. Some simple, easy-to-visualize examples are presented, followed by performance results for the quasi-spiral algorithm and the practical application of the different paths in a process simulation tool.

  12. Application of the actor model to large scale NDE data analysis

    NASA Astrophysics Data System (ADS)

    Coughlin, Chris

    2018-03-01

    The Actor model of concurrent computation discretizes a problem into a series of independent units or actors that interact only through the exchange of messages. Without direct coupling between individual components, an Actor-based system is inherently concurrent and fault-tolerant. These traits lend themselves to so-called "Big Data" applications in which the volume of data to analyze requires a distributed multi-system design. For a practical demonstration of the Actor computational model, a system was developed to assist with the automated analysis of Nondestructive Evaluation (NDE) datasets using the open source Myriad Data Reduction Framework. A machine learning model trained to detect damage in two-dimensional slices of C-Scan data was deployed in a streaming data processing pipeline. To demonstrate the flexibility of the Actor model, the pipeline was deployed on a local system and re-deployed as a distributed system without recompiling, reconfiguring, or restarting the running application.

  13. Deep learning for computational biology.

    PubMed

    Angermueller, Christof; Pärnamaa, Tanel; Parts, Leopold; Stegle, Oliver

    2016-07-29

    Technological advances in genomics and imaging have led to an explosion of molecular and cellular profiling data from large numbers of samples. This rapid increase in biological data dimension and acquisition rate is challenging conventional analysis strategies. Modern machine learning methods, such as deep learning, promise to leverage very large data sets for finding hidden structure within them, and for making accurate predictions. In this review, we discuss applications of this new breed of analysis approaches in regulatory genomics and cellular imaging. We provide background of what deep learning is, and the settings in which it can be successfully applied to derive biological insights. In addition to presenting specific applications and providing tips for practical use, we also highlight possible pitfalls and limitations to guide computational biologists when and how to make the most use of this new technology. © 2016 The Authors. Published under the terms of the CC BY 4.0 license.

  14. On-Line Robust Modal Stability Prediction using Wavelet Processing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Lind, Rick

    1998-01-01

    Wavelet analysis for filtering and system identification has been used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins is reduced with parametric and nonparametric time- frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data is used to reduce the effects of external disturbances and unmodeled dynamics. Parametric estimates of modal stability are also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. The F-18 High Alpha Research Vehicle aeroservoelastic flight test data demonstrates improved robust stability prediction by extension of the stability boundary beyond the flight regime. Guidelines and computation times are presented to show the efficiency and practical aspects of these procedures for on-line implementation. Feasibility of the method is shown for processing flight data from time- varying nonstationary test points.

  15. An efficient, large-scale, non-lattice-detection algorithm for exhaustive structural auditing of biomedical ontologies.

    PubMed

    Zhang, Guo-Qiang; Xing, Guangming; Cui, Licong

    2018-04-01

    One of the basic challenges in developing structural methods for systematic audition on the quality of biomedical ontologies is the computational cost usually involved in exhaustive sub-graph analysis. We introduce ANT-LCA, a new algorithm for computing all non-trivial lowest common ancestors (LCA) of each pair of concepts in the hierarchical order induced by an ontology. The computation of LCA is a fundamental step for non-lattice approach for ontology quality assurance. Distinct from existing approaches, ANT-LCA only computes LCAs for non-trivial pairs, those having at least one common ancestor. To skip all trivial pairs that may be of no practical interest, ANT-LCA employs a simple but innovative algorithmic strategy combining topological order and dynamic programming to keep track of non-trivial pairs. We provide correctness proofs and demonstrate a substantial reduction in computational time for two largest biomedical ontologies: SNOMED CT and Gene Ontology (GO). ANT-LCA achieved an average computation time of 30 and 3 sec per version for SNOMED CT and GO, respectively, about 2 orders of magnitude faster than the best known approaches. Our algorithm overcomes a fundamental computational barrier in sub-graph based structural analysis of large ontological systems. It enables the implementation of a new breed of structural auditing methods that not only identifies potential problematic areas, but also automatically suggests changes to fix the issues. Such structural auditing methods can lead to more effective tools supporting ontology quality assurance work. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing

    NASA Astrophysics Data System (ADS)

    Chen, A.; Pham, L.; Kempler, S.; Theobald, M.; Esfandiari, A.; Campino, J.; Vollmer, B.; Lynnes, C.

    2011-12-01

    Cloud Computing technology has been used to offer high-performance and low-cost computing and storage resources for both scientific problems and business services. Several cloud computing services have been implemented in the commercial arena, e.g. Amazon's EC2 & S3, Microsoft's Azure, and Google App Engine. There are also some research and application programs being launched in academia and governments to utilize Cloud Computing. NASA launched the Nebula Cloud Computing platform in 2008, which is an Infrastructure as a Service (IaaS) to deliver on-demand distributed virtual computers. Nebula users can receive required computing resources as a fully outsourced service. NASA Goddard Earth Science Data and Information Service Center (GES DISC) migrated several GES DISC's applications to the Nebula as a proof of concept, including: a) The Simple, Scalable, Script-based Science Processor for Measurements (S4PM) for processing scientific data; b) the Atmospheric Infrared Sounder (AIRS) data process workflow for processing AIRS raw data; and c) the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (GIOVANNI) for online access to, analysis, and visualization of Earth science data. This work aims to evaluate the practicability and adaptability of the Nebula. The initial work focused on the AIRS data process workflow to evaluate the Nebula. The AIRS data process workflow consists of a series of algorithms being used to process raw AIRS level 0 data and output AIRS level 2 geophysical retrievals. Migrating the entire workflow to the Nebula platform is challenging, but practicable. After installing several supporting libraries and the processing code itself, the workflow is able to process AIRS data in a similar fashion to its current (non-cloud) configuration. We compared the performance of processing 2 days of AIRS level 0 data through level 2 using a Nebula virtual computer and a local Linux computer. The result shows that Nebula has significantly better performance than the local machine. Much of the difference was due to newer equipment in the Nebula than the legacy computer, which is suggestive of a potential economic advantage beyond elastic power, i.e., access to up-to-date hardware vs. legacy hardware that must be maintained past its prime to amortize the cost. In addition to a trade study of advantages and challenges of porting complex processing to the cloud, a tutorial was developed to enable further progress in utilizing the Nebula for Earth Science applications and understanding better the potential for Cloud Computing in further data- and computing-intensive Earth Science research. In particular, highly bursty computing such as that experienced in the user-demand-driven Giovanni system may become more tractable in a Cloud environment. Our future work will continue to focus on migrating more GES DISC's applications/instances, e.g. Giovanni instances, to the Nebula platform and making matured migrated applications to be in operation on the Nebula.

  17. Practical considerations for obtaining high quality quantitative computed tomography data of the skeletal system.

    PubMed

    Troy, Karen L; Edwards, W Brent

    2018-05-01

    Quantitative CT (QCT) analysis involves the calculation of specific parameters such as bone volume and density from CT image data, and can be a powerful tool for understanding bone quality and quantity. However, without careful attention to detail during all steps of the acquisition and analysis process, data can be of poor- to unusable-quality. Good quality QCT for research requires meticulous attention to detail and standardization of all aspects of data collection and analysis to a degree that is uncommon in a clinical setting. Here, we review the literature to summarize practical and technical considerations for obtaining high quality QCT data, and provide examples of how each recommendation affects calculated variables. We also provide an overview of the QCT analysis technique to illustrate additional opportunities to improve data reproducibility and reliability. Key recommendations include: standardizing the scanner and data acquisition settings, minimizing image artifacts, selecting an appropriate reconstruction algorithm, and maximizing repeatability and objectivity during QCT analysis. The goal of the recommendations is to reduce potential sources of error throughout the analysis, from scan acquisition to the interpretation of results. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. P3: a practice focused learning environment

    NASA Astrophysics Data System (ADS)

    Irving, Paul W.; Obsniuk, Michael J.; Caballero, Marcos D.

    2017-09-01

    There has been an increased focus on the integration of practices into physics curricula, with a particular emphasis on integrating computation into the undergraduate curriculum of scientists and engineers. In this paper, we present a university-level, introductory physics course for science and engineering majors at Michigan State University called P3 (projects and practices in physics) that is centred around providing introductory physics students with the opportunity to appropriate various science and engineering practices. The P3 design integrates computation with analytical problem solving and is built upon a curriculum foundation of problem-based learning, the principles of constructive alignment and the theoretical framework of community of practice. The design includes an innovative approach to computational physics instruction, instructional scaffolds, and a unique approach to assessment that enables instructors to guide students in the development of the practices of a physicist. We present the very positive student related outcomes of the design gathered via attitudinal and conceptual inventories and research interviews of students’ reflecting on their experiences in the P3 classroom.

  19. BioVeL: a virtual laboratory for data analysis and modelling in biodiversity science and ecology.

    PubMed

    Hardisty, Alex R; Bacall, Finn; Beard, Niall; Balcázar-Vargas, Maria-Paula; Balech, Bachir; Barcza, Zoltán; Bourlat, Sarah J; De Giovanni, Renato; de Jong, Yde; De Leo, Francesca; Dobor, Laura; Donvito, Giacinto; Fellows, Donal; Guerra, Antonio Fernandez; Ferreira, Nuno; Fetyukova, Yuliya; Fosso, Bruno; Giddy, Jonathan; Goble, Carole; Güntsch, Anton; Haines, Robert; Ernst, Vera Hernández; Hettling, Hannes; Hidy, Dóra; Horváth, Ferenc; Ittzés, Dóra; Ittzés, Péter; Jones, Andrew; Kottmann, Renzo; Kulawik, Robert; Leidenberger, Sonja; Lyytikäinen-Saarenmaa, Päivi; Mathew, Cherian; Morrison, Norman; Nenadic, Aleksandra; de la Hidalga, Abraham Nieva; Obst, Matthias; Oostermeijer, Gerard; Paymal, Elisabeth; Pesole, Graziano; Pinto, Salvatore; Poigné, Axel; Fernandez, Francisco Quevedo; Santamaria, Monica; Saarenmaa, Hannu; Sipos, Gergely; Sylla, Karl-Heinz; Tähtinen, Marko; Vicario, Saverio; Vos, Rutger Aldo; Williams, Alan R; Yilmaz, Pelin

    2016-10-20

    Making forecasts about biodiversity and giving support to policy relies increasingly on large collections of data held electronically, and on substantial computational capability and capacity to analyse, model, simulate and predict using such data. However, the physically distributed nature of data resources and of expertise in advanced analytical tools creates many challenges for the modern scientist. Across the wider biological sciences, presenting such capabilities on the Internet (as "Web services") and using scientific workflow systems to compose them for particular tasks is a practical way to carry out robust "in silico" science. However, use of this approach in biodiversity science and ecology has thus far been quite limited. BioVeL is a virtual laboratory for data analysis and modelling in biodiversity science and ecology, freely accessible via the Internet. BioVeL includes functions for accessing and analysing data through curated Web services; for performing complex in silico analysis through exposure of R programs, workflows, and batch processing functions; for on-line collaboration through sharing of workflows and workflow runs; for experiment documentation through reproducibility and repeatability; and for computational support via seamless connections to supporting computing infrastructures. We developed and improved more than 60 Web services with significant potential in many different kinds of data analysis and modelling tasks. We composed reusable workflows using these Web services, also incorporating R programs. Deploying these tools into an easy-to-use and accessible 'virtual laboratory', free via the Internet, we applied the workflows in several diverse case studies. We opened the virtual laboratory for public use and through a programme of external engagement we actively encouraged scientists and third party application and tool developers to try out the services and contribute to the activity. Our work shows we can deliver an operational, scalable and flexible Internet-based virtual laboratory to meet new demands for data processing and analysis in biodiversity science and ecology. In particular, we have successfully integrated existing and popular tools and practices from different scientific disciplines to be used in biodiversity and ecological research.

  20. The Computational Infrastructure for Geodynamics as a Community of Practice

    NASA Astrophysics Data System (ADS)

    Hwang, L.; Kellogg, L. H.

    2016-12-01

    Computational Infrastructure for Geodynamics (CIG), geodynamics.org, originated in 2005 out of community recognition that the efforts of individual or small groups of researchers to develop scientifically-sound software is impossible to sustain, duplicates effort, and makes it difficult for scientists to adopt state-of-the art computational methods that promote new discovery. As a community of practice, participants in CIG share an interest in computational modeling in geodynamics and work together on open source software to build the capacity to support complex, extensible, scalable, interoperable, reliable, and reusable software in an effort to increase the return on investment in scientific software development and increase the quality of the resulting software. The group interacts regularly to learn from each other and better their practices formally through webinar series, workshops, and tutorials and informally through listservs and hackathons. Over the past decade, we have learned that successful scientific software development requires at a minimum: collaboration between domain-expert researchers, software developers and computational scientists; clearly identified and committed lead developer(s); well-defined scientific and computational goals that are regularly evaluated and updated; well-defined benchmarks and testing throughout development; attention throughout development to usability and extensibility; understanding and evaluation of the complexity of dependent libraries; and managed user expectations through education, training, and support. CIG's code donation standards provide the basis for recently formalized best practices in software development (geodynamics.org/cig/dev/best-practices/). Best practices include use of version control; widely used, open source software libraries; extensive test suites; portable configuration and build systems; extensive documentation internal and external to the code; and structured, human readable input formats.

  1. Benchmarking of Computational Models for NDE and SHM of Composites

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin; Leckey, Cara; Hafiychuk, Vasyl; Juarez, Peter; Timucin, Dogan; Schuet, Stefan; Hafiychuk, Halyna

    2016-01-01

    Ultrasonic wave phenomena constitute the leading physical mechanism for nondestructive evaluation (NDE) and structural health monitoring (SHM) of solid composite materials such as carbon-fiber-reinforced polymer (CFRP) laminates. Computational models of ultrasonic guided-wave excitation, propagation, scattering, and detection in quasi-isotropic laminates can be extremely valuable in designing practically realizable NDE and SHM hardware and software with desired accuracy, reliability, efficiency, and coverage. This paper presents comparisons of guided-wave simulations for CFRP composites implemented using three different simulation codes: two commercial finite-element analysis packages, COMSOL and ABAQUS, and a custom code implementing the Elastodynamic Finite Integration Technique (EFIT). Comparisons are also made to experimental laser Doppler vibrometry data and theoretical dispersion curves.

  2. Computer and internet use by ophthalmologists and trainees in an academic centre.

    PubMed

    Somal, Kirandeep; Lam, Wai-Ching; Tam, Eric

    2009-06-01

    The purpose of this study was to determine computer, internet, and department web site use by members of the Department of Ophthalmology and Vision Sciences at the University of Toronto in Toronto, Ont. Cross-sectional analysis. Eighty-eight members of the Department of Ophthalmology and Vision Sciences who responded to a survey. One hundred forty-eight department members (93 staff, 24 residents, and 31 fellows) were invited via e-mail to complete an online survey looking at computer and internet use. Participation was voluntary. Individuals who did not fill in an online response were sent a paper copy of the survey. No identifying fields were used in the data analysis. A response rate of 59% (88/148) was obtained. Fifty-nine percent of respondents described their computer skill as "good" or better; 86.4% utilized a computer in their clinical practice. Performance of computer-related tasks included accessing e-mail (98.9%), accessing medical literature (87.5%), conducting personal affairs (83%), and accessing conference/round schedules (65.9%). The survey indicated that 89.1% of respondents accessed peer-reviewed material online, including eMedicine (60.2%) and UpToDate articles (48.9%). Thirty-three percent of department members reported never having visited the department web site. Impediments to web site use included information not up to date (27.3%), information not of interest (22.1%), and difficulty locating the web site (20.8%). The majority of ophthalmologists and trainees in an academic centre utilize computer and internet resources for various tasks. A weak linear correlation was found between lower age of respondent and higher self-evaluated experience with computers (r = -0.43). Although use of the current department web site was low, respondents were interested in seeing improvements to the web site to increase its utility.

  3. The Unlock Project: a Python-based framework for practical brain-computer interface communication "app" development.

    PubMed

    Brumberg, Jonathan S; Lorenz, Sean D; Galbraith, Byron V; Guenther, Frank H

    2012-01-01

    In this paper we present a framework for reducing the development time needed for creating applications for use in non-invasive brain-computer interfaces (BCI). Our framework is primarily focused on facilitating rapid software "app" development akin to current efforts in consumer portable computing (e.g. smart phones and tablets). This is accomplished by handling intermodule communication without direct user or developer implementation, instead relying on a core subsystem for communication of standard, internal data formats. We also provide a library of hardware interfaces for common mobile EEG platforms for immediate use in BCI applications. A use-case example is described in which a user with amyotrophic lateral sclerosis participated in an electroencephalography-based BCI protocol developed using the proposed framework. We show that our software environment is capable of running in real-time with updates occurring 50-60 times per second with limited computational overhead (5 ms system lag) while providing accurate data acquisition and signal analysis.

  4. Computational tools for exact conditional logistic regression.

    PubMed

    Corcoran, C; Mehta, C; Patel, N; Senchaudhuri, P

    Logistic regression analyses are often challenged by the inability of unconditional likelihood-based approximations to yield consistent, valid estimates and p-values for model parameters. This can be due to sparseness or separability in the data. Conditional logistic regression, though useful in such situations, can also be computationally unfeasible when the sample size or number of explanatory covariates is large. We review recent developments that allow efficient approximate conditional inference, including Monte Carlo sampling and saddlepoint approximations. We demonstrate through real examples that these methods enable the analysis of significantly larger and more complex data sets. We find in this investigation that for these moderately large data sets Monte Carlo seems a better alternative, as it provides unbiased estimates of the exact results and can be executed in less CPU time than can the single saddlepoint approximation. Moreover, the double saddlepoint approximation, while computationally the easiest to obtain, offers little practical advantage. It produces unreliable results and cannot be computed when a maximum likelihood solution does not exist. Copyright 2001 John Wiley & Sons, Ltd.

  5. Computational analysis of drop formation before and after the first singularity: the fate of free and satellite drops during simple dripping and DOD drop formation

    NASA Astrophysics Data System (ADS)

    Chen, Alvin U.; Basaran, Osman A.

    2000-11-01

    Drop formation from a capillary --- dripping mode --- or an ink jet nozzle --- drop-on-demand (DOD) mode --- falls into a class of scientifically challenging yet practically useful free surface flows that exhibit a finite time singularity, i.e. the breakup of an initially single liquid mass into two or more fragments. While computational tools to model such problems have been developed recently, they lack the accuracy needed to quantitatively predict all the dynamics observed in experiments. Here we present a new finite element method (FEM) based on a robust algorithm for elliptic mesh generation and remeshing to handle extremely large interface deformations. The new algorithm allows continuation of computations beyond the first singularity to track fates of both primary and any satellite drops. The accuracy of the computations is demonstrated by comparison of simulations with experimental measurements made possible with an ultra high-speed digital imager capable of recording 100 million frames per second.

  6. Computer system design description for SY-101 hydrogen mitigation test project data acquisition and control system (DACS-1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ermi, A.M.

    1997-05-01

    Description of the Proposed Activity/REPORTABLE OCCURRENCE or PIAB: This ECN changes the computer systems design description support document describing the computers system used to control, monitor and archive the processes and outputs associated with the Hydrogen Mitigation Test Pump installed in SY-101. There is no new activity or procedure associated with the updating of this reference document. The updating of this computer system design description maintains an agreed upon documentation program initiated within the test program and carried into operations at time of turnover to maintain configuration control as outlined by design authority practicing guidelines. There are no new crediblemore » failure modes associated with the updating of information in a support description document. The failure analysis of each change was reviewed at the time of implementation of the Systems Change Request for all the processes changed. This document simply provides a history of implementation and current system status.« less

  7. Flood inundation extent mapping based on block compressed tracing

    NASA Astrophysics Data System (ADS)

    Shen, Dingtao; Rui, Yikang; Wang, Jiechen; Zhang, Yu; Cheng, Liang

    2015-07-01

    Flood inundation extent, depth, and duration are important factors affecting flood hazard evaluation. At present, flood inundation analysis is based mainly on a seeded region-growing algorithm, which is an inefficient process because it requires excessive recursive computations and it is incapable of processing massive datasets. To address this problem, we propose a block compressed tracing algorithm for mapping the flood inundation extent, which reads the DEM data in blocks before transferring them to raster compression storage. This allows a smaller computer memory to process a larger amount of data, which solves the problem of the regular seeded region-growing algorithm. In addition, the use of a raster boundary tracing technique allows the algorithm to avoid the time-consuming computations required by the seeded region-growing. Finally, we conduct a comparative evaluation in the Chin-sha River basin, results show that the proposed method solves the problem of flood inundation extent mapping based on massive DEM datasets with higher computational efficiency than the original method, which makes it suitable for practical applications.

  8. Efficient Computation of Closed-loop Frequency Response for Large Order Flexible Systems

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Giesy, Daniel P.

    1997-01-01

    An efficient and robust computational scheme is given for the calculation of the frequency response function of a large order, flexible system implemented with a linear, time invariant control system. Advantage is taken of the highly structured sparsity of the system matrix of the plant based on a model of the structure using normal mode coordinates. The computational time per frequency point of the new computational scheme is a linear function of system size, a significant improvement over traditional, full-matrix techniques whose computational times per frequency point range from quadratic to cubic functions of system size. This permits the practical frequency domain analysis of systems of much larger order than by traditional, full-matrix techniques. Formulations are given for both open and closed loop loop systems. Numerical examples are presented showing the advantages of the present formulation over traditional approaches, both in speed and in accuracy. Using a model with 703 structural modes, a speed-up of almost two orders of magnitude was observed while accuracy improved by up to 5 decimal places.

  9. Computational Hemodynamics Involving Artificial Devices

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin; Feiereisen, William (Technical Monitor)

    2001-01-01

    This paper reports the progress being made towards developing complete blood flow simulation capability in human, especially, in the presence of artificial devices such as valves and ventricular assist devices. Devices modeling poses unique challenges different from computing the blood flow in natural hearts and arteries. There are many elements needed such as flow solvers, geometry modeling including flexible walls, moving boundary procedures and physiological characterization of blood. As a first step, computational technology developed for aerospace applications was extended in the recent past to the analysis and development of mechanical devices. The blood flow in these devices is practically incompressible and Newtonian, and thus various incompressible Navier-Stokes solution procedures can be selected depending on the choice of formulations, variables and numerical schemes. Two primitive variable formulations used are discussed as well as the overset grid approach to handle complex moving geometry. This procedure has been applied to several artificial devices. Among these, recent progress made in developing DeBakey axial flow blood pump will be presented from computational point of view. Computational and clinical issues will be discussed in detail as well as additional work needed.

  10. Analysis Report for Exascale Storage Requirements for Scientific Data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruwart, Thomas M.

    Over the next 10 years, the Department of Energy will be transitioning from Petascale to Exascale Computing resulting in data storage, networking, and infrastructure requirements to increase by three orders of magnitude. The technologies and best practices used today are the result of a relatively slow evolution of ancestral technologies developed in the 1950s and 1960s. These include magnetic tape, magnetic disk, networking, databases, file systems, and operating systems. These technologies will continue to evolve over the next 10 to 15 years on a reasonably predictable path. Experience with the challenges involved in transitioning these fundamental technologies from Terascale tomore » Petascale computing systems has raised questions about how these will scale another 3 or 4 orders of magnitude to meet the requirements imposed by Exascale computing systems. This report is focused on the most concerning scaling issues with data storage systems as they relate to High Performance Computing- and presents options for a path forward. Given the ability to store exponentially increasing amounts of data, far more advanced concepts and use of metadata will be critical to managing data in Exascale computing systems.« less

  11. Refining Pragmatically-Appropriate Oral Communication via Computer-Simulated Conversations

    ERIC Educational Resources Information Center

    Sydorenko, Tetyana; Daurio, Phoebe; Thorne, Steven L.

    2018-01-01

    To address the problem of limited opportunities for practicing second language speaking in interaction, especially delicate interactions requiring pragmatic competence, we describe computer simulations designed for the oral practice of extended pragmatic routines and report on the affordances of such simulations for learning pragmatically…

  12. Implementing Computer Technologies: Teachers' Perceptions and Practices

    ERIC Educational Resources Information Center

    Wozney, Lori; Venkatesh, Vivek; Abrami, Philip

    2006-01-01

    This study investigates personal and setting characteristics, teacher attitudes, and current computer technology practices among 764 elementary and secondary teachers from both private and public school sectors in Quebec. Using expectancy-value theory, the Technology Implementation Questionnaire (TIQ) was developed; it consists of 33 belief items…

  13. Lab4CE: A Remote Laboratory for Computer Education

    ERIC Educational Resources Information Center

    Broisin, Julien; Venant, Rémi; Vidal, Philippe

    2017-01-01

    Remote practical activities have been demonstrated to be efficient when learners come to acquire inquiry skills. In computer science education, virtualization technologies are gaining popularity as this technological advance enables instructors to implement realistic practical learning activities, and learners to engage in authentic and…

  14. 75 FR 28296 - Denso Manufacturing of Michigan Including On-Site Leased Workers From Adecco Employment Services...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-20

    ...., Anchor Staffing, Capitol Software Systems, Donohue Computer Services, Historic Northside Family Practice, Scripture and Associates, Summit Software Services DD, Tacworldwide Companies, Talent Trax, Tek Systems...., Anchor Staffing, Capitol Software Systems, Donohue Computer Services, Historic Northside Family Practice...

  15. Clinicians’ perceptions and the relevant computer-based information needs towards the practice of evidence based medicine

    PubMed Central

    Jiang, Guoqian; Ogasawara, Katsuhiko; Endoh, Akira; Sakurai, Tsunetaro

    2003-01-01

    We conducted a survey among 100 clinicians in a university hospital to determine the clinician’s attitudes and the relevant computer-based information needs towards the practice of evidence-based medicine in outpatient setting. PMID:14728387

  16. User-Driven Sampling Strategies in Image Exploitation

    DOE PAGES

    Harvey, Neal R.; Porter, Reid B.

    2013-12-23

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-drivenmore » sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. We discovered that in user-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. Furthermore, in preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.« less

  17. User-driven sampling strategies in image exploitation

    NASA Astrophysics Data System (ADS)

    Harvey, Neal; Porter, Reid

    2013-12-01

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-driven sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. User-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. In preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.

  18. Four PPPPerspectives on computational creativity in theory and in practice

    NASA Astrophysics Data System (ADS)

    Jordanous, Anna

    2016-04-01

    Computational creativity is the modelling, simulating or replicating of creativity computationally. In examining and learning from these "creative systems", from what perspective should the creativity of a system be considered? Are we interested in the creativity of the system's output? Or of its creative processes? Features of the system? Or how it operates within its environment? Traditionally computational creativity has focused more on creative systems' products or processes, though this focus has widened recently. Creativity research offers the Four Ps of creativity: Person/Producer, Product, Process and Press/Environment. This paper presents the Four Ps, explaining each in the context of creativity research and how it relates to computational creativity. To illustrate the usefulness of the Four Ps in taking broader perspectives on creativity in its computational treatment, the concepts of novelty and value are explored using the Four Ps, highlighting aspects of novelty and value that may otherwise be overlooked. Analysis of recent research in computational creativity finds that although each of the Four Ps appears in the body of computational creativity work, individual pieces of work often do not acknowledge all Four Ps, missing opportunities to widen their work's relevance. We can see, though, that high-status computational creativity papers do typically address all Four Ps. This paper argues that the broader views of creativity afforded by the Four Ps is vital in guiding us towards more comprehensively useful computational investigations of creativity.

  19. On aerodynamic wake analysis and its relation to total aerodynamic drag in a wind tunnel environment

    NASA Astrophysics Data System (ADS)

    Guterres, Rui M.

    The present work was developed with the goal of advancing the state of the art in the application of three-dimensional wake data analysis to the quantification of aerodynamic drag on a body in a low speed wind tunnel environment. Analysis of the existing tools, their strengths and limitations is presented. Improvements to the existing analysis approaches were made. Software tools were developed to integrate the analysis into a practical tool. A comprehensive derivation of the equations needed for drag computations based on three dimensional separated wake data is developed. A set of complete steps ranging from the basic mathematical concept to the applicable engineering equations is presented. An extensive experimental study was conducted. Three representative body types were studied in varying ground effect conditions. A detailed qualitative wake analysis using wake imaging and two and three dimensional flow visualization was performed. Several significant features of the flow were identified and their relation to the total aerodynamic drag established. A comprehensive wake study of this type is shown to be in itself a powerful tool for the analysis of the wake aerodynamics and its relation to body drag. Quantitative wake analysis techniques were developed. Significant post processing and data conditioning tools and precision analysis were developed. The quality of the data is shown to be in direct correlation with the accuracy of the computed aerodynamic drag. Steps are taken to identify the sources of uncertainty. These are quantified when possible and the accuracy of the computed results is seen to significantly improve. When post processing alone does not resolve issues related to precision and accuracy, solutions are proposed. The improved quantitative wake analysis is applied to the wake data obtained. Guidelines are established that will lead to more successful implementation of these tools in future research programs. Close attention is paid to implementation of issues that are of crucial importance for the accuracy of the results and that are not detailed in the literature. The impact of ground effect on the flows in hand is qualitatively and quantitatively studied. Its impact on the accuracy of the computations as well as the wall drag incompatibility with the theoretical model followed are discussed. The newly developed quantitative analysis provides significantly increased accuracy. The aerodynamic drag coefficient is computed within one percent of balance measured value for the best cases.

  20. Induction of Social Behavior in Zebrafish: Live Versus Computer Animated Fish as Stimuli

    PubMed Central

    Qin, Meiying; Wong, Albert; Seguin, Diane

    2014-01-01

    Abstract The zebrafish offers an excellent compromise between system complexity and practical simplicity and has been suggested as a translational research tool for the analysis of human brain disorders associated with abnormalities of social behavior. Unlike laboratory rodents zebrafish are diurnal, thus visual cues may be easily utilized in the analysis of their behavior and brain function. Visual cues, including the sight of conspecifics, have been employed to induce social behavior in zebrafish. However, the method of presentation of these cues and the question of whether computer animated images versus live stimulus fish have differential effects have not been systematically analyzed. Here, we compare the effects of five stimulus presentation types: live conspecifics in the experimental tank or outside the tank, playback of video-recorded live conspecifics, computer animated images of conspecifics presented by two software applications, the previously employed General Fish Animator, and a new application Zebrafish Presenter. We report that all stimuli were equally effective and induced a robust social response (shoaling) manifesting as reduced distance between stimulus and experimental fish. We conclude that presentation of live stimulus fish, or 3D images, is not required and 2D computer animated images are sufficient to induce robust and consistent social behavioral responses in zebrafish. PMID:24575942

Top