Development of Computer-Based Experiment Set on Simple Harmonic Motion of Mass on Springs
ERIC Educational Resources Information Center
Musik, Panjit
2017-01-01
The development of computer-based experiment set has become necessary in teaching physics in schools so that students can learn from their real experiences. The purpose of this study is to create and to develop the computer-based experiment set on simple harmonic motion of mass on springs for teaching and learning physics. The average period of…
Computer-Based vs Paper-Based Examinations: Perceptions of University Teachers
ERIC Educational Resources Information Center
Jamil, Mubashrah; Tariq, R. H.; Shami, P. A.
2012-01-01
This research reported teachers' perceptions about computer-based (CB) vs. paper-based (PB) examinations. Teachers were divided into 7 major categories i.e., gender, departments, designations, qualifications, teaching experiences, computer training certifications and CB examination experiences, which were the key factors to be observed and…
Evolving technologies for Space Station Freedom computer-based workstations
NASA Technical Reports Server (NTRS)
Jensen, Dean G.; Rudisill, Marianne
1990-01-01
Viewgraphs on evolving technologies for Space Station Freedom computer-based workstations are presented. The human-computer computer software environment modules are described. The following topics are addressed: command and control workstation concept; cupola workstation concept; Japanese experiment module RMS workstation concept; remote devices controlled from workstations; orbital maneuvering vehicle free flyer; remote manipulator system; Japanese experiment module exposed facility; Japanese experiment module small fine arm; flight telerobotic servicer; human-computer interaction; and workstation/robotics related activities.
Experience with a UNIX based batch computing facility for H1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhards, R.; Kruener-Marquis, U.; Szkutnik, Z.
1994-12-31
A UNIX based batch computing facility for the H1 experiment at DESY is described. The ultimate goal is to replace the DESY IBM mainframe by a multiprocessor SGI Challenge series computer, using the UNIX operating system, for most of the computing tasks in H1.
ERIC Educational Resources Information Center
Grimley, Michael; Green, Richard; Nilsen, Trond; Thompson, David
2012-01-01
Computer games are purported to be effective instructional tools that enhance motivation and improve engagement. The aim of this study was to investigate how tertiary student experiences change when instruction was computer game based compared to lecture based, and whether experiences differed between high and low achieving students. Participants…
Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Xu; Tuo, Rui; Jeff Wu, C. F.
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less
Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion
He, Xu; Tuo, Rui; Jeff Wu, C. F.
2017-01-31
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahrens, J.P.; Shapiro, L.G.; Tanimoto, S.L.
1997-04-01
This paper describes a computing environment which supports computer-based scientific research work. Key features include support for automatic distributed scheduling and execution and computer-based scientific experimentation. A new flexible and extensible scheduling technique that is responsive to a user`s scheduling constraints, such as the ordering of program results and the specification of task assignments and processor utilization levels, is presented. An easy-to-use constraint language for specifying scheduling constraints, based on the relational database query language SQL, is described along with a search-based algorithm for fulfilling these constraints. A set of performance studies show that the environment can schedule and executemore » program graphs on a network of workstations as the user requests. A method for automatically generating computer-based scientific experiments is described. Experiments provide a concise method of specifying a large collection of parameterized program executions. The environment achieved significant speedups when executing experiments; for a large collection of scientific experiments an average speedup of 3.4 on an average of 5.5 scheduled processors was obtained.« less
Mayer, Richard E; Hegarty, Mary; Mayer, Sarah; Campbell, Julie
2005-12-01
In 4 experiments, students received a lesson consisting of computer-based animation and narration or a lesson consisting of paper-based static diagrams and text. The lessons used the same words and graphics in the paper-based and computer-based versions to explain the process of lightning formation (Experiment 1), how a toilet tank works (Experiment 2), how ocean waves work (Experiment 3), and how a car's braking system works (Experiment 4). On subsequent retention and transfer tests, the paper group performed significantly better than the computer group on 4 of 8 comparisons, and there was no significant difference on the rest. These results support the static media hypothesis, in which static illustrations with printed text reduce extraneous processing and promote germane processing as compared with narrated animations.
ERIC Educational Resources Information Center
Psycharis, Sarantos
2016-01-01
Computational experiment approach considers models as the fundamental instructional units of Inquiry Based Science and Mathematics Education (IBSE) and STEM Education, where the model take the place of the "classical" experimental set-up and simulation replaces the experiment. Argumentation in IBSE and STEM education is related to the…
A Computer-Based Simulation of an Acid-Base Titration
ERIC Educational Resources Information Center
Boblick, John M.
1971-01-01
Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)
Reactor transient control in support of PFR/TREAT TUCOP experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burrows, D.R.; Larsen, G.R.; Harrison, L.J.
1984-01-01
Unique energy deposition and experiment control requirements posed bythe PFR/TREAT series of transient undercooling/overpower (TUCOP) experiments resulted in equally unique TREAT reactor operations. New reactor control computer algorithms were written and used with the TREAT reactor control computer system to perform such functions as early power burst generation (based on test train flow conditions), burst generation produced by a step insertion of reactivity following a controlled power ramp, and shutdown (SCRAM) initiators based on both test train conditions and energy deposition. Specialized hardware was constructed to simulate test train inputs to the control computer system so that computer algorithms couldmore » be tested in real time without irradiating the experiment.« less
ERIC Educational Resources Information Center
García Laborda, Jesús; López Santiago, Mercedes; Otero de Juan, Nuria; Álvarez Álvarez, Alfredo
2014-01-01
Current evolutions of language testing have led to integrating computers in FSP assessments both in oral and written communicative tasks. This paper deals with two main issues: learners' expectations about the types of questions in FSP computer based assessments and the relation with their own experience. This paper describes the experience of 23…
NASA Astrophysics Data System (ADS)
Wan, Junwei; Chen, Hongyan; Zhao, Jing
2017-08-01
According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.
AnnotCompute: annotation-based exploration and meta-analysis of genomics experiments
Zheng, Jie; Stoyanovich, Julia; Manduchi, Elisabetta; Liu, Junmin; Stoeckert, Christian J.
2011-01-01
The ever-increasing scale of biological data sets, particularly those arising in the context of high-throughput technologies, requires the development of rich data exploration tools. In this article, we present AnnotCompute, an information discovery platform for repositories of functional genomics experiments such as ArrayExpress. Our system leverages semantic annotations of functional genomics experiments with controlled vocabulary and ontology terms, such as those from the MGED Ontology, to compute conceptual dissimilarities between pairs of experiments. These dissimilarities are then used to support two types of exploratory analysis—clustering and query-by-example. We show that our proposed dissimilarity measures correspond to a user's intuition about conceptual dissimilarity, and can be used to support effective query-by-example. We also evaluate the quality of clustering based on these measures. While AnnotCompute can support a richer data exploration experience, its effectiveness is limited in some cases, due to the quality of available annotations. Nonetheless, tools such as AnnotCompute may provide an incentive for richer annotations of experiments. Code is available for download at http://www.cbil.upenn.edu/downloads/AnnotCompute. Database URL: http://www.cbil.upenn.edu/annotCompute/ PMID:22190598
ERIC Educational Resources Information Center
Mayer, Richard E.; Hegarty, Mary; Mayer, Sarah; Campbell, Julie
2005-01-01
In 4 experiments, students received a lesson consisting of computer-based animation and narration or a lesson consisting of paper-based static diagrams and text. The lessons used the same words and graphics in the paper-based and computer-based versions to explain the process of lightning formation (Experiment 1), how a toilet tank works…
Computer Based Simulation of Laboratory Experiments.
ERIC Educational Resources Information Center
Edward, Norrie S.
1997-01-01
Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…
The Information Science Experiment System - The computer for science experiments in space
NASA Technical Reports Server (NTRS)
Foudriat, Edwin C.; Husson, Charles
1989-01-01
The concept of the Information Science Experiment System (ISES), potential experiments, and system requirements are reviewed. The ISES is conceived as a computer resource in space whose aim is to assist computer, earth, and space science experiments, to develop and demonstrate new information processing concepts, and to provide an experiment base for developing new information technology for use in space systems. The discussion covers system hardware and architecture, operating system software, the user interface, and the ground communication link.
DIRAC in Large Particle Physics Experiments
NASA Astrophysics Data System (ADS)
Stagni, F.; Tsaregorodtsev, A.; Arrabito, L.; Sailer, A.; Hara, T.; Zhang, X.; Consortium, DIRAC
2017-10-01
The DIRAC project is developing interware to build and operate distributed computing systems. It provides a development framework and a rich set of services for both Workload and Data Management tasks of large scientific communities. A number of High Energy Physics and Astrophysics collaborations have adopted DIRAC as the base for their computing models. DIRAC was initially developed for the LHCb experiment at LHC, CERN. Later, the Belle II, BES III and CTA experiments as well as the linear collider detector collaborations started using DIRAC for their computing systems. Some of the experiments built their DIRAC-based systems from scratch, others migrated from previous solutions, ad-hoc or based on different middlewares. Adaptation of DIRAC for a particular experiment was enabled through the creation of extensions to meet their specific requirements. Each experiment has a heterogeneous set of computing and storage resources at their disposal that were aggregated through DIRAC into a coherent pool. Users from different experiments can interact with the system in different ways depending on their specific tasks, expertise level and previous experience using command line tools, python APIs or Web Portals. In this contribution we will summarize the experience of using DIRAC in particle physics collaborations. The problems of migration to DIRAC from previous systems and their solutions will be presented. An overview of specific DIRAC extensions will be given. We hope that this review will be useful for experiments considering an update, or for those designing their computing models.
AEC Experiment Establishes Computer Link Between California and Paris
demonstrated that a terminal in Paris could search a computer in California and display the resulting (Copies) AEC EXPERIMENT ESTABLISHES COMPUTER LINK BETWEEN CALIFORNIA AND PARIS The feasibility of a worldwide information retrieval system which would tie a computer base of information to terminals on the
Integrated design, execution, and analysis of arrayed and pooled CRISPR genome-editing experiments.
Canver, Matthew C; Haeussler, Maximilian; Bauer, Daniel E; Orkin, Stuart H; Sanjana, Neville E; Shalem, Ophir; Yuan, Guo-Cheng; Zhang, Feng; Concordet, Jean-Paul; Pinello, Luca
2018-05-01
CRISPR (clustered regularly interspaced short palindromic repeats) genome-editing experiments offer enormous potential for the evaluation of genomic loci using arrayed single guide RNAs (sgRNAs) or pooled sgRNA libraries. Numerous computational tools are available to help design sgRNAs with optimal on-target efficiency and minimal off-target potential. In addition, computational tools have been developed to analyze deep-sequencing data resulting from genome-editing experiments. However, these tools are typically developed in isolation and oftentimes are not readily translatable into laboratory-based experiments. Here, we present a protocol that describes in detail both the computational and benchtop implementation of an arrayed and/or pooled CRISPR genome-editing experiment. This protocol provides instructions for sgRNA design with CRISPOR (computational tool for the design, evaluation, and cloning of sgRNA sequences), experimental implementation, and analysis of the resulting high-throughput sequencing data with CRISPResso (computational tool for analysis of genome-editing outcomes from deep-sequencing data). This protocol allows for design and execution of arrayed and pooled CRISPR experiments in 4-5 weeks by non-experts, as well as computational data analysis that can be performed in 1-2 d by both computational and noncomputational biologists alike using web-based and/or command-line versions.
ERIC Educational Resources Information Center
Psycharis, Sarantos
2016-01-01
In this study, an instructional design model, based on the computational experiment approach, was employed in order to explore the effects of the formative assessment strategies and scientific abilities rubrics on students' engagement in the development of inquiry-based pedagogical scenario. In the following study, rubrics were used during the…
Perceptions and performance using computer-based testing: One institution's experience.
Bloom, Timothy J; Rich, Wesley D; Olson, Stephanie M; Adams, Michael L
2018-02-01
The purpose of this study was to evaluate student and faculty perceptions of the transition to a required computer-based testing format and to identify any impact of this transition on student exam performance. Separate questionnaires sent to students and faculty asked about perceptions of and problems with computer-based testing. Exam results from program-required courses for two years prior to and two years following the adoption of computer-based testing were compared to determine if this testing format impacted student performance. Responses to Likert-type questions about perceived ease of use showed no difference between students with one and three semesters experience with computer-based testing. Of 223 student-reported problems, 23% related to faculty training with the testing software. Students most commonly reported improved feedback (46% of responses) and ease of exam-taking (17% of responses) as benefits to computer-based testing. Faculty-reported difficulties were most commonly related to problems with student computers during an exam (38% of responses) while the most commonly identified benefit was collecting assessment data (32% of responses). Neither faculty nor students perceived an impact on exam performance due to computer-based testing. An analysis of exam grades confirmed there was no consistent performance difference between the paper and computer-based formats. Both faculty and students rapidly adapted to using computer-based testing. There was no evidence that switching to computer-based testing had any impact on student exam performance. Copyright © 2017 Elsevier Inc. All rights reserved.
Progress in Machine Learning Studies for the CMS Computing Infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonacorsi, Daniele; Kuznetsov, Valentin; Magini, Nicolo
Here, computing systems for LHC experiments developed together with Grids worldwide. While a complete description of the original Grid-based infrastructure and services for LHC experiments and its recent evolutions can be found elsewhere, it is worth to mention here the scale of the computing resources needed to fulfill the needs of LHC experiments in Run-1 and Run-2 so far.
Progress in Machine Learning Studies for the CMS Computing Infrastructure
Bonacorsi, Daniele; Kuznetsov, Valentin; Magini, Nicolo; ...
2017-12-06
Here, computing systems for LHC experiments developed together with Grids worldwide. While a complete description of the original Grid-based infrastructure and services for LHC experiments and its recent evolutions can be found elsewhere, it is worth to mention here the scale of the computing resources needed to fulfill the needs of LHC experiments in Run-1 and Run-2 so far.
Using Computer-Based "Experiments" in the Analysis of Chemical Reaction Equilibria
ERIC Educational Resources Information Center
Li, Zhao; Corti, David S.
2018-01-01
The application of the Reaction Monte Carlo (RxMC) algorithm to standard textbook problems in chemical reaction equilibria is discussed. The RxMC method is a molecular simulation algorithm for studying the equilibrium properties of reactive systems, and therefore provides the opportunity to develop computer-based "experiments" for the…
How Do Students Experience Testing on the University Computer?
ERIC Educational Resources Information Center
Whittington, Dale; And Others
1995-01-01
Reports a study of the administration mode, scores, and testing experiences of students taking the PreProfessional Skills Test (PPST) under differing conditions (computer based and paper and pencil). PPST scores and surveys of the students revealed varied test-taking strategies and computer-related alterations in test difficulty, construct,…
Computer-Based Molecular Modelling: Finnish School Teachers' Experiences and Views
ERIC Educational Resources Information Center
Aksela, Maija; Lundell, Jan
2008-01-01
Modern computer-based molecular modelling opens up new possibilities for chemistry teaching at different levels. This article presents a case study seeking insight into Finnish school teachers' use of computer-based molecular modelling in teaching chemistry, into the different working and teaching methods used, and their opinions about necessary…
Implementing Computer Based Laboratories
NASA Astrophysics Data System (ADS)
Peterson, David
2001-11-01
Physics students at Francis Marion University will complete several required laboratory exercises utilizing computer-based Vernier probes. The simple pendulum, the acceleration due to gravity, simple harmonic motion, radioactive half lives, and radiation inverse square law experiments will be incorporated into calculus-based and algebra-based physics courses. Assessment of student learning and faculty satisfaction will be carried out by surveys and test results. Cost effectiveness and time effectiveness assessments will be presented. Majors in Computational Physics, Health Physics, Engineering, Chemistry, Mathematics and Biology take these courses, and assessments will be categorized by major. To enhance the computer skills of students enrolled in the courses, MAPLE will be used for further analysis of the data acquired during the experiments. Assessment of these enhancement exercises will also be presented.
NASA Astrophysics Data System (ADS)
Spiliotopoulos, I.; Mirmont, M.; Kruijff, M.
2008-08-01
This paper highlights the flight preparation and mission performance of a PC104-based On-Board Computer for ESA's second Young Engineer's Satellite (YES2), with additional attention to the flight software design and experience of QNX as multi-process real-time operating system. This combination of Commercial-Of-The-Shelf (COTS) technologies is an accessible option for small satellites with high computational demands.
A Comparison of Web-Based and Face-to-Face Functional Measurement Experiments
ERIC Educational Resources Information Center
Van Acker, Frederik; Theuns, Peter
2010-01-01
Information Integration Theory (IIT) is concerned with how people combine information into an overall judgment. A method is hereby presented to perform Functional Measurement (FM) experiments, the methodological counterpart of IIT, on the Web. In a comparison of Web-based FM experiments, face-to-face experiments, and computer-based experiments in…
Jack Rabbit Pretest Data For TATB Based IHE Model Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, M M; Strand, O T; Bosson, S T
The Jack Rabbit Pretest series consisted of 5 focused hydrodynamic experiments, 2021E PT3, PT4, PT5, PT6, and PT7. They were fired in March and April of 2008 at the Contained Firing Facility, Site 300, Lawrence Livermore National Laboratory, Livermore, California. These experiments measured dead-zone formation and impulse gradients created during the detonation of TATB based insensitive high explosive. This document contains reference data tables for all 5 experiments. These data tables include: (1) Measured laser velocimetry of the experiment diagnostic plate (2) Computed diagnostic plate profile contours through velocity integration (3) Computed center axis pressures through velocity differentiation. All timesmore » are in microseconds, referenced from detonator circuit current start. All dimensions are in millimeters. Schematic axi-symmetric cross sections are shown for each experiment. These schematics detail the materials used and dimensions of the experiment and component parts. This should allow anyone wanting to evaluate their TATB based insensitive high explosive detonation model against experiment. These data are particularly relevant in examining reactive flow detonation model prediction in computational simulation of dead-zone formation and resulting impulse gradients produced by detonating TATB based explosive.« less
Computer-Based Testing in the Medical Curriculum: A Decade of Experiences at One School
ERIC Educational Resources Information Center
McNulty, John; Chandrasekhar, Arcot; Hoyt, Amy; Gruener, Gregory; Espiritu, Baltazar; Price, Ron, Jr.
2011-01-01
This report summarizes more than a decade of experiences with implementing computer-based testing across a 4-year medical curriculum. Practical considerations are given to the fields incorporated within an item database and their use in the creation and analysis of examinations, security issues in the delivery and integrity of examinations,…
Experience with a sophisticated computer based authoring system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardner, P.R.
1984-04-01
In the November 1982 issue of ADCIS SIG CBT Newsletter the editor arrives at two conclusions regarding Computer Based Authoring Systems (CBAS): (1) CBAS drastically reduces programming time and the need for expert programmers, and (2) CBAS appears to have minimal impact on initial lesson design. Both of these comments have significant impact on any Cost-Benefit analysis for Computer-Based Training. The first tends to improve cost-effectiveness but only toward the limits imposed by the second. Westinghouse Hanford Company (WHC) recently purchased a sophisticated CBAS, the WISE/SMART system from Wicat (Orem, UT), for use in the Nuclear Power Industry. This reportmore » details our experience with this system relative to Items (1) and (2) above; lesson design time will be compared with lesson input time. Also provided will be the WHC experience in the use of subject matter experts (though computer neophytes) for the design and inputting of CBT materials.« less
Two Computer-Assisted Experiments
ERIC Educational Resources Information Center
Kraftmakher, Yaakov
2013-01-01
Two computer-assisted experiments are described: (i) determination of the speed of ultrasound waves in water and (ii) measurement of the thermal expansion of an aluminum-based alloy. A new data-acquisition system developed by PASCO scientific is used. In both experiments, the "Keep" mode of recording data is employed: the data are…
Learning Oceanography from a Computer Simulation Compared with Direct Experience at Sea
ERIC Educational Resources Information Center
Winn, William; Stahr, Frederick; Sarason, Christian; Fruland, Ruth; Oppenheimer, Peter; Lee, Yen-Ling
2006-01-01
Considerable research has compared how students learn science from computer simulations with how they learn from "traditional" classes. Little research has compared how students learn science from computer simulations with how they learn from direct experience in the real environment on which the simulations are based. This study compared two…
Examining the Feasibility and Effect of Transitioning GED Tests to Computer
ERIC Educational Resources Information Center
Higgins, Jennifer; Patterson, Margaret Becker; Bozman, Martha; Katz, Michael
2010-01-01
This study examined the feasibility of administering GED Tests using a computer based testing system with embedded accessibility tools and the impact on test scores and test-taker experience when GED Tests are transitioned from paper to computer. Nineteen test centers across five states successfully installed the computer based testing program,…
Women and Computer Based Technologies: A Feminist Perspective.
ERIC Educational Resources Information Center
Morritt, Hope
The use of computer based technologies by professional women in education is examined through a feminist standpoint theory in this paper. The theory is grounded in eight claims which form the basis of the conceptual framework for the study. The experiences of nine women participants with computer based technologies were categorized using three…
Study of Fluid Experiment System (FES)/CAST/Holographic Ground System (HGS)
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Cummings, Rick; Jones, Brian
1992-01-01
The use of holographic and schlieren optical techniques for studying the concentration gradients in solidification processes has been used by several investigators over the years. The HGS facility at MSFC has been primary resource in researching this capability. Consequently, scientific personnel have been able to utilize these techniques in both ground based research and in space experiments. An important event in the scientific utilization of the HGS facilities was the TGS Crystal Growth and the casting and solidification technology (CAST) experiments that were flown on the International Microgravity Laboratory (IML) mission in March of this year. The preparation and processing of these space observations are the primary experiments reported in this work. This project provides some ground-based studies to optimize on the holographic techniques used to acquire information about the crystal growth processes flown on IML. Since the ground-based studies will be compared with the space-based experimental results, it is necessary to conduct sufficient ground based studies to best determine how the experiment worked in space. The current capabilities in computer based systems for image processing and numerical computation have certainly assisted in those efforts. As anticipated, this study has certainly shown that these advanced computing capabilities are helpful in the data analysis of such experiments.
Computer-Based Experiments to Measure RC.
ERIC Educational Resources Information Center
Hart, Francis X.
2000-01-01
Finds that few electricity and magnetism experiments make use of computers for data acquisition. Reports on the use of a Vernier system for the measurement of the RC time constant for the charging and discharging of a capacitor. (CCM)
NASA Astrophysics Data System (ADS)
Veltri, Pierangelo
The use of computer based solutions for data management in biology and clinical science has contributed to improve life-quality and also to gather research results in shorter time. Indeed, new algorithms and high performance computation have been using in proteomics and genomics studies for curing chronic diseases (e.g., drug designing) as well as supporting clinicians both in diagnosis (e.g., images-based diagnosis) and patient curing (e.g., computer based information analysis on information gathered from patient). In this paper we survey on examples of computer based techniques applied in both biology and clinical contexts. The reported applications are also results of experiences in real case applications at University Medical School of Catanzaro and also part of experiences of the National project Staywell SH 2.0 involving many research centers and companies aiming to study and improve citizen wellness.
Computer Game-Based Learning: Perceptions and Experiences of Senior Chinese Adults
ERIC Educational Resources Information Center
Wang, Feihong; Lockee, Barbara B.; Burton, John K.
2012-01-01
The purpose of this study was to investigate senior Chinese adults' potential acceptance of computer game-based learning (CGBL) by probing their perceptions of computer game play and their perceived impacts of game play on their learning of computer skills and life satisfaction. A total of 60 senior adults from a local senior adult learning center…
ERIC Educational Resources Information Center
Hsu, Ting-Chia
2016-01-01
In this study, a peer assessment system using the grid-based knowledge classification approach was developed to improve students' performance during computer skills training. To evaluate the effectiveness of the proposed approach, an experiment was conducted in a computer skills certification course. The participants were divided into three…
Building Computer-Based Experiments in Psychology without Programming Skills.
Ruisoto, Pablo; Bellido, Alberto; Ruiz, Javier; Juanes, Juan A
2016-06-01
Research in Psychology usually requires to build and run experiments. However, although this task has required scripting, recent computer tools based on graphical interfaces offer new opportunities in this field for researchers with non-programming skills. The purpose of this study is to illustrate and provide a comparative overview of two of the main free open source "point and click" software packages for building and running experiments in Psychology: PsychoPy and OpenSesame. Recommendations for their potential use are further discussed.
The Application of Web-based Computer-assisted Instruction Courseware within Health Assessment
NASA Astrophysics Data System (ADS)
Xiuyan, Guo
Health assessment is a clinical nursing course and places emphasis on clinical skills. The application of computer-assisted instruction in the field of nursing teaching solved the problems in the traditional lecture class. This article stated teaching experience of web-based computer-assisted instruction, based upon a two-year study of computer-assisted instruction courseware use within the course health assessment. The computer-assisted instruction courseware could develop teaching structure, simulate clinical situations, create teaching situations and facilitate students study.
Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources
NASA Astrophysics Data System (ADS)
Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.
2011-12-01
Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.
An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.
Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V
2014-07-01
We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.
An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology
Deodhar, Suruchi; Bisset, Keith R.; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V.
2014-01-01
We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity. PMID:25530914
Computer-Based Mapping for Curriculum Development.
ERIC Educational Resources Information Center
Allen, Brockenbrough S.; And Others
This article describes the results of a three-month experiment in the use of computer-based semantic networks for curriculum development. A team of doctoral and master's degree students developed a 1200-item computer database representing a tentative "domain of competency" for a proposed MA degree in Workforce Education and Lifelong…
Determination of Absolute Zero Using a Computer-Based Laboratory
ERIC Educational Resources Information Center
Amrani, D.
2007-01-01
We present a simple computer-based laboratory experiment for evaluating absolute zero in degrees Celsius, which can be performed in college and undergraduate physical sciences laboratory courses. With a computer, absolute zero apparatus can help demonstrators or students to observe the relationship between temperature and pressure and use…
An Empirical Study of User Experience on Touch Mice
ERIC Educational Resources Information Center
Chou, Jyh Rong
2016-01-01
The touch mouse is a new type of computer mouse that provides users with a new way of touch-based environment to interact with computers. For more than a decade, user experience (UX) has grown into a core concept of human-computer interaction (HCI), describing a user's perceptions and responses that result from the use of a product in a particular…
Tran, Phuoc; Subrahmanyam, Kaveri
2013-01-01
The use of computers in the home has become very common among young children. This paper reviews research on the effects of informal computer use and identifies potential pathways through which computers may impact children's development. Based on the evidence reviewed, we present the following guidelines to arrange informal computer experiences that will promote the development of children's academic, cognitive and social skills: (1) children should be encouraged to use computers for moderate amounts of time (2-3 days a week for an hour or two per day) and (2) children's use of computers should (a) include non-violent action-based computer games as well as educational games, (b) not displace social activities but should instead be arranged to provide opportunities for social engagement with peers and family members and (c) involve content with pro-social and non-violent themes. We conclude the paper with questions that must be addressed in future research. This paper reviews research on the effects of informal computer use on children's academic, cognitive and social skills. Based on the evidence presented, we have presented guidelines to enable parents, teachers and other adults to arrange informal computer experiences so as to maximise their potential benefit for children's development.
Boevé, Anja J; Meijer, Rob R; Albers, Casper J; Beetsma, Yta; Bosker, Roel J
2015-01-01
The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration.
ERIC Educational Resources Information Center
Nix, Ingrid; Wyllie, Ali
2011-01-01
Many institutions encourage formative computer-based assessment (CBA), yet competing priorities mean that learners are necessarily selective about what they engage in. So how can we motivate them to engage? Can we facilitate learners to take more control of shaping their learning experience? To explore this, the Learning with Interactive…
An experiment for determining the Euler load by direct computation
NASA Technical Reports Server (NTRS)
Thurston, Gaylen A.; Stein, Peter A.
1986-01-01
A direct algorithm is presented for computing the Euler load of a column from experimental data. The method is based on exact inextensional theory for imperfect columns, which predicts two distinct deflected shapes at loads near the Euler load. The bending stiffness of the column appears in the expression for the Euler load along with the column length, therefore the experimental data allows a direct computation of bending stiffness. Experiments on graphite-epoxy columns of rectangular cross-section are reported in the paper. The bending stiffness of each composite column computed from experiment is compared with predictions from laminated plate theory.
Educating Laboratory Science Learners at a Distance Using Interactive Television
ERIC Educational Resources Information Center
Reddy, Christopher
2014-01-01
Laboratory science classes offered to students learning at a distance require a methodology that allows for the completion of tactile activities. Literature describes three different methods of solving the distance laboratory dilemma: kit-based laboratory experience, computer-based laboratory experience, and campus-based laboratory experience,…
A meta-analysis of outcomes from the use of computer-simulated experiments in science education
NASA Astrophysics Data System (ADS)
Lejeune, John Van
The purpose of this study was to synthesize the findings from existing research on the effects of computer simulated experiments on students in science education. Results from 40 reports were integrated by the process of meta-analysis to examine the effect of computer-simulated experiments and interactive videodisc simulations on student achievement and attitudes. Findings indicated significant positive differences in both low-level and high-level achievement of students who use computer-simulated experiments and interactive videodisc simulations as compared to students who used more traditional learning activities. No significant differences in retention, student attitudes toward the subject, or toward the educational method were found. Based on the findings of this study, computer-simulated experiments and interactive videodisc simulations should be used to enhance students' learning in science, especially in cases where the use of traditional laboratory activities are expensive, dangerous, or impractical.
vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments
2010-01-01
Background The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/. PMID:20482791
vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments.
Ma, Jingming; Dykes, Carrie; Wu, Tao; Huang, Yangxin; Demeter, Lisa; Wu, Hulin
2010-05-18
The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.
ERIC Educational Resources Information Center
Frein, Scott T.
2011-01-01
This article describes three experiments comparing paper-and-pencil tests (PPTs) to computer-based tests (CBTs) in terms of test method preferences and student performance. In Experiment 1, students took tests using three methods: PPT in class, CBT in class, and CBT at the time and place of their choosing. Results indicate that test method did not…
The Development of Acoustic Experiments for Off-Campus Teaching and Learning
ERIC Educational Resources Information Center
Wild, Graham; Swan, Geoff
2011-01-01
In this article, we show the implementation of a computer-based digital storage oscilloscope (DSO) and function generator (FG) using the computer's soundcard for off-campus acoustic experiments. The microphone input is used for the DSO, and a speaker jack is used as the FG. In an effort to reduce the cost of implementing the experiment, we examine…
Design and deployment of an elastic network test-bed in IHEP data center based on SDN
NASA Astrophysics Data System (ADS)
Zeng, Shan; Qi, Fazhi; Chen, Gang
2017-10-01
High energy physics experiments produce huge amounts of raw data, while because of the sharing characteristics of the network resources, there is no guarantee of the available bandwidth for each experiment which may cause link congestion problems. On the other side, with the development of cloud computing technologies, IHEP have established a cloud platform based on OpenStack which can ensure the flexibility of the computing and storage resources, and more and more computing applications have been deployed on virtual machines established by OpenStack. However, under the traditional network architecture, network capability can’t be required elastically, which becomes the bottleneck of restricting the flexible application of cloud computing. In order to solve the above problems, we propose an elastic cloud data center network architecture based on SDN, and we also design a high performance controller cluster based on OpenDaylight. In the end, we present our current test results.
Kron, Frederick W; Fetters, Michael D; Scerbo, Mark W; White, Casey B; Lypson, Monica L; Padilla, Miguel A; Gliva-McConvey, Gayle A; Belfore, Lee A; West, Temple; Wallace, Amelia M; Guetterman, Timothy C; Schleicher, Lauren S; Kennedy, Rebecca A; Mangrulkar, Rajesh S; Cleary, James F; Marsella, Stacy C; Becker, Daniel M
2017-04-01
To assess advanced communication skills among second-year medical students exposed either to a computer simulation (MPathic-VR) featuring virtual humans, or to a multimedia computer-based learning module, and to understand each group's experiences and learning preferences. A single-blinded, mixed methods, randomized, multisite trial compared MPathic-VR (N=210) to computer-based learning (N=211). Primary outcomes: communication scores during repeat interactions with MPathic-VR's intercultural and interprofessional communication scenarios and scores on a subsequent advanced communication skills objective structured clinical examination (OSCE). Multivariate analysis of variance was used to compare outcomes. student attitude surveys and qualitative assessments of their experiences with MPathic-VR or computer-based learning. MPathic-VR-trained students improved their intercultural and interprofessional communication performance between their first and second interactions with each scenario. They also achieved significantly higher composite scores on the OSCE than computer-based learning-trained students. Attitudes and experiences were more positive among students trained with MPathic-VR, who valued its providing immediate feedback, teaching nonverbal communication skills, and preparing them for emotion-charged patient encounters. MPathic-VR was effective in training advanced communication skills and in enabling knowledge transfer into a more realistic clinical situation. MPathic-VR's virtual human simulation offers an effective and engaging means of advanced communication training. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Kron, Frederick W.; Fetters, Michael D.; Scerbo, Mark W.; White, Casey B.; Lypson, Monica L.; Padilla, Miguel A.; Gliva-McConvey, Gayle A.; Belfore, Lee A.; West, Temple; Wallace, Amelia M.; Guetterman, Timothy C.; Schleicher, Lauren S.; Kennedy, Rebecca A.; Mangrulkar, Rajesh S.; Cleary, James F.; Marsella, Stacy C.; Becker, Daniel M.
2016-01-01
Objectives To assess advanced communication skills among second-year medical students exposed either to a computer simulation (MPathic-VR) featuring virtual humans, or to a multimedia computer-based learning module, and to understand each group’s experiences and learning preferences. Methods A single-blinded, mixed methods, randomized, multisite trial compared MPathic-VR (N=210) to computer-based learning (N=211). Primary outcomes: communication scores during repeat interactions with MPathic-VR’s intercultural and interprofessional communication scenarios and scores on a subsequent advanced communication skills objective structured clinical examination (OSCE). Multivariate analysis of variance was used to compare outcomes. Secondary outcomes: student attitude surveys and qualitative assessments of their experiences with MPathic-VR or computer-based learning. Results MPathic-VR-trained students improved their intercultural and interprofessional communication performance between their first and second interactions with each scenario. They also achieved significantly higher composite scores on the OSCE than computer-based learning-trained students. Attitudes and experiences were more positive among students trained with MPathic-VR, who valued its providing immediate feedback, teaching nonverbal communication skills, and preparing them for emotion-charged patient encounters. Conclusions MPathic-VR was effective in training advanced communication skills and in enabling knowledge transfer into a more realistic clinical situation. Practice Implications MPathic-VR’s virtual human simulation offers an effective and engaging means of advanced communication training. PMID:27939846
The development of acoustic experiments for off-campus teaching and learning
NASA Astrophysics Data System (ADS)
Wild, Graham; Swan, Geoff
2011-05-01
In this article, we show the implementation of a computer-based digital storage oscilloscope (DSO) and function generator (FG) using the computer's soundcard for off-campus acoustic experiments. The microphone input is used for the DSO, and a speaker jack is used as the FG. In an effort to reduce the cost of implementing the experiment, we examine software available for free, online. A small number of applications were compared in terms of their interface and functionality, for both the DSO and the FG. The software was then used to investigate standing waves in pipes using the computer-based DSO. Standing wave theory taught in high school and in first year physics is based on a one-dimensional model. With the use of the DSO's fast Fourier transform function, the experimental uncertainly alone was not sufficient to account for the difference observed between the measure and the calculated frequencies. Hence the original experiment was expanded upon to include the end correction effect. The DSO was also used for other simple acoustics experiments, in areas such as the physics of music.
Critical Computer Literacy: Computers in First-Year Composition as Topic and Environment.
ERIC Educational Resources Information Center
Duffelmeyer, Barbara Blakely
2000-01-01
Addresses how first-year students understand the influence of computers by cultural assumptions about technology. Presents three meaning perspectives on technology that students expressed based on formative experiences they have had with it. Discusses implications for how computers and composition scholars incorporate computer technology into…
feature extraction, human-computer interaction, and physics-based modeling. Professional Experience 2009 ., computer science, University of Colorado at Boulder M.S., computer science, University of Colorado at Boulder B.S., computer science, New Mexico Institute of Mining and Technology
Zhang, Jie; Wu, Xiaohong; Yu, Yanmei; Luo, Daisheng
2013-01-01
In optical printed Chinese character recognition (OPCCR), many classifiers have been proposed for the recognition. Among the classifiers, support vector machine (SVM) might be the best classifier. However, SVM is a classifier for two classes. When it is used for multi-classes in OPCCR, its computation is time-consuming. Thus, we propose a neighbor classes based SVM (NC-SVM) to reduce the computation consumption of SVM. Experiments of NC-SVM classification for OPCCR have been done. The results of the experiments have shown that the NC-SVM we proposed can effectively reduce the computation time in OPCCR.
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2016-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.
NASA Astrophysics Data System (ADS)
Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.
2017-12-01
This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.
Boevé, Anja J.; Meijer, Rob R.; Albers, Casper J.; Beetsma, Yta; Bosker, Roel J.
2015-01-01
The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration. PMID:26641632
Hand-held computer operating system program for collection of resident experience data.
Malan, T K; Haffner, W H; Armstrong, A Y; Satin, A J
2000-11-01
To describe a system for recording resident experience involving hand-held computers with the Palm Operating System (3 Com, Inc., Santa Clara, CA). Hand-held personal computers (PCs) are popular, easy to use, inexpensive, portable, and can share data among other operating systems. Residents in our program carry individual hand-held database computers to record Residency Review Committee (RRC) reportable patient encounters. Each resident's data is transferred to a single central relational database compatible with Microsoft Access (Microsoft Corporation, Redmond, WA). Patient data entry and subsequent transfer to a central database is accomplished with commercially available software that requires minimal computer expertise to implement and maintain. The central database can then be used for statistical analysis or to create required RRC resident experience reports. As a result, the data collection and transfer process takes less time for residents and program director alike, than paper-based or central computer-based systems. The system of collecting resident encounter data using hand-held computers with the Palm Operating System is easy to use, relatively inexpensive, accurate, and secure. The user-friendly system provides prompt, complete, and accurate data, enhancing the education of residents while facilitating the job of the program director.
The Influence of Computer-Based Text Editors on the Revision Strategies of Inexperienced Writers.
ERIC Educational Resources Information Center
Collier, Richard M.
A study sought to determine the effect of computer-based text editing on the revision strategies of inexperienced writers. Four subjects, none of whom had experience with computers or word processors, were selected from an introductory college composition course and required to master the basic terminal functions that would be necessary for…
ERIC Educational Resources Information Center
Yilmaz, Nursel; Alici, Sule
2011-01-01
The purpose of this study was to investigate pre-service early childhood teachers' attitudes towards using Computer Based Education (CBE) while implementing science activities. More specifically, the present study examined the effect of different variables such as gender, year in program, experience in preschool, owing a computer, and the…
Linear programming computational experience with onyx
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atrek, E.
1994-12-31
ONYX is a linear programming software package based on an efficient variation of the gradient projection method. When fully configured, it is intended for application to industrial size problems. While the computational experience is limited at the time of this abstract, the technique is found to be robust and competitive with existing methodology in terms of both accuracy and speed. An overview of the approach is presented together with a description of program capabilities, followed by a discussion of up-to-date computational experience with the program. Conclusions include advantages of the approach and envisioned future developments.
Heterogeneous concurrent computing with exportable services
NASA Technical Reports Server (NTRS)
Sunderam, Vaidy
1995-01-01
Heterogeneous concurrent computing, based on the traditional process-oriented model, is approaching its functionality and performance limits. An alternative paradigm, based on the concept of services, supporting data driven computation, and built on a lightweight process infrastructure, is proposed to enhance the functional capabilities and the operational efficiency of heterogeneous network-based concurrent computing. TPVM is an experimental prototype system supporting exportable services, thread-based computation, and remote memory operations that is built as an extension of and an enhancement to the PVM concurrent computing system. TPVM offers a significantly different computing paradigm for network-based computing, while maintaining a close resemblance to the conventional PVM model in the interest of compatibility and ease of transition Preliminary experiences have demonstrated that the TPVM framework presents a natural yet powerful concurrent programming interface, while being capable of delivering performance improvements of upto thirty percent.
Issues in Text Design and Layout for Computer Based Communications.
ERIC Educational Resources Information Center
Andresen, Lee W.
1991-01-01
Discussion of computer-based communications (CBC) focuses on issues involved with screen design and layout for electronic text, based on experiences with electronic messaging, conferencing, and publishing within the Australian Open Learning Information Network (AOLIN). Recommendations for research on design and layout for printed text are also…
The Impact of Internet-Based Instruction on Teacher Education: The "Paradigm Shift."
ERIC Educational Resources Information Center
Lan, Jiang JoAnn
This study incorporated Internet-based instruction into two education technology courses for preservice teachers. One was a required, undergraduate, beginning-level educational computing course. The other was a graduate, advanced-level computing course. The experiment incorporated Internet-based instruction into course delivery in order to create…
Evaluating the Usage of Cloud-Based Collaboration Services through Teamwork
ERIC Educational Resources Information Center
Qin, Li; Hsu, Jeffrey; Stern, Mel
2016-01-01
With the proliferation of cloud computing for both organizational and educational use, cloud-based collaboration services are transforming how people work in teams. The authors investigated the determinants of the usage of cloud-based collaboration services including teamwork quality, computer self-efficacy, and prior experience, as well as its…
Developing Simulations in Multi-User Virtual Environments to Enhance Healthcare Education
ERIC Educational Resources Information Center
Rogers, Luke
2011-01-01
Computer-based clinical simulations are a powerful teaching and learning tool because of their ability to expand healthcare students' clinical experience by providing practice-based learning. Despite the benefits of traditional computer-based clinical simulations, there are significant issues that arise when incorporating them into a flexible,…
A qualitative study of technophobic students' reactions to a technology-rich college science course
NASA Astrophysics Data System (ADS)
Guttschow, Gena Lee
The use of technology in education has grown rapidly in the last 20 years. In fact, many of today's college students have had some sort of computer in their elementary school classrooms. One might think that this consistent exposure to computers would foster positive attitudes about computers but this is not always the case. Currently, a substantial number of college students dislike interacting with technology. People who dislike interacting with technology are often referred to as "technophobic". Technophobic people have negative thoughts and feelings about technology and they often have a desire to avoid interaction with technology. Technophobic students' negative feelings about technology have the potential to interfere with their learning when technology is utilized as a tool for instruction of school subjects. As computer use becomes prevalent and in many instances mandatory in education, the issue of technophobia increasingly needs to be understood and addressed. This is a qualitative study designed with the intent of gaining an understanding the experiences of technophobic students who are required to use technology to learn science in a college class. Six developmental college students enrolled in a computer based anatomy and physiology class were chosen to participate in the study based on their high technophobia scores. They were interviewed three times during the quarter and videotaped once. The interview data were transcribed, coded, and analyzed. The analysis resulted in six case studies describing each participant's experience and 11 themes representing overlapping areas in the participants' worlds of experience. A discussion of the themes, the meaning they hold for me as a science educator and how they relate to the existing literature, is presented. The participants' descriptions of their experiences showed that the technophobic students did use the computers and learned skills when they had to in order to complete assignments. It was also revealed that the technophobic participants' negative attitudes did not improve after learning computer skills. Lastly, based on the participants' experiences it seems important to start a class with step-by step computer training, teaching foundational computer skills, and slowly progress towards autonomous computer exploration.
Conway, J; Sharkey, R
2002-10-01
The Faculty of Nursing, University of Newcastle, Australia, has been keen to initiate strategies that enhance student learning and nursing practice. Two strategies are problem based learning (PBL) and clinical practice. The Faculty has maintained a comparatively high proportion of the undergraduate hours in the clinical setting in times when financial constraints suggest that simulations and on campus laboratory experiences may be less expensive.Increasingly, computer based technologies are becoming sufficiently refined to support the exploration of nursing practice in a non-traditional lecture/tutorial environment. In 1998, a group of faculty members proposed that computer mediated instruction would provide an opportunity for partnership between students, academics and clinicians that would promote more positive outcomes for all and maintain the integrity of the PBL approach. This paper discusses the similarities between problem based and practice based learning and presents the findings of an evaluative study of the implementation of a practice based learning model that uses computer mediated communication to promote integration of practice experiences with the broader goals of the undergraduate curriculum.
Plant, Richard R
2016-03-01
There is an ongoing 'replication crisis' across the field of psychology in which researchers, funders, and members of the public are questioning the results of some scientific studies and the validity of the data they are based upon. However, few have considered that a growing proportion of research in modern psychology is conducted using a computer. Could it simply be that the hardware and software, or experiment generator, being used to run the experiment itself be a cause of millisecond timing error and subsequent replication failure? This article serves as a reminder that millisecond timing accuracy in psychology studies remains an important issue and that care needs to be taken to ensure that studies can be replicated on current computer hardware and software.
NASA Technical Reports Server (NTRS)
Marchese, Anthony J.; Dryer, Frederick L.
1997-01-01
This program supports the engineering design, data analysis, and data interpretation requirements for the study of initially single component, spherically symmetric, isolated droplet combustion studies. Experimental emphasis is on the study of simple alcohols (methanol, ethanol) and alkanes (n-heptane, n-decane) as fuels with time dependent measurements of drop size, flame-stand-off, liquid-phase composition, and finally, extinction. Experiments have included bench-scale studies at Princeton, studies in the 2.2 and 5.18 drop towers at NASA-LeRC, and both the Fiber Supported Droplet Combustion (FSDC-1, FSDC-2) and the free Droplet Combustion Experiment (DCE) studies aboard the shuttle. Test matrix and data interpretation are performed through spherically-symmetric, time-dependent numerical computations which embody detailed sub-models for physical and chemical processes. The computed burning rate, flame stand-off, and extinction diameter are compared with the respective measurements for each individual experiment. In particular, the data from FSDC-1 and subsequent space-based experiments provide the opportunity to compare all three types of data simultaneously with the computed parameters. Recent numerical efforts are extending the computational tools to consider time dependent, axisymmetric 2-dimensional reactive flow situations.
Mairesse, Olivier; Hofmans, Joeri; Theuns, Peter
2008-05-01
We propose a free, easy-to-use computer program that does not requires prior knowledge of computer programming to generate and run experiments using textual or pictorial stimuli. Although the FM Experiment Builder suite was initially programmed for building and conducting FM experiments, it can also be applied for non-FM experiments that necessitate randomized, single, or multifactorial designs. The program is highly configurable, allowing multilingual use and a wide range of different response formats. The outputs of the experiments are Microsoft Excel compatible .xls files that allow easy copy-paste of the results into Weiss's FM CalSTAT program (2006) or any other statistical package. Its Java-based structure is compatible with both Windows and Macintosh operating systems, and its compactness (< 1 MB) makes it easily distributable over the Internet.
A Weibull distribution accrual failure detector for cloud computing.
Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.
Computer-aided visualization and analysis system for sequence evaluation
Chee, M.S.
1998-08-18
A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device. 27 figs.
Computer-aided visualization and analysis system for sequence evaluation
Chee, Mark S.; Wang, Chunwei; Jevons, Luis C.; Bernhart, Derek H.; Lipshutz, Robert J.
2004-05-11
A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.
Computer-aided visualization and analysis system for sequence evaluation
Chee, Mark S.
1998-08-18
A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.
Computer-aided visualization and analysis system for sequence evaluation
Chee, Mark S.
2003-08-19
A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.
CSNS computing environment Based on OpenStack
NASA Astrophysics Data System (ADS)
Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu
2017-10-01
Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.
Inquiry-Based Learning in Remote Sensing: A Space Balloon Educational Experiment
ERIC Educational Resources Information Center
Mountrakis, Giorgos; Triantakonstantis, Dimitrios
2012-01-01
Teaching remote sensing in higher education has been traditionally restricted in lecture and computer-aided laboratory activities. This paper presents and evaluates an engaging inquiry-based educational experiment. The experiment was incorporated in an introductory remote sensing undergraduate course to bridge the gap between theory and…
ERIC Educational Resources Information Center
Pike, Ronald E.; Pittman, Jason M.; Hwang, Drew
2017-01-01
This paper investigates the use of a cloud computing environment to facilitate the teaching of web development at a university in the Southwestern United States. A between-subjects study of students in a web development course was conducted to assess the merits of a cloud computing environment instead of personal computers for developing websites.…
Project-Based Teaching-Learning Computer-Aided Engineering Tools
ERIC Educational Resources Information Center
Simoes, J. A.; Relvas, C.; Moreira, R.
2004-01-01
Computer-aided design, computer-aided manufacturing, computer-aided analysis, reverse engineering and rapid prototyping are tools that play an important key role within product design. These are areas of technical knowledge that must be part of engineering and industrial design courses' curricula. This paper describes our teaching experience of…
ERIC Educational Resources Information Center
Maola, Joseph; Kane, Gary
1976-01-01
Subjects, who were Occupational Work Experience students, were randomly assigned to individual guidance from either a computerized occupational information system, to a counselor-based information system or to a control group. Results demonstrate a hierarchical learning effect: The computer group learned more than the counseled group, which…
Zhang, Jie; Wu, Xiaohong; Yu, Yanmei; Luo, Daisheng
2013-01-01
In optical printed Chinese character recognition (OPCCR), many classifiers have been proposed for the recognition. Among the classifiers, support vector machine (SVM) might be the best classifier. However, SVM is a classifier for two classes. When it is used for multi-classes in OPCCR, its computation is time-consuming. Thus, we propose a neighbor classes based SVM (NC-SVM) to reduce the computation consumption of SVM. Experiments of NC-SVM classification for OPCCR have been done. The results of the experiments have shown that the NC-SVM we proposed can effectively reduce the computation time in OPCCR. PMID:23536777
Computer Assisted Thermography And Its Application In Ovulation Detection
NASA Astrophysics Data System (ADS)
Rao, K. H.; Shah, A. V.
1984-08-01
Hardware and software of a computer-assisted image analyzing system used for infrared images in medical applications are discussed. The application of computer-assisted thermography (CAT) as a complementary diagnostic tool in centralized diagnostic management is proposed. The authors adopted 'Computer Assisted Thermography' to study physiological changes in the breasts related to the hormones characterizing the menstrual cycle of a woman. Based on clinical experi-ments followed by thermal image analysis, they suggest that 'differential skin temperature (DST)1 be measured to detect the fertility interval in the menstrual cycle of a woman.
1997-02-01
application with a strong resemblance to a video game , concern has been raised that prior video game experience might have a moderating effect on scores. Much...such as spatial ability. The effects of computer or video game experience on work sample scores have not been systematically investigated. The purpose...of this study was to evaluate the incremental validity of prior video game experience over that of general aptitude as a predictor of work sample test
Wilkinson, Ann; While, Alison E; Roberts, Julia
2009-04-01
This paper is a report of a review to describe and discuss the psychometric properties of instruments used in healthcare education settings measuring experience and attitudes of healthcare students regarding their information and communication technology skills and their use of computers and the Internet for education. Healthcare professionals are expected to be computer and information literate at registration. A previous review of evaluative studies of computer-based learning suggests that methods of measuring learners' attitudes to computers and computer aided learning are problematic. A search of eight health and social science databases located 49 papers, the majority published between 1995 and January 2007, focusing on the experience and attitudes of students in the healthcare professions towards computers and e-learning. An integrative approach was adopted, with narrative description of findings. Criteria for inclusion were quantitative studies using survey tools with samples of healthcare students and concerning computer and information literacy skills, access to computers, experience with computers and use of computers and the Internet for education purposes. Since the 1980s a number of instruments have been developed, mostly in the United States of America, to measure attitudes to computers, anxiety about computer use, information and communication technology skills, satisfaction and more recently attitudes to the Internet and computers for education. The psychometric properties are poorly described. Advances in computers and technology mean that many earlier tools are no longer valid. Measures of the experience and attitudes of healthcare students to the increased use of e-learning require development in line with computer and technology advances.
Spacecraft crew procedures from paper to computers
NASA Technical Reports Server (NTRS)
Oneal, Michael; Manahan, Meera
1991-01-01
Described here is a research project that uses human factors and computer systems knowledge to explore and help guide the design and creation of an effective Human-Computer Interface (HCI) for spacecraft crew procedures. By having a computer system behind the user interface, it is possible to have increased procedure automation, related system monitoring, and personalized annotation and help facilities. The research project includes the development of computer-based procedure system HCI prototypes and a testbed for experiments that measure the effectiveness of HCI alternatives in order to make design recommendations. The testbed will include a system for procedure authoring, editing, training, and execution. Progress on developing HCI prototypes for a middeck experiment performed on Space Shuttle Mission STS-34 and for upcoming medical experiments are discussed. The status of the experimental testbed is also discussed.
Comfort and experience with online learning: trends over nine years and associations with knowledge.
Cook, David A; Thompson, Warren G
2014-07-01
Some evidence suggests that attitude toward computer-based instruction is an important determinant of success in online learning. We sought to determine how comfort using computers and perceptions of prior online learning experiences have changed over the past decade, and how these associate with learning outcomes. Each year from 2003-2011 we conducted a prospective trial of online learning. As part of each year's study, we asked medicine residents about their comfort using computers and if their previous experiences with online learning were favorable. We assessed knowledge using a multiple-choice test. We used regression to analyze associations and changes over time. 371 internal medicine and family medicine residents participated. Neither comfort with computers nor perceptions of prior online learning experiences showed a significant change across years (p > 0.61), with mean comfort rating 3.96 (maximum 5 = very comfortable) and mean experience rating 4.42 (maximum 6 = strongly agree [favorable]). Comfort showed no significant association with knowledge scores (p = 0.39) but perceptions of prior experiences did, with a 1.56% rise in knowledge score for a 1-point rise in experience score (p = 0.02). Correlations among comfort, perceptions of prior experiences, and number of prior experiences were all small and not statistically significant. Comfort with computers and perceptions of prior experience with online learning remained stable over nine years. Prior good experiences (but not comfort with computers) demonstrated a modest association with knowledge outcomes, suggesting that prior course satisfaction may influence subsequent learning.
Benchmarking gate-based quantum computers
NASA Astrophysics Data System (ADS)
Michielsen, Kristel; Nocon, Madita; Willsch, Dennis; Jin, Fengping; Lippert, Thomas; De Raedt, Hans
2017-11-01
With the advent of public access to small gate-based quantum processors, it becomes necessary to develop a benchmarking methodology such that independent researchers can validate the operation of these processors. We explore the usefulness of a number of simple quantum circuits as benchmarks for gate-based quantum computing devices and show that circuits performing identity operations are very simple, scalable and sensitive to gate errors and are therefore very well suited for this task. We illustrate the procedure by presenting benchmark results for the IBM Quantum Experience, a cloud-based platform for gate-based quantum computing.
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2017-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948
Computer-aided visualization and analysis system for sequence evaluation
Chee, Mark S.
1999-10-26
A computer system (1) for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area (814) and sample sequences in another area (816) on a display device (3).
Computer-aided visualization and analysis system for sequence evaluation
Chee, Mark S.
2001-06-05
A computer system (1) for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area (814) and sample sequences in another area (816) on a display device (3).
A Computer Simulation of Community Pharmacy Practice for Educational Use.
Bindoff, Ivan; Ling, Tristan; Bereznicki, Luke; Westbury, Juanita; Chalmers, Leanne; Peterson, Gregory; Ollington, Robert
2014-11-15
To provide a computer-based learning method for pharmacy practice that is as effective as paper-based scenarios, but more engaging and less labor-intensive. We developed a flexible and customizable computer simulation of community pharmacy. Using it, the students would be able to work through scenarios which encapsulate the entirety of a patient presentation. We compared the traditional paper-based teaching method to our computer-based approach using equivalent scenarios. The paper-based group had 2 tutors while the computer group had none. Both groups were given a prescenario and postscenario clinical knowledge quiz and survey. Students in the computer-based group had generally greater improvements in their clinical knowledge score, and third-year students using the computer-based method also showed more improvements in history taking and counseling competencies. Third-year students also found the simulation fun and engaging. Our simulation of community pharmacy provided an educational experience as effective as the paper-based alternative, despite the lack of a human tutor.
Challenges to Software/Computing for Experimentation at the LHC
NASA Astrophysics Data System (ADS)
Banerjee, Sunanda
The demands of future high energy physics experiments towards software and computing have led the experiments to plan the related activities as a full-fledged project and to investigate new methodologies and languages to meet the challenges. The paths taken by the four LHC experiments ALICE, ATLAS, CMS and LHCb are coherently put together in an LHC-wide framework based on Grid technology. The current status and understandings have been broadly outlined.
ERIC Educational Resources Information Center
Cree, George S.; McNorgan, Chris; McRae, Ken
2006-01-01
The authors present data from 2 feature verification experiments designed to determine whether distinctive features have a privileged status in the computation of word meaning. They use an attractor-based connectionist model of semantic memory to derive predictions for the experiments. Contrary to central predictions of the conceptual structure…
ERIC Educational Resources Information Center
Perram, John W.; Andersen, Morten; Ellekilde, Lars-Peter; Hjorth, Poul G.
2004-01-01
This paper discusses experience with alternative assessment strategies for an introductory course in dynamical systems, where the use of computer algebra and calculus is fully integrated into the learning process, so that the standard written examination would not be appropriate. Instead, students' competence was assessed by grading three large…
The Laboratory-Based Economics Curriculum.
ERIC Educational Resources Information Center
King, Paul G.; LaRoe, Ross M.
1991-01-01
Describes the liberal arts, computer laboratory-based economics program at Denison University (Ohio). Includes as goals helping students to (1) understand deductive arguments, (2) learn to apply theory in real-world situations, and (3) test and modify theory when necessary. Notes that the program combines computer laboratory experiments for…
Computer Mediated Communication: Online Instruction and Interactivity.
ERIC Educational Resources Information Center
Lavooy, Maria J.; Newlin, Michael H.
2003-01-01
Explores the different forms and potential applications of computer mediated communication (CMC) for Web-based and Web-enhanced courses. Based on their experiences with three different Web courses (Research Methods in Psychology, Statistical Methods in Psychology, and Basic Learning Processes) taught repeatedly over the last five years, the…
A Weibull distribution accrual failure detector for cloud computing
Wu, Zhibo; Wu, Jin; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing. PMID:28278229
ComputerTown. A Do-It-Yourself Community Computer Project.
ERIC Educational Resources Information Center
Loop, Liza; And Others
This manual based on Menlo Park's experiences in participating in a nationwide experimental computer literacy project provides guidelines for the development and management of a ComputerTown. This project, which was begun in September 1981 in the Menlo Park Public Library, concentrates on providing public access to microcomputers through hands-on…
ERIC Educational Resources Information Center
Blandford, A. E.; Smith, P. R.
1986-01-01
Describes the style of design of computer simulations developed by Computer Assisted Teaching Unit at Queen Mary College with reference to user interface, input and initialization, input data vetting, effective display screen use, graphical results presentation, and need for hard copy. Procedures and problems relating to academic involvement are…
ERIC Educational Resources Information Center
Kim, Jieun; Ryu, Hokyoung; Katuk, Norliza; Wang, Ruili; Choi, Gyunghyun
2014-01-01
The present study aims to show if a skill-challenge balancing (SCB) instruction strategy can assist learners to motivationally engage in computer-based learning. Csikszentmihalyi's flow theory (self-control, curiosity, focus of attention, and intrinsic interest) was applied to an account of the optimal learning experience in SCB-based learning…
NASA Astrophysics Data System (ADS)
Sisodia, Mitali; Shukla, Abhishek; Pathak, Anirban
2017-12-01
A scheme for distributed quantum measurement that allows nondestructive or indirect Bell measurement was proposed by Gupta et al [1]. In the present work, Gupta et al.'s scheme is experimentally realized using the five-qubit super-conductivity-based quantum computer, which has been recently placed in cloud by IBM Corporation. The experiment confirmed that the Bell state can be constructed and measured in a nondestructive manner with a reasonably high fidelity. A comparison of the outcomes of this study and the results obtained earlier in an NMR-based experiment (Samal et al. (2010) [10]) has also been performed. The study indicates that to make a scalable SQUID-based quantum computer, errors introduced by the gates (in the present technology) have to be reduced considerably.
Designing for deeper learning in a blended computer science course for middle school students
NASA Astrophysics Data System (ADS)
Grover, Shuchi; Pea, Roy; Cooper, Stephen
2015-04-01
The focus of this research was to create and test an introductory computer science course for middle school. Titled "Foundations for Advancing Computational Thinking" (FACT), the course aims to prepare and motivate middle school learners for future engagement with algorithmic problem solving. FACT was also piloted as a seven-week course on Stanford's OpenEdX MOOC platform for blended in-class learning. Unique aspects of FACT include balanced pedagogical designs that address the cognitive, interpersonal, and intrapersonal aspects of "deeper learning"; a focus on pedagogical strategies for mediating and assessing for transfer from block-based to text-based programming; curricular materials for remedying misperceptions of computing; and "systems of assessments" (including formative and summative quizzes and tests, directed as well as open-ended programming assignments, and a transfer test) to get a comprehensive picture of students' deeper computational learning. Empirical investigations, accomplished over two iterations of a design-based research effort with students (aged 11-14 years) in a public school, sought to examine student understanding of algorithmic constructs, and how well students transferred this learning from Scratch to text-based languages. Changes in student perceptions of computing as a discipline were measured. Results and mixed-method analyses revealed that students in both studies (1) achieved substantial learning gains in algorithmic thinking skills, (2) were able to transfer their learning from Scratch to a text-based programming context, and (3) achieved significant growth toward a more mature understanding of computing as a discipline. Factor analyses of prior computing experience, multivariate regression analyses, and qualitative analyses of student projects and artifact-based interviews were conducted to better understand the factors affecting learning outcomes. Prior computing experiences (as measured by a pretest) and math ability were found to be strong predictors of learning outcomes.
Accelerated Application Development: The ORNL Titan Experience
Joubert, Wayne; Archibald, Richard K.; Berrill, Mark A.; ...
2015-05-09
The use of computational accelerators such as NVIDIA GPUs and Intel Xeon Phi processors is now widespread in the high performance computing community, with many applications delivering impressive performance gains. However, programming these systems for high performance, performance portability and software maintainability has been a challenge. In this paper we discuss experiences porting applications to the Titan system. Titan, which began planning in 2009 and was deployed for general use in 2013, was the first multi-petaflop system based on accelerator hardware. To ready applications for accelerated computing, a preparedness effort was undertaken prior to delivery of Titan. In this papermore » we report experiences and lessons learned from this process and describe how users are currently making use of computational accelerators on Titan.« less
Accelerated application development: The ORNL Titan experience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joubert, Wayne; Archibald, Rick; Berrill, Mark
2015-08-01
The use of computational accelerators such as NVIDIA GPUs and Intel Xeon Phi processors is now widespread in the high performance computing community, with many applications delivering impressive performance gains. However, programming these systems for high performance, performance portability and software maintainability has been a challenge. In this paper we discuss experiences porting applications to the Titan system. Titan, which began planning in 2009 and was deployed for general use in 2013, was the first multi-petaflop system based on accelerator hardware. To ready applications for accelerated computing, a preparedness effort was undertaken prior to delivery of Titan. In this papermore » we report experiences and lessons learned from this process and describe how users are currently making use of computational accelerators on Titan.« less
Designing for Learner Engagement with Computer Based Testing
ERIC Educational Resources Information Center
Walker, Richard; Handley, Zoe
2016-01-01
The issues influencing student engagement with high-stakes computer-based exams were investigated, drawing on feedback from two cohorts of international MA Education students encountering this assessment method for the first time. Qualitative data from surveys and focus groups on the students' examination experience were analysed, leading to the…
Automated Content Synthesis for Interactive Remote Instruction.
ERIC Educational Resources Information Center
Maly, K.; Overstreet, C. M.; Gonzalez, A.; Denbar, M. L.; Cutaran, R.; Karunaratne, N.
This paper describes IRI (Interactive Remote Instruction), a computer-based system built at Old Dominion University (Virginia) in order to support distance education. The system is based on the concept of a virtual classroom where students at different locations have the same synchronous class experience, using networked computers to communicate…
DOT National Transportation Integrated Search
2007-08-01
This research was conducted to develop and test a personal computer-based study procedure (PCSP) with secondary task loading for use in human factors laboratory experiments in lieu of a driving simulator to test reading time and understanding of traf...
A Framework for the Specification of the Semantics and the Dynamics of Instructional Applications
ERIC Educational Resources Information Center
Buendia-Garcia, Felix; Diaz, Paloma
2003-01-01
An instructional application consists of a set of resources and activities to implement interacting, interrelated, and structured experiences oriented towards achieving specific educational objectives. The development of computer-based instructional applications has to follow a well defined process, so models for computer-based instructional…
Tensor methodology and computational geometry in direct computational experiments in fluid mechanics
NASA Astrophysics Data System (ADS)
Degtyarev, Alexander; Khramushin, Vasily; Shichkina, Julia
2017-07-01
The paper considers a generalized functional and algorithmic construction of direct computational experiments in fluid dynamics. Notation of tensor mathematics is naturally embedded in the finite - element operation in the construction of numerical schemes. Large fluid particle, which have a finite size, its own weight, internal displacement and deformation is considered as an elementary computing object. Tensor representation of computational objects becomes strait linear and uniquely approximation of elementary volumes and fluid particles inside them. The proposed approach allows the use of explicit numerical scheme, which is an important condition for increasing the efficiency of the algorithms developed by numerical procedures with natural parallelism. It is shown that advantages of the proposed approach are achieved among them by considering representation of large particles of a continuous medium motion in dual coordinate systems and computing operations in the projections of these two coordinate systems with direct and inverse transformations. So new method for mathematical representation and synthesis of computational experiment based on large particle method is proposed.
Comfort and experience with online learning: trends over nine years and associations with knowledge
2014-01-01
Background Some evidence suggests that attitude toward computer-based instruction is an important determinant of success in online learning. We sought to determine how comfort using computers and perceptions of prior online learning experiences have changed over the past decade, and how these associate with learning outcomes. Methods Each year from 2003–2011 we conducted a prospective trial of online learning. As part of each year’s study, we asked medicine residents about their comfort using computers and if their previous experiences with online learning were favorable. We assessed knowledge using a multiple-choice test. We used regression to analyze associations and changes over time. Results 371 internal medicine and family medicine residents participated. Neither comfort with computers nor perceptions of prior online learning experiences showed a significant change across years (p > 0.61), with mean comfort rating 3.96 (maximum 5 = very comfortable) and mean experience rating 4.42 (maximum 6 = strongly agree [favorable]). Comfort showed no significant association with knowledge scores (p = 0.39) but perceptions of prior experiences did, with a 1.56% rise in knowledge score for a 1-point rise in experience score (p = 0.02). Correlations among comfort, perceptions of prior experiences, and number of prior experiences were all small and not statistically significant. Conclusions Comfort with computers and perceptions of prior experience with online learning remained stable over nine years. Prior good experiences (but not comfort with computers) demonstrated a modest association with knowledge outcomes, suggesting that prior course satisfaction may influence subsequent learning. PMID:24985690
Situation awareness and trust in computer-based procedures in nuclear power plant operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Throneburg, E. B.; Jones, J. M.
2006-07-01
Situation awareness and trust are two issues that need to be addressed in the design of computer-based procedures for nuclear power plants. Situation awareness, in relation to computer-based procedures, concerns the operators' knowledge of the plant's state while following the procedures. Trust concerns the amount of faith that the operators put into the automated procedures, which can affect situation awareness. This paper first discusses the advantages and disadvantages of computer-based procedures. It then discusses the known aspects of situation awareness and trust as applied to computer-based procedures in nuclear power plants. An outline of a proposed experiment is then presentedmore » that includes methods of measuring situation awareness and trust so that these aspects can be analyzed for further study. (authors)« less
ERIC Educational Resources Information Center
Deryakulu, Deniz; Olkun, Sinan
2009-01-01
This study examined Turkish computer teachers' professional memories telling of their experiences with school administrators and supervisors. Seventy-four computer teachers participated in the study. Content analysis of the memories revealed that the most frequently mentioned themes concerning school administrators were "unsupportive…
Web-Based Job Submission Interface for the GAMESS Computational Chemistry Program
ERIC Educational Resources Information Center
Perri, M. J.; Weber, S. H.
2014-01-01
A Web site is described that facilitates use of the free computational chemistry software: General Atomic and Molecular Electronic Structure System (GAMESS). Its goal is to provide an opportunity for undergraduate students to perform computational chemistry experiments without the need to purchase expensive software.
ERIC Educational Resources Information Center
Teo, T.; Lee, C. B.; Chai, C. S.
2008-01-01
Computers are increasingly widespread, influencing many aspects of our social and work lives. As we move into a technology-based society, it is important that classroom experiences with computers are made available for all students. The purpose of this study is to examine pre-service teachers' attitudes towards computers. This study extends the…
ERIC Educational Resources Information Center
Kononets, Natalia
2015-01-01
The introduction of resource-based learning disciplines of computer cycles in Agrarian College. The article focused on the issue of implementation of resource-based learning courses in the agricultural cycle computer college. Tested approach to creating elearning resources through free hosting and their further use in the classroom. Noted that the…
A validation of well-being and happiness surveys for administration via the Internet.
Howell, Ryan T; Rodzon, Katrina S; Kurai, Mark; Sanchez, Amy H
2010-08-01
Internet research is appealing because it is a cost- and time-efficient way to access a large number of participants; however, the validity of Internet research for important subjective well-being (SWB) surveys has not been adequately assessed. The goal of the present study was to validate the Satisfaction With Life Scale (SWLS; Diener, Emmons, Larsen, & Griffin, 1985), the Positive and Negative Affect Schedule (PANAS-X; Watson & Clark, 1994), and the Subjective Happiness Scale (SHS; Lyubomirsky & Lepper, 1999) for use on the Internet. This study compared the quality of data collected using paper-based (paper-and-pencil version in a lab setting), computer-based (Web-based version in a lab setting), and Internet (Web-based version on a computer of the participant's choosing) surveys for these three measures of SWB. The paper-based and computer-based experiment recruited two college student samples; the Internet experiments recruited a college student sample and an adult sample responding to ads on different social-networking Web sites. This study provides support for the reliability, validity, and generalizability of the Internet format of the SWLS, PANAS-X, and SHS. Across the three experiments, the results indicate that the computer-based and Internet surveys had means, standard deviations, reliabilities, and factor structures that were similar to those of the paper-based versions. The discussion examines the difficulty of higher attrition for the Internet version, the need to examine reverse-coded items in the future, and the possibility that unhappy individuals are more likely to participate in Internet surveys of SWB.
DSMC Simulations of Hypersonic Flows and Comparison With Experiments
NASA Technical Reports Server (NTRS)
Moss, James N.; Bird, Graeme A.; Markelov, Gennady N.
2004-01-01
This paper presents computational results obtained with the direct simulation Monte Carlo (DSMC) method for several biconic test cases in which shock interactions and flow separation-reattachment are key features of the flow. Recent ground-based experiments have been performed for several biconic configurations, and surface heating rate and pressure measurements have been proposed for code validation studies. The present focus is to expand on the current validating activities for a relatively new DSMC code called DS2V that Bird (second author) has developed. Comparisons with experiments and other computations help clarify the agreement currently being achieved between computations and experiments and to identify the range of measurement variability of the proposed validation data when benchmarked with respect to the current computations. For the test cases with significant vibrational nonequilibrium, the effect of the vibrational energy surface accommodation on heating and other quantities is demonstrated.
Wind-tunnel based definition of the AFE aerothermodynamic environment. [Aeroassist Flight Experiment
NASA Technical Reports Server (NTRS)
Miller, Charles G.; Wells, W. L.
1992-01-01
The Aeroassist Flight Experiment (AFE), scheduled to be performed in 1994, will serve as a precursor for aeroassisted space transfer vehicles (ASTV's) and is representative of entry concepts being considered for missions to Mars. Rationale for the AFE is reviewed briefly as are the various experiments carried aboard the vehicle. The approach used to determine hypersonic aerodynamic and aerothermodynamic characteristics over a wide range of simulation parameters in ground-based facilities is presented. Facilities, instrumentation and test procedures employed in the establishment of the data base are discussed. Measurements illustrating the effects of hypersonic simulation parameters, particularly normal-shock density ratio (an important parameter for hypersonic blunt bodies), and attitude on aerodynamic and aerothermodynamic characteristics are presented, and predictions from computational fluid dynamic (CFD) computer codes are compared with measurement.
Zhao, Ming; Rattanatamrong, Prapaporn; DiGiovanna, Jack; Mahmoudi, Babak; Figueiredo, Renato J; Sanchez, Justin C; Príncipe, José C; Fortes, José A B
2008-01-01
Dynamic data-driven brain-machine interfaces (DDDBMI) have great potential to advance the understanding of neural systems and improve the design of brain-inspired rehabilitative systems. This paper presents a novel cyberinfrastructure that couples in vivo neurophysiology experimentation with massive computational resources to provide seamless and efficient support of DDDBMI research. Closed-loop experiments can be conducted with in vivo data acquisition, reliable network transfer, parallel model computation, and real-time robot control. Behavioral experiments with live animals are supported with real-time guarantees. Offline studies can be performed with various configurations for extensive analysis and training. A Web-based portal is also provided to allow users to conveniently interact with the cyberinfrastructure, conducting both experimentation and analysis. New motor control models are developed based on this approach, which include recursive least square based (RLS) and reinforcement learning based (RLBMI) algorithms. The results from an online RLBMI experiment shows that the cyberinfrastructure can successfully support DDDBMI experiments and meet the desired real-time requirements.
ERIC Educational Resources Information Center
Shen, Pei-Di; Lee, Tsang-Hsiung; Tsai, Chia-Wen
2007-01-01
Contrary to conventional expectations, the reality of computing education in Taiwan's vocational schools is not so practically oriented, and thus reveals much room for improvement. In this context, we conducted a quasi-experiment to examine the effects of applying web-based problem-based learning (PBL), web-based self-regulated learning (SRL), and…
Introducing Cloud Computing Topics in Curricula
ERIC Educational Resources Information Center
Chen, Ling; Liu, Yang; Gallagher, Marcus; Pailthorpe, Bernard; Sadiq, Shazia; Shen, Heng Tao; Li, Xue
2012-01-01
The demand for graduates with exposure in Cloud Computing is on the rise. For many educational institutions, the challenge is to decide on how to incorporate appropriate cloud-based technologies into their curricula. In this paper, we describe our design and experiences of integrating Cloud Computing components into seven third/fourth-year…
The Use of Audio in Computer-Based Instruction.
ERIC Educational Resources Information Center
Koroghlanian, Carol M.; Sullivan, Howard J.
This study investigated the effects of audio and text density on the achievement, time-in-program, and attitudes of 134 undergraduates. Data concerning the subjects' preexisting computer skills and experience, as well as demographic information, were also collected. The instruction in visual design principles was delivered by computer and included…
Computer Series, 102: Bits and Pieces, 40.
ERIC Educational Resources Information Center
Birk, James P., Ed.
1989-01-01
Discussed are seven computer programs: (1) a computer graphics experiment for organic chemistry laboratory; (2) a gel filtration simulation; (3) judging spelling correctness; (4) interfacing the TLC548 ADC; (5) a digitizing circuit for the Apple II game port; (6) a chemical information base; and (7) an IBM PC article database. (MVL)
Inventing Motivates and Prepares Student Teachers for Computer-Based Learning
ERIC Educational Resources Information Center
Glogger-Frey, I.; Kappich, J.; Schwonke, R.; Holzäpfel, L.; Nückles, M.; Renkl, A.
2015-01-01
A brief, problem-oriented phase such as an inventing activity is one potential instructional method for preparing learners not only cognitively but also motivationally for learning. Student teachers often need to overcome motivational barriers in order to use computer-based learning opportunities. In a preliminary experiment, we found that student…
Use of Humorous Visuals To Enhance Computer-Based-Instruction.
ERIC Educational Resources Information Center
Snetsinger, Wendy; Grabowski, Barbara
It was hypothesized that a visual strategy that incorporates a humorous theme and cartoons with humorous comments relevant to the content helps motivate students to focus on and retain computer-based instructional material. An experiment to assess this hypothesis was undertaken with 43 college students who received a humorous presentation on…
ERIC Educational Resources Information Center
VanLehn, Kurt
2011-01-01
This article is a review of experiments comparing the effectiveness of human tutoring, computer tutoring, and no tutoring. "No tutoring" refers to instruction that teaches the same content without tutoring. The computer tutoring systems were divided by their granularity of the user interface interaction into answer-based, step-based, and…
Cognition, Corpora, and Computing: Triangulating Research in Usage-Based Language Learning
ERIC Educational Resources Information Center
Ellis, Nick C.
2017-01-01
Usage-based approaches explore how we learn language from our experience of language. Related research thus involves the analysis of the usage from which learners learn and of learner usage as it develops. This program involves considerable data recording, transcription, and analysis, using a variety of corpus and computational techniques, many of…
Computer-Based Learning of Neuroanatomy: A Longitudinal Study of Learning, Transfer, and Retention
ERIC Educational Resources Information Center
Chariker, Julia H.; Naaz, Farah; Pani, John R.
2011-01-01
A longitudinal experiment was conducted to evaluate the effectiveness of new methods for learning neuroanatomy with computer-based instruction. Using a three-dimensional graphical model of the human brain and sections derived from the model, tools for exploring neuroanatomy were developed to encourage "adaptive exploration". This is an…
ERIC Educational Resources Information Center
Katz, Mary Maxwell; And Others
Teacher isolation is a significant problem in the science teaching profession. Traditional inservice solutions are often plagued by logistical difficulties or occur too infrequently to build ongoing teacher networks. Educational Technology Center (ETC) researchers reasoned that computer-based conferencing might promote collegial exchange among…
Model-Based and Model-Free Pavlovian Reward Learning: Revaluation, Revision and Revelation
Dayan, Peter; Berridge, Kent C.
2014-01-01
Evidence supports at least two methods for learning about reward and punishment and making predictions for guiding actions. One method, called model-free, progressively acquires cached estimates of the long-run values of circumstances and actions from retrospective experience. The other method, called model-based, uses representations of the environment, expectations and prospective calculations to make cognitive predictions of future value. Extensive attention has been paid to both methods in computational analyses of instrumental learning. By contrast, although a full computational analysis has been lacking, Pavlovian learning and prediction has typically been presumed to be solely model-free. Here, we revise that presumption and review compelling evidence from Pavlovian revaluation experiments showing that Pavlovian predictions can involve their own form of model-based evaluation. In model-based Pavlovian evaluation, prevailing states of the body and brain influence value computations, and thereby produce powerful incentive motivations that can sometimes be quite new. We consider the consequences of this revised Pavlovian view for the computational landscape of prediction, response and choice. We also revisit differences between Pavlovian and instrumental learning in the control of incentive motivation. PMID:24647659
Model-based and model-free Pavlovian reward learning: revaluation, revision, and revelation.
Dayan, Peter; Berridge, Kent C
2014-06-01
Evidence supports at least two methods for learning about reward and punishment and making predictions for guiding actions. One method, called model-free, progressively acquires cached estimates of the long-run values of circumstances and actions from retrospective experience. The other method, called model-based, uses representations of the environment, expectations, and prospective calculations to make cognitive predictions of future value. Extensive attention has been paid to both methods in computational analyses of instrumental learning. By contrast, although a full computational analysis has been lacking, Pavlovian learning and prediction has typically been presumed to be solely model-free. Here, we revise that presumption and review compelling evidence from Pavlovian revaluation experiments showing that Pavlovian predictions can involve their own form of model-based evaluation. In model-based Pavlovian evaluation, prevailing states of the body and brain influence value computations, and thereby produce powerful incentive motivations that can sometimes be quite new. We consider the consequences of this revised Pavlovian view for the computational landscape of prediction, response, and choice. We also revisit differences between Pavlovian and instrumental learning in the control of incentive motivation.
Learning and teaching with a computer scanner
NASA Astrophysics Data System (ADS)
Planinsic, G.; Gregorcic, B.; Etkina, E.
2014-09-01
This paper introduces the readers to simple inquiry-based activities (experiments with supporting questions) that one can do with a computer scanner to help students learn and apply the concepts of relative motion in 1 and 2D, vibrational motion and the Doppler effect. We also show how to use these activities to help students think like scientists. They will conduct simple experiments, construct different explanations for their observations, test their explanations in new experiments and represent their ideas in multiple ways.
Outline of CS application experiments
NASA Astrophysics Data System (ADS)
Otsu, Y.; Kondoh, K.; Matsumoto, M.
1985-09-01
To promote and investigate the practical application of satellite use, CS application experiments for various social activity needs, including those of public services such as the National Police Agency and the Japanese National Railway, computer network services, news material transmissions, and advanced teleconference activities, were performed. Public service satellite communications systems were developed and tested. Based on results obtained, several public services have implemented CS-2 for practical disaster-back-up uses. Practical application computer network and enhanced video-conference experiments have also been performed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Zi-Kui; Gleeson, Brian; Shang, Shunli
This project developed computational tools that can complement and support experimental efforts in order to enable discovery and more efficient development of Ni-base structural materials and coatings. The project goal was reached through an integrated computation-predictive and experimental-validation approach, including first-principles calculations, thermodynamic CALPHAD (CALculation of PHAse Diagram), and experimental investigations on compositions relevant to Ni-base superalloys and coatings in terms of oxide layer growth and microstructure stabilities. The developed description included composition ranges typical for coating alloys and, hence, allow for prediction of thermodynamic properties for these material systems. The calculation of phase compositions, phase fraction, and phase stabilities,more » which are directly related to properties such as ductility and strength, was a valuable contribution, along with the collection of computational tools that are required to meet the increasing demands for strong, ductile and environmentally-protective coatings. Specifically, a suitable thermodynamic description for the Ni-Al-Cr-Co-Si-Hf-Y system was developed for bulk alloy and coating compositions. Experiments were performed to validate and refine the thermodynamics from the CALPHAD modeling approach. Additionally, alloys produced using predictions from the current computational models were studied in terms of their oxidation performance. Finally, results obtained from experiments aided in the development of a thermodynamic modeling automation tool called ESPEI/pycalphad - for more rapid discovery and development of new materials.« less
Spin-based quantum computation in multielectron quantum dots
NASA Astrophysics Data System (ADS)
Hu, Xuedong; Das Sarma, S.
2001-10-01
In a quantum computer the hardware and software are intrinsically connected because the quantum Hamiltonian (or more precisely its time development) is the code that runs the computer. We demonstrate this subtle and crucial relationship by considering the example of electron-spin-based solid-state quantum computer in semiconductor quantum dots. We show that multielectron quantum dots with one valence electron in the outermost shell do not behave simply as an effective single-spin system unless special conditions are satisfied. Our work compellingly demonstrates that a delicate synergy between theory and experiment (between software and hardware) is essential for constructing a quantum computer.
Processing of the WLCG monitoring data using NoSQL
NASA Astrophysics Data System (ADS)
Andreeva, J.; Beche, A.; Belov, S.; Dzhunov, I.; Kadochnikov, I.; Karavakis, E.; Saiz, P.; Schovancova, J.; Tuckett, D.
2014-06-01
The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.
Spacelab experiment computer study. Volume 1: Executive summary (presentation)
NASA Technical Reports Server (NTRS)
Lewis, J. L.; Hodges, B. C.; Christy, J. O.
1976-01-01
A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.
ERIC Educational Resources Information Center
Jafari, Mina; Welden, Alicia Rae; Williams, Kyle L.; Winograd, Blair; Mulvihill, Ellen; Hendrickson, Heidi P.; Lenard, Michael; Gottfried, Amy; Geva, Eitan
2017-01-01
In this paper, we report on the implementation of a novel compute-to-learn pedagogy, which is based upon the theories of situated cognition and meaningful learning. The "compute-to-learn" pedagogy is designed to simulate an authentic research experience as part of the undergraduate curriculum, including project development, teamwork,…
ERIC Educational Resources Information Center
Bayrak, Bekir; Kanli, Uygar; Kandil Ingeç, Sebnem
2007-01-01
In this study, the research problem was: "Is the computer based physics instruction as effective as laboratory intensive physics instruction with regards to academic success on electric circuits 9th grade students?" For this research of experimental quality the design of pre-test and post-test are applied with an experiment and a control…
NASA Astrophysics Data System (ADS)
Nurjanah; Dahlan, J. A.; Wibisono, Y.
2017-02-01
This paper aims to make a design and development computer-based e-learning teaching material for improving mathematical understanding ability and spatial sense of junior high school students. Furthermore, the particular aims are (1) getting teaching material design, evaluation model, and intrument to measure mathematical understanding ability and spatial sense of junior high school students; (2) conducting trials computer-based e-learning teaching material model, asessment, and instrument to develop mathematical understanding ability and spatial sense of junior high school students; (3) completing teaching material models of computer-based e-learning, assessment, and develop mathematical understanding ability and spatial sense of junior high school students; (4) resulting research product is teaching materials of computer-based e-learning. Furthermore, the product is an interactive learning disc. The research method is used of this study is developmental research which is conducted by thought experiment and instruction experiment. The result showed that teaching materials could be used very well. This is based on the validation of computer-based e-learning teaching materials, which is validated by 5 multimedia experts. The judgement result of face and content validity of 5 validator shows that the same judgement result to the face and content validity of each item test of mathematical understanding ability and spatial sense. The reliability test of mathematical understanding ability and spatial sense are 0,929 and 0,939. This reliability test is very high. While the validity of both tests have a high and very high criteria.
NASA Astrophysics Data System (ADS)
Liang, Ruiyu; Xi, Ji; Bao, Yongqiang
2017-07-01
To improve the performance of gain compensation based on three-segment sound pressure level (SPL) in hearing aids, an improved multichannel loudness compensation method based on eight-segment SPL was proposed. Firstly, the uniform cosine modulated filter bank was designed. Then, the adjacent channels which have low or gradual slopes were adaptively merged to obtain the corresponding non-uniform cosine modulated filter according to the audiogram of hearing impaired persons. Secondly, the input speech was decomposed into sub-band signals and the SPL of every sub-band signal was computed. Meanwhile, the audible SPL range from 0 dB SPL to 120 dB SPL was equally divided into eight segments. Based on these segments, a different prescription formula was designed to compute more detailed gain to compensate according to the audiogram and the computed SPL. Finally, the enhanced signal was synthesized. Objective experiments showed the decomposed signals after cosine modulated filter bank have little distortion. Objective experiments showed that the hearing aids speech perception index (HASPI) and hearing aids speech quality index (HASQI) increased 0.083 and 0.082 on average, respectively. Subjective experiments showed the proposed algorithm can effectively improve the speech recognition of six hearing impaired persons.
Exploring Electronics Laboratory Experiments Using Computer Software
ERIC Educational Resources Information Center
Gandole, Yogendra Babarao
2011-01-01
The roles of teachers and students are changing, and there are undoubtedly ways of learning not yet discovered. However, the computer and software technology may provide a significant role to identify the problems, to present solutions and life-long learning. It is clear that the computer based educational technology has reached the point where…
An Analysis of Creative Process Learning in Computer Game Activities through Player Experiences
ERIC Educational Resources Information Center
Inchamnan, Wilawan
2016-01-01
This research investigates the extent to which creative processes can be fostered through computer gaming. It focuses on creative components in games that have been specifically designed for educational purposes: Digital Game Based Learning (DGBL). A behavior analysis for measuring the creative potential of computer game activities and learning…
Integration of Computer Technology Into an Introductory-Level Neuroscience Laboratory
ERIC Educational Resources Information Center
Evert, Denise L.; Goodwin, Gregory; Stavnezer, Amy Jo
2005-01-01
We describe 3 computer-based neuroscience laboratories. In the first 2 labs, we used commercially available interactive software to enhance the study of functional and comparative neuroanatomy and neurophysiology. In the remaining lab, we used customized software and hardware in 2 psychophysiological experiments. With the use of the computer-based…
The Role of Context-Related Parameters in Adults' Mental Computational Acts
ERIC Educational Resources Information Center
Naresh, Nirmala; Presmeg, Norma
2012-01-01
Researchers who have carried out studies pertaining to mental computation and everyday mathematics point out that adults and children reason intuitively based upon experiences within specific contexts; they use invented strategies of their own to solve real-life problems. We draw upon research areas of mental computation and everyday mathematics…
Integrating Data Base into the Elementary School Science Program.
ERIC Educational Resources Information Center
Schlenker, Richard M.
This document describes seven science activities that combine scientific principles and computers. The objectives for the activities are to show students how the computer can be used as a tool to store and arrange scientific data, provide students with experience using the computer as a tool to manage scientific data, and provide students with…
Using a Computer-based Messaging System at a High School To Increase School/Home Communication.
ERIC Educational Resources Information Center
Burden, Mitzi K.
Minimal communication between school and home was found to contribute to low performance by students at McDuffie High School (South Carolina). This report describes the experience of establishing a computer-based telephone messaging system in the high school and involving parents, teachers, and students in its use. Additional strategies employed…
ERIC Educational Resources Information Center
Kwon, Oh-Woog; Lee, Kiyoung; Kim, Young-Kil; Lee, Yunkeun
2015-01-01
This paper introduces a Dialog-Based Computer-Assisted second-Language Learning (DB-CALL) system using semantic and grammar correctness evaluations and the results of its experiment. While the system dialogues with English learners about a given topic, it automatically evaluates the grammar and content properness of their English utterances, then…
ERIC Educational Resources Information Center
Stefanski, Angela J.; Leitze, Amy; Fife-Demski, Veronica M.
2018-01-01
This collective case study used methods of discourse analysis to consider what computer-mediated collaboration might reveal about preservice teachers' sense-making in a field-based practicum as they learn to teach reading to children identified as struggling readers. Researchers agree that field-based experiences coupled with time for reflection…
ERIC Educational Resources Information Center
Tsai, Chia-Wen
2015-01-01
This research investigated, via quasi-experiments, the effects of web-based co-regulated learning (CRL) on developing students' computing skills. Two classes of 68 undergraduates in a one-semester course titled "Applied Information Technology: Data Processing" were chosen for this research. The first class (CRL group, n = 38) received…
Pedagogy and Processes for a Computer Programming Outreach Workshop--The Bridge to College Model
ERIC Educational Resources Information Center
Tangney, Brendan; Oldham, Elizabeth; Conneely, Claire; Barrett, Stephen; Lawlor, John
2010-01-01
This paper describes a model for computer programming outreach workshops aimed at second-level students (ages 15-16). Participants engage in a series of programming activities based on the Scratch visual programming language, and a very strong group-based pedagogy is followed. Participants are not required to have any prior programming experience.…
A method of non-contact reading code based on computer vision
NASA Astrophysics Data System (ADS)
Zhang, Chunsen; Zong, Xiaoyu; Guo, Bingxuan
2018-03-01
With the purpose of guarantee the computer information exchange security between internal and external network (trusted network and un-trusted network), A non-contact Reading code method based on machine vision has been proposed. Which is different from the existing network physical isolation method. By using the computer monitors, camera and other equipment. Deal with the information which will be on exchanged, Include image coding ,Generate the standard image , Display and get the actual image , Calculate homography matrix, Image distort correction and decoding in calibration, To achieve the computer information security, Non-contact, One-way transmission between the internal and external network , The effectiveness of the proposed method is verified by experiments on real computer text data, The speed of data transfer can be achieved 24kb/s. The experiment shows that this algorithm has the characteristics of high security, fast velocity and less loss of information. Which can meet the daily needs of the confidentiality department to update the data effectively and reliably, Solved the difficulty of computer information exchange between Secret network and non-secret network, With distinctive originality, practicability, and practical research value.
Role of the ATLAS Grid Information System (AGIS) in Distributed Data Analysis and Simulation
NASA Astrophysics Data System (ADS)
Anisenkov, A. V.
2018-03-01
In modern high-energy physics experiments, particular attention is paid to the global integration of information and computing resources into a unified system for efficient storage and processing of experimental data. Annually, the ATLAS experiment performed at the Large Hadron Collider at the European Organization for Nuclear Research (CERN) produces tens of petabytes raw data from the recording electronics and several petabytes of data from the simulation system. For processing and storage of such super-large volumes of data, the computing model of the ATLAS experiment is based on heterogeneous geographically distributed computing environment, which includes the worldwide LHC computing grid (WLCG) infrastructure and is able to meet the requirements of the experiment for processing huge data sets and provide a high degree of their accessibility (hundreds of petabytes). The paper considers the ATLAS grid information system (AGIS) used by the ATLAS collaboration to describe the topology and resources of the computing infrastructure, to configure and connect the high-level software systems of computer centers, to describe and store all possible parameters, control, configuration, and other auxiliary information required for the effective operation of the ATLAS distributed computing applications and services. The role of the AGIS system in the development of a unified description of the computing resources provided by grid sites, supercomputer centers, and cloud computing into a consistent information model for the ATLAS experiment is outlined. This approach has allowed the collaboration to extend the computing capabilities of the WLCG project and integrate the supercomputers and cloud computing platforms into the software components of the production and distributed analysis workload management system (PanDA, ATLAS).
Pain Assessment and Management in Nursing Education Using Computer-based Simulations.
Romero-Hall, Enilda
2015-08-01
It is very important for nurses to have a clear understanding of the patient's pain experience and of management strategies. However, a review of the nursing literature shows that one of the main barriers to proper pain management practice is lack of knowledge. Nursing schools are in a unique position to address the gap in pain management knowledge by facilitating the acquisition and use of knowledge by the next generation of nurses. The purpose of this article is to discuss the role of computer-based simulations as a reliable educational technology strategy that can enhance the learning experience of nursing students acquiring pain management knowledge and practice. Computer-based simulations provide a significant number of learning affordances that can help change nursing students' attitudes and behaviors toward and practice of pain assessment and management. Copyright © 2015 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.
Edwardson, S R; Pejsa, J
1993-01-01
A computer-based tutorial for teaching nursing financial management concepts was developed using the macro function of a commercially available spreadsheet program. The goals of the tutorial were to provide students with an experience with spreadsheets as a computer tool and to teach selected financial management concepts. Preliminary results show the tutorial was well received by students. Suggestions are made for overcoming the general lack of computer sophistication among students.
ERIC Educational Resources Information Center
Wu, Hsiao-Chi; Shen, Pei-Di; Chen, Yi-Fen; Tsai, Chia-Wen
2016-01-01
Web-based learning is generally a solitary process without teachers' on-the-spot assistance. In this study, a quasi-experiment was conducted to explore the effects of various combinations of Web-Based Cognitive Apprenticeship (WBCA) and Time Management (TM) on the development of students' computing skills. Three class cohorts of 124 freshmen in a…
Search and retrieval of office files using dBASE 3
NASA Technical Reports Server (NTRS)
Breazeale, W. L.; Talley, C. R.
1986-01-01
Described is a method of automating the office files retrieval process using a commercially available software package (dBASE III). The resulting product is a menu-driven computer program which requires no computer skills to operate. One part of the document is written for the potential user who has minimal computer experience and uses sample menu screens to explain the program; while a second part is oriented towards the computer literate individual and includes rather detailed descriptions of the methodology and search routines. Although much of the programming techniques are explained, this document is not intended to be a tutorial on dBASE III. It is hoped that the document will serve as a stimulus for other applications of dBASE III.
Using OSG Computing Resources with (iLC)Dirac
NASA Astrophysics Data System (ADS)
Sailer, A.; Petric, M.; CLICdp Collaboration
2017-10-01
CPU cycles for small experiments and projects can be scarce, thus making use of all available resources, whether dedicated or opportunistic, is mandatory. While enabling uniform access to the LCG computing elements (ARC, CREAM), the DIRAC grid interware was not able to use OSG computing elements (GlobusCE, HTCondor-CE) without dedicated support at the grid site through so called ‘SiteDirectors’, which directly submit to the local batch system. This in turn requires additional dedicated effort for small experiments on the grid site. Adding interfaces to the OSG CEs through the respective grid middleware is therefore allowing accessing them within the DIRAC software without additional site-specific infrastructure. This enables greater use of opportunistic resources for experiments and projects without dedicated clusters or an established computing infrastructure with the DIRAC software. To allow sending jobs to HTCondor-CE and legacy Globus computing elements inside DIRAC the required wrapper classes were developed. Not only is the usage of these types of computing elements now completely transparent for all DIRAC instances, which makes DIRAC a flexible solution for OSG based virtual organisations, but it also allows LCG Grid Sites to move to the HTCondor-CE software, without shutting DIRAC based VOs out of their site. In these proceedings we detail how we interfaced the DIRAC system to the HTCondor-CE and Globus computing elements and explain the encountered obstacles and solutions developed, and how the linear collider community uses resources in the OSG.
Grid Computing at GSI for ALICE and FAIR - present and future
NASA Astrophysics Data System (ADS)
Schwarz, Kilian; Uhlig, Florian; Karabowicz, Radoslaw; Montiel-Gonzalez, Almudena; Zynovyev, Mykhaylo; Preuss, Carsten
2012-12-01
The future FAIR experiments CBM and PANDA have computing requirements that fall in a category that could currently not be satisfied by one single computing centre. One needs a larger, distributed computing infrastructure to cope with the amount of data to be simulated and analysed. Since 2002, GSI operates a tier2 center for ALICE@CERN. The central component of the GSI computing facility and hence the core of the ALICE tier2 centre is a LSF/SGE batch farm, currently split into three subclusters with a total of 15000 CPU cores shared by the participating experiments, and accessible both locally and soon also completely via Grid. In terms of data storage, a 5.5 PB Lustre file system, directly accessible from all worker nodes is maintained, as well as a 300 TB xrootd-based Grid storage element. Based on this existing expertise, and utilising ALICE's middleware ‘AliEn’, the Grid infrastructure for PANDA and CBM is being built. Besides a tier0 centre at GSI, the computing Grids of the two FAIR collaborations encompass now more than 17 sites in 11 countries and are constantly expanding. The operation of the distributed FAIR computing infrastructure benefits significantly from the experience gained with the ALICE tier2 centre. A close collaboration between ALICE Offline and FAIR provides mutual advantages. The employment of a common Grid middleware as well as compatible simulation and analysis software frameworks ensure significant synergy effects.
A versatile software package for inter-subject correlation based analyses of fMRI.
Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi
2014-01-01
In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/
A versatile software package for inter-subject correlation based analyses of fMRI
Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi
2014-01-01
In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/ PMID:24550818
Web-based analysis and publication of flow cytometry experiments.
Kotecha, Nikesh; Krutzik, Peter O; Irish, Jonathan M
2010-07-01
Cytobank is a Web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a Web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permission, from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at http://www.cytobank.org. (c) 2010 by John Wiley & Sons, Inc.
Web-Based Analysis and Publication of Flow Cytometry Experiments
Kotecha, Nikesh; Krutzik, Peter O.; Irish, Jonathan M.
2014-01-01
Cytobank is a web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permissions from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at www.cytobank.org PMID:20578106
Sato, Emi; Matsuda, Kouhei
2018-06-11
The purpose of this study was to examine cerebral blood flow in the frontal cortex area during personality self-rating tasks. Our two hypotheses were (1) cerebral blood flow varies based on personality rating condition and (2) cerebral blood flow varies based on the personality traits. This experiment measured cerebral blood flow under 3 personal computer rating conditions and 2 questionnaire conditions. Comparing the rating conditions, the results of the t-test indicated that cerebral blood flow was higher in the questionnaire condition than it was in the personal computer condition. With respect to the Big Five, the result of the correlation coefficient, that is, cerebral blood flow during a personality rating task, changed according to the trait for agreeableness. The results of the analysis of the 5-cluster on individual differences indicated that certain personality traits were related to the factors that increased or decreased cerebral blood flow. An analysis of variance indicated that openness to experience and Behavioural Activation System-drive was significant given that participants with high intellectual curiosity were motivated in this experiment, thus, their cerebral blood flow may have increased. The significance of this experiment was that by employing certain performance measures we could examine differences in physical changes based on personality traits. © 2018 International Union of Psychological Science.
Koivunen, Marita; Välimäki, Maritta; Jakobsson, Tiina; Pitkänen, Anneli
2008-01-01
This article describes the systematic process in which an evidence-based approach was used to develop a curriculum designed to support the computer and Internet skills of nurses in psychiatric hospitals in Finland. The pressure on organizations to have skilled and motivated nurses who use modern information and communication technology in health care organizations has increased due to rapid technology development at the international and national levels. However, less frequently has the development of those computer education curricula been based on evidence-based knowledge. First, we identified psychiatric nurses' learning experiences and barriers to computer use by examining written essays. Second, nurses' computer skills were surveyed. Last, evidence from the literature was scrutinized to find effective methods that can be used to teach and learn computer use in health care. This information was integrated and used for the development process of an education curriculum designed to support nurses' computer and Internet skills.
ERIC Educational Resources Information Center
Hedman, Leif; Sharafi, Parvaneh
2004-01-01
This case study explores how educational training and clinical practice that uses personal computers (PCs) and Personal Digital Assistants (PDAs) to access Internet-based medical information, affects the engagement modes of students, flow experience components, and IT-competence. A questionnaire assessing these variables was administered before…
The Ultimate Sampling Dilemma in Experience-Based Decision Making
ERIC Educational Resources Information Center
Fiedler, Klaus
2008-01-01
Computer simulations and 2 experiments demonstrate the ultimate sampling dilemma, which constitutes a serious obstacle to inductive inferences in a probabilistic world. Participants were asked to take the role of a manager who is to make purchasing decisions based on positive versus negative feedback about 3 providers in 2 different product…
MorphoHawk: Geometric-based Software for Manufacturing and More
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keith Arterburn
2001-04-01
Hollywood movies portray facial recognition as a perfected technology, but reality is that sophisticated computers and algorithmic calculations are far from perfect. In fact, the most sophisticated and successful computer for recognizing faces and other imagery already is the human brain with more than 10 billion nerve cells. Beginning at birth, humans process data and connect optical and sensory experiences that create unparalleled accumulation of data for people to associate images with life experiences, emotions and knowledge. Computers are powerful, rapid and tireless, but still cannot compare to the highly sophisticated relational calculations and associations that the human computer canmore » produce in connecting ‘what we see with what we know.’« less
NASA Technical Reports Server (NTRS)
Vallee, J.; Wilson, T.
1976-01-01
Results are reported of the first experiments for a computer conference management information system at the National Aeronautics and Space Administration. Between August 1975 and March 1976, two NASA projects with geographically separated participants (NASA scientists) used the PLANET computer conferencing system for portions of their work. The first project was a technology assessment of future transportation systems. The second project involved experiments with the Communication Technology Satellite. As part of this project, pre- and postlaunch operations were discussed in a computer conference. These conferences also provided the context for an analysis of the cost of computer conferencing. In particular, six cost components were identified: (1) terminal equipment, (2) communication with a network port, (3) network connection, (4) computer utilization, (5) data storage and (6) administrative overhead.
ERIC Educational Resources Information Center
Lee, Eun-Ju
2007-01-01
This experiment examined what situational and dispositional features moderate the effects of linguistic gender cues on gender stereotyping in anonymous, text-based computer-mediated communication. Participants played a trivia game with an ostensible partner via computer, whose comments represented either prototypically masculine or feminine…
ERIC Educational Resources Information Center
Stotter, Philip L.; Culp, George H.
An experimental course in organic chemistry utilized computer-assisted instructional (CAI) techniques. The CAI lessons provided tutorial drill and practice and simulated experiments and reactions. The Conversational Language for Instruction and Computing was used, along with a CDC 6400-6600 system; students scheduled and completed the lessons at…
ERIC Educational Resources Information Center
Davies, Alison; Ramsay, Jill; Lindfield, Helen; Couperthwaite, John
2005-01-01
This paper examines BSc Physiotherapy students' experiences of developing their neurological observational and analytical skills using a blend of traditional classroom activities and computer-based materials at the University of Birmingham. New teaching and learning resources were developed and supported in the School of Health Sciences using Web…
ERIC Educational Resources Information Center
WING, RICHARD L.; AND OTHERS
THE PURPOSE OF THE EXPERIMENT WAS TO PRODUCE AND EVALUATE 3 COMPUTER-BASED ECONOMICS GAMES AS A METHOD OF INDIVIDUALIZING INSTRUCTION FOR GRADE 6 STUDENTS. 26 EXPERIMENTAL SUBJECTS PLAYED 2 ECONOMICS GAMES, WHILE A CONTROL GROUP RECEIVED CONVENTIONAL INSTRUCTION ON SIMILAR MATERIAL. IN THE SUMERIAN GAME, STUDENTS SEATED AT THE TYPEWRITER TERMINALS…
Computer-Based Experiment for Determining Planck's Constant Using LEDs
ERIC Educational Resources Information Center
Zhou, Feng; Cloninger, Todd
2008-01-01
Visible light emitting diodes (LEDs) have been widely used as power indicators. However, after the power is switched off, it takes a while for the LED to go off. Many students were fascinated by this simple demonstration. In this paper, by making use of computer-based data acquisition and modeling, we show the voltage across the LED undergoing an…
ERIC Educational Resources Information Center
Langheinrich, Jessica; Bogner, Franz X.
2015-01-01
As non-scientific conceptions interfere with learning processes, teachers need both, to know about them and to address them in their classrooms. For our study, based on 182 eleventh graders, we analyzed the level of conceptual understanding by implementing the "draw and write" technique during a computer-supported gene technology module.…
ERIC Educational Resources Information Center
Akpinar, Ercan
2014-01-01
This study investigates the effects of using interactive computer animations based on predict-observe-explain (POE) as a presentation tool on primary school students' understanding of the static electricity concepts. A quasi-experimental pre-test/post-test control group design was utilized in this study. The experiment group consisted of 30…
ERIC Educational Resources Information Center
Wilkerson, Joyce A.; Elkins, Susan A.
This qualitative case study assessed web-based instruction in a computer-aided design/computer-assisted manufacturing (CAD/CAM) course designed for workforce development. The study examined students' and instructors' experience in a CAD/CAM course delivered exclusively on the Internet, evaluating course content and delivery, clarity of…
ERIC Educational Resources Information Center
Martin, Michael J.
2004-01-01
With new and inexpensive computer-based methods, measuring the speed of light and the Earth's radius--historically difficult endeavors--can be simple enough to be tackled by high school and college students working in labs that have limited budgets. In this article, the author describes two methods of estimating the Earth's radius using two…
ERIC Educational Resources Information Center
Tsai, Chia-Wen; Lee, Tsang-Hsiung; Shen, Pei-Di
2013-01-01
Many private vocational schools in Taiwan have taken to enrolling students with lower levels of academic achievement. The authors re-designed a course and conducted a series of quasi-experiments to develop students' long-term computing skills, and examined the longitudinal effects of web-enabled, problem-based learning (PBL) and self-regulated…
ERIC Educational Resources Information Center
Tsai, Chia-Wen; Shen, Pei-Di; Lin, Rong-An
2015-01-01
This study investigated, via quasi-experiments, the effects of student-centered project-based learning with initiation (SPBL with Initiation) on the development of students' computing skills. In this study, 96 elementary school students were selected from four class sections taking a course titled "Digital Storytelling" and were assigned…
A computer-based physics laboratory apparatus: Signal generator software
NASA Astrophysics Data System (ADS)
Thanakittiviroon, Tharest; Liangrocapart, Sompong
2005-09-01
This paper describes a computer-based physics laboratory apparatus to replace expensive instruments such as high-precision signal generators. This apparatus uses a sound card in a common personal computer to give sinusoidal signals with an accurate frequency that can be programmed to give different frequency signals repeatedly. An experiment on standing waves on an oscillating string uses this apparatus. In conjunction with interactive lab manuals, which have been developed using personal computers in our university, we achieve a complete set of low-cost, accurate, and easy-to-use equipment for teaching a physics laboratory.
A fast point-cloud computing method based on spatial symmetry of Fresnel field
NASA Astrophysics Data System (ADS)
Wang, Xiangxiang; Zhang, Kai; Shen, Chuan; Zhu, Wenliang; Wei, Sui
2017-10-01
Aiming at the great challenge for Computer Generated Hologram (CGH) duo to the production of high spatial-bandwidth product (SBP) is required in the real-time holographic video display systems. The paper is based on point-cloud method and it takes advantage of the propagating reversibility of Fresnel diffraction in the propagating direction and the fringe pattern of a point source, known as Gabor zone plate has spatial symmetry, so it can be used as a basis for fast calculation of diffraction field in CGH. A fast Fresnel CGH method based on the novel look-up table (N-LUT) method is proposed, the principle fringe patterns (PFPs) at the virtual plane is pre-calculated by the acceleration algorithm and be stored. Secondly, the Fresnel diffraction fringe pattern at dummy plane can be obtained. Finally, the Fresnel propagation from dummy plan to hologram plane. The simulation experiments and optical experiments based on Liquid Crystal On Silicon (LCOS) is setup to demonstrate the validity of the proposed method under the premise of ensuring the quality of 3D reconstruction the method proposed in the paper can be applied to shorten the computational time and improve computational efficiency.
Draghici, Sorin; Tarca, Adi L; Yu, Longfei; Ethier, Stephen; Romero, Roberto
2008-03-01
The BioArray Software Environment (BASE) is a very popular MIAME-compliant, web-based microarray data repository. However in BASE, like in most other microarray data repositories, the experiment annotation and raw data uploading can be very timeconsuming, especially for large microarray experiments. We developed KUTE (Karmanos Universal daTabase for microarray Experiments), as a plug-in for BASE 2.0 that addresses these issues. KUTE provides an automatic experiment annotation feature and a completely redesigned data work-flow that dramatically reduce the human-computer interaction time. For instance, in BASE 2.0 a typical Affymetrix experiment involving 100 arrays required 4 h 30 min of user interaction time forexperiment annotation, and 45 min for data upload/download. In contrast, for the same experiment, KUTE required only 28 min of user interaction time for experiment annotation, and 3.3 min for data upload/download. http://vortex.cs.wayne.edu/kute/index.html.
An Overview of Recent Developments in Computational Aeroelasticity
NASA Technical Reports Server (NTRS)
Bennett, Robert M.; Edwards, John W.
2004-01-01
The motivation for Computational Aeroelasticity (CA) and the elements of one type of the analysis or simulation process are briefly reviewed. The need for streamlining and improving the overall process to reduce elapsed time and improve overall accuracy is discussed. Further effort is needed to establish the credibility of the methodology, obtain experience, and to incorporate the experience base to simplify the method for future use. Experience with the application of a variety of Computational Aeroelasticity programs is summarized for the transonic flutter of two wings, the AGARD 445.6 wing and a typical business jet wing. There is a compelling need for a broad range of additional flutter test cases for further comparisons. Some existing data sets that may offer CA challenges are presented.
Operating a Geiger Müller tube using a PC sound card
NASA Astrophysics Data System (ADS)
Azooz, A. A.
2009-01-01
In this paper, a simple MATLAB-based PC program that enables the computer to function as a replacement for the electronic scalar-counter system associated with a Geiger-Müller (GM) tube is described. The program utilizes the ability of MATLAB to acquire data directly from the computer sound card. The signal from the GM tube is applied to the computer sound card via the line in port. All standard GM experiments, pulse shape and statistical analysis experiments can be carried out using this system. A new visual demonstration of dead time effects is also presented.
Microcomputer-Based Digital Signal Processing Laboratory Experiments.
ERIC Educational Resources Information Center
Tinari, Jr., Rocco; Rao, S. Sathyanarayan
1985-01-01
Describes a system (Apple II microcomputer interfaced to flexible, custom-designed digital hardware) which can provide: (1) Fast Fourier Transform (FFT) computation on real-time data with a video display of spectrum; (2) frequency synthesis experiments using the inverse FFT; and (3) real-time digital filtering experiments. (JN)
Accessible Earth: An accessible study abroad capstone for the geoscience curriculum
NASA Astrophysics Data System (ADS)
Bennett, R. A.; Lamb, D. A.
2017-12-01
International capstone field courses offer geoscience-students opportunities to reflect upon their knowledge, develop intercultural competence, appreciate diversity, and recognize themselves as geoscientists on a global scale. Such experiences are often described as pivotal to a geoscientist's education, a right of passage. However, field-based experiences present insurmountable barriers to many students, undermining the goal of inclusive excellence. Nevertheless, there remains a widespread belief that successful geoscientists are those able to traverse inaccessible terrain. One path forward from this apparent dilemma is emerging as we take steps to address a parallel challenge: as we move into the 21st century the geoscience workforce will require an ever increasing range of skills, including analysis, modeling, communication, and computational proficiency. Computer programing, laboratory experimentation, numerical simulation, etc, are inherently more accessible than fieldwork, yet equally valuable. Students interested in pursuing such avenues may be better served by capstone experiences that align more closely with their career goals. Moreover, many of the desirable learning outcomes attributed to field-based education are not unique to immersion in remote inaccessible locations. Affective and cognitive gains may also result from social bonding through extended time with peers and mentors, creative synthesis of knowledge, project-based learning, and intercultural experience. Developing an inclusive course for the geoscience curriculum requires considering all learners, including different genders, ages, physical abilities, familial dynamics, and a multitude of other attributes. The Accessible Earth Study Abroad Program endeavors to provide geoscience students an inclusive capstone experience focusing on modern geophysical observation systems (satellite based observations and permanent networks of ground-based instruments), computational thinking and methods of data science, scientific collaboration, and professional development. In this presentation, we will describe our thought process for creating the Accessible Earth curriculum, our successes to-date, and the anticipated challenges ahead.
Experimental Probability in Elementary School
ERIC Educational Resources Information Center
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Experiments in Computing: A Survey
Moisseinen, Nella
2014-01-01
Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general. PMID:24688404
Experiments in computing: a survey.
Tedre, Matti; Moisseinen, Nella
2014-01-01
Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.
Pfeiffer, Ulrich J; Schilbach, Leonhard; Timmermans, Bert; Kuzmanovic, Bojana; Georgescu, Alexandra L; Bente, Gary; Vogeley, Kai
2014-11-01
There is ample evidence that human primates strive for social contact and experience interactions with conspecifics as intrinsically rewarding. Focusing on gaze behavior as a crucial means of human interaction, this study employed a unique combination of neuroimaging, eye-tracking, and computer-animated virtual agents to assess the neural mechanisms underlying this component of behavior. In the interaction task, participants believed that during each interaction the agent's gaze behavior could either be controlled by another participant or by a computer program. Their task was to indicate whether they experienced a given interaction as an interaction with another human participant or the computer program based on the agent's reaction. Unbeknownst to them, the agent was always controlled by a computer to enable a systematic manipulation of gaze reactions by varying the degree to which the agent engaged in joint attention. This allowed creating a tool to distinguish neural activity underlying the subjective experience of being engaged in social and non-social interaction. In contrast to previous research, this allows measuring neural activity while participants experience active engagement in real-time social interactions. Results demonstrate that gaze-based interactions with a perceived human partner are associated with activity in the ventral striatum, a core component of reward-related neurocircuitry. In contrast, interactions with a computer-driven agent activate attention networks. Comparisons of neural activity during interaction with behaviorally naïve and explicitly cooperative partners demonstrate different temporal dynamics of the reward system and indicate that the mere experience of engagement in social interaction is sufficient to recruit this system. Copyright © 2014 Elsevier Inc. All rights reserved.
BioSig3D: High Content Screening of Three-Dimensional Cell Culture Models
Bilgin, Cemal Cagatay; Fontenay, Gerald; Cheng, Qingsu; Chang, Hang; Han, Ju; Parvin, Bahram
2016-01-01
BioSig3D is a computational platform for high-content screening of three-dimensional (3D) cell culture models that are imaged in full 3D volume. It provides an end-to-end solution for designing high content screening assays, based on colony organization that is derived from segmentation of nuclei in each colony. BioSig3D also enables visualization of raw and processed 3D volumetric data for quality control, and integrates advanced bioinformatics analysis. The system consists of multiple computational and annotation modules that are coupled together with a strong use of controlled vocabularies to reduce ambiguities between different users. It is a web-based system that allows users to: design an experiment by defining experimental variables, upload a large set of volumetric images into the system, analyze and visualize the dataset, and either display computed indices as a heatmap, or phenotypic subtypes for heterogeneity analysis, or download computed indices for statistical analysis or integrative biology. BioSig3D has been used to profile baseline colony formations with two experiments: (i) morphogenesis of a panel of human mammary epithelial cell lines (HMEC), and (ii) heterogeneity in colony formation using an immortalized non-transformed cell line. These experiments reveal intrinsic growth properties of well-characterized cell lines that are routinely used for biological studies. BioSig3D is being released with seed datasets and video-based documentation. PMID:26978075
Collaborative Working Architecture for IoT-Based Applications.
Mora, Higinio; Signes-Pont, María Teresa; Gil, David; Johnsson, Magnus
2018-05-23
The new sensing applications need enhanced computing capabilities to handle the requirements of complex and huge data processing. The Internet of Things (IoT) concept brings processing and communication features to devices. In addition, the Cloud Computing paradigm provides resources and infrastructures for performing the computations and outsourcing the work from the IoT devices. This scenario opens new opportunities for designing advanced IoT-based applications, however, there is still much research to be done to properly gear all the systems for working together. This work proposes a collaborative model and an architecture to take advantage of the available computing resources. The resulting architecture involves a novel network design with different levels which combines sensing and processing capabilities based on the Mobile Cloud Computing (MCC) paradigm. An experiment is included to demonstrate that this approach can be used in diverse real applications. The results show the flexibility of the architecture to perform complex computational tasks of advanced applications.
Cybersecurity Workforce Development and the Protection of Critical Infrastructure
2017-03-31
communicat ions products, and limited travel for site visits and conferencing. The CSCC contains a developed web-based coordination site, computer ...the CSCC. The Best Practices Ana~yst position maintains a lisr of best practices, computer related patches. and standard operating procedures (SOP...involved in conducting vulnerability assessments of computer networks. To adequately exercise and experiment with industry standard software, it was
The Modeling of Human Intelligence in the Computer as Demonstrated in the Game of DIPLOMAT.
ERIC Educational Resources Information Center
Collins, James Edward; Paulsen, Thomas Dean
An attempt was made to develop human-like behavior in the computer. A theory of the human learning process was described. A computer game was presented which simulated the human capabilities of reasoning and learning. The program was required to make intelligent decisions based on past experiences and critical analysis of the present situation.…
ERIC Educational Resources Information Center
Wlodyga, Linda J.
2010-01-01
In an attempt to prepare new graduate nurses to meet the demands of health care delivery systems, the use of computer-based clinical information systems that combine hands-on experience with computer based information systems was explored. Since the introduction of Electronic Medical Records (EMR) nearly two decades ago, the demand for nurses to…
ERIC Educational Resources Information Center
Carlfjord, Siw; Johansson, Kjell; Bendtsen, Preben; Nilsen, Per; Andersson, Agneta
2010-01-01
Objective: The aim of this study was to evaluate staff experiences of the use of a computer-based concept for lifestyle testing and tailored advice implemented in routine primary health care (PHC). Design: The design of the study was a cross-sectional, retrospective survey. Setting: The study population consisted of staff at nine PHC units in the…
ERIC Educational Resources Information Center
Painter, Diane D.
2016-01-01
The four-week university-sponsored summer Computer-based Writing (CBW) Program directed by the head of a special education initial teacher licensure program gave teaching interns opportunities to work with young struggling writers in a supervised clinical setting to address keyboarding skills, writing conventions and knowledge and application of…
Development of an Augmented Reality Game to Teach Abstract Concepts in Food Chemistry
ERIC Educational Resources Information Center
Crandall, Philip G.; Engler, Robert K.; Beck, Dennis E.; Killian, Susan A.; O'Bryan, Corliss A.; Jarvis, Nathan; Clausen, Ed
2015-01-01
One of the most pressing issues for many land grant institutions is the ever increasing cost to build and operate wet chemistry laboratories. A partial solution is to develop computer-based teaching modules that take advantage of animation, web-based or off-campus learning experiences directed at engaging students' creative experiences. We…
20 CFR 345.302 - Definition of terms and phrases used in experience-rating.
Code of Federal Regulations, 2013 CFR
2013-04-01
... raised in order to bring that rate to the minimum rate of zero is multiplied by the employer's 1-year...-year compensation base of each employer whose experience-based contribution rate, computed at Step 6 of.... Effective January 1, 1991, and on the first of each subsequent calendar year, the Board will reduce each...
20 CFR 345.302 - Definition of terms and phrases used in experience-rating.
Code of Federal Regulations, 2012 CFR
2012-04-01
... raised in order to bring that rate to the minimum rate of zero is multiplied by the employer's 1-year...-year compensation base of each employer whose experience-based contribution rate, computed at Step 6 of.... Effective January 1, 1991, and on the first of each subsequent calendar year, the Board will reduce each...
20 CFR 345.302 - Definition of terms and phrases used in experience-rating.
Code of Federal Regulations, 2014 CFR
2014-04-01
... raised in order to bring that rate to the minimum rate of zero is multiplied by the employer's 1-year...-year compensation base of each employer whose experience-based contribution rate, computed at Step 6 of.... Effective January 1, 1991, and on the first of each subsequent calendar year, the Board will reduce each...
ERIC Educational Resources Information Center
Levy, Philippa
2006-01-01
This paper focuses on learners' experiences of text-based computer-mediated communication (CMC) as a means of self-expression, dialogue and debate. A detailed case study narrative and a reflective commentary are presented, drawn from a personal, practice-based inquiry into the design and facilitation of a professional development course for which…
Student Activity and Profile Datasets from an Online Video-Based Collaborative Learning Experience
ERIC Educational Resources Information Center
Martín, Estefanía; Gértrudix, Manuel; Urquiza-Fuentes, Jaime; Haya, Pablo A.
2015-01-01
This paper describes two datasets extracted from a video-based educational experience using a social and collaborative platform. The length of the trial was 3 months. It involved 111 students from two different courses. Twenty-nine came from Computer Engineering (CE) course and 82 from Media and Communication (M&C) course. They were organised…
Computers in mathematics: teacher-inservice training at a distance
NASA Astrophysics Data System (ADS)
Friedman, Edward A.; Jurkat, M. P.
1993-01-01
While research and experience show many advantages for incorporation of computer technology into secondary school mathematics instruction, less than 5 percent of the nation's teachers are actively using computers in their classrooms. This is the case even though mathematics teachers in grades 7 - 12 are often familiar with computer technology and have computers available to them in their schools. The implementation bottleneck is in-service teacher training and there are few models of effective implementation available for teachers to emulate. Stevens Institute of Technology has been active since 1988 in research and development efforts to incorporate computers into classroom use. We have found that teachers need to see examples of classroom experience with hardware and software and they need to have assistance as they experiment with applications of software and the development of lesson plans. High-band width technology can greatly facilitate teacher training in this area through transmission of video documentaries, software discussions, teleconferencing, peer interactions, classroom observations, etc. We discuss the experience that Stevens has had with face-to-face teacher training as well as with satellite-based teleconferencing using one-way video and two- way audio. Included are reviews of analyses of this project by researchers from Educational Testing Service, Princeton University, and Bank Street School of Education.
Brain-Computer Interface Based on Generation of Visual Images
Bobrov, Pavel; Frolov, Alexander; Cantor, Charles; Fedulova, Irina; Bakhnyan, Mikhail; Zhavoronkov, Alexander
2011-01-01
This paper examines the task of recognizing EEG patterns that correspond to performing three mental tasks: relaxation and imagining of two types of pictures: faces and houses. The experiments were performed using two EEG headsets: BrainProducts ActiCap and Emotiv EPOC. The Emotiv headset becomes widely used in consumer BCI application allowing for conducting large-scale EEG experiments in the future. Since classification accuracy significantly exceeded the level of random classification during the first three days of the experiment with EPOC headset, a control experiment was performed on the fourth day using ActiCap. The control experiment has shown that utilization of high-quality research equipment can enhance classification accuracy (up to 68% in some subjects) and that the accuracy is independent of the presence of EEG artifacts related to blinking and eye movement. This study also shows that computationally-inexpensive Bayesian classifier based on covariance matrix analysis yields similar classification accuracy in this problem as a more sophisticated Multi-class Common Spatial Patterns (MCSP) classifier. PMID:21695206
Usage of CT data in biomechanical research
NASA Astrophysics Data System (ADS)
Safonov, Roman A.; Golyadkina, Anastasiya A.; Kirillova, Irina V.; Kossovich, Leonid Y.
2017-02-01
Object of study: The investigation is focused on development of personalized medicine. The determination of mechanical properties of bone tissues based on in vivo data was considered. Methods: CT, MRI, natural experiments on versatile test machine Instron 5944, numerical experiments using Python programs. Results: The medical diagnostics methods, which allows determination of mechanical properties of bone tissues based on in vivo data. The series of experiments to define the values of mechanical parameters of bone tissues. For one and the same sample, computed tomography (CT), magnetic resonance imaging (MRI), ultrasonic investigations and mechanical experiments on single-column test machine Instron 5944 were carried out. The computer program for comparison of CT and MRI images was created. The grayscale values in the same points of the samples were determined on both CT and MRI images. The Haunsfield grayscale values were used to determine rigidity (Young module) and tensile strength of the samples. The obtained data was compared to natural experiments results for verification.
Dynamic partitioning as a way to exploit new computing paradigms: the cloud use case.
NASA Astrophysics Data System (ADS)
Ciaschini, Vincenzo; Dal Pra, Stefano; dell'Agnello, Luca
2015-12-01
The WLCG community and many groups in the HEP community have based their computing strategy on the Grid paradigm, which proved successful and still ensures its goals. However, Grid technology has not spread much over other communities; in the commercial world, the cloud paradigm is the emerging way to provide computing services. WLCG experiments aim to achieve integration of their existing current computing model with cloud deployments and take advantage of the so-called opportunistic resources (including HPC facilities) which are usually not Grid compliant. One missing feature in the most common cloud frameworks, is the concept of job scheduler, which plays a key role in a traditional computing centre, by enabling a fairshare based access at the resources to the experiments in a scenario where demand greatly outstrips availability. At CNAF we are investigating the possibility to access the Tier-1 computing resources as an OpenStack based cloud service. The system, exploiting the dynamic partitioning mechanism already being used to enable Multicore computing, allowed us to avoid a static splitting of the computing resources in the Tier-1 farm, while permitting a share friendly approach. The hosts in a dynamically partitioned farm may be moved to or from the partition, according to suitable policies for request and release of computing resources. Nodes being requested in the partition switch their role and become available to play a different one. In the cloud use case hosts may switch from acting as Worker Node in the Batch system farm to cloud compute node member, made available to tenants. In this paper we describe the dynamic partitioning concept, its implementation and integration with our current batch system, LSF.
OpenCL-based vicinity computation for 3D multiresolution mesh compression
NASA Astrophysics Data System (ADS)
Hachicha, Soumaya; Elkefi, Akram; Ben Amar, Chokri
2017-03-01
3D multiresolution mesh compression systems are still widely addressed in many domains. These systems are more and more requiring volumetric data to be processed in real-time. Therefore, the performance is becoming constrained by material resources usage and an overall reduction in the computational time. In this paper, our contribution entirely lies on computing, in real-time, triangles neighborhood of 3D progressive meshes for a robust compression algorithm based on the scan-based wavelet transform(WT) technique. The originality of this latter algorithm is to compute the WT with minimum memory usage by processing data as they are acquired. However, with large data, this technique is considered poor in term of computational complexity. For that, this work exploits the GPU to accelerate the computation using OpenCL as a heterogeneous programming language. Experiments demonstrate that, aside from the portability across various platforms and the flexibility guaranteed by the OpenCL-based implementation, this method can improve performance gain in speedup factor of 5 compared to the sequential CPU implementation.
Internet messenger based smart virtual class learning using ubiquitous computing
NASA Astrophysics Data System (ADS)
Umam, K.; Mardi, S. N. S.; Hariadi, M.
2017-06-01
Internet messenger (IM) has become an important educational technology component in college education, IM makes it possible for students to engage in learning and collaborating at smart virtual class learning (SVCL) using ubiquitous computing. However, the model of IM-based smart virtual class learning using ubiquitous computing and empirical evidence that would favor a broad application to improve engagement and behavior are still limited. In addition, the expectation that IM based SVCL using ubiquitous computing could improve engagement and behavior on smart class cannot be confirmed because the majority of the reviewed studies followed instructions paradigms. This article aims to present the model of IM-based SVCL using ubiquitous computing and showing learners’ experiences in improved engagement and behavior for learner-learner and learner-lecturer interactions. The method applied in this paper includes design process and quantitative analysis techniques, with the purpose of identifying scenarios of ubiquitous computing and realize the impressions of learners and lecturers about engagement and behavior aspect and its contribution to learning
Computational Experiments for Science and Engineering Education
NASA Technical Reports Server (NTRS)
Xie, Charles
2011-01-01
How to integrate simulation-based engineering and science (SBES) into the science curriculum smoothly is a challenging question. For the importance of SBES to be appreciated, the core value of simulations-that they help people understand natural phenomena and solve engineering problems-must be taught. A strategy to achieve this goal is to introduce computational experiments to the science curriculum to replace or supplement textbook illustrations and exercises and to complement or frame hands-on or wet lab experiments. In this way, students will have an opportunity to learn about SBES without compromising other learning goals required by the standards and teachers will welcome these tools as they strengthen what they are already teaching. This paper demonstrates this idea using a number of examples in physics, chemistry, and engineering. These exemplary computational experiments show that it is possible to create a curriculum that is both deeper and wider.
Devitt, P; Cehic, D; Palmer, E
1998-06-01
Student teaching of surgery has been devolved from the university in an effort to increase and broaden undergraduate clinical experience. In order to ensure uniformity of learning we have defined learning objectives and provided a computer-based package to supplement clinical teaching. A study was undertaken to evaluate the place of computer-based learning in a clinical environment. Twelve modules were provided for study during a 6-week attachment. These covered clinical problems related to cardiology, neurosurgery and gastrointestinal haemorrhage. Eighty-four fourth-year students undertook a pre- and post-test assessment on these three topics as well as acute abdominal pain. No extra learning material on the latter topic was provided during the attachment. While all students showed significant improvement in performance in the post-test assessment, those who had access to the computer material performed significantly better than did the controls. Within the topics, students in both groups performed equally well on the post-test assessment of acute abdominal pain but the control group's performance was significantly lacking on the topic of gastrointestinal haemorrhage, suggesting that the bulk of learning on this subject came from the computer material and little from the clinical attachment. This type of learning resource can be used to supplement the student's clinical experience and at the same time monitor what they learn during clinical clerkships and identify areas of weakness.
Girls back off mathematics again: the views and experiences of girls in computer-based mathematics
NASA Astrophysics Data System (ADS)
Vale, Colleen
2002-12-01
The views and experiences of girls in two co-educational mathematics classrooms in which computers were regularly used were researched. Data were collected by observation and videotaping of lessons, questionnaire, and interviews of students and the teachers. In this paper case studies of six girls are presented. Their `stories' reveal a diversity of experiences and views and multiple gender identities. High achieving girls persisted as "outsiders within," other girls "backed off", and exceptional girls challenged gender stereotypes. Implications for social justice in mathematics in the age of the super highway are discussed.
Montecarlo Simulations for a Lep Experiment with Unix Workstation Clusters
NASA Astrophysics Data System (ADS)
Bonesini, M.; Calegari, A.; Rossi, P.; Rossi, V.
Modular systems of RISC CPU based computers have been implemented for large productions of Montecarlo simulated events for the DELPHI experiment at CERN. From a pilot system based on DEC 5000 CPU’s, a full size system based on a CONVEX C3820 UNIX supercomputer and a cluster of HP 735 workstations has been put into operation as a joint effort between INFN Milano and CILEA.
Colling, D.; Britton, D.; Gordon, J.; Lloyd, S.; Doyle, A.; Gronbech, P.; Coles, J.; Sansum, A.; Patrick, G.; Jones, R.; Middleton, R.; Kelsey, D.; Cass, A.; Geddes, N.; Clark, P.; Barnby, L.
2013-01-01
The Large Hadron Collider (LHC) is one of the greatest scientific endeavours to date. The construction of the collider itself and the experiments that collect data from it represent a huge investment, both financially and in terms of human effort, in our hope to understand the way the Universe works at a deeper level. Yet the volumes of data produced are so large that they cannot be analysed at any single computing centre. Instead, the experiments have all adopted distributed computing models based on the LHC Computing Grid. Without the correct functioning of this grid infrastructure the experiments would not be able to understand the data that they have collected. Within the UK, the Grid infrastructure needed by the experiments is provided by the GridPP project. We report on the operations, performance and contributions made to the experiments by the GridPP project during the years of 2010 and 2011—the first two significant years of the running of the LHC. PMID:23230163
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Cummings, Rick; Jones, Brian
1992-01-01
The microgravity materials processing program has been instrumental in providing the crystal growth community with an experimental environment to better understand the phenomena associated with the growing of crystals. In many applications one may pursue the growth of large single crystals which cannot be grown on earth due to convective driven flows. A microgravity environment is characterized by neither convection of buoyancy. Consequently superior crystals are able to be grown in space. On the other hand, since neither convection nor buoyancy dominates the fluid flow in a microgravity environment, then lesser dominating phenomena can affect crystal growth, such as surface driven flows or diffusion limited solidification. In the case of experiments that are to be flown in space using the Fluid Experiments System (FES), diffusion limited growth should be the dominating phenomenon. The use of holographic and Schlieren optical techniques for studying the concentration gradients in solidification processes has been used by several investigators over the years. The Holographic Ground System (HGS) facility at MSFC has been a primary resource in researching this capability. Consequently scientific personnel have been able to utilize these techniques in both ground based research and in space experiments. An important event in the scientific utilization of the HGS facilities was the TGS (triglycine sulfate) Crystal Growth and the Casting and Solidification Technology (CAST) experiments that were flown on the International Microgravity Lab (IML) mission in March of this year. The preparation and processing of these space observations are the primary experiments reported in this work. This project provides some ground-based studies to optimize on the holographic techniques used to acquire information about the crystal growth processes flown on IML. Since the ground-based studies will be compared with the space-based experimental results, it is necessary to conduct sufficient ground based studies to best determine how the experiment in space worked. The current capabilities in computer based systems for image processing and numerical computation have certainly assisted in those efforts. As anticipated, this study has certainly shown that these advanced computing capabilities are helpful in the data analysis of such experiments.
A computer-based measure of resultant achievement motivation.
Blankenship, V
1987-08-01
Three experiments were conducted to develop a computer-based measure of individual differences in resultant achievement motivation (RAM) on the basis of level-of-aspiration, achievement motivation, and dynamics-of-action theories. In Experiment 1, the number of atypical shifts and greater responsiveness to incentives on 21 trials with choices among easy, intermediate, and difficult levels of an achievement-oriented game were positively correlated and were found to differentiate the 62 subjects (31 men, 31 women) on the amount of time they spent at a nonachievement task (watching a color design) 1 week later. In Experiment 2, test-retest reliability was established with the use of 67 subjects (15 men, 52 women). Point and no-point trials were offered in blocks, with point trials first for half the subjects and no-point trials first for the other half. Reliability was higher for the atypical-shift measure than for the incentive-responsiveness measure and was higher when points were offered first. In Experiment 3, computer anxiety was manipulated by creating a simulated computer breakdown in the experimental condition. Fifty-nine subjects (13 men, 46 women) were randomly assigned to the experimental condition or to one of two control conditions (an interruption condition and a no-interruption condition). Subjects with low RAM, as demonstrated by a low number of typical shifts, took longer to choose the achievement-oriented task, as predicted by the dynamics-of-action theory. The difference was evident in all conditions and most striking in the computer-breakdown condition. A change of focus from atypical to typical shifts is discussed.
Howson, Moira; Ritchie, Linda; Carter, Philip D; Parry, David Tudor; Koziol-McLain, Jane
2016-01-01
Background The use of Web-based interventions to deliver mental health and behavior change programs is increasingly popular. They are cost-effective, accessible, and generally effective. Often these interventions concern psychologically sensitive and challenging issues, such as depression or anxiety. The process by which a person receives and experiences therapy is important to understanding therapeutic process and outcomes. While the experience of the patient or client in traditional face-to-face therapy has been evaluated in a number of ways, there appeared to be a gap in the evaluation of patient experiences of therapeutic interventions delivered online. Evaluation of Web-based artifacts has focused either on evaluation of experience from a computer Web-design perspective through usability testing or on evaluation of treatment effectiveness. Neither of these methods focuses on the psychological experience of the person while engaged in the therapeutic process. Objective This study aimed to investigate what methods, if any, have been used to evaluate the in situ psychological experience of users of Web-based self-help psychosocial interventions. Methods A systematic literature review was undertaken of interdisciplinary databases with a focus on health and computer sciences. Studies that met a predetermined search protocol were included. Results Among 21 studies identified that examined psychological experience of the user, only 1 study collected user experience in situ. The most common method of understanding users’ experience was through semistructured interviews conducted posttreatment or questionnaires administrated at the end of an intervention session. The questionnaires were usually based on standardized tools used to assess user experience with traditional face-to-face treatment. Conclusions There is a lack of methods specified in the literature to evaluate the interface between Web-based mental health or behavior change artifacts and users. Main limitations in the research were the nascency of the topic and cross-disciplinary nature of the field. There is a need to develop and deliver methods of understanding users’ psychological experiences while using an intervention. PMID:27363519
Computer Model Of Fragmentation Of Atomic Nuclei
NASA Technical Reports Server (NTRS)
Wilson, John W.; Townsend, Lawrence W.; Tripathi, Ram K.; Norbury, John W.; KHAN FERDOUS; Badavi, Francis F.
1995-01-01
High Charge and Energy Semiempirical Nuclear Fragmentation Model (HZEFRG1) computer program developed to be computationally efficient, user-friendly, physics-based program for generating data bases on fragmentation of atomic nuclei. Data bases generated used in calculations pertaining to such radiation-transport applications as shielding against radiation in outer space, radiation dosimetry in outer space, cancer therapy in laboratories with beams of heavy ions, and simulation studies for designing detectors for experiments in nuclear physics. Provides cross sections for production of individual elements and isotopes in breakups of high-energy heavy ions by combined nuclear and Coulomb fields of interacting nuclei. Written in ANSI FORTRAN 77.
Method of mobile robot indoor navigation by artificial landmarks with use of computer vision
NASA Astrophysics Data System (ADS)
Glibin, E. S.; Shevtsov, A. A.; Enik, O. A.
2018-05-01
The article describes an algorithm of the mobile robot indoor navigation based on the use of visual odometry. The results of the experiment identifying calculation errors in the distance traveled on a slip are presented. It is shown that the use of computer vision allows one to correct erroneous coordinates of the robot with the help of artificial landmarks. The control system utilizing the proposed method has been realized on the basis of Arduino Mego 2560 controller and a single-board computer Raspberry Pi 3. The results of the experiment on the mobile robot navigation with the use of this control system are presented.
Experiences of Student Mathematics-Teachers in Computer-Based Mathematics Learning Environment
ERIC Educational Resources Information Center
Karatas, Ilhan
2011-01-01
Computer technology in mathematics education enabled the students find many opportunities for investigating mathematical relationships, hypothesizing, and making generalizations. These opportunities were provided to pre-service teachers through a faculty course. At the end of the course, the teachers were assigned project tasks involving…
Classical boson sampling algorithms with superior performance to near-term experiments
NASA Astrophysics Data System (ADS)
Neville, Alex; Sparrow, Chris; Clifford, Raphaël; Johnston, Eric; Birchall, Patrick M.; Montanaro, Ashley; Laing, Anthony
2017-12-01
It is predicted that quantum computers will dramatically outperform their conventional counterparts. However, large-scale universal quantum computers are yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to the platform of linear optics, which has sparked interest as a rapid way to demonstrate such quantum supremacy. Photon statistics are governed by intractable matrix functions, which suggests that sampling from the distribution obtained by injecting photons into a linear optical network could be solved more quickly by a photonic experiment than by a classical computer. The apparently low resource requirements for large boson sampling experiments have raised expectations of a near-term demonstration of quantum supremacy by boson sampling. Here we present classical boson sampling algorithms and theoretical analyses of prospects for scaling boson sampling experiments, showing that near-term quantum supremacy via boson sampling is unlikely. Our classical algorithm, based on Metropolised independence sampling, allowed the boson sampling problem to be solved for 30 photons with standard computing hardware. Compared to current experiments, a demonstration of quantum supremacy over a successful implementation of these classical methods on a supercomputer would require the number of photons and experimental components to increase by orders of magnitude, while tackling exponentially scaling photon loss.
NASA Astrophysics Data System (ADS)
Xie, Lizhe; Hu, Yining; Chen, Yang; Shi, Luyao
2015-03-01
Projection and back-projection are the most computational consuming parts in Computed Tomography (CT) reconstruction. Parallelization strategies using GPU computing techniques have been introduced. We in this paper present a new parallelization scheme for both projection and back-projection. The proposed method is based on CUDA technology carried out by NVIDIA Corporation. Instead of build complex model, we aimed on optimizing the existing algorithm and make it suitable for CUDA implementation so as to gain fast computation speed. Besides making use of texture fetching operation which helps gain faster interpolation speed, we fixed sampling numbers in the computation of projection, to ensure the synchronization of blocks and threads, thus prevents the latency caused by inconsistent computation complexity. Experiment results have proven the computational efficiency and imaging quality of the proposed method.
13C-based metabolic flux analysis: fundamentals and practice.
Yang, Tae Hoon
2013-01-01
Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.
The Use of Humor in a CBI Science Lesson To Enhance Retention.
ERIC Educational Resources Information Center
Snetsinger, Wendy; Grabowski, Barbara
This research experiment studied the effect of humor versus non-humor on learning and retention of a computer-based instructional (CBI) lesson on tick identification. The experiment also surveyed the subjects' enjoyment of the lesson material, their personal experiences with ticks, and their concerns about ticks and tick-borne diseases.…
The Historical and Situated Nature Design Experiments--Implications for Data Analysis
ERIC Educational Resources Information Center
Krange, I.; Ludvigsen, Sten
2009-01-01
This article is a methodological contribution to the use of design experiments in educational research. We will discuss the implications of a historical and situated interpretation to design experiments, the consequences this has for the analysis of the collected data and empirically based suggestions to improve the designs of the computer-based…
Centrifugal Pump Experiment for Chemical Engineering Undergraduates
ERIC Educational Resources Information Center
Vanderslice, Nicholas; Oberto, Richard; Marrero, Thomas R.
2012-01-01
The purpose of this paper is to describe a Centrifugal Pump Experiment that provided an experiential learning experience to chemical engineering undergraduates at the University of Missouri in the spring of 2010 in the Unit Operations Laboratory course. Lab equipment was used by senior students with computer-based data and control technology. In…
Advanced Tools for Smartphone-Based Experiments: Phyphox
ERIC Educational Resources Information Center
Staacks, S.; Hütz, S.; Stampfer, C.; Heinke, H.
2018-01-01
The sensors in modern smartphones are a promising and cost-effective tool for experimentation in physics education, but many experiments face practical problems. Often the phone is inaccessible during the experiment and the data usually needs to be analyzed subsequently on a computer. We address both problems by introducing a new app, called…
The Composition of Dramatic Experience: The Play Element in Student Electronic Projects.
ERIC Educational Resources Information Center
Rouzie, Albert
2000-01-01
Analyzes two student group Hypertext projects and a MOO (multiuser domain, object-oriented) project. Finds the play element can enrich readerly experience through the creation of a dramatic experience of information. Suggests that composition instructors need to recognize the play element in computer-based composition and encourage the development…
Experiments on Interfaces To Support Query Expansion.
ERIC Educational Resources Information Center
Beaulieu, M.
1997-01-01
Focuses on the user and human-computer interaction aspects of the research based on the Okapi text retrieval system. Three experiments implementing different approaches to query expansion are described, including the use of graphical user interfaces with different windowing techniques. (Author/LRW)
NASA Astrophysics Data System (ADS)
Řidký, V.; Šidlof, P.; Vlček, V.
2013-04-01
The work is devoted to comparing measured data with the results of numerical simulations. As mathematical model was used mathematical model whitout turbulence for incompressible flow In the experiment was observed the behavior of designed NACA0015 airfoil in airflow. For the numerical solution was used OpenFOAM computational package, this is open-source software based on finite volume method. In the numerical solution is prescribed displacement of the airfoil, which corresponds to the experiment. The velocity at a point close to the airfoil surface is compared with the experimental data obtained from interferographic measurements of the velocity field. Numerical solution is computed on a 3D mesh composed of about 1 million ortogonal hexahedron elements. The time step is limited by the Courant number. Parallel computations are run on supercomputers of the CIV at Technical University in Prague (HAL and FOX) and on a computer cluster of the Faculty of Mechatronics of Liberec (HYDRA). Run time is fixed at five periods, the results from the fifth periods and average value for all periods are then be compared with experiment.
Peng, Kuan; He, Ling; Zhu, Ziqiang; Tang, Jingtian; Xiao, Jiaying
2013-12-01
Compared with commonly used analytical reconstruction methods, the frequency-domain finite element method (FEM) based approach has proven to be an accurate and flexible algorithm for photoacoustic tomography. However, the FEM-based algorithm is computationally demanding, especially for three-dimensional cases. To enhance the algorithm's efficiency, in this work a parallel computational strategy is implemented in the framework of the FEM-based reconstruction algorithm using a graphic-processing-unit parallel frame named the "compute unified device architecture." A series of simulation experiments is carried out to test the accuracy and accelerating effect of the improved method. The results obtained indicate that the parallel calculation does not change the accuracy of the reconstruction algorithm, while its computational cost is significantly reduced by a factor of 38.9 with a GTX 580 graphics card using the improved method.
Federated data storage system prototype for LHC experiments and data intensive science
NASA Astrophysics Data System (ADS)
Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Ryabinkin, E.; Zarochentsev, A.
2017-10-01
Rapid increase of data volume from the experiments running at the Large Hadron Collider (LHC) prompted physics computing community to evaluate new data handling and processing solutions. Russian grid sites and universities’ clusters scattered over a large area aim at the task of uniting their resources for future productive work, at the same time giving an opportunity to support large physics collaborations. In our project we address the fundamental problem of designing a computing architecture to integrate distributed storage resources for LHC experiments and other data-intensive science applications and to provide access to data from heterogeneous computing facilities. Studies include development and implementation of federated data storage prototype for Worldwide LHC Computing Grid (WLCG) centres of different levels and University clusters within one National Cloud. The prototype is based on computing resources located in Moscow, Dubna, Saint Petersburg, Gatchina and Geneva. This project intends to implement a federated distributed storage for all kind of operations such as read/write/transfer and access via WAN from Grid centres, university clusters, supercomputers, academic and commercial clouds. The efficiency and performance of the system are demonstrated using synthetic and experiment-specific tests including real data processing and analysis workflows from ATLAS and ALICE experiments, as well as compute-intensive bioinformatics applications (PALEOMIX) running on supercomputers. We present topology and architecture of the designed system, report performance and statistics for different access patterns and show how federated data storage can be used efficiently by physicists and biologists. We also describe how sharing data on a widely distributed storage system can lead to a new computing model and reformations of computing style, for instance how bioinformatics program running on supercomputers can read/write data from the federated storage.
1982-01-01
physical reasoning and based on computational experience with similar equations. There is another non- automatic way: through proper scaling of all...1979) for an automatic scheme for this scaling on a digitial computer . Shampine(1980) reports a special definition of stiffness appropriate for...an analog for a laboratory that typically already has a digital computer . The digitial is much more versatile. Also there does not yet exist " software
A binary motor imagery tasks based brain-computer interface for two-dimensional movement control
NASA Astrophysics Data System (ADS)
Xia, Bin; Cao, Lei; Maysam, Oladazimi; Li, Jie; Xie, Hong; Su, Caixia; Birbaumer, Niels
2017-12-01
Objective. Two-dimensional movement control is a popular issue in brain-computer interface (BCI) research and has many applications in the real world. In this paper, we introduce a combined control strategy to a binary class-based BCI system that allows the user to move a cursor in a two-dimensional (2D) plane. Users focus on a single moving vector to control 2D movement instead of controlling vertical and horizontal movement separately. Approach. Five participants took part in a fixed-target experiment and random-target experiment to verify the effectiveness of the combination control strategy under the fixed and random routine conditions. Both experiments were performed in a virtual 2D dimensional environment and visual feedback was provided on the screen. Main results. The five participants achieved an average hit rate of 98.9% and 99.4% for the fixed-target experiment and the random-target experiment, respectively. Significance. The results demonstrate that participants could move the cursor in the 2D plane effectively. The proposed control strategy is based only on a basic two-motor imagery BCI, which enables more people to use it in real-life applications.
Qayumi, A K; Kurihara, Y; Imai, M; Pachev, G; Seo, H; Hoshino, Y; Cheifetz, R; Matsuura, K; Momoi, M; Saleem, M; Lara-Guerra, H; Miki, Y; Kariya, Y
2004-10-01
This study aimed to compare the effects of computer-assisted, text-based and computer-and-text learning conditions on the performances of 3 groups of medical students in the pre-clinical years of their programme, taking into account their academic achievement to date. A fourth group of students served as a control (no-study) group. Participants were recruited from the pre-clinical years of the training programmes in 2 medical schools in Japan, Jichi Medical School near Tokyo and Kochi Medical School near Osaka. Participants were randomly assigned to 4 learning conditions and tested before and after the study on their knowledge of and skill in performing an abdominal examination, in a multiple-choice test and an objective structured clinical examination (OSCE), respectively. Information about performance in the programme was collected from school records and students were classified as average, good or excellent. Student and faculty evaluations of their experience in the study were explored by means of a short evaluation survey. Compared to the control group, all 3 study groups exhibited significant gains in performance on knowledge and performance measures. For the knowledge measure, the gains of the computer-assisted and computer-assisted plus text-based learning groups were significantly greater than the gains of the text-based learning group. The performances of the 3 groups did not differ on the OSCE measure. Analyses of gains by performance level revealed that high achieving students' learning was independent of study method. Lower achieving students performed better after using computer-based learning methods. The results suggest that computer-assisted learning methods will be of greater help to students who do not find the traditional methods effective. Explorations of the factors behind this are a matter for future research.
Effects on Training Using Illumination in Virtual Environments
NASA Technical Reports Server (NTRS)
Maida, James C.; Novak, M. S. Jennifer; Mueller, Kristian
1999-01-01
Camera based tasks are commonly performed during orbital operations, and orbital lighting conditions, such as high contrast shadowing and glare, are a factor in performance. Computer based training using virtual environments is a common tool used to make and keep CTW members proficient. If computer based training included some of these harsh lighting conditions, would the crew increase their proficiency? The project goal was to determine whether computer based training increases proficiency if one trains for a camera based task using computer generated virtual environments with enhanced lighting conditions such as shadows and glare rather than color shaded computer images normally used in simulators. Previous experiments were conducted using a two degree of freedom docking system. Test subjects had to align a boresight camera using a hand controller with one axis of rotation and one axis of rotation. Two sets of subjects were trained on two computer simulations using computer generated virtual environments, one with lighting, and one without. Results revealed that when subjects were constrained by time and accuracy, those who trained with simulated lighting conditions performed significantly better than those who did not. To reinforce these results for speed and accuracy, the task complexity was increased.
Assessment of Collaborative Learning Experiences by Graphical Analysis of Wiki Contributions
ERIC Educational Resources Information Center
Palomo-Duarte, Manuel; Dodero, Juan Manuel; Medina-Bulo, Inmaculada; Rodríguez-Posada, Emilio J.; Ruiz-Rube, Iván
2014-01-01
The widespread adoption of computers and Internet in our life has reached the classrooms, where computer-supported collaborative learning (CSCL) based on wikis offers new ways of collaboration and encourages student participation. When the number of contributions from students increases, traditional assessment procedures of e-learning settings…
Matter Gravitates, but Does Gravity Matter?
ERIC Educational Resources Information Center
Groetsch, C. W.
2011-01-01
The interplay of physical intuition, computational evidence, and mathematical rigor in a simple trajectory model is explored. A thought experiment based on the model is used to elicit student conjectures on the influence of a physical parameter; a mathematical model suggests a computational investigation of the conjectures, and rigorous analysis…
A Monte Carlo Simulation of Brownian Motion in the Freshman Laboratory
ERIC Educational Resources Information Center
Anger, C. D.; Prescott, J. R.
1970-01-01
Describes a dry- lab" experiment for the college freshman laboratory, in which the essential features of Browian motion are given principles, using the Monte Carlo technique. Calculations principles, using the Monte Carlo technique. Calculations are carried out by a computation sheme based on computer language. Bibliography. (LC)
Distriblets: Java-Based Distributed Computing on the Web.
ERIC Educational Resources Information Center
Finkel, David; Wills, Craig E.; Brennan, Brian; Brennan, Chris
1999-01-01
Describes a system for using the World Wide Web to distribute computational tasks to multiple hosts on the Web that is written in Java programming language. Describes the programs written to carry out the load distribution, the structure of a "distriblet" class, and experiences in using this system. (Author/LRW)
Training Programs in Applications Software.
ERIC Educational Resources Information Center
Modianos, Doan T.; Cornwell, Larry W.
1988-01-01
Description of training programs for using business applications software highlights implementing programs for Lotus 1-2-3 and dBASE III Plus. The amount of computer experience of the users and the difference in training methods needed are discussed, and the use of a Macintosh computer for producing notes is explained. (LRW)
Divergence of Digital World of Teachers
ERIC Educational Resources Information Center
Uzunboylu, Huseyin; Tuncay, Nazime
2010-01-01
There exists great diversity in the teachers' digital world. Teachers are being discriminated based on numerous educational gaps. This paper seeks to assess the extent of the digital divide among the North Cyprus vocational teachers along the four axes: age, Internet access, computer access, and performance (computing knowledge/experience). A…
High level language for measurement complex control based on the computer E-100I
NASA Technical Reports Server (NTRS)
Zubkov, B. V.
1980-01-01
A high level language was designed to control the process of conducting an experiment using the computer "Elektrinika-1001". Program examples are given to control the measuring and actuating devices. The procedure of including these programs in the suggested high level language is described.
Distributed Training for the Reserve Component: Instructor Handbook for Computer Conferencing.
ERIC Educational Resources Information Center
Harbour, J.; And Others
The purpose of this handbook is to provide background and teaching recommendations for instructors who will be remotely conducting Reserve Component training using asynchronous computer conferencing techniques. The recommendations in this handbook are based on an international review of the literature in distance learning and experience gained…
Computer-based learning in neuroanatomy: A longitudinal study of learning, transfer, and retention
NASA Astrophysics Data System (ADS)
Chariker, Julia H.
A longitudinal experiment was conducted to explore computer-based learning of neuroanatomy. Using a realistic 3D graphical model of neuroanatomy, and sections derived from the model, exploratory graphical tools were integrated into interactive computer programs so as to allow adaptive exploration. 72 participants learned either sectional anatomy alone or learned whole anatomy followed by sectional anatomy. Sectional anatomy was explored either in perceptually continuous animation or discretely, as in the use of an anatomical atlas. Learning was measured longitudinally to a high performance criterion. After learning, transfer to biomedical images and long-term retention was tested. Learning whole anatomy prior to learning sectional anatomy led to a more efficient learning experience. Learners demonstrated high levels of transfer from whole anatomy to sectional anatomy and from sectional anatomy to complex biomedical images. All learning groups demonstrated high levels of retention at 2--3 weeks.
Nair, Sankaran N; Czaja, Sara J; Sharit, Joseph
2007-06-01
This article explores the role of age, cognitive abilities, prior experience, and knowledge in skill acquisition for a computer-based simulated customer service task. Fifty-two participants aged 50-80 performed the task over 4 consecutive days following training. They also completed a battery that assessed prior computer experience and cognitive abilities. The data indicated that overall quality and efficiency of performance improved with practice. The predictors of initial level of performance and rate of change in performance varied according to the performance parameter assessed. Age and fluid intelligence predicted initial level and rate of improvement in overall quality, whereas crystallized intelligence and age predicted initial e-mail processing time, and crystallized intelligence predicted rate of change in e-mail processing time over days. We discuss the implications of these findings for the design of intervention strategies.
Modeling the fusion of cylindrical bioink particles in post bioprinting structure formation
NASA Astrophysics Data System (ADS)
McCune, Matt; Shafiee, Ashkan; Forgacs, Gabor; Kosztin, Ioan
2015-03-01
Cellular Particle Dynamics (CPD) is an effective computational method to describe the shape evolution and biomechanical relaxation processes in multicellular systems. Thus, CPD is a useful tool to predict the outcome of post-printing structure formation in bioprinting. The predictive power of CPD has been demonstrated for multicellular systems composed of spherical bioink units. Experiments and computer simulations were related through an independently developed theoretical formalism based on continuum mechanics. Here we generalize the CPD formalism to (i) include cylindrical bioink particles often used in specific bioprinting applications, (ii) describe the more realistic experimental situation in which both the length and the volume of the cylindrical bioink units decrease during post-printing structure formation, and (iii) directly connect CPD simulations to the corresponding experiments without the need of the intermediate continuum theory inherently based on simplifying assumptions. Work supported by NSF [PHY-0957914]. Computer time provided by the University of Missouri Bioinformatics Consortium.
Challenges in scaling NLO generators to leadership computers
NASA Astrophysics Data System (ADS)
Benjamin, D.; Childers, JT; Hoeche, S.; LeCompte, T.; Uram, T.
2017-10-01
Exascale computing resources are roughly a decade away and will be capable of 100 times more computing than current supercomputers. In the last year, Energy Frontier experiments crossed a milestone of 100 million core-hours used at the Argonne Leadership Computing Facility, Oak Ridge Leadership Computing Facility, and NERSC. The Fortran-based leading-order parton generator called Alpgen was successfully scaled to millions of threads to achieve this level of usage on Mira. Sherpa and MadGraph are next-to-leading order generators used heavily by LHC experiments for simulation. Integration times for high-multiplicity or rare processes can take a week or more on standard Grid machines, even using all 16-cores. We will describe our ongoing work to scale the Sherpa generator to thousands of threads on leadership-class machines and reduce run-times to less than a day. This work allows the experiments to leverage large-scale parallel supercomputers for event generation today, freeing tens of millions of grid hours for other work, and paving the way for future applications (simulation, reconstruction) on these and future supercomputers.
Hwang, Han-Jeong; Han, Chang-Hee; Lim, Jeong-Hwan; Kim, Yong-Wook; Choi, Soo-In; An, Kwang-Ok; Lee, Jun-Hak; Cha, Ho-Seung; Hyun Kim, Seung; Im, Chang-Hwan
2017-03-01
Although the feasibility of brain-computer interface (BCI) systems based on steady-state visual evoked potential (SSVEP) has been extensively investigated, only a few studies have evaluated its clinical feasibility in patients with locked-in syndrome (LIS), who are the main targets of BCI technology. The main objective of this case report was to share our experiences of SSVEP-based BCI experiments involving five patients with LIS, thereby providing researchers with useful information that can potentially help them to design BCI experiments for patients with LIS. In our experiments, a four-class online SSVEP-based BCI system was implemented and applied to four of five patients repeatedly on multiple days to investigate its test-retest reliability. In the last experiments with two of the four patients, the practical usability of our BCI system was tested using a questionnaire survey. All five patients showed clear and distinct SSVEP responses at all four fundamental stimulation frequencies (6, 6.66, 7.5, 10 Hz), and responses at harmonic frequencies were also observed in three patients. Mean classification accuracy was 76.99% (chance level = 25%). The test-retest reliability experiments demonstrated stable performance of our BCI system over different days even when the initial experimental settings (e.g., electrode configuration, fixation time, visual angle) used in the first experiment were used without significant modifications. Our results suggest that SSVEP-based BCI paradigms might be successfully used to implement clinically feasible BCI systems for severely paralyzed patients. © 2016 Society for Psychophysiological Research.
A comparison of CLIPS- and LISP-based approaches to the development of a real-time expert system
NASA Technical Reports Server (NTRS)
Frainier, R.; Groleau, N.; Bhatnagar, R.; Lam, C.; Compton, M.; Colombano, S.; Lai, S.; Szolovits, P.; Manahan, M.; Statler, I.
1990-01-01
This paper describes an ongoing expert system development effort started in 1988 which is evaluating both CLIPS- and LISP- based approaches. The expert system is being developed to a project schedule and is planned for flight on Space Shuttle Mission SLS-2 in 1992. The expert system will help astronauts do the best possible science for a vestibular physiology experiment already scheduled for that mission. The system gathers and reduces data from the experiment, flags 'interesting' results, and proposes changes in the experiment both to exploit the in-flight observations and to stay within the time allowed by Mission Control for the experiment. These tasks must all be performed in real time. Two Apple Macintosh computers are used. The CLIPS- and LISP- based environments are layered above the Macintosh computer Operating System. The 'CLIPS-based' environment includes CLIPS and HyperCard. The LlSP-based environment includes Common LISP, Parmenides (a frame system), and FRuleKit (a rule system). Important evaluation factors include ease of programming, performance against real-time requirements, usability by an astronaut, robustness, and ease of maintenance. Current results on the factors of ease of programming, performance against real-time requirements, and ease of maintenance are discussed.
Organization of the secure distributed computing based on multi-agent system
NASA Astrophysics Data System (ADS)
Khovanskov, Sergey; Rumyantsev, Konstantin; Khovanskova, Vera
2018-04-01
Nowadays developing methods for distributed computing is received much attention. One of the methods of distributed computing is using of multi-agent systems. The organization of distributed computing based on the conventional network computers can experience security threats performed by computational processes. Authors have developed the unified agent algorithm of control system of computing network nodes operation. Network PCs is used as computing nodes. The proposed multi-agent control system for the implementation of distributed computing allows in a short time to organize using of the processing power of computers any existing network to solve large-task by creating a distributed computing. Agents based on a computer network can: configure a distributed computing system; to distribute the computational load among computers operated agents; perform optimization distributed computing system according to the computing power of computers on the network. The number of computers connected to the network can be increased by connecting computers to the new computer system, which leads to an increase in overall processing power. Adding multi-agent system in the central agent increases the security of distributed computing. This organization of the distributed computing system reduces the problem solving time and increase fault tolerance (vitality) of computing processes in a changing computing environment (dynamic change of the number of computers on the network). Developed a multi-agent system detects cases of falsification of the results of a distributed system, which may lead to wrong decisions. In addition, the system checks and corrects wrong results.
NASA Astrophysics Data System (ADS)
Ge, Yuanzheng; Chen, Bin; liu, Liang; Qiu, Xiaogang; Song, Hongbin; Wang, Yong
2018-02-01
Individual-based computational environment provides an effective solution to study complex social events by reconstructing scenarios. Challenges remain in reconstructing the virtual scenarios and reproducing the complex evolution. In this paper, we propose a framework to reconstruct a synthetic computational environment, reproduce the epidemic outbreak, and evaluate management interventions in a virtual university. The reconstructed computational environment includes 4 fundamental components: the synthetic population, behavior algorithms, multiple social networks, and geographic campus environment. In the virtual university, influenza H1N1 transmission experiments are conducted, and gradually enhanced interventions are evaluated and compared quantitatively. The experiment results indicate that the reconstructed virtual environment provides a solution to reproduce complex emergencies and evaluate policies to be executed in the real world.
SPREADSHEET-BASED PROGRAM FOR ERGONOMIC ADJUSTMENT OF NOTEBOOK COMPUTER AND WORKSTATION SETTINGS.
Nanthavanij, Suebsak; Prae-Arporn, Kanlayanee; Chanjirawittaya, Sorajak; Paripoonyo, Satirajit; Rodloy, Somsak
2015-06-01
This paper discusses a computer program, ErgoNBC, which provides suggestions regarding the ergonomic settings of a notebook computer (NBC), workstation components, and selected accessories in order to help computer users to assume an appropriate work posture during the NBC work. From the users' body height, NBC and workstation component data, ErgoNBC computes the recommended tilt angle of NBC base unit, NBC screen angle, distance between the user and NBC, seat height and work surface height. If necessary, the NBC base support, seat cushion and footrest, including their settings, are recommended. An experiment involving twenty-four university students was conducted to evaluate the recommendations provided by ErgoNBC. The Rapid Upper Limb Assessment (RULA) technique was used to analyze their work postures both before and after implementing the Ergo NBC's recommendations. The results clearly showed that ErgoNBC could significantly help to improve the subjects' work postures.
Sönksen, P; Williams, C
1996-07-01
In this article we have stressed that a diabetes care information system should be useful to, usable and actually used by carers at the point of patient contact. Information resulting from such encounters should, at no extra cost, furnish the needs of communication, audit, research and management. Diabeta is a clinical record system for supporting the management of patients with diabetes. It has grown 'organically' within an academic clinical unit over a period of 23 years. It is used for each and every encounter with the clinicians in our diabetes team and as such, contains an immense amount of objective clinical experience. This experience can be interrogated very easily by computer-naive clinicians using a remarkable interactive program ('Datascan') which contains statistical procedures 'embedded' in the APL computer code, eliminating the need to 'export' the data into a statistical package. The latest PC-based version is incredibly fast and this immense amount of clinical experience can be carried around on a notebook PC and be available for exploration at any time. This makes 'evidence-based medicine' available in a remarkably flexible way since it shares the accumulated objective experience of literally 'dozens' of clinicians over a period which now extends to 23 years. It adds a completely new dimension to the term 'clinical experience' and is unattainable with manual records. It would be naive to assume that such systems are easy to design, build or implement, or that the initial capital outlay required will be small although costs are falling continuously. Medicine is a highly complex activity, the essential basis of which is human interaction. Introduction of a technology into this interaction requires sensitivity to the wishes and requirements of individuals, and protection of their exchanges from third parties. The potential of computers in diabetes care is so great that these issues must be addressed through continuing research, development, evaluation and funding of new systems. This must be led by the medical profession not the computer industry.
Gene Expression Analysis: Teaching Students to Do 30,000 Experiments at Once with Microarray
ERIC Educational Resources Information Center
Carvalho, Felicia I.; Johns, Christopher; Gillespie, Marc E.
2012-01-01
Genome scale experiments routinely produce large data sets that require computational analysis, yet there are few student-based labs that illustrate the design and execution of these experiments. In order for students to understand and participate in the genomic world, teaching labs must be available where students generate and analyze large data…
QPSO-Based Adaptive DNA Computing Algorithm
Karakose, Mehmet; Cigdem, Ugur
2013-01-01
DNA (deoxyribonucleic acid) computing that is a new computation model based on DNA molecules for information storage has been increasingly used for optimization and data analysis in recent years. However, DNA computing algorithm has some limitations in terms of convergence speed, adaptability, and effectiveness. In this paper, a new approach for improvement of DNA computing is proposed. This new approach aims to perform DNA computing algorithm with adaptive parameters towards the desired goal using quantum-behaved particle swarm optimization (QPSO). Some contributions provided by the proposed QPSO based on adaptive DNA computing algorithm are as follows: (1) parameters of population size, crossover rate, maximum number of operations, enzyme and virus mutation rate, and fitness function of DNA computing algorithm are simultaneously tuned for adaptive process, (2) adaptive algorithm is performed using QPSO algorithm for goal-driven progress, faster operation, and flexibility in data, and (3) numerical realization of DNA computing algorithm with proposed approach is implemented in system identification. Two experiments with different systems were carried out to evaluate the performance of the proposed approach with comparative results. Experimental results obtained with Matlab and FPGA demonstrate ability to provide effective optimization, considerable convergence speed, and high accuracy according to DNA computing algorithm. PMID:23935409
ERIC Educational Resources Information Center
Sparks, Kristin E., Ed.; Simonson, Michael, Ed.
2000-01-01
Subjects addressed by the 35 papers in this proceedings include: computer-based cooperative, collaborative and individual learning; comparison of students' and teachers' computer affect and behavior effects on performance; effects of headings and computer experience in CBI; use of CSCA to support argumentation skills in legal education; drama's…
A resource management architecture based on complex network theory in cloud computing federation
NASA Astrophysics Data System (ADS)
Zhang, Zehua; Zhang, Xuejie
2011-10-01
Cloud Computing Federation is a main trend of Cloud Computing. Resource Management has significant effect on the design, realization, and efficiency of Cloud Computing Federation. Cloud Computing Federation has the typical characteristic of the Complex System, therefore, we propose a resource management architecture based on complex network theory for Cloud Computing Federation (abbreviated as RMABC) in this paper, with the detailed design of the resource discovery and resource announcement mechanisms. Compare with the existing resource management mechanisms in distributed computing systems, a Task Manager in RMABC can use the historical information and current state data get from other Task Managers for the evolution of the complex network which is composed of Task Managers, thus has the advantages in resource discovery speed, fault tolerance and adaptive ability. The result of the model experiment confirmed the advantage of RMABC in resource discovery performance.
NASA Technical Reports Server (NTRS)
Ali, Syed Firasat; Khan, Javed Khan; Rossi, Marcia J.; Crane, Peter; Heath, Bruce E.; Knighten, Tremaine; Culpepper, Christi
2003-01-01
Personal computer based flight simulators are expanding opportunities for providing low-cost pilot training. One advantage of these devices is the opportunity to incorporate instructional features into training scenarios that might not be cost effective with earlier systems. Research was conducted to evaluate the utility of different instructional features using a coordinated level turn as an aircraft maneuvering task. In study I, a comparison was made between automated computer grades of performance with certified flight instructors grades. Every one of the six student volunteers conducted a flight with level turns at two different bank angles. The automated computer grades were based on prescribed tolerances on bank angle, airspeed and altitude. Two certified flight instructors independently examined the video tapes of heads up and instrument displays of the flights and graded them. The comparison of automated grades with the instructors grades was based on correlations between them. In study II, a 2x2 between subjects factorial design was used to devise and conduct an experiment. Comparison was made between real time training and above real time training and between feedback and no feedback in training. The performance measure to monitor progress in training was based on deviations in bank angle and altitude. The performance measure was developed after completion of the experiment including the training and test flights. It was not envisaged before the experiment. The experiment did not include self- instructions as it was originally planned, although feedback by experimenter to the trainee was included in the study.
NASA Technical Reports Server (NTRS)
Norlin, Ken (Technical Monitor); Ali, Syed Firasat; Khan, M. Javed; Rossi, Marcia J.; Crane, Peter; Heath, Bruce E.; Knighten, Tremaine; Culpepper, Christi
2003-01-01
Personal computer based flight simulators are expanding opportunities for providing low-cost pilot training. One advantage of these devices is the opportunity to incorporate instructional features into training scenarios that might not be cost effective with earlier systems. Research was conducted to evaluate the utility of different instructional features using a coordinated level turn as an aircraft maneuvering task. In study I, a comparison was made between automated computer grades of performance with certified flight instructors grades. Every one of the six student volunteers conducted a flight with level turns at two different bank angles. The automated computer grades were based on prescribed tolerances on bank angle, airspeed and altitude. Two certified flight instructors independently examined the video tapes of heads up and instrument displays of the flights and graded them. The comparison of automated grades with the instructors grades ms based on correlations between them. In study II, a 2x2 between subjects factorial design was used to devise and conduct an experiment. Comparison was made between real time training and above real time training and between feedback and no feedback in training. The performance measure to monitor progress in training was based on deviations in bank angle and altitude. The performance measure was developed after completion of the experiment including the training and test flights. It was not envisaged before the experiment. The experiment did not include self-instructions as it was originally planned, although feedback by experimenter to the trainee was included in the study.
A Comparison of Computational Cognitive Models: Agent-Based Systems Versus Rule-Based Architectures
2003-03-01
Java™ How To Program , Prentice Hall, 1999. Friedman-Hill, E., Jess, The Expert System Shell for the Java Platform, Sandia National Laboratories, 2001...transition from the descriptive NDM theory to a computational model raises several questions: Who is an experienced decision maker? How do you model the...progression from being a novice to an experienced decision maker? How does the model account for previous experiences? Are there situations where
NASA Astrophysics Data System (ADS)
Matott, L. S.; Hymiak, B.; Reslink, C. F.; Baxter, C.; Aziz, S.
2012-12-01
As part of the NSF-sponsored 'URGE (Undergraduate Research Group Experiences) to Compute' program, Dr. Matott has been collaborating with talented Math majors to explore the design of cost-effective systems to safeguard groundwater supplies from contaminated sites. Such activity is aided by a combination of groundwater modeling, simulation-based optimization, and high-performance computing - disciplines largely unfamiliar to the students at the outset of the program. To help train and engage the students, a number of interactive and graphical software packages were utilized. Examples include: (1) a tutorial for exploring the behavior of evolutionary algorithms and other heuristic optimizers commonly used in simulation-based optimization; (2) an interactive groundwater modeling package for exploring alternative pump-and-treat containment scenarios at a contaminated site in Billings, Montana; (3) the R software package for visualizing various concepts related to subsurface hydrology; and (4) a job visualization tool for exploring the behavior of numerical experiments run on a large distributed computing cluster. Further engagement and excitement in the program was fostered by entering (and winning) a computer art competition run by the Coalition for Academic Scientific Computation (CASC). The winning submission visualizes an exhaustively mapped optimization cost surface and dramatically illustrates the phenomena of artificial minima - valley locations that correspond to designs whose costs are only partially optimal.
A Novel Approach for Creating Activity-Aware Applications in a Hospital Environment
NASA Astrophysics Data System (ADS)
Bardram, Jakob E.
Context-aware and activity-aware computing has been proposed as a way to adapt the computer to the user’s ongoing activity. However, deductively moving from physical context - like location - to establishing human activity has proved difficult. This paper proposes a novel approach to activity-aware computing. Instead of inferring activities, this approach enables the user to explicitly model their activity, and then use sensor-based events to create, manage, and use these computational activities adjusted to a specific context. This approach was crafted through a user-centered design process in collaboration with a hospital department. We propose three strategies for activity-awareness: context-based activity matching, context-based activity creation, and context-based activity adaptation. We present the implementation of these strategies and present an experimental evaluation of them. The experiments demonstrate that rather than considering context as information, context can be a relational property that links ’real-world activities’ with their ’computational activities’.
Evolution of the Virtualized HPC Infrastructure of Novosibirsk Scientific Center
NASA Astrophysics Data System (ADS)
Adakin, A.; Anisenkov, A.; Belov, S.; Chubarov, D.; Kalyuzhny, V.; Kaplin, V.; Korol, A.; Kuchin, N.; Lomakin, S.; Nikultsev, V.; Skovpen, K.; Sukharev, A.; Zaytsev, A.
2012-12-01
Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies, and Institute of Computational Mathematics and Mathematical Geophysics (ICM&MG). Since each institute has specific requirements on the architecture of computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for a particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM&MG), and a Grid Computing Facility of BINP. A dedicated optical network with the initial bandwidth of 10 Gb/s connecting these three facilities was built in order to make it possible to share the computing resources among the research communities, thus increasing the efficiency of operating the existing computing facilities and offering a common platform for building the computing infrastructure for future scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technology based on XEN and KVM platforms. This contribution gives a thorough review of the present status and future development prospects for the NSC virtualized computing infrastructure and the experience gained while using it for running production data analysis jobs related to HEP experiments being carried out at BINP, especially the KEDR detector experiment at the VEPP-4M electron-positron collider.
Computer-aided light sheet flow visualization using photogrammetry
NASA Technical Reports Server (NTRS)
Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.
1994-01-01
A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and a visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) results, was chosen to interactively display the reconstructed light sheet images with the numerical surface geometry for the model or aircraft under study. The photogrammetric reconstruction technique and the image processing and computer graphics techniques and equipment are described. Results of the computer-aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images with CFD solutions in the same graphics environment is also demonstrated.
Computer-Aided Light Sheet Flow Visualization
NASA Technical Reports Server (NTRS)
Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.
1993-01-01
A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.
Computer-aided light sheet flow visualization
NASA Technical Reports Server (NTRS)
Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.
1993-01-01
A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.
Causal learning with local computations.
Fernbach, Philip M; Sloman, Steven A
2009-05-01
The authors proposed and tested a psychological theory of causal structure learning based on local computations. Local computations simplify complex learning problems via cues available on individual trials to update a single causal structure hypothesis. Structural inferences from local computations make minimal demands on memory, require relatively small amounts of data, and need not respect normative prescriptions as inferences that are principled locally may violate those principles when combined. Over a series of 3 experiments, the authors found (a) systematic inferences from small amounts of data; (b) systematic inference of extraneous causal links; (c) influence of data presentation order on inferences; and (d) error reduction through pretraining. Without pretraining, a model based on local computations fitted data better than a Bayesian structural inference model. The data suggest that local computations serve as a heuristic for learning causal structure. Copyright 2009 APA, all rights reserved.
TEACHING PHYSICS: A computer-based revitalization of Atwood's machine
NASA Astrophysics Data System (ADS)
Trumper, Ricardo; Gelbman, Moshe
2000-09-01
Atwood's machine is used in a microcomputer-based experiment to demonstrate Newton's second law with considerable precision. The friction force on the masses and the moment of inertia of the pulley can also be estimated.
Hoang, Tuan; Tran, Dat; Huang, Xu
2013-01-01
Common Spatial Pattern (CSP) is a state-of-the-art method for feature extraction in Brain-Computer Interface (BCI) systems. However it is designed for 2-class BCI classification problems. Current extensions of this method to multiple classes based on subspace union and covariance matrix similarity do not provide a high performance. This paper presents a new approach to solving multi-class BCI classification problems by forming a subspace resembled from original subspaces and the proposed method for this approach is called Approximation-based Common Principal Component (ACPC). We perform experiments on Dataset 2a used in BCI Competition IV to evaluate the proposed method. This dataset was designed for motor imagery classification with 4 classes. Preliminary experiments show that the proposed ACPC feature extraction method when combining with Support Vector Machines outperforms CSP-based feature extraction methods on the experimental dataset.
Nonvolatile “AND,” “OR,” and “NOT” Boolean logic gates based on phase-change memory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Y.; Zhong, Y. P.; Deng, Y. F.
2013-12-21
Electronic devices or circuits that can implement both logic and memory functions are regarded as the building blocks for future massive parallel computing beyond von Neumann architecture. Here we proposed phase-change memory (PCM)-based nonvolatile logic gates capable of AND, OR, and NOT Boolean logic operations verified in SPICE simulations and circuit experiments. The logic operations are parallel computing and results can be stored directly in the states of the logic gates, facilitating the combination of computing and memory in the same circuit. These results are encouraging for ultralow-power and high-speed nonvolatile logic circuit design based on novel memory devices.
Kulhánek, Tomáš; Ježek, Filip; Mateják, Marek; Šilar, Jan; Kofránek, Jří
2015-08-01
This work introduces experiences of teaching modeling and simulation for graduate students in the field of biomedical engineering. We emphasize the acausal and object-oriented modeling technique and we have moved from teaching block-oriented tool MATLAB Simulink to acausal and object oriented Modelica language, which can express the structure of the system rather than a process of computation. However, block-oriented approach is allowed in Modelica language too and students have tendency to express the process of computation. Usage of the exemplar acausal domains and approach allows students to understand the modeled problems much deeper. The causality of the computation is derived automatically by the simulation tool.
Scout trajectory error propagation computer program
NASA Technical Reports Server (NTRS)
Myler, T. R.
1982-01-01
Since 1969, flight experience has been used as the basis for predicting Scout orbital accuracy. The data used for calculating the accuracy consists of errors in the trajectory parameters (altitude, velocity, etc.) at stage burnout as observed on Scout flights. Approximately 50 sets of errors are used in Monte Carlo analysis to generate error statistics in the trajectory parameters. A covariance matrix is formed which may be propagated in time. The mechanization of this process resulted in computer program Scout Trajectory Error Propagation (STEP) and is described herein. Computer program STEP may be used in conjunction with the Statistical Orbital Analysis Routine to generate accuracy in the orbit parameters (apogee, perigee, inclination, etc.) based upon flight experience.
An application for multi-person task synchronization
NASA Technical Reports Server (NTRS)
Brown, Robert L.; Doyle, Dee
1990-01-01
Computer applications are studied that will enable a group of people to synchronize their actions when following a predefined task sequence. It is assumed that the people involved only have computer workstations available to them for communication. Hence, the approach is to study how the computer can be used to help a group remain synchronized. A series of applications were designed and developed that can be used as vehicles for experimentation. An example of how this technique can be used for a remote coaching capability is explained in a report describing an experiment that simulated a Life Sciences experiment on-board Space Station Freedom, with a ground based principal investigator providing the expertise by coaching the on-orbit mission specialist.
Build a better mouse: directly-observed issues in computer use for adults with SMI.
Black, Anne C; Serowik, Kristin L; Schensul, Jean J; Bowen, Anne M; Rosen, Marc I
2013-03-01
Integrating information technology into healthcare has the potential to bring treatment to hard-to-reach people. Individuals with serious mental illness (SMI), however, may derive limited benefit from these advances in care because of lack of computer ownership and experience. To date, conclusions about the computer skills and attitudes of adults with SMI have been based primarily on self-report. In the current study, 28 psychiatric outpatients with co-occurring cocaine use were interviewed about their computer use and opinions, and 25 were then directly observed using task analysis and think aloud methods as they navigated a multi-component health informational website. Participants reported low rates of computer ownership and use, and negative attitudes towards computers. Self-reported computer skills were higher than demonstrated in the task analysis. However, some participants spontaneously expressed more positive attitudes and greater computer self-efficacy after navigating the website. Implications for increasing access to computer-based health information are discussed.
Build a Better Mouse: Directly-Observed Issues in Computer Use for Adults with SMI
Black, Anne C.; Serowik, Kristin L.; Schensul, Jean J.; Bowen, Anne M.; Rosen, Marc I.
2014-01-01
Integrating information technology into healthcare has the potential to bring treatment to hard-to-reach people. Individuals with serious mental illness (SMI), however, may derive limited benefit from these advances in care because of lack of computer ownership and experience. To date, conclusions about the computer skills and attitudes of adults with SMI have been based primarily on self-report. In the current study, 28 psychiatric outpatients with co-occurring cocaine use were interviewed about their computer use and opinions, and 25 were then directly observed using task analysis and think aloud methods as they navigated a multi-component health informational website. Participants reported low rates of computer ownership and use, and negative attitudes towards computers. Self-reported computer skills were higher than demonstrated in the task analysis. However, some participants spontaneously expressed more positive attitudes and greater computer self-efficacy after navigating the website. Implications for increasing access to computer-based health information are discussed. PMID:22711454
NASA Astrophysics Data System (ADS)
Tomshaw, Stephen G.
Physics education research has shown that students bring alternate conceptions to the classroom which can be quite resistant to traditional instruction methods (Clement, 1982; Halloun & Hestenes, 1985; McDermott, 1991). Microcomputer-based laboratory (MBL) experiments that employ an active-engagement strategy have been shown to improve student conceptual understanding in high school and introductory university physics courses (Thornton & Sokoloff, 1998). These (MBL) experiments require a specialized computer interface, type-specific sensors (e.g. motion detectors, force probes, accelerometers), and specialized software in addition to the standard physics experimental apparatus. Tao and Gunstone (1997) have shown that computer simulations used in an active engagement environment can also lead to conceptual change. This study investigated 69 secondary physics students' use of computer simulations of MBL activities in place of the hands-on MBL laboratory activities. The average normalized gain
Region based Brain Computer Interface for a home control application.
Akman Aydin, Eda; Bay, Omer Faruk; Guler, Inan
2015-08-01
Environment control is one of the important challenges for disabled people who suffer from neuromuscular diseases. Brain Computer Interface (BCI) provides a communication channel between the human brain and the environment without requiring any muscular activation. The most important expectation for a home control application is high accuracy and reliable control. Region-based paradigm is a stimulus paradigm based on oddball principle and requires selection of a target at two levels. This paper presents an application of region based paradigm for a smart home control application for people with neuromuscular diseases. In this study, a region based stimulus interface containing 49 commands was designed. Five non-disabled subjects were attended to the experiments. Offline analysis results of the experiments yielded 95% accuracy for five flashes. This result showed that region based paradigm can be used to select commands of a smart home control application with high accuracy in the low number of repetitions successfully. Furthermore, a statistically significant difference was not observed between the level accuracies.
Trusted measurement model based on multitenant behaviors.
Ning, Zhen-Hu; Shen, Chang-Xiang; Zhao, Yong; Liang, Peng
2014-01-01
With a fast growing pervasive computing, especially cloud computing, the behaviour measurement is at the core and plays a vital role. A new behaviour measurement tailored for Multitenants in cloud computing is needed urgently to fundamentally establish trust relationship. Based on our previous research, we propose an improved trust relationship scheme which captures the world of cloud computing where multitenants share the same physical computing platform. Here, we first present the related work on multitenant behaviour; secondly, we give the scheme of behaviour measurement where decoupling of multitenants is taken into account; thirdly, we explicitly explain our decoupling algorithm for multitenants; fourthly, we introduce a new way of similarity calculation for deviation control, which fits the coupled multitenants under study well; lastly, we design the experiments to test our scheme.
Trusted Measurement Model Based on Multitenant Behaviors
Ning, Zhen-Hu; Shen, Chang-Xiang; Zhao, Yong; Liang, Peng
2014-01-01
With a fast growing pervasive computing, especially cloud computing, the behaviour measurement is at the core and plays a vital role. A new behaviour measurement tailored for Multitenants in cloud computing is needed urgently to fundamentally establish trust relationship. Based on our previous research, we propose an improved trust relationship scheme which captures the world of cloud computing where multitenants share the same physical computing platform. Here, we first present the related work on multitenant behaviour; secondly, we give the scheme of behaviour measurement where decoupling of multitenants is taken into account; thirdly, we explicitly explain our decoupling algorithm for multitenants; fourthly, we introduce a new way of similarity calculation for deviation control, which fits the coupled multitenants under study well; lastly, we design the experiments to test our scheme. PMID:24987731
Reconsidering Simulations in Science Education at a Distance: Features of Effective Use
ERIC Educational Resources Information Center
Blake, C.; Scanlon, E.
2007-01-01
This paper proposes a reconsideration of use of computer simulations in science education. We discuss three studies of the use of science simulations for undergraduate distance learning students. The first one, "The Driven Pendulum" simulation is a computer-based experiment on the behaviour of a pendulum. The second simulation, "Evolve" is…
ERIC Educational Resources Information Center
Armel, Donald, Ed.
Papers from a conference on microcomputers are: "Organizational Leadership through Information Technology" (John A. Anderson); "Multimedia in the Classroom--Rejuvenating the Literacy Course" (Stephen T. Anderson, Sr.); "Something New about Notetaking: A Computer-Based Instructional Experiment" (Donald Armel);…
Computer Use of a Medical Dictionary to Select Search Words.
ERIC Educational Resources Information Center
O'Connor, John
1986-01-01
Explains an experiment in text-searching retrieval for cancer questions which developed and used computer procedures (via human simulation) to select search words from medical dictionaries. This study is based on an earlier one in which search words were humanly selected, and the recall results of the two studies are compared. (Author/LRW)
Curricular Improvements through Computation and Experiment Based Learning Modules
ERIC Educational Resources Information Center
Khan, Fazeel; Singh, Kumar
2015-01-01
Engineers often need to predict how a part, mechanism or machine will perform in service, and this insight is typically achieved thorough computer simulations. Therefore, instruction in the creation and application of simulation models is essential for aspiring engineers. The purpose of this project was to develop a unified approach to teaching…
ERIC Educational Resources Information Center
Vincent, Peter
2000-01-01
Examines the outcomes of a two-year trial of computer-mediated conferencing (CMC) conducted at a British university during the final-year undergraduate course in glacial and periglacial geomorphology. Discusses the issues related to CMC and describes the experience over the last two years of using CMC conferencing. (CMK)
IP Addressing: Problem-Based Learning Approach on Computer Networks
ERIC Educational Resources Information Center
Jevremovic, Aleksandar; Shimic, Goran; Veinovic, Mladen; Ristic, Nenad
2017-01-01
The case study presented in this paper describes the pedagogical aspects and experience gathered while using an e-learning tool named IPA-PBL. Its main purpose is to provide additional motivation for adopting theoretical principles and procedures in a computer networks course. In the proposed model, the sequencing of activities of the learning…
ERIC Educational Resources Information Center
Arumi, Francisco N.
Computer programs capable of describing the thermal behavior of buildings are used to help architectural students understand environmental systems. The Numerical Simulation Laboratory at the Architectural School of the University of Texas at Austin was developed to provide the necessary software capable of simulating the energy transactions…
M-Learning: An Experiment in Using SMS to Support Learning New English Language Words
ERIC Educational Resources Information Center
Cavus, Nadire; Ibrahim, Dogan
2009-01-01
There is an increase use of wireless technologies in education all over the world. In fact, wireless technologies such as laptop computers, palmtop computers and mobile phones are revolutionizing education and transforming the traditional classroom-based learning and teaching into "anytime" and "anywhere" education. This paper investigates the use…
At the Brink: Baseline Data. The Microcomputing Program at Drexel University.
ERIC Educational Resources Information Center
McCord, Joan
Surveys of faculty and students at Drexel University were conducted to assess the effects of the university's decision in late 1982 to require all incoming freshmen to buy computers. A May 1983 survey determined faculty experiences with computers, their teaching preferences, and educational backgrounds. Based on responses from 328 faculty (93%),…
Engaging Whole-Number Knowledge for Rational-Number Leaning Using a Computer-Based Tool.
ERIC Educational Resources Information Center
Hunting, Robert P.; And Others
1996-01-01
This constructivist teaching experiment investigated the role of whole-number knowledge in fraction learning. Two nine-year-old students worked for one year with a computer program (Copycat) and demonstrated that the development of whole-number knowledge and rational-number knowledge is interdependent. Contains 59 references. (KMC)
E-Learning System Using Segmentation-Based MR Technique for Learning Circuit Construction
ERIC Educational Resources Information Center
Takemura, Atsushi
2016-01-01
This paper proposes a novel e-Learning system using the mixed reality (MR) technique for technical experiments involving the construction of electronic circuits. The proposed system comprises experimenters' mobile computers and a remote analysis system. When constructing circuits, each learner uses a mobile computer to transmit image data from the…
ERIC Educational Resources Information Center
Jiang, L. Crystal; Bazarova, Natalie N.; Hancock, Jeffrey T.
2011-01-01
The present research investigated whether the attribution process through which people explain self-disclosures differs in text-based computer-mediated interactions versus face to face, and whether differences in causal attributions account for the increased intimacy frequently observed in mediated communication. In the experiment participants…
Cloud Computing Technologies in Writing Class: Factors Influencing Students' Learning Experience
ERIC Educational Resources Information Center
Wang, Jenny
2017-01-01
The proposed interactive online group within the cloud computing technologies as a main contribution of this paper provides easy and simple access to the cloud-based Software as a Service (SaaS) system and delivers effective educational tools for students and teacher on after-class group writing assignment activities. Therefore, this study…
NASA Astrophysics Data System (ADS)
Read, A.; Taga, A.; O-Saada, F.; Pajchel, K.; Samset, B. H.; Cameron, D.
2008-07-01
Computing and storage resources connected by the Nordugrid ARC middleware in the Nordic countries, Switzerland and Slovenia are a part of the ATLAS computing Grid. This infrastructure is being commissioned with the ongoing ATLAS Monte Carlo simulation production in preparation for the commencement of data taking in 2008. The unique non-intrusive architecture of ARC, its straightforward interplay with the ATLAS Production System via the Dulcinea executor, and its performance during the commissioning exercise is described. ARC support for flexible and powerful end-user analysis within the GANGA distributed analysis framework is also shown. Whereas the storage solution for this Grid was earlier based on a large, distributed collection of GridFTP-servers, the ATLAS computing design includes a structured SRM-based system with a limited number of storage endpoints. The characteristics, integration and performance of the old and new storage solutions are presented. Although the hardware resources in this Grid are quite modest, it has provided more than double the agreed contribution to the ATLAS production with an efficiency above 95% during long periods of stable operation.
NASA Astrophysics Data System (ADS)
Márquez Damián, J. I.; Granada, J. R.; Malaspina, D. C.
2014-04-01
In this work we present an evaluation in ENDF-6 format of the scattering law for light and heavy water computed using the LEAPR module of NJOY99. The models used in this evaluation are based on experimental data on light water dynamics measured by Novikov, partial structure factors obtained by Soper, and molecular dynamics calculations performed with GROMACS using a reparameterized version of the flexible SPC model by Toukan and Rahman. The models use the Egelstaff-Schofield diffusion equation for translational motion, and a continuous spectrum calculated from the velocity autocorrelation function computed with GROMACS. The scattering law for H in H2O is computed using the incoherent approximation, and the scattering law D and O in D2O are computed using the Sköld approximation for coherent scattering. The calculations show significant improvement over ENDF/B-VI and ENDF/B-VII when compared with measurements of the total cross section, differential scattering experiments and quasi-elastic neutron scattering experiments (QENS).
Predicting neutron damage using TEM with in situ ion irradiation and computer modeling
NASA Astrophysics Data System (ADS)
Kirk, Marquis A.; Li, Meimei; Xu, Donghua; Wirth, Brian D.
2018-01-01
We have constructed a computer model of irradiation defect production closely coordinated with TEM and in situ ion irradiation of Molybdenum at 80 °C over a range of dose, dose rate and foil thickness. We have reexamined our previous ion irradiation data to assign appropriate error and uncertainty based on more recent work. The spatially dependent cascade cluster dynamics model is updated with recent Molecular Dynamics results for cascades in Mo. After a careful assignment of both ion and neutron irradiation dose values in dpa, TEM data are compared for both ion and neutron irradiated Mo from the same source material. Using the computer model of defect formation and evolution based on the in situ ion irradiation of thin foils, the defect microstructure, consisting of densities and sizes of dislocation loops, is predicted for neutron irradiation of bulk material at 80 °C and compared with experiment. Reasonable agreement between model prediction and experimental data demonstrates a promising direction in understanding and predicting neutron damage using a closely coordinated program of in situ ion irradiation experiment and computer simulation.
NASA Technical Reports Server (NTRS)
Lightsey, W. D.; Alhorn, D. C.; Polites, M. E.
1992-01-01
An experiment designed to test the feasibility of using rotating unbalanced-mass (RUM) devices for line and raster scanning gimbaled payloads, while expending very little power is described. The experiment is configured for ground-based testing, but the scan concept is applicable to ground-based, balloon-borne, and space-based payloads, as well as free-flying spacecraft. The servos used in scanning are defined; the electronic hardware is specified; and a computer simulation model of the system is described. Simulation results are presented that predict system performance and verify the servo designs.
Puskaric, Marin; von Helversen, Bettina; Rieskamp, Jörg
2017-08-01
Social information such as observing others can improve performance in decision making. In particular, social information has been shown to be useful when finding the best solution on one's own is difficult, costly, or dangerous. However, past research suggests that when making decisions people do not always consider other people's behaviour when it is at odds with their own experiences. Furthermore, the cognitive processes guiding the integration of social information with individual experiences are still under debate. Here, we conducted two experiments to test whether information about other persons' behaviour influenced people's decisions in a classification task. Furthermore, we examined how social information is integrated with individual learning experiences by testing different computational models. Our results show that social information had a small but reliable influence on people's classifications. The best computational model suggests that in categorization people first make up their own mind based on the non-social information, which is then updated by the social information.
Exploring Architectural Details Through a Wearable Egocentric Vision Device
Alletto, Stefano; Abati, Davide; Serra, Giuseppe; Cucchiara, Rita
2016-01-01
Augmented user experiences in the cultural heritage domain are in increasing demand by the new digital native tourists of 21st century. In this paper, we propose a novel solution that aims at assisting the visitor during an outdoor tour of a cultural site using the unique first person perspective of wearable cameras. In particular, the approach exploits computer vision techniques to retrieve the details by proposing a robust descriptor based on the covariance of local features. Using a lightweight wearable board, the solution can localize the user with respect to the 3D point cloud of the historical landmark and provide him with information about the details at which he is currently looking. Experimental results validate the method both in terms of accuracy and computational effort. Furthermore, user evaluation based on real-world experiments shows that the proposal is deemed effective in enriching a cultural experience. PMID:26901197
CMS Centres Worldwide - a New Collaborative Infrastructure
NASA Astrophysics Data System (ADS)
Taylor, Lucas
2011-12-01
The CMS Experiment at the LHC has established a network of more than fifty inter-connected "CMS Centres" at CERN and in institutes in the Americas, Asia, Australasia, and Europe. These facilities are used by people doing CMS detector and computing grid operations, remote shifts, data quality monitoring and analysis, as well as education and outreach. We present the computing, software, and collaborative tools and videoconferencing systems. These include permanently running "telepresence" video links (hardware-based H.323, EVO and Vidyo), Webcasts, and generic Web tools such as CMS-TV for broadcasting live monitoring and outreach information. Being Web-based and experiment-independent, these systems could easily be extended to other organizations. We describe the experiences of using CMS Centres Worldwide in the CMS data-taking operations as well as for major media events with several hundred TV channels, radio stations, and many more press journalists simultaneously around the world.
NASA Astrophysics Data System (ADS)
Keshet, Aviv; Ketterle, Wolfgang
2013-01-01
Atomic physics experiments often require a complex sequence of precisely timed computer controlled events. This paper describes a distributed graphical user interface-based control system designed with such experiments in mind, which makes use of off-the-shelf output hardware from National Instruments. The software makes use of a client-server separation between a user interface for sequence design and a set of output hardware servers. Output hardware servers are designed to use standard National Instruments output cards, but the client-server nature should allow this to be extended to other output hardware. Output sequences running on multiple servers and output cards can be synchronized using a shared clock. By using a field programmable gate array-generated variable frequency clock, redundant buffers can be dramatically shortened, and a time resolution of 100 ns achieved over effectively arbitrary sequence lengths.
Keshet, Aviv; Ketterle, Wolfgang
2013-01-01
Atomic physics experiments often require a complex sequence of precisely timed computer controlled events. This paper describes a distributed graphical user interface-based control system designed with such experiments in mind, which makes use of off-the-shelf output hardware from National Instruments. The software makes use of a client-server separation between a user interface for sequence design and a set of output hardware servers. Output hardware servers are designed to use standard National Instruments output cards, but the client-server nature should allow this to be extended to other output hardware. Output sequences running on multiple servers and output cards can be synchronized using a shared clock. By using a field programmable gate array-generated variable frequency clock, redundant buffers can be dramatically shortened, and a time resolution of 100 ns achieved over effectively arbitrary sequence lengths.
Exploring Architectural Details Through a Wearable Egocentric Vision Device.
Alletto, Stefano; Abati, Davide; Serra, Giuseppe; Cucchiara, Rita
2016-02-17
Augmented user experiences in the cultural heritage domain are in increasing demand by the new digital native tourists of 21st century. In this paper, we propose a novel solution that aims at assisting the visitor during an outdoor tour of a cultural site using the unique first person perspective of wearable cameras. In particular, the approach exploits computer vision techniques to retrieve the details by proposing a robust descriptor based on the covariance of local features. Using a lightweight wearable board, the solution can localize the user with respect to the 3D point cloud of the historical landmark and provide him with information about the details at which he is currently looking. Experimental results validate the method both in terms of accuracy and computational effort. Furthermore, user evaluation based on real-world experiments shows that the proposal is deemed effective in enriching a cultural experience.
An experiment on the use of disposable plastics as a reinforcement in concrete beams
NASA Technical Reports Server (NTRS)
Chowdhury, Mostafiz R.
1992-01-01
Illustrated here is the concept of reinforced concrete structures by the use of computer simulation and an inexpensive hands-on design experiment. The students in our construction management program use disposable plastic as a reinforcement to demonstrate their understanding of reinforced concrete and prestressed concrete beams. The plastics used for such an experiment vary from plastic bottles to steel reinforced auto tires. This experiment will show the extent to which plastic reinforcement increases the strength of a concrete beam. The procedure of using such throw-away plastics in an experiment to explain the interaction between the reinforcement material and concrete, and a comparison of the test results for using different types of waste plastics are discussed. A computer analysis to simulate the structural response is used to compare the test results and to understand the analytical background of reinforced concrete design. This interaction of using computers to analyze structures and to relate the output results with real experimentation is found to be a very useful method for teaching a math-based analytical subject to our non-engineering students.
The Strategic Nature of Changing Your Mind
ERIC Educational Resources Information Center
Walsh, Matthew M.; Anderson, John R.
2009-01-01
In two experiments, we studied how people's strategy choices emerge through an initial and then a more considered evaluation of available strategies. The experiments employed a computer-based paradigm where participants solved multiplication problems using mental and calculator solutions. In addition to recording responses and solution times, we…
Hollmann, M; Mönch, T; Mulla-Osman, S; Tempelmann, C; Stadler, J; Bernarding, J
2008-10-30
In functional MRI (fMRI) complex experiments and applications require increasingly complex parameter handling as the experimental setup usually consists of separated soft- and hardware systems. Advanced real-time applications such as neurofeedback-based training or brain computer interfaces (BCIs) may even require adaptive changes of the paradigms and experimental setup during the measurement. This would be facilitated by an automated management of the overall workflow and a control of the communication between all experimental components. We realized a concept based on an XML software framework called Experiment Description Language (EDL). All parameters relevant for real-time data acquisition, real-time fMRI (rtfMRI) statistical data analysis, stimulus presentation, and activation processing are stored in one central EDL file, and processed during the experiment. A usability study comparing the central EDL parameter management with traditional approaches showed an improvement of the complete experimental handling. Based on this concept, a feasibility study realizing a dynamic rtfMRI-based brain computer interface showed that the developed system in combination with EDL was able to reliably detect and evaluate activation patterns in real-time. The implementation of a centrally controlled communication between the subsystems involved in the rtfMRI experiments reduced potential inconsistencies, and will open new applications for adaptive BCIs.
What To Do, Instead of Counterproductive, Standardized Curricula and Testing.
ERIC Educational Resources Information Center
Keegan, Mark
1993-01-01
Presents evidence against the use of standardized curricula and testing and in favor of discovery-based instruction. Describes a successful experience with a relatively new form of discovery-based instruction, the scenario educational computer software. (AEF)
Network Community Detection based on the Physarum-inspired Computational Framework.
Gao, Chao; Liang, Mingxin; Li, Xianghua; Zhang, Zili; Wang, Zhen; Zhou, Zhili
2016-12-13
Community detection is a crucial and essential problem in the structure analytics of complex networks, which can help us understand and predict the characteristics and functions of complex networks. Many methods, ranging from the optimization-based algorithms to the heuristic-based algorithms, have been proposed for solving such a problem. Due to the inherent complexity of identifying network structure, how to design an effective algorithm with a higher accuracy and a lower computational cost still remains an open problem. Inspired by the computational capability and positive feedback mechanism in the wake of foraging process of Physarum, which is a large amoeba-like cell consisting of a dendritic network of tube-like pseudopodia, a general Physarum-based computational framework for community detection is proposed in this paper. Based on the proposed framework, the inter-community edges can be identified from the intra-community edges in a network and the positive feedback of solving process in an algorithm can be further enhanced, which are used to improve the efficiency of original optimization-based and heuristic-based community detection algorithms, respectively. Some typical algorithms (e.g., genetic algorithm, ant colony optimization algorithm, and Markov clustering algorithm) and real-world datasets have been used to estimate the efficiency of our proposed computational framework. Experiments show that the algorithms optimized by Physarum-inspired computational framework perform better than the original ones, in terms of accuracy and computational cost. Moreover, a computational complexity analysis verifies the scalability of our framework.
[Research of controlling of smart home system based on P300 brain-computer interface].
Wang, Jinjia; Yang, Chengjie
2014-08-01
Using electroencephalogram (EEG) signal to control external devices has always been the research focus in the field of brain-computer interface (BCI). This is especially significant for those disabilities who have lost capacity of movements. In this paper, the P300-based BCI and the microcontroller-based wireless radio frequency (RF) technology are utilized to design a smart home control system, which can be used to control household appliances, lighting system, and security devices directly. Experiment results showed that the system was simple, reliable and easy to be populirised.
NASA Astrophysics Data System (ADS)
Uddin, M. Maruf; Fuad, Muzaddid-E.-Zaman; Rahaman, Md. Mashiur; Islam, M. Rabiul
2017-12-01
With the rapid decrease in the cost of computational infrastructure with more efficient algorithm for solving non-linear problems, Reynold's averaged Navier-Stokes (RaNS) based Computational Fluid Dynamics (CFD) has been used widely now-a-days. As a preliminary evaluation tool, CFD is used to calculate the hydrodynamic loads on offshore installations, ships, and other structures in the ocean at initial design stages. Traditionally, wedges have been studied more than circular cylinders because cylinder section has zero deadrise angle at the instant of water impact, which increases with increase of submergence. In Present study, RaNS based commercial code ANSYS Fluent is used to simulate the water entry of a circular section at constant velocity. It is seen that present computational results were compared with experiment and other numerical method.
Cyber-workstation for computational neuroscience.
Digiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C; Fortes, Jose; Sanchez, Justin C
2010-01-01
A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface.
Cyber-Workstation for Computational Neuroscience
DiGiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C.; Fortes, Jose; Sanchez, Justin C.
2009-01-01
A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface. PMID:20126436
Experience with ethylene plant computer control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nasi, M.; Darby, M.L.; Sourander, M.
This article discusses the control strategies, results and opinions of management and operations of a computer based ethylene plant control system. The ethylene unit contains 9 cracking heaters, and its nameplate capacity is 200,000 tpa ethylene. Reports on control performance during different unit loading and using different feedstock types. By converting the yield and utility consumption benefits due to computer control into monetary units, the payback time of the system is less than 2 yrs.
A study of Mariner 10 flight experiences and some flight piece part failure rate computations
NASA Technical Reports Server (NTRS)
Paul, F. A.
1976-01-01
The problems and failures encountered in Mariner flight are discussed and the data available through a quantitative accounting of all electronic piece parts on the spacecraft are summarized. It also shows computed failure rates for electronic piece parts. It is intended that these computed data be used in the continued updating of the failure rate base used for trade-off studies and predictions for future JPL space missions.
Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2
NASA Technical Reports Server (NTRS)
Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)
1998-01-01
The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.
A CFD Heterogeneous Parallel Solver Based on Collaborating CPU and GPU
NASA Astrophysics Data System (ADS)
Lai, Jianqi; Tian, Zhengyu; Li, Hua; Pan, Sha
2018-03-01
Since Graphic Processing Unit (GPU) has a strong ability of floating-point computation and memory bandwidth for data parallelism, it has been widely used in the areas of common computing such as molecular dynamics (MD), computational fluid dynamics (CFD) and so on. The emergence of compute unified device architecture (CUDA), which reduces the complexity of compiling program, brings the great opportunities to CFD. There are three different modes for parallel solution of NS equations: parallel solver based on CPU, parallel solver based on GPU and heterogeneous parallel solver based on collaborating CPU and GPU. As we can see, GPUs are relatively rich in compute capacity but poor in memory capacity and the CPUs do the opposite. We need to make full use of the GPUs and CPUs, so a CFD heterogeneous parallel solver based on collaborating CPU and GPU has been established. Three cases are presented to analyse the solver’s computational accuracy and heterogeneous parallel efficiency. The numerical results agree well with experiment results, which demonstrate that the heterogeneous parallel solver has high computational precision. The speedup on a single GPU is more than 40 for laminar flow, it decreases for turbulent flow, but it still can reach more than 20. What’s more, the speedup increases as the grid size becomes larger.
CLOUDCLOUD : general-purpose instrument monitoring and data managing software
NASA Astrophysics Data System (ADS)
Dias, António; Amorim, António; Tomé, António
2016-04-01
An effective experiment is dependent on the ability to store and deliver data and information to all participant parties regardless of their degree of involvement in the specific parts that make the experiment a whole. Having fast, efficient and ubiquitous access to data will increase visibility and discussion, such that the outcome will have already been reviewed several times, strengthening the conclusions. The CLOUD project aims at providing users with a general purpose data acquisition, management and instrument monitoring platform that is fast, easy to use, lightweight and accessible to all participants of an experiment. This work is now implemented in the CLOUD experiment at CERN and will be fully integrated with the experiment as of 2016. Despite being used in an experiment of the scale of CLOUD, this software can also be used in any size of experiment or monitoring station, from single computers to large networks of computers to monitor any sort of instrument output without influencing the individual instrument's DAQ. Instrument data and meta data is stored and accessed via a specially designed database architecture and any type of instrument output is accepted using our continuously growing parsing application. Multiple databases can be used to separate different data taking periods or a single database can be used if for instance an experiment is continuous. A simple web-based application gives the user total control over the monitored instruments and their data, allowing data visualization and download, upload of processed data and the ability to edit existing instruments or add new instruments to the experiment. When in a network, new computers are immediately recognized and added to the system and are able to monitor instruments connected to them. Automatic computer integration is achieved by a locally running python-based parsing agent that communicates with a main server application guaranteeing that all instruments assigned to that computer are monitored with parsing intervals as fast as milliseconds. This software (server+agents+interface+database) comes in easy and ready-to-use packages that can be installed in any operating system, including Android and iOS systems. This software is ideal for use in modular experiments or monitoring stations with large variability in instruments and measuring methods or in large collaborations, where data requires homogenization in order to be effectively transmitted to all involved parties. This work presents the software and provides performance comparison with previously used monitoring systems in the CLOUD experiment at CERN.
List of Publications of the U.S. Army Engineer Waterways Experiment Station. Volume 2
1993-09-01
Station List of Publications of the U.S. Army Engineer Waterways Experiment Station Volume II compiled by Research Library Information Management Division...Waterways Experiment Station for Other Agencies Air Base Survivability Systems Management Office Headquarters .............................. Z-1 Airport... manages , conducts, and coordinates research and development in the Information Management (IM) technology areas that include computer science
The binding domain of the HMGB1 inhibitor carbenoxolone: Theory and experiment
NASA Astrophysics Data System (ADS)
Mollica, Luca; Curioni, Alessandro; Andreoni, Wanda; Bianchi, Marco E.; Musco, Giovanna
2008-05-01
We present a combined computational and experimental study of the interaction of the Box A of the HMGB1 protein and carbenoxolone, an inhibitor of its pro-inflammatory activity. The computational approach consists of classical molecular dynamics (MD) simulations based on the GROMOS force field with quantum-refined (QRFF) atomic charges for the ligand. Experimental data consist of fluorescence intensities, chemical shift displacements, saturation transfer differences and intermolecular Nuclear Overhauser Enhancement signals. Good agreement is found between observations and the conformation of the ligand-protein complex resulting from QRFF-MD. In contrast, simple docking procedures and MD based on the unrefined force field provide models inconsistent with experiment. The ligand-protein binding is dominated by non-directional interactions.
Review of blunt body wake flows at hypersonic low density conditions
NASA Technical Reports Server (NTRS)
Moss, J. N.; Price, J. M.
1996-01-01
Recent results of experimental and computational studies concerning hypersonic flows about blunted cones including their near wake are reviewed. Attention is focused on conditions where rarefaction effects are present, particularly in the wake. The experiments have been performed for a common model configuration (70 deg spherically-blunted cone) in five hypersonic facilities that encompass a significant range of rarefaction and nonequilibrium effects. Computational studies using direct simulation Monte Carlo (DSMC) and Navier-Stokes solvers have been applied to selected experiments performed in each of the facilities. In addition, computations have been made for typical flight conditions in both Earth and Mars atmospheres, hence more energetic flows than produced in the ground-based tests. Also, comparisons of DSMC calculations and forebody measurements made for the Japanese Orbital Reentry Experiment (OREX) vehicle (a 50 deg spherically-blunted cone) are presented to bridge the spectrum of ground to flight conditions.
A General Cross-Layer Cloud Scheduling Framework for Multiple IoT Computer Tasks.
Wu, Guanlin; Bao, Weidong; Zhu, Xiaomin; Zhang, Xiongtao
2018-05-23
The diversity of IoT services and applications brings enormous challenges to improving the performance of multiple computer tasks' scheduling in cross-layer cloud computing systems. Unfortunately, the commonly-employed frameworks fail to adapt to the new patterns on the cross-layer cloud. To solve this issue, we design a new computer task scheduling framework for multiple IoT services in cross-layer cloud computing systems. Specifically, we first analyze the features of the cross-layer cloud and computer tasks. Then, we design the scheduling framework based on the analysis and present detailed models to illustrate the procedures of using the framework. With the proposed framework, the IoT services deployed in cross-layer cloud computing systems can dynamically select suitable algorithms and use resources more effectively to finish computer tasks with different objectives. Finally, the algorithms are given based on the framework, and extensive experiments are also given to validate its effectiveness, as well as its superiority.
Ogawa, K
1992-01-01
This paper proposes a new evaluation and prediction method for computer usability. This method is based on our two previously proposed information transmission measures created from a human-to-computer information transmission model. The model has three information transmission levels: the device, software, and task content levels. Two measures, called the device independent information measure (DI) and the computer independent information measure (CI), defined on the software and task content levels respectively, are given as the amount of information transmitted. Two information transmission rates are defined as DI/T and CI/T, where T is the task completion time: the device independent information transmission rate (RDI), and the computer independent information transmission rate (RCI). The method utilizes the RDI and RCI rates to evaluate relatively the usability of software and device operations on different computer systems. Experiments using three different systems, in this case a graphical information input task, confirm that the method offers an efficient way of determining computer usability.
The Legnaro-Padova distributed Tier-2: challenges and results
NASA Astrophysics Data System (ADS)
Badoer, Simone; Biasotto, Massimo; Costa, Fulvia; Crescente, Alberto; Fantinel, Sergio; Ferrari, Roberto; Gulmini, Michele; Maron, Gaetano; Michelotto, Michele; Sgaravatto, Massimo; Toniolo, Nicola
2014-06-01
The Legnaro-Padova Tier-2 is a computing facility serving the ALICE and CMS LHC experiments. It also supports other High Energy Physics experiments and other virtual organizations of different disciplines, which can opportunistically harness idle resources if available. The unique characteristic of this Tier-2 is its topology: the computational resources are spread in two different sites, about 15 km apart: the INFN Legnaro National Laboratories and the INFN Padova unit, connected through a 10 Gbps network link (it will be soon updated to 20 Gbps). Nevertheless these resources are seamlessly integrated and are exposed as a single computing facility. Despite this intrinsic complexity, the Legnaro-Padova Tier-2 ranks among the best Grid sites for what concerns reliability and availability. The Tier-2 comprises about 190 worker nodes, providing about 26000 HS06 in total. Such computing nodes are managed by the LSF local resource management system, and are accessible using a Grid-based interface implemented through multiple CREAM CE front-ends. dCache, xrootd and Lustre are the storage systems in use at the Tier-2: about 1.5 PB of disk space is available to users in total, through multiple access protocols. A 10 Gbps network link, planned to be doubled in the next months, connects the Tier-2 to WAN. This link is used for the LHC Open Network Environment (LHCONE) and for other general purpose traffic. In this paper we discuss about the experiences at the Legnaro-Padova Tier-2: the problems that had to be addressed, the lessons learned, the implementation choices. We also present the tools used for the daily management operations. These include DOCET, a Java-based webtool designed, implemented and maintained at the Legnaro-Padova Tier-2, and deployed also in other sites, such as the LHC Italian T1. DOCET provides an uniform interface to manage all the information about the physical resources of a computing center. It is also used as documentation repository available to the Tier-2 operations team. Finally we discuss about the foreseen developments of the existing infrastructure. This includes in particular the evolution from a Grid-based resource towards a Cloud-based computing facility.
Computational Workbench for Multibody Dynamics
NASA Technical Reports Server (NTRS)
Edmonds, Karina
2007-01-01
PyCraft is a computer program that provides an interactive, workbenchlike computing environment for developing and testing algorithms for multibody dynamics. Examples of multibody dynamic systems amenable to analysis with the help of PyCraft include land vehicles, spacecraft, robots, and molecular models. PyCraft is based on the Spatial-Operator- Algebra (SOA) formulation for multibody dynamics. The SOA operators enable construction of simple and compact representations of complex multibody dynamical equations. Within the Py-Craft computational workbench, users can, essentially, use the high-level SOA operator notation to represent the variety of dynamical quantities and algorithms and to perform computations interactively. PyCraft provides a Python-language interface to underlying C++ code. Working with SOA concepts, a user can create and manipulate Python-level operator classes in order to implement and evaluate new dynamical quantities and algorithms. During use of PyCraft, virtually all SOA-based algorithms are available for computational experiments.
Experiments with Helium-Filled Balloons
ERIC Educational Resources Information Center
Zable, Anthony C.
2010-01-01
The concepts of Newtonian mechanics, fluids, and ideal gas law physics are often treated as separate and isolated topics in the typical introductory college-level physics course, especially in the laboratory setting. To bridge these subjects, a simple experiment was developed that utilizes computer-based data acquisition sensors and a digital gram…
Theoretical Branches in Teaching Computer Science
ERIC Educational Resources Information Center
Habiballa, Hashim; Kmet, Tibor
2004-01-01
The present paper describes an educational experiment dealing with teaching the theory of formal languages and automata as well as their application concepts. It presents a practical application of an educational experiment and initial results based on comparative instruction of two samples of students (n = 56). The application concept should…
Teaching Human-Centered Security Using Nontraditional Techniques
ERIC Educational Resources Information Center
Renaud, Karen; Cutts, Quintin
2013-01-01
Computing science students amass years of programming experience and a wealth of factual knowledge in their undergraduate courses. Based on our combined years of experience, however, one of our students' abiding shortcomings is that they think there is only "one correct answer" to issues in most courses: an "idealistic"…
Teaching Agile Software Development: A Case Study
ERIC Educational Resources Information Center
Devedzic, V.; Milenkovic, S. R.
2011-01-01
This paper describes the authors' experience of teaching agile software development to students of computer science, software engineering, and other related disciplines, and comments on the implications of this and the lessons learned. It is based on the authors' eight years of experience in teaching agile software methodologies to various groups…
Investigating the Effectiveness of Classroom Diagnostic Tools
ERIC Educational Resources Information Center
Schultz, Robert K.
2012-01-01
The primary purposes of the study are to investigate what teachers experience while using the Classroom Diagnostic Tools (CDT) and to relate those experiences to the rate of growth in students' mathematics achievement. The CDT contains three components: an online computer adaptive diagnostic test, interactive web-based student reports, and…
Multiview Drawing Instruction: A Two-Location Experiment
ERIC Educational Resources Information Center
Connolly, Patrick; Holliday-Darr, Kathryn; Blasko, Dawn G.
2006-01-01
Several methods have been developed, presented, and discussed at recent ASEE and EDGD conferences on the topic of computer-based multiview drawing instruction. While small-scale and localized testing of these instruments and methods has been undertaken, no larger-scale or multi-location experiments have been attempted. This paper describes an…
A Modern and Interactive Approach to Learning Laser and Optical Communications.
ERIC Educational Resources Information Center
Minasian, Robert; Alameh, Kamal
2002-01-01
Discusses challenges in teaching lasers and optical communications to engineers, including the prohibitive cost of laboratory experiments, and describes the development of a computer-based photonics simulation experiment module which provides students with an understanding and visualization of how lasers can be modulated in telecommunications.…
Design and performance of the virtualization platform for offline computing on the ATLAS TDAQ Farm
NASA Astrophysics Data System (ADS)
Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Twomey, M. S.; Zaytsev, A.
2014-06-01
With the LHC collider at CERN currently going through the period of Long Shutdown 1 there is an opportunity to use the computing resources of the experiments' large trigger farms for other data processing activities. In the case of the ATLAS experiment, the TDAQ farm, consisting of more than 1500 compute nodes, is suitable for running Monte Carlo (MC) production jobs that are mostly CPU and not I/O bound. This contribution gives a thorough review of the design and deployment of a virtualized platform running on this computing resource and of its use to run large groups of CernVM based virtual machines operating as a single CERN-P1 WLCG site. This platform has been designed to guarantee the security and the usability of the ATLAS private network, and to minimize interference with TDAQ's usage of the farm. Openstack has been chosen to provide a cloud management layer. The experience gained in the last 3.5 months shows that the use of the TDAQ farm for the MC simulation contributes to the ATLAS data processing at the level of a large Tier-1 WLCG site, despite the opportunistic nature of the underlying computing resources being used.
Performance Analysis of Distributed Object-Oriented Applications
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1998-01-01
The purpose of this research was to evaluate the efficiency of a distributed simulation architecture which creates individual modules which are made self-scheduling through the use of a message-based communication system used for requesting input data from another module which is the source of that data. To make the architecture as general as possible, the message-based communication architecture was implemented using standard remote object architectures (Common Object Request Broker Architecture (CORBA) and/or Distributed Component Object Model (DCOM)). A series of experiments were run in which different systems are distributed in a variety of ways across multiple computers and the performance evaluated. The experiments were duplicated in each case so that the overhead due to message communication and data transmission can be separated from the time required to actually perform the computational update of a module each iteration. The software used to distribute the modules across multiple computers was developed in the first year of the current grant and was modified considerably to add a message-based communication scheme supported by the DCOM distributed object architecture. The resulting performance was analyzed using a model created during the first year of this grant which predicts the overhead due to CORBA and DCOM remote procedure calls and includes the effects of data passed to and from the remote objects. A report covering the distributed simulation software and the results of the performance experiments has been submitted separately. The above report also discusses possible future work to apply the methodology to dynamically distribute the simulation modules so as to minimize overall computation time.
Description of data base management systems activities
NASA Technical Reports Server (NTRS)
1983-01-01
One of the major responsibilities of the JPL Computing and Information Services Office is to develop and maintain a JPL plan for providing computing services to the JPL management and administrative community that will lead to improved productivity. The CISO plan to accomplish this objective has been titled 'Management and Administrative Support Systems' (MASS). The MASS plan is based on the continued use of JPL's IBM 3032 Computer system for administrative computing and for the MASS functions. The current candidate administrative Data Base Management Systems required to support the MASS include ADABASE, Cullinane IDMS and TOTAL. Previous uses of administrative Data Base Systems have been applied to specific local functions rather than in a centralized manner with elements common to the many user groups. Limited capacity data base systems have been installed in microprocessor based office automation systems in a few Project and Management Offices using Ashton-Tate dBASE II. These experiences plus some other localized in house DBMS uses have provided an excellent background for developing user and system requirements for a single DBMS to support the MASS program.
Clustering recommendations to compute agent reputation
NASA Astrophysics Data System (ADS)
Bedi, Punam; Kaur, Harmeet
2005-03-01
Traditional centralized approaches to security are difficult to apply to multi-agent systems which are used nowadays in e-commerce applications. Developing a notion of trust that is based on the reputation of an agent can provide a softer notion of security that is sufficient for many multi-agent applications. Our paper proposes a mechanism for computing reputation of the trustee agent for use by the trustier agent. The trustier agent computes the reputation based on its own experience as well as the experience the peer agents have with the trustee agents. The trustier agents intentionally interact with the peer agents to get their experience information in the form of recommendations. We have also considered the case of unintentional encounters between the referee agents and the trustee agent, which can be directly between them or indirectly through a set of interacting agents. The clustering is done to filter off the noise in the recommendations in the form of outliers. The trustier agent clusters the recommendations received from referee agents on the basis of the distances between recommendations using the hierarchical agglomerative method. The dendogram hence obtained is cut at the required similarity level which restricts the maximum distance between any two recommendations within a cluster. The cluster with maximum number of elements denotes the views of the majority of recommenders. The center of this cluster represents the reputation of the trustee agent which can be computed using c-means algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Ms. Ketki; Kim, Yong-Ha; Yiacoumi, Sotira
The mixing process of fresh water and seawater releases a significant amount of energy and is a potential source of renewable energy. The so called ‘blue energy’ or salinity-gradient energy can be harvested by a device consisting of carbon electrodes immersed in an electrolyte solution, based on the principle of capacitive double layer expansion (CDLE). In this study, we have investigated the feasibility of energy production based on the CDLE principle. Experiments and computer simulations were used to study the process. Mesoporous carbon materials, synthesized at the Oak Ridge National Laboratory, were used as electrode materials in the experiments. Neutronmore » imaging of the blue energy cycle was conducted with cylindrical mesoporous carbon electrodes and 0.5 M lithium chloride as the electrolyte solution. For experiments conducted at 0.6 V and 0.9 V applied potential, a voltage increase of 0.061 V and 0.054 V was observed, respectively. From sequences of neutron images obtained for each step of the blue energy cycle, information on the direction and magnitude of lithium ion transport was obtained. A computer code was developed to simulate the process. Experimental data and computer simulations allowed us to predict energy production.« less
Analysis and Simulation of a Blue Energy Cycle
Sharma, Ms. Ketki; Kim, Yong-Ha; Yiacoumi, Sotira; ...
2016-01-30
The mixing process of fresh water and seawater releases a significant amount of energy and is a potential source of renewable energy. The so called ‘blue energy’ or salinity-gradient energy can be harvested by a device consisting of carbon electrodes immersed in an electrolyte solution, based on the principle of capacitive double layer expansion (CDLE). In this study, we have investigated the feasibility of energy production based on the CDLE principle. Experiments and computer simulations were used to study the process. Mesoporous carbon materials, synthesized at the Oak Ridge National Laboratory, were used as electrode materials in the experiments. Neutronmore » imaging of the blue energy cycle was conducted with cylindrical mesoporous carbon electrodes and 0.5 M lithium chloride as the electrolyte solution. For experiments conducted at 0.6 V and 0.9 V applied potential, a voltage increase of 0.061 V and 0.054 V was observed, respectively. From sequences of neutron images obtained for each step of the blue energy cycle, information on the direction and magnitude of lithium ion transport was obtained. A computer code was developed to simulate the process. Experimental data and computer simulations allowed us to predict energy production.« less
Faculty experiences with providing online courses. Thorns among the roses.
Cravener, P A
1999-01-01
This article presents a review of the literature summarizing faculty reports of their experiences with computer-mediated distance education compared with their traditional face-to-face teaching experiences. Both challenges and benefits of distance learning programs contrasted with classroom-based teaching are revealed. Specific difficulties and advantages identified by online faculty were categorized into four broad areas of impact on the teaching/learning experience: (a) faculty workload, (b) access to education, (c) adapting to technology, and (d) instructional quality. Challenges appear to be related predominantly to faculty workloads, new technologies, and online course management. Benefits identified by online educators indicate that computer-mediated distance education has high potential for expanding student access to educational resources, for providing individualized instruction, and for promoting active learning among geographically separated members of learning groups.
Classification of cancerous cells based on the one-class problem approach
NASA Astrophysics Data System (ADS)
Murshed, Nabeel A.; Bortolozzi, Flavio; Sabourin, Robert
1996-03-01
One of the most important factors in reducing the effect of cancerous diseases is the early diagnosis, which requires a good and a robust method. With the advancement of computer technologies and digital image processing, the development of a computer-based system has become feasible. In this paper, we introduce a new approach for the detection of cancerous cells. This approach is based on the one-class problem approach, through which the classification system need only be trained with patterns of cancerous cells. This reduces the burden of the training task by about 50%. Based on this approach, a computer-based classification system is developed, based on the Fuzzy ARTMAP neural networks. Experimental results were performed using a set of 542 patterns taken from a sample of breast cancer. Results of the experiment show 98% correct identification of cancerous cells and 95% correct identification of non-cancerous cells.
The silicon synapse or, neural net computing.
Frenger, P
1989-01-01
Recent developments have rekindled interest in the electronic neural network, a form of parallel computer architecture loosely based on the nervous system of living creatures. This paper describes the elements of neural net computers, reviews the historical milestones in their development, and lists the advantages and disadvantages of their use. Methods for software simulation of neural network systems on existing computers, as well as creation of hardware analogues, are given. The most successful applications of these techniques, involving emulation of biological system responses, are presented. The author's experiences with neural net systems are discussed.
Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano
2013-01-01
The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.
Learning Evolution and the Nature of Science Using Evolutionary Computing and Artificial Life
ERIC Educational Resources Information Center
Pennock, Robert T.
2007-01-01
Because evolution in natural systems happens so slowly, it is difficult to design inquiry-based labs where students can experiment and observe evolution in the way they can when studying other phenomena. New research in evolutionary computation and artificial life provides a solution to this problem. This paper describes a new A-Life software…
ERIC Educational Resources Information Center
Jacobson, Michael J.; Taylor, Charlotte E.; Richards, Deborah
2016-01-01
In this paper, we propose computational scientific inquiry (CSI) as an innovative model for learning important scientific knowledge and new practices for "doing" science. This approach involves the use of a "game-like" virtual world for students to experience virtual biological fieldwork in conjunction with using an agent-based…
Learning and Teaching with a Computer Scanner
ERIC Educational Resources Information Center
Planinsic, G.; Gregorcic, B.; Etkina, E.
2014-01-01
This paper introduces the readers to simple inquiry-based activities (experiments with supporting questions) that one can do with a computer scanner to help students learn and apply the concepts of relative motion in 1 and 2D, vibrational motion and the Doppler effect. We also show how to use these activities to help students think like…
Suggestions for CAP-TSD mesh and time-step input parameters
NASA Technical Reports Server (NTRS)
Bland, Samuel R.
1991-01-01
Suggestions for some of the input parameters used in the CAP-TSD (Computational Aeroelasticity Program-Transonic Small Disturbance) computer code are presented. These parameters include those associated with the mesh design and time step. The guidelines are based principally on experience with a one-dimensional model problem used to study wave propagation in the vertical direction.
The Impact of the Digital Divide on First-Year Community College Students
ERIC Educational Resources Information Center
Mansfield, Malinda
2017-01-01
Some students do not possess the learning management system (LMS) and basic computer skills needed for success in first-year experience (FYE) courses. The purpose of this quantitative study, based on the Integrative Learning Design Framework and theory of transactional distance, was to identify what basic computer skills and LMS skills are needed…
ERIC Educational Resources Information Center
Sadik, Olgun
2017-01-01
The primary purpose of this study was to identify secondary computer science (CS) teachers' needs, related to knowledge, skills, and school setting, to create more effective CS education in the United States. In addition, this study examined how these needs change based on the participants' years of teaching experience as well as their background…
Designing Serious Game Interventions for Individuals with Autism.
Whyte, Elisabeth M; Smyth, Joshua M; Scherf, K Suzanne
2015-12-01
The design of "Serious games" that use game components (e.g., storyline, long-term goals, rewards) to create engaging learning experiences has increased in recent years. We examine of the core principles of serious game design and examine the current use of these principles in computer-based interventions for individuals with autism. Participants who undergo these computer-based interventions often show little evidence of the ability to generalize such learning to novel, everyday social communicative interactions. This lack of generalized learning may result, in part, from the limited use of fundamental elements of serious game design that are known to maximize learning. We suggest that future computer-based interventions should consider the full range of serious game design principles that promote generalization of learning.
Spin-neurons: A possible path to energy-efficient neuromorphic computers
NASA Astrophysics Data System (ADS)
Sharad, Mrigank; Fan, Deliang; Roy, Kaushik
2013-12-01
Recent years have witnessed growing interest in the field of brain-inspired computing based on neural-network architectures. In order to translate the related algorithmic models into powerful, yet energy-efficient cognitive-computing hardware, computing-devices beyond CMOS may need to be explored. The suitability of such devices to this field of computing would strongly depend upon how closely their physical characteristics match with the essential computing primitives employed in such models. In this work, we discuss the rationale of applying emerging spin-torque devices for bio-inspired computing. Recent spin-torque experiments have shown the path to low-current, low-voltage, and high-speed magnetization switching in nano-scale magnetic devices. Such magneto-metallic, current-mode spin-torque switches can mimic the analog summing and "thresholding" operation of an artificial neuron with high energy-efficiency. Comparison with CMOS-based analog circuit-model of a neuron shows that "spin-neurons" (spin based circuit model of neurons) can achieve more than two orders of magnitude lower energy and beyond three orders of magnitude reduction in energy-delay product. The application of spin-neurons can therefore be an attractive option for neuromorphic computers of future.
Spin-neurons: A possible path to energy-efficient neuromorphic computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharad, Mrigank; Fan, Deliang; Roy, Kaushik
Recent years have witnessed growing interest in the field of brain-inspired computing based on neural-network architectures. In order to translate the related algorithmic models into powerful, yet energy-efficient cognitive-computing hardware, computing-devices beyond CMOS may need to be explored. The suitability of such devices to this field of computing would strongly depend upon how closely their physical characteristics match with the essential computing primitives employed in such models. In this work, we discuss the rationale of applying emerging spin-torque devices for bio-inspired computing. Recent spin-torque experiments have shown the path to low-current, low-voltage, and high-speed magnetization switching in nano-scale magnetic devices.more » Such magneto-metallic, current-mode spin-torque switches can mimic the analog summing and “thresholding” operation of an artificial neuron with high energy-efficiency. Comparison with CMOS-based analog circuit-model of a neuron shows that “spin-neurons” (spin based circuit model of neurons) can achieve more than two orders of magnitude lower energy and beyond three orders of magnitude reduction in energy-delay product. The application of spin-neurons can therefore be an attractive option for neuromorphic computers of future.« less
Rivard, Justin D; Vergis, Ashley S; Unger, Bertram J; Hardy, Krista M; Andrew, Chris G; Gillman, Lawrence M; Park, Jason
2014-06-01
Computer-based surgical simulators capture a multitude of metrics based on different aspects of performance, such as speed, accuracy, and movement efficiency. However, without rigorous assessment, it may be unclear whether all, some, or none of these metrics actually reflect technical skill, which can compromise educational efforts on these simulators. We assessed the construct validity of individual performance metrics on the LapVR simulator (Immersion Medical, San Jose, CA, USA) and used these data to create task-specific summary metrics. Medical students with no prior laparoscopic experience (novices, N = 12), junior surgical residents with some laparoscopic experience (intermediates, N = 12), and experienced surgeons (experts, N = 11) all completed three repetitions of four LapVR simulator tasks. The tasks included three basic skills (peg transfer, cutting, clipping) and one procedural skill (adhesiolysis). We selected 36 individual metrics on the four tasks that assessed six different aspects of performance, including speed, motion path length, respect for tissue, accuracy, task-specific errors, and successful task completion. Four of seven individual metrics assessed for peg transfer, six of ten metrics for cutting, four of nine metrics for clipping, and three of ten metrics for adhesiolysis discriminated between experience levels. Time and motion path length were significant on all four tasks. We used the validated individual metrics to create summary equations for each task, which successfully distinguished between the different experience levels. Educators should maintain some skepticism when reviewing the plethora of metrics captured by computer-based simulators, as some but not all are valid. We showed the construct validity of a limited number of individual metrics and developed summary metrics for the LapVR. The summary metrics provide a succinct way of assessing skill with a single metric for each task, but require further validation.
2013-01-01
Objective. This study compared the relationship between computer experience and performance on computerized cognitive tests and a traditional paper-and-pencil cognitive test in a sample of older adults (N = 634). Method. Participants completed computer experience and computer attitudes questionnaires, three computerized cognitive tests (Useful Field of View (UFOV) Test, Road Sign Test, and Stroop task) and a paper-and-pencil cognitive measure (Trail Making Test). Multivariate analysis of covariance was used to examine differences in cognitive performance across the four measures between those with and without computer experience after adjusting for confounding variables. Results. Although computer experience had a significant main effect across all cognitive measures, the effect sizes were similar. After controlling for computer attitudes, the relationship between computer experience and UFOV was fully attenuated. Discussion. Findings suggest that computer experience is not uniquely related to performance on computerized cognitive measures compared with paper-and-pencil measures. Because the relationship between computer experience and UFOV was fully attenuated by computer attitudes, this may imply that motivational factors are more influential to UFOV performance than computer experience. Our findings support the hypothesis that computer use is related to cognitive performance, and this relationship is not stronger for computerized cognitive measures. Implications and directions for future research are provided. PMID:22929395
NASA Astrophysics Data System (ADS)
Antoine, Marilyn V.
2011-12-01
The purpose of this research was to extend earlier research on sources of selfefficacy (Lent, Lopez, & Biechke, 1991; Usher & Pajares, 2009) to the information technology domain. The principal investigator examined how Bandura's (1977) sources of self-efficacy information---mastery experience, vicarious experience, verbal persuasion, and physiological states---shape computer self-efficacy beliefs and influence the decision to use or not use computers. The study took place at a mid-sized Historically Black College or University in the South. A convenience sample of 105 undergraduates was drawn from students enrolled in multiple sections of two introductory computer courses. There were 67 females and 38 males. This research was a correlational study of the following variables: sources of computer self-efficacy, general computer self-efficacy, outcome expectations, computer anxiety, and intention to use computers. The principal investigator administered a survey questionnaire containing 52 Likert items to measure the major study variables. Additionally, the survey instrument collected demographic variables such as gender, age, race, intended major, classification, technology use, technology adoption category, and whether the student owns a computer. The results reveal the following: (1) Mastery experience and verbal persuasion had statistically significant relationships to general computer self-efficacy, while vicarious experience and physiological states had non-significant relationships. Mastery experience had the strongest correlation to general computer self-efficacy. (2) All of the sources of computer self-efficacy had statistically significant relationships to personal outcome expectations. Vicarious experience had the strongest correlation to personal outcome expectations. (3) All of the sources of self-efficacy had statistically significant relationships to performance outcome expectations. Vicarious experience had the strongest correlation to performance outcome expectations. (4) Mastery experience and physiological states had statistically significant relationships to computer anxiety, while vicarious experience and verbal persuasion had non-significant relationships. Physiological states had the strongest correlation to computer anxiety. (5) Mastery experience, vicarious experience, and physiological states had statistically significant relationships to intention to use computers, while verbal persuasion had a non-significant relationship. Mastery experience had the strongest correlation to intention to use computers. Gender-related findings indicate that females reported higher average mastery experience, vicarious experience, physiological states, and intention to use computers than males. Females reported lower average general computer self-efficacy, computer anxiety, verbal persuasion, personal outcome expectations, and performance outcome expectations than males. The results of this study can be used to develop strategies for increasing general computer self-efficacy, outcome expectations, and intention to use computers. The results can also be used to develop strategies for reducing computer anxiety.
Numerical Modeling of Nonlinear Thermodynamics in SMA Wires
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reynolds, D R; Kloucek, P
We present a mathematical model describing the thermodynamic behavior of shape memory alloy wires, as well as a computational technique to solve the resulting system of partial differential equations. The model consists of conservation equations based on a new Helmholtz free energy potential. The computational technique introduces a viscosity-based continuation method, which allows the model to handle dynamic applications where the temporally local behavior of solutions is desired. Computational experiments document that this combination of modeling and solution techniques appropriately predicts the thermally- and stress-induced martensitic phase transitions, as well as the hysteretic behavior and production of latent heat associatedmore » with such materials.« less
NASA Technical Reports Server (NTRS)
1972-01-01
The IDAPS (Image Data Processing System) is a user-oriented, computer-based, language and control system, which provides a framework or standard for implementing image data processing applications, simplifies set-up of image processing runs so that the system may be used without a working knowledge of computer programming or operation, streamlines operation of the image processing facility, and allows multiple applications to be run in sequence without operator interaction. The control system loads the operators, interprets the input, constructs the necessary parameters for each application, and cells the application. The overlay feature of the IBSYS loader (IBLDR) provides the means of running multiple operators which would otherwise overflow core storage.
Fuel Injector Design Optimization for an Annular Scramjet Geometry
NASA Technical Reports Server (NTRS)
Steffen, Christopher J., Jr.
2003-01-01
A four-parameter, three-level, central composite experiment design has been used to optimize the configuration of an annular scramjet injector geometry using computational fluid dynamics. The computational fluid dynamic solutions played the role of computer experiments, and response surface methodology was used to capture the simulation results for mixing efficiency and total pressure recovery within the scramjet flowpath. An optimization procedure, based upon the response surface results of mixing efficiency, was used to compare the optimal design configuration against the target efficiency value of 92.5%. The results of three different optimization procedures are presented and all point to the need to look outside the current design space for different injector geometries that can meet or exceed the stated mixing efficiency target.
Posttest calculation of the PBF LOC-11B and LOC-11C experiments using RELAP4/MOD6. [PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendrix, C.E.
Comparisons between RELAP4/MOD6, Update 4 code-calculated and measured experimental data are presented for the PBF LOC-11C and LOC-11B experiments. Independent code verification techniques are now being developed and this study represents a preliminary effort applying structured criteria for developing computer models, selecting code input, and performing base-run analyses. Where deficiencies are indicated in the base-case representation of the experiment, methods of code and criteria improvement are developed and appropriate recommendations are made.
Wang, Zhaocai; Ji, Zuwen; Wang, Xiaoming; Wu, Tunhua; Huang, Wei
2017-12-01
As a promising approach to solve the computationally intractable problem, the method based on DNA computing is an emerging research area including mathematics, computer science and molecular biology. The task scheduling problem, as a well-known NP-complete problem, arranges n jobs to m individuals and finds the minimum execution time of last finished individual. In this paper, we use a biologically inspired computational model and describe a new parallel algorithm to solve the task scheduling problem by basic DNA molecular operations. In turn, we skillfully design flexible length DNA strands to represent elements of the allocation matrix, take appropriate biological experiment operations and get solutions of the task scheduling problem in proper length range with less than O(n 2 ) time complexity. Copyright © 2017. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Wei, Hui; Gong, Guanghong; Li, Ni
2017-10-01
Computer-generated hologram (CGH) is a promising 3D display technology while it is challenged by heavy computation load and vast memory requirement. To solve these problems, a depth compensating CGH calculation method based on symmetry and similarity of zone plates is proposed and implemented on graphics processing unit (GPU). An improved LUT method is put forward to compute the distances between object points and hologram pixels in the XY direction. The concept of depth compensating factor is defined and used for calculating the holograms of points with different depth positions instead of layer-based methods. The proposed method is suitable for arbitrary sampling objects with lower memory usage and higher computational efficiency compared to other CGH methods. The effectiveness of the proposed method is validated by numerical and optical experiments.
Gate sequence for continuous variable one-way quantum computation
Su, Xiaolong; Hao, Shuhong; Deng, Xiaowei; Ma, Lingyu; Wang, Meihong; Jia, Xiaojun; Xie, Changde; Peng, Kunchi
2013-01-01
Measurement-based one-way quantum computation using cluster states as resources provides an efficient model to perform computation and information processing of quantum codes. Arbitrary Gaussian quantum computation can be implemented sufficiently by long single-mode and two-mode gate sequences. However, continuous variable gate sequences have not been realized so far due to an absence of cluster states larger than four submodes. Here we present the first continuous variable gate sequence consisting of a single-mode squeezing gate and a two-mode controlled-phase gate based on a six-mode cluster state. The quantum property of this gate sequence is confirmed by the fidelities and the quantum entanglement of two output modes, which depend on both the squeezing and controlled-phase gates. The experiment demonstrates the feasibility of implementing Gaussian quantum computation by means of accessible gate sequences.
Impact of detector simulation in particle physics collider experiments
NASA Astrophysics Data System (ADS)
Daniel Elvira, V.
2017-06-01
Through the last three decades, accurate simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics (HEP) experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detector simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the precision of the physics results and publication turnaround, from data-taking to submission. It also presents estimates of the cost and economic impact of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data with increasingly complex detectors, taxing heavily the performance of simulation and reconstruction software. Consequently, exploring solutions to speed up simulation and reconstruction software to satisfy the growing demand of computing resources in a time of flat budgets is a matter that deserves immediate attention. The article ends with a short discussion on the potential solutions that are being considered, based on leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering HEP code for concurrency and parallel computing.
The process group approach to reliable distributed computing
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.
1992-01-01
The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.
On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat
NASA Astrophysics Data System (ADS)
Hua, H.
2016-12-01
Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.
The Ruggedized STD Bus Microcomputer - A low cost computer suitable for Space Shuttle experiments
NASA Technical Reports Server (NTRS)
Budney, T. J.; Stone, R. W.
1982-01-01
Previous space flight computers have been costly in terms of both hardware and software. The Ruggedized STD Bus Microcomputer is based on the commercial Mostek/Pro-Log STD Bus. Ruggedized PC cards can be based on commercial cards from more than 60 manufacturers, reducing hardware cost and design time. Software costs are minimized by using standard 8-bit microprocessors and by debugging code using commercial versions of the ruggedized flight boards while the flight hardware is being fabricated.
The November 1, 2017 issue of Cancer Research is dedicated to a collection of computational resource papers in genomics, proteomics, animal models, imaging, and clinical subjects for non-bioinformaticists looking to incorporate computing tools into their work. Scientists at Pacific Northwest National Laboratory have developed P-MartCancer, an open, web-based interactive software tool that enables statistical analyses of peptide or protein data generated from mass-spectrometry (MS)-based global proteomics experiments.
Rapid Human-Computer Interactive Conceptual Design of Mobile and Manipulative Robot Systems
2015-05-19
algorithm based on Age-Fitness Pareto Optimization (AFPO) ([9]) with an additional user prefer- ence objective and a neural network-based user model, we...greater than 40, which is about 5 times further than any robot traveled in our experiments. 6 3.3 Methods The algorithm uses a client -server computational...architecture. The client here is an interactive pro- gram which takes a pair of controllers as input, simulates4 two copies of the robot with
A Fine-Grained and Privacy-Preserving Query Scheme for Fog Computing-Enhanced Location-Based Service
Yin, Fan; Tang, Xiaohu
2017-01-01
Location-based services (LBS), as one of the most popular location-awareness applications, has been further developed to achieve low-latency with the assistance of fog computing. However, privacy issues remain a research challenge in the context of fog computing. Therefore, in this paper, we present a fine-grained and privacy-preserving query scheme for fog computing-enhanced location-based services, hereafter referred to as FGPQ. In particular, mobile users can obtain the fine-grained searching result satisfying not only the given spatial range but also the searching content. Detailed privacy analysis shows that our proposed scheme indeed achieves the privacy preservation for the LBS provider and mobile users. In addition, extensive performance analyses and experiments demonstrate that the FGPQ scheme can significantly reduce computational and communication overheads and ensure the low-latency, which outperforms existing state-of-the art schemes. Hence, our proposed scheme is more suitable for real-time LBS searching. PMID:28696395
Yang, Xue; Yin, Fan; Tang, Xiaohu
2017-07-11
Location-based services (LBS), as one of the most popular location-awareness applications, has been further developed to achieve low-latency with the assistance of fog computing. However, privacy issues remain a research challenge in the context of fog computing. Therefore, in this paper, we present a fine-grained and privacy-preserving query scheme for fog computing-enhanced location-based services, hereafter referred to as FGPQ. In particular, mobile users can obtain the fine-grained searching result satisfying not only the given spatial range but also the searching content. Detailed privacy analysis shows that our proposed scheme indeed achieves the privacy preservation for the LBS provider and mobile users. In addition, extensive performance analyses and experiments demonstrate that the FGPQ scheme can significantly reduce computational and communication overheads and ensure the low-latency, which outperforms existing state-of-the art schemes. Hence, our proposed scheme is more suitable for real-time LBS searching.
NASA Astrophysics Data System (ADS)
Shegog, Ross; Lazarus, Melanie M.; Murray, Nancy G.; Diamond, Pamela M.; Sessions, Nathalie; Zsigmond, Eva
2012-10-01
The transgenic mouse model is useful for studying the causes and potential cures for human genetic diseases. Exposing high school biology students to laboratory experience in developing transgenic animal models is logistically prohibitive. Computer-based simulation, however, offers this potential in addition to advantages of fidelity and reach. This study describes and evaluates a computer-based simulation to train advanced placement high school science students in laboratory protocols, a transgenic mouse model was produced. A simulation module on preparing a gene construct in the molecular biology lab was evaluated using a randomized clinical control design with advanced placement high school biology students in Mercedes, Texas ( n = 44). Pre-post tests assessed procedural and declarative knowledge, time on task, attitudes toward computers for learning and towards science careers. Students who used the simulation increased their procedural and declarative knowledge regarding molecular biology compared to those in the control condition (both p < 0.005). Significant increases continued to occur with additional use of the simulation ( p < 0.001). Students in the treatment group became more positive toward using computers for learning ( p < 0.001). The simulation did not significantly affect attitudes toward science in general. Computer simulation of complex transgenic protocols have potential to provide a "virtual" laboratory experience as an adjunct to conventional educational approaches.
Kaga, Chiaki; Okochi, Mina; Tomita, Yasuyuki; Kato, Ryuji; Honda, Hiroyuki
2008-03-01
We developed a method of effective peptide screening that combines experiments and computational analysis. The method is based on the concept that screening efficiency can be enhanced from even limited data by use of a model derived from computational analysis that serves as a guide to screening and combining the model with subsequent repeated experiments. Here we focus on cell-adhesion peptides as a model application of this peptide-screening strategy. Cell-adhesion peptides were screened by use of a cell-based assay of a peptide array. Starting with the screening data obtained from a limited, random 5-mer library (643 sequences), a rule regarding structural characteristics of cell-adhesion peptides was extracted by fuzzy neural network (FNN) analysis. According to this rule, peptides with unfavored residues in certain positions that led to inefficient binding were eliminated from the random sequences. In the restricted, second random library (273 sequences), the yield of cell-adhesion peptides having an adhesion rate more than 1.5-fold to that of the basal array support was significantly high (31%) compared with the unrestricted random library (20%). In the restricted third library (50 sequences), the yield of cell-adhesion peptides increased to 84%. We conclude that a repeated cycle of experiments screening limited numbers of peptides can be assisted by the rule-extracting feature of FNN.
Jiang, Yuyi; Shao, Zhiqing; Guo, Yi
2014-01-01
A complex computing problem can be solved efficiently on a system with multiple computing nodes by dividing its implementation code into several parallel processing modules or tasks that can be formulated as directed acyclic graph (DAG) problems. The DAG jobs may be mapped to and scheduled on the computing nodes to minimize the total execution time. Searching an optimal DAG scheduling solution is considered to be NP-complete. This paper proposed a tuple molecular structure-based chemical reaction optimization (TMSCRO) method for DAG scheduling on heterogeneous computing systems, based on a very recently proposed metaheuristic method, chemical reaction optimization (CRO). Comparing with other CRO-based algorithms for DAG scheduling, the design of tuple reaction molecular structure and four elementary reaction operators of TMSCRO is more reasonable. TMSCRO also applies the concept of constrained critical paths (CCPs), constrained-critical-path directed acyclic graph (CCPDAG) and super molecule for accelerating convergence. In this paper, we have also conducted simulation experiments to verify the effectiveness and efficiency of TMSCRO upon a large set of randomly generated graphs and the graphs for real world problems. PMID:25143977
Jiang, Yuyi; Shao, Zhiqing; Guo, Yi
2014-01-01
A complex computing problem can be solved efficiently on a system with multiple computing nodes by dividing its implementation code into several parallel processing modules or tasks that can be formulated as directed acyclic graph (DAG) problems. The DAG jobs may be mapped to and scheduled on the computing nodes to minimize the total execution time. Searching an optimal DAG scheduling solution is considered to be NP-complete. This paper proposed a tuple molecular structure-based chemical reaction optimization (TMSCRO) method for DAG scheduling on heterogeneous computing systems, based on a very recently proposed metaheuristic method, chemical reaction optimization (CRO). Comparing with other CRO-based algorithms for DAG scheduling, the design of tuple reaction molecular structure and four elementary reaction operators of TMSCRO is more reasonable. TMSCRO also applies the concept of constrained critical paths (CCPs), constrained-critical-path directed acyclic graph (CCPDAG) and super molecule for accelerating convergence. In this paper, we have also conducted simulation experiments to verify the effectiveness and efficiency of TMSCRO upon a large set of randomly generated graphs and the graphs for real world problems.
NASA Technical Reports Server (NTRS)
Dorais, Gregory A.; Kurien, James; Rajan, Kanna
1999-01-01
We describe the computer demonstration of the Remote Agent Experiment (RAX). The Remote Agent is a high-level, model-based, autonomous control agent being validated on the NASA Deep Space 1 spacecraft.
NASA Technical Reports Server (NTRS)
Kvaternik, R. G.
1975-01-01
Two computational procedures for analyzing complex structural systems for their natural modes and frequencies of vibration are presented. Both procedures are based on a substructures methodology and both employ the finite-element stiffness method to model the constituent substructures. The first procedure is a direct method based on solving the eigenvalue problem associated with a finite-element representation of the complete structure. The second procedure is a component-mode synthesis scheme in which the vibration modes of the complete structure are synthesized from modes of substructures into which the structure is divided. The analytical basis of the methods contains a combination of features which enhance the generality of the procedures. The computational procedures exhibit a unique utilitarian character with respect to the versatility, computational convenience, and ease of computer implementation. The computational procedures were implemented in two special-purpose computer programs. The results of the application of these programs to several structural configurations are shown and comparisons are made with experiment.
Skeels, Meredith M; Kurth, Ann; Clausen, Marc; Severynen, Anneleen; Garcia-Smith, Hal
2006-01-01
CARE+ is a tablet PC-based computer counseling tool designed to support medication adherence and secondary HIV prevention for people living with HIV. Thirty HIV+ men and women participated in our user study to assess usability and attitudes towards CARE+. We observed them using CARE+ for the first time and conducted a semi-structured interview afterwards. Our findings suggest computer counseling may reduce social bias and encourage participants to answer questions honestly. Participants felt that discussing sensitive subjects with a computer instead of a person reduced feelings of embarrassment and being judged, and promoted privacy. Results also confirm that potential users think computers can provide helpful counseling, and that many also want human counseling interaction. Our study also revealed that tablet PC-based applications are usable by our population of mixed experience computer users. Computer counseling holds great potential for providing assessment and health promotion to individuals with chronic conditions such as HIV.
NASA Astrophysics Data System (ADS)
Billard, Aude
2000-10-01
This paper summarizes a number of experiments in biologically inspired robotics. The common feature to all experiments is the use of artificial neural networks as the building blocks for the controllers. The experiments speak in favor of using a connectionist approach for designing adaptive and flexible robot controllers, and for modeling neurological processes. I present 1) DRAMA, a novel connectionist architecture, which has general property for learning time series and extracting spatio-temporal regularities in multi-modal and highly noisy data; 2) Robota, a doll-shaped robot, which imitates and learns a proto-language; 3) an experiment in collective robotics, where a group of 4 to 15 Khepera robots learn dynamically the topography of an environment whose features change frequently; 4) an abstract, computational model of primate ability to learn by imitation; 5) a model for the control of locomotor gaits in a quadruped legged robot.
NASA Technical Reports Server (NTRS)
Hirt, Stefanie M.; Reich, David B.; O'Connor, Michael B.
2010-01-01
Computational fluid dynamics was used to study the effectiveness of micro-ramp vortex generators to control oblique shock boundary layer interactions. Simulations were based on experiments previously conducted in the 15 x 15 cm supersonic wind tunnel at NASA Glenn Research Center. Four micro-ramp geometries were tested at Mach 2.0 varying the height, chord length, and spanwise spacing between micro-ramps. The overall flow field was examined. Additionally, key parameters such as boundary-layer displacement thickness, momentum thickness and incompressible shape factor were also examined. The computational results predicted the effects of the micro-ramps well, including the trends for the impact that the devices had on the shock boundary layer interaction. However, computing the shock boundary layer interaction itself proved to be problematic since the calculations predicted more pronounced adverse effects on the boundary layer due to the shock than were seen in the experiment.
NASA Technical Reports Server (NTRS)
Hirt, Stephanie M.; Reich, David B.; O'Connor, Michael B.
2012-01-01
Computational fluid dynamics was used to study the effectiveness of micro-ramp vortex generators to control oblique shock boundary layer interactions. Simulations were based on experiments previously conducted in the 15- by 15-cm supersonic wind tunnel at the NASA Glenn Research Center. Four micro-ramp geometries were tested at Mach 2.0 varying the height, chord length, and spanwise spacing between micro-ramps. The overall flow field was examined. Additionally, key parameters such as boundary-layer displacement thickness, momentum thickness and incompressible shape factor were also examined. The computational results predicted the effects of the microramps well, including the trends for the impact that the devices had on the shock boundary layer interaction. However, computing the shock boundary layer interaction itself proved to be problematic since the calculations predicted more pronounced adverse effects on the boundary layer due to the shock than were seen in the experiment.
NASA Astrophysics Data System (ADS)
Buhari, Abudhahir; Zukarnain, Zuriati Ahmad; Khalid, Roszelinda; Zakir Dato', Wira Jaafar Ahmad
2016-11-01
The applications of quantum information science move towards bigger and better heights for the next generation technology. Especially, in the field of quantum cryptography and quantum computation, the world already witnessed various ground-breaking tangible product and promising results. Quantum cryptography is one of the mature field from quantum mechanics and already available in the markets. The current state of quantum cryptography is still under various researches in order to reach the heights of digital cryptography. The complexity of quantum cryptography is higher due to combination of hardware and software. The lack of effective simulation tool to design and analyze the quantum cryptography experiments delays the reaching distance of the success. In this paper, we propose a framework to achieve an effective non-entanglement based quantum cryptography simulation tool. We applied hybrid simulation technique i.e. discrete event, continuous event and system dynamics. We also highlight the limitations of a commercial photonic simulation tool based experiments. Finally, we discuss ideas for achieving one-stop simulation package for quantum based secure key distribution experiments. All the modules of simulation framework are viewed from the computer science perspective.
Ammenwerth, Elske; Mansmann, Ulrich; Iller, Carola; Eichstädter, Ronald
2003-01-01
The documentation of the nursing process is an important but often neglected part of clinical documentation. Paper-based systems have been introduced to support nursing process documentation. Frequently, however, problems such as low quality of documentation are reported. It is unclear whether computer-based documentation systems can reduce these problems and which factors influence their acceptance by users. We introduced a computer-based nursing documentation system on four wards of the University Hospitals of Heidelberg and systematically evaluated its preconditions and its effects in a pretest-posttest intervention study. For the analysis of user acceptance, we concentrated on subjective data drawn from questionnaires and interviews. A questionnaire was developed using items from published questionnaires and items that had to be developed for the special purpose of this study. The quantitative results point to two factors influencing the acceptance of a new computer-based documentation system: the previous acceptance of the nursing process and the previous amount of self-confidence when using computers. On one ward, the diverse acceptance scores heavily declined after the introduction of the nursing documentation system. Explorative qualitative analysis on this ward points to further success factors of computer-based nursing documentation systems. Our results can be used to assist the planning and introduction of computer-based nursing documentation systems. They demonstrate the importance of computer experience and acceptance of the nursing process on a ward but also point to other factors such as the fit between nursing workflow and the functionality of a nursing documentation system.
ERIC Educational Resources Information Center
Chang, Shao-Chen; Hwang, Gwo-Jen
2017-01-01
In this study, a mission synchronization-based peer-assistance approach is proposed to improve students' learning performance in digital game-based learning activities. To evaluate the effectiveness of the proposed approach, an experiment has been conducted in an elementary school natural science course to examine the participants' learning…
Database Administration: Concepts, Tools, Experiences, and Problems.
ERIC Educational Resources Information Center
Leong-Hong, Belkis; Marron, Beatrice
The concepts of data base administration, the role of the data base administrator (DBA), and computer software tools useful in data base administration are described in order to assist data base technologists and managers. A study of DBA's in the Federal Government is detailed in terms of the functions they perform, the software tools they use,…
ERIC Educational Resources Information Center
Tang, Stephen; Hanneghan, Martin
2011-01-01
Game-based learning harnesses the advantages of computer games technology to create a fun, motivating and interactive virtual learning environment that promotes problem-based experiential learning. Such an approach is advocated by many commentators to provide an enhanced learning experience than those based on traditional didactic methods.…
Computational studies of physical properties of Nb-Si based alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ouyang, Lizhi
2015-04-16
The overall goal is to provide physical properties data supplementing experiments for thermodynamic modeling and other simulations such as phase filed simulation for microstructure and continuum simulations for mechanical properties. These predictive computational modeling and simulations may yield insights that can be used to guide materials design, processing, and manufacture. Ultimately, they may lead to usable Nb-Si based alloy which could play an important role in current plight towards greener energy. The main objectives of the proposed projects are: (1) developing a first principles method based supercell approach for calculating thermodynamic and mechanic properties of ordered crystals and disordered latticesmore » including solid solution; (2) application of the supercell approach to Nb-Si base alloy to compute physical properties data that can be used for thermodynamic modeling and other simulations to guide the optimal design of Nb-Si based alloy.« less
Student Experiments and Teacher Tests Using EDAQ530
ERIC Educational Resources Information Center
Kopasz, Katalin; Makra, Péter; Gingl, Zoltán
2013-01-01
Experiments, as we all know, are especially important in science education. However, their impact on improving thinking could be even greater when applied together with the methods of inquiry-based learning (IBL). In this paper we present our observations of a high-school laboratory class where students used computers to carry out and analyse real…
ERIC Educational Resources Information Center
Chen, Hsin-Liang; Choi, Gilok
2005-01-01
This study investigates socio-technical aspects of digital video libraries based on college students' learning experiences and perspectives. Forty-one students in biology classes were studied through a survey and individual interviews. Findings are presented by the students' knowledge of computer technology, experiences with AV materials, and…
Validity of Adult Retrospective Reports of Adverse Childhood Experiences: Review of the Evidence
ERIC Educational Resources Information Center
Hardt, Jochen; Rutter, Michael
2004-01-01
Background: Influential studies have cast doubt on the validity of retrospective reports by adults of their own adverse experiences in childhood. Accordingly, many researchers view retrospective reports with scepticism. Method: A computer-based search, supplemented by hand searches, was used to identify studies reported between 1980 and 2001 in…
An Ab Initio Based Potential Energy Surface for Water
NASA Technical Reports Server (NTRS)
Partridge, Harry; Schwenke, David W.; Langhoff, Stephen R. (Technical Monitor)
1996-01-01
We report a new determination of the water potential energy surface. A high quality ab initio potential energy surface (PES) and dipole moment function of water have been computed. This PES is empirically adjusted to improve the agreement between the computed line positions and those from the HITRAN 92 data base. The adjustment is small, nonetheless including an estimate of core (oxygen 1s) electron correlation greatly improves the agreement with experiment. Of the 27,245 assigned transitions in the HITRAN 92 data base for H2(O-16), the overall root mean square (rms) deviation between the computed and observed line positions is 0.125/cm. However the deviations do not correspond to a normal distribution: 69% of the lines have errors less than 0.05/cm. Overall, the agreement between the line intensities computed in the present work and those contained in the data base is quite good, however there are a significant number of line strengths which differ greatly.
Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy.
Morimoto, Satoshi; Remijn, Gerard B; Nakajima, Yoshitaka
2016-01-01
Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects of preceding chords on harmonic expectancy from a computational perspective, using stochastic modeling. We conducted a behavioral experiment, in which participants listened to short chord sequences and evaluated the subjective relatedness of the last chord to the preceding ones. Based on these judgments, we built stochastic models of the computational process underlying harmonic expectancy. Following this, we compared the explanatory power of the models. Our results imply that, even when listening to short chord sequences, internally constructed and updated tonal assumptions determine the expectancy of the upcoming chord.
Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy
Morimoto, Satoshi; Remijn, Gerard B.; Nakajima, Yoshitaka
2016-01-01
Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects of preceding chords on harmonic expectancy from a computational perspective, using stochastic modeling. We conducted a behavioral experiment, in which participants listened to short chord sequences and evaluated the subjective relatedness of the last chord to the preceding ones. Based on these judgments, we built stochastic models of the computational process underlying harmonic expectancy. Following this, we compared the explanatory power of the models. Our results imply that, even when listening to short chord sequences, internally constructed and updated tonal assumptions determine the expectancy of the upcoming chord. PMID:27003807
DOE Office of Scientific and Technical Information (OSTI.GOV)
I. W. Ginsberg
Multiresolutional decompositions known as spectral fingerprints are often used to extract spectral features from multispectral/hyperspectral data. In this study, the authors investigate the use of wavelet-based algorithms for generating spectral fingerprints. The wavelet-based algorithms are compared to the currently used method, traditional convolution with first-derivative Gaussian filters. The comparison analyses consists of two parts: (a) the computational expense of the new method is compared with the computational costs of the current method and (b) the outputs of the wavelet-based methods are compared with those of the current method to determine any practical differences in the resulting spectral fingerprints. The resultsmore » show that the wavelet-based algorithms can greatly reduce the computational expense of generating spectral fingerprints, while practically no differences exist in the resulting fingerprints. The analysis is conducted on a database of hyperspectral signatures, namely, Hyperspectral Digital Image Collection Experiment (HYDICE) signatures. The reduction in computational expense is by a factor of about 30, and the average Euclidean distance between resulting fingerprints is on the order of 0.02.« less
ALFA: The new ALICE-FAIR software framework
NASA Astrophysics Data System (ADS)
Al-Turany, M.; Buncic, P.; Hristov, P.; Kollegger, T.; Kouzinopoulos, C.; Lebedev, A.; Lindenstruth, V.; Manafov, A.; Richter, M.; Rybalchenko, A.; Vande Vyvre, P.; Winckler, N.
2015-12-01
The commonalities between the ALICE and FAIR experiments and their computing requirements led to the development of large parts of a common software framework in an experiment independent way. The FairRoot project has already shown the feasibility of such an approach for the FAIR experiments and extending it beyond FAIR to experiments at other facilities[1, 2]. The ALFA framework is a joint development between ALICE Online- Offline (O2) and FairRoot teams. ALFA is designed as a flexible, elastic system, which balances reliability and ease of development with performance using multi-processing and multithreading. A message- based approach has been adopted; such an approach will support the use of the software on different hardware platforms, including heterogeneous systems. Each process in ALFA assumes limited communication and reliance on other processes. Such a design will add horizontal scaling (multiple processes) to vertical scaling provided by multiple threads to meet computing and throughput demands. ALFA does not dictate any application protocols. Potentially, any content-based processor or any source can change the application protocol. The framework supports different serialization standards for data exchange between different hardware and software languages.
SPIRE Data-Base Management System
NASA Technical Reports Server (NTRS)
Fuechsel, C. F.
1984-01-01
Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.
NASA Technical Reports Server (NTRS)
Khan, Gufran Sayeed; Gubarev, Mikhail; Speegle, Chet; Ramsey, Brian
2010-01-01
The presentation includes grazing incidence X-ray optics, motivation and challenges, mid spatial frequency generation in cylindrical polishing, design considerations for polishing lap, simulation studies and experimental results, future scope, and summary. Topics include current status of replication optics technology, cylindrical polishing process using large size polishing lap, non-conformance of polishin lap to the optics, development of software and polishing machine, deterministic prediction of polishing, polishing experiment under optimum conditions, and polishing experiment based on known error profile. Future plans include determination of non-uniformity in the polishing lap compliance, development of a polishing sequence based on a known error profile of the specimen, software for generating a mandrel polishing sequence, design an development of a flexible polishing lap, and computer controlled localized polishing process.
ERIC Educational Resources Information Center
Alty, James L.
Dual Coding Theory has quite specific predictions about how information in different media is stored, manipulated and recalled. Different combinations of media are expected to have significant effects upon the recall and retention of information. This obviously may have important consequences in the design of computer-based programs. The paper…
Explicit Building Block Multiobjective Evolutionary Computation: Methods and Applications
2005-06-16
which is introduced in 1990 by Richard Dawkins in his book ”The Selfish Gene .” [34] 356 E.5.7 Pareto Envelop-based Selection Algorithm I and II...IGC Intelligent Gene Collector . . . . . . . . . . . . . . . . . 59 OED Orthogonal Experimental Design . . . . . . . . . . . . . 59 MED Main Effect...complete one experiment 74 `′ The string length hold within the computer (can be longer than number of genes
ERIC Educational Resources Information Center
van Merrienboer, Jeroen J. G.
The contributions of instructional design to cognitive science are discussed. It is argued that both sciences have their own object of study, but share a common interest in human cognition and performance as part of instructional systems. From a case study based on experience in teaching introductory computer programming, it is concluded that both…
ERIC Educational Resources Information Center
Conzelman, Gaylord M.; And Others
1975-01-01
Describes a program at the School of Veterinary Medicine, University of California at Davis to give students experience in diagnosis and management of urinary tract diseases. Students request from computer data banks that laboratory information they deem most useful in the medical management of each clinical problem. (JT)
ERIC Educational Resources Information Center
Fox, Janna; Cheng, Liying
2015-01-01
In keeping with the trend to elicit multiple stakeholder responses to operational tests as part of test validation, this exploratory mixed methods study examines test-taker accounts of an Internet-based (i.e., computer-administered) test in the high-stakes context of proficiency testing for university admission. In 2013, as language testing…
LAWS simulation: Sampling strategies and wind computation algorithms
NASA Technical Reports Server (NTRS)
Emmitt, G. D. A.; Wood, S. A.; Houston, S. H.
1989-01-01
In general, work has continued on developing and evaluating algorithms designed to manage the Laser Atmospheric Wind Sounder (LAWS) lidar pulses and to compute the horizontal wind vectors from the line-of-sight (LOS) measurements. These efforts fall into three categories: Improvements to the shot management and multi-pair algorithms (SMA/MPA); observing system simulation experiments; and ground-based simulations of LAWS.
ERIC Educational Resources Information Center
Chen, Joyce; Bankston, Ronnie
Computers are now perceived as a required resource by business, education, and government, as well as in personal life. The rates of adoption of information technologies among these groups (business, education, government, family/individual) have varied, which may have created knowledge gaps. Based on the data collected from a telephone survey in…
Pygmalion in Media-Based Learning: Effects of Quality Expectancies on Learning Outcomes
ERIC Educational Resources Information Center
Fries, Stefan; Horz, Holger; Haimerl, Charlotte
2006-01-01
Two studies investigated how quality expectations affect students' outcomes of media-based learning. Experiment 1 (N=62) demonstrated that students expecting a high-end computer-based training programme learned most, whereas students expecting a programme of ambiguous quality learned least and students having no expectations performed in between.…
Box, Simon
2014-01-01
Optimal switching of traffic lights on a network of junctions is a computationally intractable problem. In this research, road traffic networks containing signallized junctions are simulated. A computer game interface is used to enable a human ‘player’ to control the traffic light settings on the junctions within the simulation. A supervised learning approach, based on simple neural network classifiers can be used to capture human player's strategies in the game and thus develop a human-trained machine control (HuTMaC) system that approaches human levels of performance. Experiments conducted within the simulation compare the performance of HuTMaC to two well-established traffic-responsive control systems that are widely deployed in the developed world and also to a temporal difference learning-based control method. In all experiments, HuTMaC outperforms the other control methods in terms of average delay and variance over delay. The conclusion is that these results add weight to the suggestion that HuTMaC may be a viable alternative, or supplemental method, to approximate optimization for some practical engineering control problems where the optimal strategy is computationally intractable. PMID:26064570
Use of computers and the Internet by residents in US family medicine programmes.
King, Richard V; Murphy-Cullen, Cassie L; Mayo, Helen G; Marcee, Alice K; Schneider, Gregory W
2007-06-01
Computers, personal digital assistants (PDA), and the Internet are widely used as resources in medical education and clinical care. Educators who intend to incorporate these resources effectively into residency education programmes can benefit from understanding how residents currently use these tools, their skills, and their preferences. The researchers sent questionnaires to 306 US family medicine residency programmes for all of their residents to complete. Respondents were 1177 residents from 125 (41%) programmes. Access to a computer was reported by 95% of respondents. Of these, 97% of desktop and 89% of laptop computers could access the Internet. Residents accessed various educational and clinical resources. Half felt they had 'intermediate' skills at Web searches, 23% had 'some skills,' and 27% were 'quite skilled.' Those under 30 years of age reported higher skill levels. Those who experienced a Web-based curriculum in medical school reported higher search skills and greater success in finding clinical information. Respondents preferred to use technology to supplement the didactic sessions offered in resident teaching conferences. Favourable conditions exist in family medicine residency programmes to implement a blend of traditional and technology-based learning experiences. These conditions include residents' experience, skills, and preferences.
Box, Simon
2014-12-01
Optimal switching of traffic lights on a network of junctions is a computationally intractable problem. In this research, road traffic networks containing signallized junctions are simulated. A computer game interface is used to enable a human 'player' to control the traffic light settings on the junctions within the simulation. A supervised learning approach, based on simple neural network classifiers can be used to capture human player's strategies in the game and thus develop a human-trained machine control (HuTMaC) system that approaches human levels of performance. Experiments conducted within the simulation compare the performance of HuTMaC to two well-established traffic-responsive control systems that are widely deployed in the developed world and also to a temporal difference learning-based control method. In all experiments, HuTMaC outperforms the other control methods in terms of average delay and variance over delay. The conclusion is that these results add weight to the suggestion that HuTMaC may be a viable alternative, or supplemental method, to approximate optimization for some practical engineering control problems where the optimal strategy is computationally intractable.
Improved atmospheric 3D BSDF model in earthlike exoplanet using ray-tracing based method
NASA Astrophysics Data System (ADS)
Ryu, Dongok; Kim, Sug-Whan; Seong, Sehyun
2012-10-01
The studies on planetary radiative transfer computation have become important elements to disk-averaged spectral characterization of potential exoplanets. In this paper, we report an improved ray-tracing based atmospheric simulation model as a part of 3-D earth-like planet model with 3 principle sub-components i.e. land, sea and atmosphere. Any changes in ray paths and their characteristics such as radiative power and direction are computed as they experience reflection, refraction, transmission, absorption and scattering. Improved atmospheric BSDF algorithms uses Q.Liu's combined Rayleigh and aerosol Henrey-Greenstein scattering phase function. The input cloud-free atmosphere model consists of 48 layers with vertical absorption profiles and a scattering layer with their input characteristics using the GIOVANNI database. Total Solar Irradiance data are obtained from Solar Radiation and Climate Experiment (SORCE) mission. Using aerosol scattering computation, we first tested the atmospheric scattering effects with imaging simulation with HRIV, EPOXI. Then we examined the computational validity of atmospheric model with the measurements of global, direct and diffuse radiation taken from NREL(National Renewable Energy Laboratory)s pyranometers and pyrheliometers on a ground station for cases of single incident angle and for simultaneous multiple incident angles of the solar beam.
Developing Mobile Based Instruction
ERIC Educational Resources Information Center
Martin, Florence; Pastore, Raymond; Snider, Jean
2012-01-01
This paper describes an instructional design class's experience developing instruction for the mobile web. The class was taught at a southeastern university in the United States in a master's level computer based instruction course. Two example projects are showcased and student reflections on design issues are highlighted. Additionally,…
Building a Propulsion Experiment Project Management Environment
NASA Technical Reports Server (NTRS)
Keiser, Ken; Tanner, Steve; Hatcher, Danny; Graves, Sara
2004-01-01
What do you get when you cross rocket scientists with computer geeks? It is an interactive, distributed computing web of tools and services providing a more productive environment for propulsion research and development. The Rocket Engine Advancement Program 2 (REAP2) project involves researchers at several institutions collaborating on propulsion experiments and modeling. In an effort to facilitate these collaborations among researchers at different locations and with different specializations, researchers at the Information Technology and Systems Center,' University of Alabama in Huntsville, are creating a prototype web-based interactive information system in support of propulsion research. This system, to be based on experience gained in creating similar systems for NASA Earth science field experiment campaigns such as the Convection and Moisture Experiments (CAMEX), will assist in the planning and analysis of model and experiment results across REAP2 participants. The initial version of the Propulsion Experiment Project Management Environment (PExPM) consists of a controlled-access web portal facilitating the drafting and sharing of working documents and publications. Interactive tools for building and searching an annotated bibliography of publications related to REAP2 research topics have been created to help organize and maintain the results of literature searches. Also work is underway, with some initial prototypes in place, for interactive project management tools allowing project managers to schedule experiment activities, track status and report on results. This paper describes current successes, plans, and expected challenges for this project.
Retrieving relevant time-course experiments: a study on Arabidopsis microarrays.
Şener, Duygu Dede; Oğul, Hasan
2016-06-01
Understanding time-course regulation of genes in response to a stimulus is a major concern in current systems biology. The problem is usually approached by computational methods to model the gene behaviour or its networked interactions with the others by a set of latent parameters. The model parameters can be estimated through a meta-analysis of available data obtained from other relevant experiments. The key question here is how to find the relevant experiments which are potentially useful in analysing current data. In this study, the authors address this problem in the context of time-course gene expression experiments from an information retrieval perspective. To this end, they introduce a computational framework that takes a time-course experiment as a query and reports a list of relevant experiments retrieved from a given repository. These retrieved experiments can then be used to associate the environmental factors of query experiment with the findings previously reported. The model is tested using a set of time-course Arabidopsis microarrays. The experimental results show that relevant experiments can be successfully retrieved based on content similarity.
MoCog1: A computer simulation of recognition-primed human decision making
NASA Technical Reports Server (NTRS)
Gevarter, William B.
1991-01-01
This report describes the successful results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior. Most human decision-making is of the experience-based, relatively straight-forward, largely automatic, type of response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. This report describes the development of the architecture and computer program associated with such 'recognition-primed' decision-making. The resultant computer program was successfully utilized as a vehicle to simulate findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior in response to their environment. The present work is an expanded version and is based on research reported while the author was an employee of NASA ARC.
Workload Characterization of CFD Applications Using Partial Differential Equation Solvers
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
Workload characterization is used for modeling and evaluating of computing systems at different levels of detail. We present workload characterization for a class of Computational Fluid Dynamics (CFD) applications that solve Partial Differential Equations (PDEs). This workload characterization focuses on three high performance computing platforms: SGI Origin2000, EBM SP-2, a cluster of Intel Pentium Pro bases PCs. We execute extensive measurement-based experiments on these platforms to gather statistics of system resource usage, which results in workload characterization. Our workload characterization approach yields a coarse-grain resource utilization behavior that is being applied for performance modeling and evaluation of distributed high performance metacomputing systems. In addition, this study enhances our understanding of interactions between PDE solver workloads and high performance computing platforms and is useful for tuning these applications.
Girls and computer science: experiences, perceptions, and career aspirations
NASA Astrophysics Data System (ADS)
Hur, Jung Won; Andrzejewski, Carey E.; Marghitu, Daniela
2017-04-01
The purpose of this mixed methods study was to examine ways to promote computer science (CS) among girls by exploring young women's experiences and perceptions of CS as well as investigating factors affecting their career aspirations. American girls aged 10-16 participated in focus group interviews as well as pre-, post-, and follow-up surveys while attending a CS camp. The analysis of data revealed that although the participants were generally positive about the CS field, they had very limited knowledge of and experience with CS, leading to little aspiration to become computer scientists. The findings also indicated that girls' affinity for and confidence in CS were critical factors affecting their motivation for pursuing a CS-related career. The study demonstrated that participation in the CS camp motivated a small number of participants to be interested in majoring in CS, but the activity time was too short to make a significant impact. Based on the findings, we suggest that providing CS programming experiences in K-12 classrooms is important in order to boost girls' confidence and interest in CS.
Progress on the Fabric for Frontier Experiments Project at Fermilab
NASA Astrophysics Data System (ADS)
Box, Dennis; Boyd, Joseph; Dykstra, Dave; Garzoglio, Gabriele; Herner, Kenneth; Kirby, Michael; Kreymer, Arthur; Levshina, Tanya; Mhashilkar, Parag; Sharma, Neha
2015-12-01
The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.
Exploring JavaScript and ROOT technologies to create Web-based ATLAS analysis and monitoring tools
NASA Astrophysics Data System (ADS)
Sánchez Pineda, A.
2015-12-01
We explore the potential of current web applications to create online interfaces that allow the visualization, interaction and real cut-based physics analysis and monitoring of processes through a web browser. The project consists in the initial development of web- based and cloud computing services to allow students and researchers to perform fast and very useful cut-based analysis on a browser, reading and using real data and official Monte- Carlo simulations stored in ATLAS computing facilities. Several tools are considered: ROOT, JavaScript and HTML. Our study case is the current cut-based H → ZZ → llqq analysis of the ATLAS experiment. Preliminary but satisfactory results have been obtained online.
NASA Astrophysics Data System (ADS)
Suárez Araujo, Carmen Paz; Barahona da Fonseca, Isabel; Barahona da Fonseca, José; Simões da Fonseca, J.
2004-08-01
A theoretical approach that aims to the identification of information processing that may be responsible for emotional dimensions of subjective experience is studied as an initial step in the construction of a neural net model of affective dimensions of psychological experiences. In this paper it is suggested that a way of orientated recombination of attributes can be present not only in the perceptive processing but also in cognitive ones. We will present an analysis of the most important emotion theories, we show their neural organization and we propose the neural computation approach as an appropriate framework for generating knowledge about the neural base of emotional experience. Finally, in this study we present a scheme corresponding to framework to design a computational neural multi-system for Emotion (CONEMSE).
Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment
NASA Astrophysics Data System (ADS)
Ritsch, E.; Atlas Collaboration
2014-06-01
The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during Run 1 relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for Run 2, and beyond. A number of fast detector simulation, digitization and reconstruction techniques are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.
Unsteady Thick Airfoil Aerodynamics: Experiments, Computation, and Theory
NASA Technical Reports Server (NTRS)
Strangfeld, C.; Rumsey, C. L.; Mueller-Vahl, H.; Greenblatt, D.; Nayeri, C. N.; Paschereit, C. O.
2015-01-01
An experimental, computational and theoretical investigation was carried out to study the aerodynamic loads acting on a relatively thick NACA 0018 airfoil when subjected to pitching and surging, individually and synchronously. Both pre-stall and post-stall angles of attack were considered. Experiments were carried out in a dedicated unsteady wind tunnel, with large surge amplitudes, and airfoil loads were estimated by means of unsteady surface mounted pressure measurements. Theoretical predictions were based on Theodorsen's and Isaacs' results as well as on the relatively recent generalizations of van der Wall. Both two- and three-dimensional computations were performed on structured grids employing unsteady Reynolds-averaged Navier-Stokes (URANS). For pure surging at pre-stall angles of attack, the correspondence between experiments and theory was satisfactory; this served as a validation of Isaacs theory. Discrepancies were traced to dynamic trailing-edge separation, even at low angles of attack. Excellent correspondence was found between experiments and theory for airfoil pitching as well as combined pitching and surging; the latter appears to be the first clear validation of van der Wall's theoretical results. Although qualitatively similar to experiment at low angles of attack, two-dimensional URANS computations yielded notable errors in the unsteady load effects of pitching, surging and their synchronous combination. The main reason is believed to be that the URANS equations do not resolve wake vorticity (explicitly modeled in the theory) or the resulting rolled-up un- steady flow structures because high values of eddy viscosity tend to \\smear" the wake. At post-stall angles, three-dimensional computations illustrated the importance of modeling the tunnel side walls.
Support News Publications Computing for Experiments Computing for Neutrino and Muon Physics Computing for Collider Experiments Computing for Astrophysics Research and Development Accelerator Modeling ComPASS - Impact of Detector Simulation on Particle Physics Collider Experiments Daniel Elvira's paper "Impact
Self-Organized Service Negotiation for Collaborative Decision Making
Zhang, Bo; Zheng, Ziming
2014-01-01
This paper proposes a self-organized service negotiation method for CDM in intelligent and automatic manners. It mainly includes three phases: semantic-based capacity evaluation for the CDM sponsor, trust computation of the CDM organization, and negotiation selection of the decision-making service provider (DMSP). In the first phase, the CDM sponsor produces the formal semantic description of the complex decision task for DMSP and computes the capacity evaluation values according to participator instructions from different DMSPs. In the second phase, a novel trust computation approach is presented to compute the subjective belief value, the objective reputation value, and the recommended trust value. And in the third phase, based on the capacity evaluation and trust computation, a negotiation mechanism is given to efficiently implement the service selection. The simulation experiment results show that our self-organized service negotiation method is feasible and effective for CDM. PMID:25243228
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnstad, H.
The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnstad, H.
The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less
Self-organized service negotiation for collaborative decision making.
Zhang, Bo; Huang, Zhenhua; Zheng, Ziming
2014-01-01
This paper proposes a self-organized service negotiation method for CDM in intelligent and automatic manners. It mainly includes three phases: semantic-based capacity evaluation for the CDM sponsor, trust computation of the CDM organization, and negotiation selection of the decision-making service provider (DMSP). In the first phase, the CDM sponsor produces the formal semantic description of the complex decision task for DMSP and computes the capacity evaluation values according to participator instructions from different DMSPs. In the second phase, a novel trust computation approach is presented to compute the subjective belief value, the objective reputation value, and the recommended trust value. And in the third phase, based on the capacity evaluation and trust computation, a negotiation mechanism is given to efficiently implement the service selection. The simulation experiment results show that our self-organized service negotiation method is feasible and effective for CDM.
Computer-aided detection of initial polyp candidates with level set-based adaptive convolution
NASA Astrophysics Data System (ADS)
Zhu, Hongbin; Duan, Chaijie; Liang, Zhengrong
2009-02-01
In order to eliminate or weaken the interference between different topological structures on the colon wall, adaptive and normalized convolution methods were used to compute the first and second order spatial derivatives of computed tomographic colonography images, which is the beginning of various geometric analyses. However, the performance of such methods greatly depends on the single-layer representation of the colon wall, which is called the starting layer (SL) in the following text. In this paper, we introduce a level set-based adaptive convolution (LSAC) method to compute the spatial derivatives, in which the level set method is employed to determine a more reasonable SL. The LSAC was applied to a computer-aided detection (CAD) scheme to detect the initial polyp candidates, and experiments showed that it benefits the CAD scheme in both the detection sensitivity and specificity as compared to our previous work.
Distributed Computation of the knn Graph for Large High-Dimensional Point Sets
Plaku, Erion; Kavraki, Lydia E.
2009-01-01
High-dimensional problems arising from robot motion planning, biology, data mining, and geographic information systems often require the computation of k nearest neighbor (knn) graphs. The knn graph of a data set is obtained by connecting each point to its k closest points. As the research in the above-mentioned fields progressively addresses problems of unprecedented complexity, the demand for computing knn graphs based on arbitrary distance metrics and large high-dimensional data sets increases, exceeding resources available to a single machine. In this work we efficiently distribute the computation of knn graphs for clusters of processors with message passing. Extensions to our distributed framework include the computation of graphs based on other proximity queries, such as approximate knn or range queries. Our experiments show nearly linear speedup with over one hundred processors and indicate that similar speedup can be obtained with several hundred processors. PMID:19847318
Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I
1995-01-01
Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.
Entanglement-Based Machine Learning on a Quantum Computer
NASA Astrophysics Data System (ADS)
Cai, X.-D.; Wu, D.; Su, Z.-E.; Chen, M.-C.; Wang, X.-L.; Li, Li; Liu, N.-L.; Lu, C.-Y.; Pan, J.-W.
2015-03-01
Machine learning, a branch of artificial intelligence, learns from previous experience to optimize performance, which is ubiquitous in various fields such as computer sciences, financial analysis, robotics, and bioinformatics. A challenge is that machine learning with the rapidly growing "big data" could become intractable for classical computers. Recently, quantum machine learning algorithms [Lloyd, Mohseni, and Rebentrost, arXiv.1307.0411] were proposed which could offer an exponential speedup over classical algorithms. Here, we report the first experimental entanglement-based classification of two-, four-, and eight-dimensional vectors to different clusters using a small-scale photonic quantum computer, which are then used to implement supervised and unsupervised machine learning. The results demonstrate the working principle of using quantum computers to manipulate and classify high-dimensional vectors, the core mathematical routine in machine learning. The method can, in principle, be scaled to larger numbers of qubits, and may provide a new route to accelerate machine learning.
Bayesian experimental design for models with intractable likelihoods.
Drovandi, Christopher C; Pettitt, Anthony N
2013-12-01
In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables. © 2013, The International Biometric Society.
A CFD validation roadmap for hypersonic flows
NASA Technical Reports Server (NTRS)
Marvin, Joseph G.
1992-01-01
A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.
A CFD validation roadmap for hypersonic flows
NASA Technical Reports Server (NTRS)
Marvin, Joseph G.
1993-01-01
A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.
Early experiences in developing and managing the neuroscience gateway.
Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas T
2015-02-01
The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway.
Early experiences in developing and managing the neuroscience gateway
Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas. T.
2015-01-01
SUMMARY The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway. PMID:26523124
Auditory presentation and synchronization in Adobe Flash and HTML5/JavaScript Web experiments.
Reimers, Stian; Stewart, Neil
2016-09-01
Substantial recent research has examined the accuracy of presentation durations and response time measurements for visually presented stimuli in Web-based experiments, with a general conclusion that accuracy is acceptable for most kinds of experiments. However, many areas of behavioral research use auditory stimuli instead of, or in addition to, visual stimuli. Much less is known about auditory accuracy using standard Web-based testing procedures. We used a millisecond-accurate Black Box Toolkit to measure the actual durations of auditory stimuli and the synchronization of auditory and visual presentation onsets. We examined the distribution of timings for 100 presentations of auditory and visual stimuli across two computers with difference specs, three commonly used browsers, and code written in either Adobe Flash or JavaScript. We also examined different coding options for attempting to synchronize the auditory and visual onsets. Overall, we found that auditory durations were very consistent, but that the lags between visual and auditory onsets varied substantially across browsers and computer systems.
Implementation of DFT application on ternary optical computer
NASA Astrophysics Data System (ADS)
Junjie, Peng; Youyi, Fu; Xiaofeng, Zhang; Shuai, Kong; Xinyu, Wei
2018-03-01
As its characteristics of huge number of data bits and low energy consumption, optical computing may be used in the applications such as DFT etc. which needs a lot of computation and can be implemented in parallel. According to this, DFT implementation methods in full parallel as well as in partial parallel are presented. Based on resources ternary optical computer (TOC), extensive experiments were carried out. Experimental results show that the proposed schemes are correct and feasible. They provide a foundation for further exploration of the applications on TOC that needs a large amount calculation and can be processed in parallel.
Chang, Yaw-Jen; Chang, Cheng-Hao
2016-06-01
Based on the principle of immobilized metal affinity chromatography (IMAC), it has been found that a Ni-Co alloy-coated protein chip is able to immobilize functional proteins with a His-tag attached. In this study, an intelligent computational approach was developed to promote the performance and repeatability of a Ni-Co alloy-coated protein chip. This approach was launched out of L18 experiments. Based on the experimental data, the fabrication process model of a Ni-Co protein chip was established by using an artificial neural network, and then an optimal fabrication condition was obtained using the Taguchi genetic algorithm. The result was validated experimentally and compared with a nitrocellulose chip. Consequentially, experimental outcomes revealed that the Ni-Co alloy-coated chip, fabricated using the proposed approach, had the best performance and repeatability compared with the Ni-Co chips of an L18 orthogonal array design and the nitrocellulose chip. Moreover, the low fluorescent background of the chip surface gives a more precise fluorescent detection. Based on a small quantity of experiments, this proposed intelligent computation approach can significantly reduce the experimental cost and improve the product's quality. © 2015 Society for Laboratory Automation and Screening.
BIOSSES: a semantic sentence similarity estimation system for the biomedical domain.
Sogancioglu, Gizem; Öztürk, Hakime; Özgür, Arzucan
2017-07-15
The amount of information available in textual format is rapidly increasing in the biomedical domain. Therefore, natural language processing (NLP) applications are becoming increasingly important to facilitate the retrieval and analysis of these data. Computing the semantic similarity between sentences is an important component in many NLP tasks including text retrieval and summarization. A number of approaches have been proposed for semantic sentence similarity estimation for generic English. However, our experiments showed that such approaches do not effectively cover biomedical knowledge and produce poor results for biomedical text. We propose several approaches for sentence-level semantic similarity computation in the biomedical domain, including string similarity measures and measures based on the distributed vector representations of sentences learned in an unsupervised manner from a large biomedical corpus. In addition, ontology-based approaches are presented that utilize general and domain-specific ontologies. Finally, a supervised regression based model is developed that effectively combines the different similarity computation metrics. A benchmark data set consisting of 100 sentence pairs from the biomedical literature is manually annotated by five human experts and used for evaluating the proposed methods. The experiments showed that the supervised semantic sentence similarity computation approach obtained the best performance (0.836 correlation with gold standard human annotations) and improved over the state-of-the-art domain-independent systems up to 42.6% in terms of the Pearson correlation metric. A web-based system for biomedical semantic sentence similarity computation, the source code, and the annotated benchmark data set are available at: http://tabilab.cmpe.boun.edu.tr/BIOSSES/ . gizemsogancioglu@gmail.com or arzucan.ozgur@boun.edu.tr. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Machining fixture layout optimization using particle swarm optimization algorithm
NASA Astrophysics Data System (ADS)
Dou, Jianping; Wang, Xingsong; Wang, Lei
2011-05-01
Optimization of fixture layout (locator and clamp locations) is critical to reduce geometric error of the workpiece during machining process. In this paper, the application of particle swarm optimization (PSO) algorithm is presented to minimize the workpiece deformation in the machining region. A PSO based approach is developed to optimize fixture layout through integrating ANSYS parametric design language (APDL) of finite element analysis to compute the objective function for a given fixture layout. Particle library approach is used to decrease the total computation time. The computational experiment of 2D case shows that the numbers of function evaluations are decreased about 96%. Case study illustrates the effectiveness and efficiency of the PSO based optimization approach.
Bajaj, Chandrajit; Chen, Shun-Chuan; Rand, Alexander
2011-01-01
In order to compute polarization energy of biomolecules, we describe a boundary element approach to solving the linearized Poisson-Boltzmann equation. Our approach combines several important features including the derivative boundary formulation of the problem and a smooth approximation of the molecular surface based on the algebraic spline molecular surface. State of the art software for numerical linear algebra and the kernel independent fast multipole method is used for both simplicity and efficiency of our implementation. We perform a variety of computational experiments, testing our method on a number of actual proteins involved in molecular docking and demonstrating the effectiveness of our solver for computing molecular polarization energy. PMID:21660123
Cloud-Based Virtual Laboratory for Network Security Education
ERIC Educational Resources Information Center
Xu, Le; Huang, Dijiang; Tsai, Wei-Tek
2014-01-01
Hands-on experiments are essential for computer network security education. Existing laboratory solutions usually require significant effort to build, configure, and maintain and often do not support reconfigurability, flexibility, and scalability. This paper presents a cloud-based virtual laboratory education platform called V-Lab that provides a…
E-Learning. Trends and Issues Alert.
ERIC Educational Resources Information Center
Imel, Susan
Electronic learning, also known as e-learning, is generally defined as instruction and learning experiences that are delivered via electronic technology such as the Internet, audiotape and videotape, satellite broadcast, interactive television, and CD-ROM. Web-based learning, computer-based learning, and virtual classrooms are some of the…
Event-based Plausibility Immediately Influences On-line Language Comprehension
Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L.; Scheepers, Christoph; McRae, Ken
2011-01-01
In some theories of sentence comprehension, linguistically-relevant lexical knowledge such as selectional restrictions is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients. Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns such as hair when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge, rather than lexical-grammatical knowledge. PMID:21517222
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Jordan Ned; Carver, Zana A.; Weber, Thomas J.
A combination experimental and computational approach was developed to predict chemical transport into saliva. A serous-acinar chemical transport assay was established to measure chemical transport with non-physiological (standard cell culture medium) and physiological (using surrogate plasma and saliva medium) conditions using 3,5,6-trichloro-2-pyridinol (TCPy) a metabolite of the pesticide chlorpyrifos. High levels of TCPy protein binding was observed in cell culture medium and rat plasma resulting in different TCPy transport behaviors in the two experimental conditions. In the non-physiological transport experiment, TCPy reached equilibrium at equivalent concentrations in apical and basolateral chambers. At higher TCPy doses, increased unbound TCPy was observed,more » and TCPy concentrations in apical and basolateral chambers reached equilibrium faster than lower doses, suggesting only unbound TCPy is able to cross the cellular monolayer. In the physiological experiment, TCPy transport was slower than non-physiological conditions, and equilibrium was achieved at different concentrations in apical and basolateral chambers at a comparable ratio (0.034) to what was previously measured in rats dosed with TCPy (saliva:blood ratio: 0.049). A cellular transport computational model was developed based on TCPy protein binding kinetics and accurately simulated all transport experiments using different permeability coefficients for the two experimental conditions (1.4 vs 0.4 cm/hr for non-physiological and physiological experiments, respectively). The computational model was integrated into a physiologically based pharmacokinetic (PBPK) model and accurately predicted TCPy concentrations in saliva of rats dosed with TCPy. Overall, this study demonstrates an approach to predict chemical transport in saliva potentially increasing the utility of salivary biomonitoring in the future.« less
Laghari, Samreen; Niazi, Muaz A
2016-01-01
Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.
Adaptation of acoustic model experiments of STM via smartphones and tablets
NASA Astrophysics Data System (ADS)
Thees, Michael; Hochberg, Katrin; Kuhn, Jochen; Aeschlimann, Martin
2017-10-01
The importance of Scanning Tunneling Microscopy (STM) in today's research and industry leads to the question of how to include such a key technology in physics education. Manfred Euler has developed an acoustic model experiment to illustrate the fundamental measuring principles based on an analogy between quantum mechanics and acoustics. Based on earlier work we applied mobile devices such as smartphones and tablets instead of using a computer to record and display the experimental data and thus converted Euler's experimental setup into a low-cost experiment that is easy to build and handle by students themselves.
The meaning of computers to a group of men who are homeless.
Miller, Kathleen Swenson; Bunch-Harrison, Stacey; Brumbaugh, Brett; Kutty, Rekha Sankaran; FitzGerald, Kathleen
2005-01-01
The purpose of this pilot study was to explore the experience with computers and the meaning of computers to a group of homeless men living in a long-term shelter. This descriptive exploratory study used semistructured interviews with seven men who had been given access to computers and had participated in individually tailored occupation based interventions through a Work Readiness Program. Three themes emerged from analyzing the interviews: access to computers, computers as a bridge to life-skill development, and changed self-perceptions as a result of connecting to technology. Because they lacked computer knowledge and feared failure, the majority of study participants had not sought out computers available through public access. The need for access to computers, the potential use of computers as a medium for intervention, and the meaning of computers to these men who represent the digital divide are described in this study.
Research on elastic resource management for multi-queue under cloud computing environment
NASA Astrophysics Data System (ADS)
CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang
2017-10-01
As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.
The future of PanDA in ATLAS distributed computing
NASA Astrophysics Data System (ADS)
De, K.; Klimentov, A.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Schovancova, J.; Vaniachine, A.; Wenaus, T.
2015-12-01
Experiments at the Large Hadron Collider (LHC) face unprecedented computing challenges. Heterogeneous resources are distributed worldwide at hundreds of sites, thousands of physicists analyse the data remotely, the volume of processed data is beyond the exabyte scale, while data processing requires more than a few billion hours of computing usage per year. The PanDA (Production and Distributed Analysis) system was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. In the process, the old batch job paradigm of locally managed computing in HEP was discarded in favour of a far more automated, flexible and scalable model. The success of PanDA in ATLAS is leading to widespread adoption and testing by other experiments. PanDA is the first exascale workload management system in HEP, already operating at more than a million computing jobs per day, and processing over an exabyte of data in 2013. There are many new challenges that PanDA will face in the near future, in addition to new challenges of scale, heterogeneity and increasing user base. PanDA will need to handle rapidly changing computing infrastructure, will require factorization of code for easier deployment, will need to incorporate additional information sources including network metrics in decision making, be able to control network circuits, handle dynamically sized workload processing, provide improved visualization, and face many other challenges. In this talk we will focus on the new features, planned or recently implemented, that are relevant to the next decade of distributed computing workload management using PanDA.
Global Sensory Qualities and Aesthetic Experience in Music.
Brattico, Pauli; Brattico, Elvira; Vuust, Peter
2017-01-01
A well-known tradition in the study of visual aesthetics holds that the experience of visual beauty is grounded in global computational or statistical properties of the stimulus, for example, scale-invariant Fourier spectrum or self-similarity. Some approaches rely on neural mechanisms, such as efficient computation, processing fluency, or the responsiveness of the cells in the primary visual cortex. These proposals are united by the fact that the contributing factors are hypothesized to be global (i.e., they concern the percept as a whole), formal or non-conceptual (i.e., they concern form instead of content), computational and/or statistical, and based on relatively low-level sensory properties. Here we consider that the study of aesthetic responses to music could benefit from the same approach. Thus, along with local features such as pitch, tuning, consonance/dissonance, harmony, timbre, or beat, also global sonic properties could be viewed as contributing toward creating an aesthetic musical experience. Several such properties are discussed and their neural implementation is reviewed in the light of recent advances in neuroaesthetics.
Tank, J; Smith, L; Spedding, G R
2017-02-06
The flight of many birds and bats, and their robotic counterparts, occurs over a range of chord-based Reynolds numbers from 1 × 10 4 to 1.5 × 10 5 . It is precisely over this range where the aerodynamics of simple, rigid, fixed wings becomes extraordinarily sensitive to small changes in geometry and the environment, with two sets of consequences. The first is that practical lifting devices at this scale will likely not be simple, rigid, fixed wings. The second is that it becomes non-trivial to make baseline comparisons for experiment and computation, when either one can be wrong. Here we examine one ostensibly simple case of the NACA 0012 aerofoil and make careful comparison between the technical literature, and new experiments and computations. The agreement (or lack thereof) will establish one or more baseline results and some sensitivities around them. The idea is that the diagnostic procedures will help to guide comparisons and predictions in subsequent more complex cases.
Tank, J.; Smith, L.
2017-01-01
The flight of many birds and bats, and their robotic counterparts, occurs over a range of chord-based Reynolds numbers from 1 × 104 to 1.5 × 105. It is precisely over this range where the aerodynamics of simple, rigid, fixed wings becomes extraordinarily sensitive to small changes in geometry and the environment, with two sets of consequences. The first is that practical lifting devices at this scale will likely not be simple, rigid, fixed wings. The second is that it becomes non-trivial to make baseline comparisons for experiment and computation, when either one can be wrong. Here we examine one ostensibly simple case of the NACA 0012 aerofoil and make careful comparison between the technical literature, and new experiments and computations. The agreement (or lack thereof) will establish one or more baseline results and some sensitivities around them. The idea is that the diagnostic procedures will help to guide comparisons and predictions in subsequent more complex cases. PMID:28163869
Skylab earth resources experiment package /EREP/ - Sea surface topography experiment
NASA Technical Reports Server (NTRS)
Vonbun, F. O.; Marsh, J. G.; Mcgoogan, J. T.; Leitao, C. D.; Vincent, S.; Wells, W. T.
1976-01-01
The S-193 Skylab radar altimeter was operated in a round-the-world pass on Jan. 31, 1974. The main purpose of this experiment was to test and 'measure' the variation of the sea surface topography using the Goddard Space Flight Center (GSFC) geoid model as a reference. This model is based upon 430,000 satellite and 25,000 ground gravity observations. Variations of the sea surface on the order of -40 to +60 m were observed along this pass. The 'computed' and 'measured' sea surfaces have an rms agreement on the order of 7 m. This is quite satisfactory, considering that this was the first time the sea surface has been observed directly over a distance of nearly 35,000 km and compared to a computed model. The Skylab orbit for this global pass was computed using the Goddard Earth Model (GEM 6) and S-band radar tracking data, resulting in an orbital height uncertainty of better than 5 m over one orbital period.
Analyzing high energy physics data using database computing: Preliminary report
NASA Technical Reports Server (NTRS)
Baden, Andrew; Day, Chris; Grossman, Robert; Lifka, Dave; Lusk, Ewing; May, Edward; Price, Larry
1991-01-01
A proof of concept system is described for analyzing high energy physics (HEP) data using data base computing. The system is designed to scale up to the size required for HEP experiments at the Superconducting SuperCollider (SSC) lab. These experiments will require collecting and analyzing approximately 10 to 100 million 'events' per year during proton colliding beam collisions. Each 'event' consists of a set of vectors with a total length of approx. one megabyte. This represents an increase of approx. 2 to 3 orders of magnitude in the amount of data accumulated by present HEP experiments. The system is called the HEPDBC System (High Energy Physics Database Computing System). At present, the Mark 0 HEPDBC System is completed, and can produce analysis of HEP experimental data approx. an order of magnitude faster than current production software on data sets of approx. 1 GB. The Mark 1 HEPDBC System is currently undergoing testing and is designed to analyze data sets 10 to 100 times larger.
NASA Astrophysics Data System (ADS)
Maraffi, S.
2016-12-01
Context/PurposeWe experienced a new teaching and learning technology: a Computer Class Role Playing Game (RPG) to perform educational activity in classrooms through an interactive game. This approach is new, there are some experiences on educational games, but mainly individual and not class-based. Gaming all together in a class, with a single scope for the whole class, it enhances peer collaboration, cooperative problem solving and friendship. MethodsTo perform the research we experimented the games in several classes of different degrees, acquiring specific questionnaire by teachers and pupils. Results Experimental results were outstanding: RPG, our interactive activity, exceed by 50% the overall satisfaction compared to traditional lessons or Power Point supported teaching. InterpretationThe appreciation of RPG was in agreement with the class level outcome identified by the teacher after the experimentation. Our work experience get excellent feedbacks by teachers, in terms of efficacy of this new teaching methodology and of achieved results. Using new methodology more close to the student point of view improves the innovation and creative capacities of learners, and it support the new role of teacher as learners' "coach". ConclusionThis paper presents the first experimental results on the application of this new technology based on a Computer game which project on a wall in the class an adventure lived by the students. The plots of the actual adventures are designed for deeper learning of Science, Technology, Engineering, Mathematics (STEM) and Social Sciences & Humanities (SSH). The participation of the pupils it's based on the interaction with the game by the use of their own tablets or smartphones. The game is based on a mixed reality learning environment, giving the students the feel "to be IN the adventure".
Astronomy Education using the Web and a Computer Algebra System
NASA Astrophysics Data System (ADS)
Flurchick, K. M.; Culver, Roger B.; Griego, Ben
2013-04-01
The combination of a web server and a Computer Algebra System to provide students the ability to explore and investigate astronomical concepts presented in a class can help student understanding. This combination of technologies provides a framework to extend the classroom experience with independent student exploration. In this presentation we report on the developmen of this web based material and some initial results of students making use of the computational tools using webMathematica^TM. The material developed allow the student toanalyze and investigate a variety of astronomical phenomena, including topics such as the Runge-Lenz vector, descriptions of the orbits of some of the exo-planets, Bode' law and other topics related to celestial mechanics. The server based Computer Algebra System system allows for computations without installing software on the student's computer but provides a powerful environment to explore the various concepts. The current system is installed at North Carolina A&T State University and has been used in several undergraduate classes.
A Development of Lightweight Grid Interface
NASA Astrophysics Data System (ADS)
Iwai, G.; Kawai, Y.; Sasaki, T.; Watase, Y.
2011-12-01
In order to help a rapid development of Grid/Cloud aware applications, we have developed API to abstract the distributed computing infrastructures based on SAGA (A Simple API for Grid Applications). SAGA, which is standardized in the OGF (Open Grid Forum), defines API specifications to access distributed computing infrastructures, such as Grid, Cloud and local computing resources. The Universal Grid API (UGAPI), which is a set of command line interfaces (CLI) and APIs, aims to offer simpler API to combine several SAGA interfaces with richer functionalities. These CLIs of the UGAPI offer typical functionalities required by end users for job management and file access to the different distributed computing infrastructures as well as local computing resources. We have also built a web interface for the particle therapy simulation and demonstrated the large scale calculation using the different infrastructures at the same time. In this paper, we would like to present how the web interface based on UGAPI and SAGA achieve more efficient utilization of computing resources over the different infrastructures with technical details and practical experiences.
Progress on the FabrIc for Frontier Experiments project at Fermilab
Box, Dennis; Boyd, Joseph; Dykstra, Dave; ...
2015-12-23
The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercialmore » cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. Hence, the progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide« less
Advanced tools for smartphone-based experiments: phyphox
NASA Astrophysics Data System (ADS)
Staacks, S.; Hütz, S.; Heinke, H.; Stampfer, C.
2018-07-01
The sensors in modern smartphones are a promising and cost-effective tool for experimentation in physics education, but many experiments face practical problems. Often the phone is inaccessible during the experiment and the data usually needs to be analyzed subsequently on a computer. We address both problems by introducing a new app, called ‘phyphox’, which is specifically designed for utilizing experiments in physics teaching. The app is free and designed to offer the same set of features on Android and iOS.
ERIC Educational Resources Information Center
Walker, Jearl
1982-01-01
Spatial filtering, based on diffraction/interference of light waves, is a technique by which unwanted information in a picture ("noise") can be separated from wanted information. A series of experiments is described in which students can create a system that functions as an optical computer to create clearer pictures. (Author/JN)
Experience in Teaching OOAD to Various Students
ERIC Educational Resources Information Center
Boberic-Krsticev, Danijela; Tešendic, Danijela
2013-01-01
The paper elaborates on experiences and lessons learned from the course on object-oriented analyses and design at the Faculty of Sciences, Novi Sad. The course on OOAD is taught to students of computer science and to the students of mathematical programme. Conclusions made in this paper are based on results of students' assignments as well as…
gPhysics--Using Smart Glasses for Head-Centered, Context-Aware Learning in Physics Experiments
ERIC Educational Resources Information Center
Kuhn, Jochen; Lukowicz, Paul; Hirth, Michael; Poxrucker, Andreas; Weppner, Jens; Younas, Junaid
2016-01-01
Smart Glasses such as Google Glass are mobile computers combining classical Head-Mounted Displays (HMD) with several sensors. Therefore, contact-free, sensor-based experiments can be linked with relating, near-eye presented multiple representations. We will present a first approach on how Smart Glasses can be used as an experimental tool for…
Benefits of computer screen-based simulation in learning cardiac arrest procedures.
Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc
2010-07-01
What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to use high-fidelity patient simulators, which present simulations that are closer to real-life situations.
Remote sensing of land-based voids using computer enhanced infrared thermography
NASA Astrophysics Data System (ADS)
Weil, Gary J.
1989-10-01
Experiments are described in which computer-enhanced infrared thermography techniques are used to detect and describe subsurface land-based voids, such as voids surrounding buried utility pipes, voids in concrete structures such as airport taxiways, abandoned buried utility storage tanks, and caves and underground shelters. Infrared thermography also helps to evaluate bridge deck systems, highway pavements, and garage concrete. The IR thermography techniques make it possible to survey large areas quickly and efficiently. The paper also surveys the advantages and limitations of thermographic testing in comparison with other forms of NDT.
Adaptive control for eye-gaze input system
NASA Astrophysics Data System (ADS)
Zhao, Qijie; Tu, Dawei; Yin, Hairong
2004-01-01
The characteristics of the vision-based human-computer interaction system have been analyzed, and the practical application and its limited factors at present time have also been mentioned. The information process methods have been put forward. In order to make the communication flexible and spontaneous, the algorithms to adaptive control of user"s head movement has been designed, and the events-based methods and object-oriented computer language is used to develop the system software, by experiment testing, we found that under given condition, these methods and algorithms can meet the need of the HCI.
Chen, Xiaojun; Xu, Lu; Wang, Huixiang; Wang, Fang; Wang, Qiugen; Kikinis, Ron
2017-01-01
Implant placement has been widely used in various kinds of surgery. However, accurate intraoperative drilling performance is essential to avoid injury to adjacent structures. Although some commercially-available surgical navigation systems have been approved for clinical applications, these systems are expensive and the source code is not available to researchers. 3D Slicer is a free, open source software platform for the research community of computer-aided surgery. In this study, a loadable module based on Slicer has been developed and validated to support surgical navigation. This research module allows reliable calibration of the surgical drill, point-based registration and surface matching registration, so that the position and orientation of the surgical drill can be tracked and displayed on the computer screen in real time, aiming at reducing risks. In accuracy verification experiments, the mean target registration error (TRE) for point-based and surface-based registration were 0.31±0.06mm and 1.01±0.06mm respectively, which should meet clinical requirements. Both phantom and cadaver experiments demonstrated the feasibility of our surgical navigation software module. PMID:28109564
ERIC Educational Resources Information Center
Paul, Sandra K.; Kranberg, Susan
The third report from a comprehensive Unesco study, this document traces the history of the application of computer-based technology to the book distribution process in the United States and indicates functional areas currently showing the effects of using this technology. Ways in which computer use is altering book distribution management…
Veteran Unemployment of Transitioning Marines
2013-11-01
military experience. C2 Marines have high AFQT scores and work with information systems; they may pursue, for example, computer science degrees in college...i.e., they made a rational decision based on lack of information). DOD actuarial officials use the low MGIB benefit use rate to maintain program...such as computer science , to make their military skills transferable, while others may not. Marines in services, repair/maintenance, operator, and
ERIC Educational Resources Information Center
Seiter, Ellen
2007-01-01
Based on four years of experience teaching computers to 8-12 year olds, media scholar Ellen Seiter offers parents and educators practical advice on what children need to know about the Internet and when they need to know it. "The Internet Playground" argues that, contrary to the promises of technology boosters, teaching with computers is…
Classifying Web Pages by Using Knowledge Bases for Entity Retrieval
NASA Astrophysics Data System (ADS)
Kiritani, Yusuke; Ma, Qiang; Yoshikawa, Masatoshi
In this paper, we propose a novel method to classify Web pages by using knowledge bases for entity search, which is a kind of typical Web search for information related to a person, location or organization. First, we map a Web page to entities according to the similarities between the page and the entities. Various methods for computing such similarity are applied. For example, we can compute the similarity between a given page and a Wikipedia article describing a certain entity. The frequency of an entity appearing in the page is another factor used in computing the similarity. Second, we construct a directed acyclic graph, named PEC graph, based on the relations among Web pages, entities, and categories, by referring to YAGO, a knowledge base built on Wikipedia and WordNet. Finally, by analyzing the PEC graph, we classify Web pages into categories. The results of some preliminary experiments validate the methods proposed in this paper.
Experiments in cooperative-arm object manipulation with a two-armed free-flying robot. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Koningstein, Ross
1990-01-01
Developing computed-torque controllers for complex manipulator systems using current techniques and tools is difficult because they address the issues pertinent to simulation, as opposed to control. A new formulation of computed-torque (CT) control that leads to an automated computer-torque robot controller program is presented. This automated tool is used for simulations and experimental demonstrations of endpoint and object control from a free-flying robot. A new computed-torque formulation states the multibody control problem in an elegant, homogeneous, and practical form. A recursive dynamics algorithm is presented that numerically evaluates kinematics and dynamics terms for multibody systems given a topological description. Manipulators may be free-flying, and may have closed-chain constraints. With the exception of object squeeze-force control, the algorithm does not deal with actuator redundancy. The algorithm is used to implement an automated 2D computed-torque dynamics and control package that allows joint, endpoint, orientation, momentum, and object squeeze-force control. This package obviates the need for hand-derivation of kinematics and dynamics, and is used for both simulation and experimental control. Endpoint control experiments are performed on a laboratory robot that has two arms to manipulate payloads, and uses an air bearing to achieve very-low drag characteristics. Simulations and experimental data for endpoint and object controllers are presented for the experimental robot - a complex dynamic system. There is a certain rather wide set of conditions under which CT endpoint controllers can neglect robot base accelerations (but not motions) and achieve comparable performance including base accelerations in the model. The regime over which this simplification holds is explored by simulation and experiment.
Doing Your Science While You're in Orbit
NASA Astrophysics Data System (ADS)
Green, Mark L.; Miller, Stephen D.; Vazhkudai, Sudharshan S.; Trater, James R.
2010-11-01
Large-scale neutron facilities such as the Spallation Neutron Source (SNS) located at Oak Ridge National Laboratory need easy-to-use access to Department of Energy Leadership Computing Facilities and experiment repository data. The Orbiter thick- and thin-client and its supporting Service Oriented Architecture (SOA) based services (available at https://orbiter.sns.gov) consist of standards-based components that are reusable and extensible for accessing high performance computing, data and computational grid infrastructure, and cluster-based resources easily from a user configurable interface. The primary Orbiter system goals consist of (1) developing infrastructure for the creation and automation of virtual instrumentation experiment optimization, (2) developing user interfaces for thin- and thick-client access, (3) provide a prototype incorporating major instrument simulation packages, and (4) facilitate neutron science community access and collaboration. The secure Orbiter SOA authentication and authorization is achieved through the developed Virtual File System (VFS) services, which use Role-Based Access Control (RBAC) for data repository file access, thin-and thick-client functionality and application access, and computational job workflow management. The VFS Relational Database Management System (RDMS) consists of approximately 45 database tables describing 498 user accounts with 495 groups over 432,000 directories with 904,077 repository files. Over 59 million NeXus file metadata records are associated to the 12,800 unique NeXus file field/class names generated from the 52,824 repository NeXus files. Services that enable (a) summary dashboards of data repository status with Quality of Service (QoS) metrics, (b) data repository NeXus file field/class name full text search capabilities within a Google like interface, (c) fully functional RBAC browser for the read-only data repository and shared areas, (d) user/group defined and shared metadata for data repository files, (e) user, group, repository, and web 2.0 based global positioning with additional service capabilities are currently available. The SNS based Orbiter SOA integration progress with the Distributed Data Analysis for Neutron Scattering Experiments (DANSE) software development project is summarized with an emphasis on DANSE Central Services and the Virtual Neutron Facility (VNF). Additionally, the DANSE utilization of the Orbiter SOA authentication, authorization, and data transfer services best practice implementations are presented.
Cloud computing task scheduling strategy based on improved differential evolution algorithm
NASA Astrophysics Data System (ADS)
Ge, Junwei; He, Qian; Fang, Yiqiu
2017-04-01
In order to optimize the cloud computing task scheduling scheme, an improved differential evolution algorithm for cloud computing task scheduling is proposed. Firstly, the cloud computing task scheduling model, according to the model of the fitness function, and then used improved optimization calculation of the fitness function of the evolutionary algorithm, according to the evolution of generation of dynamic selection strategy through dynamic mutation strategy to ensure the global and local search ability. The performance test experiment was carried out in the CloudSim simulation platform, the experimental results show that the improved differential evolution algorithm can reduce the cloud computing task execution time and user cost saving, good implementation of the optimal scheduling of cloud computing tasks.
Rubrics and Exemplars in Text-Conferencing
ERIC Educational Resources Information Center
Zahara, Allan
2005-01-01
The author draws on his K-12 teaching experiences in analyzing the strengths and weaknesses of asynchronous, text-based conferencing in online education. Issues relating to Web-based versus client-driven systems in computer-mediated conferencing (CMC) are examined. The paper also discusses pedagogical and administrative implications of choosing a…
Developing and Assessing E-Learning Techniques for Teaching Forecasting
ERIC Educational Resources Information Center
Gel, Yulia R.; O'Hara Hines, R. Jeanette; Chen, He; Noguchi, Kimihiro; Schoner, Vivian
2014-01-01
In the modern business environment, managers are increasingly required to perform decision making and evaluate related risks based on quantitative information in the face of uncertainty, which in turn increases demand for business professionals with sound skills and hands-on experience with statistical data analysis. Computer-based training…
Outreach programmes to attract girls into computing: how the best laid plans can sometimes fail
NASA Astrophysics Data System (ADS)
Lang, Catherine; Fisher, Julie; Craig, Annemieke; Forgasz, Helen
2015-07-01
This article presents a reflective analysis of an outreach programme called the Digital Divas Club. This curriculum-based programme was delivered in Australian schools with the aim of stimulating junior and middle school girls' interest in computing courses and careers. We believed that we had developed a strong intervention programme based on previous literature and our collective knowledge and experiences. While it was coordinated by university academics, the programme content was jointly created and modified by practicing school teachers. After four years, when the final data were compiled, it showed that our programme produced significant change to student confidence in computing, but the ability to influence a desire to pursue a career path in computing did not fully eventuate. To gain a deeper insight in to why this may be the case, data collected from two of the schools are interrogated in more detail as described in this article. These schools were at the end of the expected programme outcomes. We found that despite designing a programme that delivered a multi-layered positive computing experience, factors beyond our control such as school culture and teacher technical self-efficacy help account for the unanticipated results. Despite our best laid plans, the expectations that this semester long programme would influence students' longer term career outcomes may have been aspirational at best.
Empirical flow parameters - a tool for hydraulic model validity assessment : [summary].
DOT National Transportation Integrated Search
2013-10-01
Hydraulic modeling assembles models based on generalizations of parameter values from textbooks, professional literature, computer program documentation, and engineering experience. Actual measurements adjacent to the model location are seldom availa...
NASA Astrophysics Data System (ADS)
Class, G.; Meyder, R.; Stratmanns, E.
1985-12-01
The large data base for validation and development of computer codes for two-phase flow, generated at the COSIMA facility, is reviewed. The aim of COSIMA is to simulate the hydraulic, thermal, and mechanical conditions in the subchannel and the cladding of fuel rods in pressurized water reactors during the blowout phase of a loss of coolant accident. In terms of fuel rod behavior, it is found that during blowout under realistic conditions only small strains are reached. For cladding rupture extremely high rod internal pressures are necessary. The behavior of fuel rod simulators and the effect of thermocouples attached to the cladding outer surface are clarified. Calculations performed with the codes RELAP and DRUFAN show satisfactory agreement with experiments. This can be improved by updating the phase separation models in the codes.
Blind topological measurement-based quantum computation.
Morimae, Tomoyuki; Fujii, Keisuke
2012-01-01
Blind quantum computation is a novel secure quantum-computing protocol that enables Alice, who does not have sufficient quantum technology at her disposal, to delegate her quantum computation to Bob, who has a fully fledged quantum computer, in such a way that Bob cannot learn anything about Alice's input, output and algorithm. A recent proof-of-principle experiment demonstrating blind quantum computation in an optical system has raised new challenges regarding the scalability of blind quantum computation in realistic noisy conditions. Here we show that fault-tolerant blind quantum computation is possible in a topologically protected manner using the Raussendorf-Harrington-Goyal scheme. The error threshold of our scheme is 4.3 × 10(-3), which is comparable to that (7.5 × 10(-3)) of non-blind topological quantum computation. As the error per gate of the order 10(-3) was already achieved in some experimental systems, our result implies that secure cloud quantum computation is within reach.
Blind topological measurement-based quantum computation
NASA Astrophysics Data System (ADS)
Morimae, Tomoyuki; Fujii, Keisuke
2012-09-01
Blind quantum computation is a novel secure quantum-computing protocol that enables Alice, who does not have sufficient quantum technology at her disposal, to delegate her quantum computation to Bob, who has a fully fledged quantum computer, in such a way that Bob cannot learn anything about Alice's input, output and algorithm. A recent proof-of-principle experiment demonstrating blind quantum computation in an optical system has raised new challenges regarding the scalability of blind quantum computation in realistic noisy conditions. Here we show that fault-tolerant blind quantum computation is possible in a topologically protected manner using the Raussendorf-Harrington-Goyal scheme. The error threshold of our scheme is 4.3×10-3, which is comparable to that (7.5×10-3) of non-blind topological quantum computation. As the error per gate of the order 10-3 was already achieved in some experimental systems, our result implies that secure cloud quantum computation is within reach.
Performance monitoring for brain-computer-interface actions.
Schurger, Aaron; Gale, Steven; Gozel, Olivia; Blanke, Olaf
2017-02-01
When presented with a difficult perceptual decision, human observers are able to make metacognitive judgements of subjective certainty. Such judgements can be made independently of and prior to any overt response to a sensory stimulus, presumably via internal monitoring. Retrospective judgements about one's own task performance, on the other hand, require first that the subject perform a task and thus could potentially be made based on motor processes, proprioceptive, and other sensory feedback rather than internal monitoring. With this dichotomy in mind, we set out to study performance monitoring using a brain-computer interface (BCI), with which subjects could voluntarily perform an action - moving a cursor on a computer screen - without any movement of the body, and thus without somatosensory feedback. Real-time visual feedback was available to subjects during training, but not during the experiment where the true final position of the cursor was only revealed after the subject had estimated where s/he thought it had ended up after 6s of BCI-based cursor control. During the first half of the experiment subjects based their assessments primarily on the prior probability of the end position of the cursor on previous trials. However, during the second half of the experiment subjects' judgements moved significantly closer to the true end position of the cursor, and away from the prior. This suggests that subjects can monitor task performance when the task is performed without overt movement of the body. Copyright © 2016 Elsevier Inc. All rights reserved.
User's guide for the IEBT application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartoletti, T
INFOSEC Experience-Based Training (IEBT) is a simulation and modeling approach to education in the arena of information security issues and its application to system-specific operations. The IEBT philosophy is that ''Experience is the Best Teacher''. This approach to computer-based training aims to bridge the gap between unappealing ''read the text, answer the questions'' types of training (largely a test of short-term memory), and the far more costly, time-consuming and inconvenient ''real hardware'' laboratory experience. Simulation and modeling supports this bridge by allowing the critical or salient features to be exercised while avoiding those aspects of a real world experience unrelatedmore » to the training goal.« less
Computer literacy and attitudes towards e-learning among first year medical students
Link, Thomas Michael; Marz, Richard
2006-01-01
Background At the Medical University of Vienna, most information for students is available only online. In 2005, an e-learning project was initiated and there are plans to introduce a learning management system. In this study, we estimate the level of students' computer skills, the number of students having difficulty with e-learning, and the number of students opposed to e-learning. Methods The study was conducted in an introductory course on computer-based and web-based training (CBT/WBT). Students were asked to fill out a questionnaire online that covered a wide range of relevant attitudes and experiences. Results While the great majority of students possess sufficient computer skills and acknowledge the advantages of interactive and multimedia-enhanced learning material, a small percentage lacks basic computer skills and/or is very skeptical about e-learning. There is also a consistently significant albeit weak gender difference in available computer infrastructure and Internet access. As for student attitudes toward e-learning, we found that age, computer use, and previous exposure to computers are more important than gender. A sizable number of students, 12% of the total, make little or no use of existing e-learning offerings. Conclusion Many students would benefit from a basic introduction to computers and to the relevant computer-based resources of the university. Given to the wide range of computer skills among students, a single computer course for all students would not be useful nor would it be accepted. Special measures should be taken to prevent students who lack computer skills from being disadvantaged or from developing computer-hostile attitudes. PMID:16784524
Computer literacy and attitudes towards e-learning among first year medical students.
Link, Thomas Michael; Marz, Richard
2006-06-19
At the Medical University of Vienna, most information for students is available only online. In 2005, an e-learning project was initiated and there are plans to introduce a learning management system. In this study, we estimate the level of students' computer skills, the number of students having difficulty with e-learning, and the number of students opposed to e-learning. The study was conducted in an introductory course on computer-based and web-based training (CBT/WBT). Students were asked to fill out a questionnaire online that covered a wide range of relevant attitudes and experiences. While the great majority of students possess sufficient computer skills and acknowledge the advantages of interactive and multimedia-enhanced learning material, a small percentage lacks basic computer skills and/or is very skeptical about e-learning. There is also a consistently significant albeit weak gender difference in available computer infrastructure and Internet access. As for student attitudes toward e-learning, we found that age, computer use, and previous exposure to computers are more important than gender. A sizable number of students, 12% of the total, make little or no use of existing e-learning offerings. Many students would benefit from a basic introduction to computers and to the relevant computer-based resources of the university. Given to the wide range of computer skills among students, a single computer course for all students would not be useful nor would it be accepted. Special measures should be taken to prevent students who lack computer skills from being disadvantaged or from developing computer-hostile attitudes.
Evolvable social agents for bacterial systems modeling.
Paton, Ray; Gregory, Richard; Vlachos, Costas; Saunders, Jon; Wu, Henry
2004-09-01
We present two approaches to the individual-based modeling (IbM) of bacterial ecologies and evolution using computational tools. The IbM approach is introduced, and its important complementary role to biosystems modeling is discussed. A fine-grained model of bacterial evolution is then presented that is based on networks of interactivity between computational objects representing genes and proteins. This is followed by a coarser grained agent-based model, which is designed to explore the evolvability of adaptive behavioral strategies in artificial bacteria represented by learning classifier systems. The structure and implementation of the two proposed individual-based bacterial models are discussed, and some results from simulation experiments are presented, illustrating their adaptive properties.
CNN for breaking text-based CAPTCHA with noise
NASA Astrophysics Data System (ADS)
Liu, Kaixuan; Zhang, Rong; Qing, Ke
2017-07-01
A CAPTCHA ("Completely Automated Public Turing test to tell Computers and Human Apart") system is a program that most humans can pass but current computer programs could hardly pass. As the most common type of CAPTCHAs , text-based CAPTCHA has been widely used in different websites to defense network bots. In order to breaking textbased CAPTCHA, in this paper, two trained CNN models are connected for the segmentation and classification of CAPTCHA images. Then base on these two models, we apply sliding window segmentation and voting classification methods realize an end-to-end CAPTCHA breaking system with high success rate. The experiment results show that our method is robust and effective in breaking text-based CAPTCHA with noise.
1988-09-01
scheduler’s knowledge of available employees ’ experience levels. If the scheduler lacks first-hand knowledge of employee experience levels, then assistance ...to start a new system of rotating primary employees and asked that this capability be included in the program . Yet he had only a vague idea 29 about...competitive with the DBASE prototype. The LOTUS 123 program was based around a spreadsheet that contained all the job, employee and schedule form data in a