Strain, J J; Felciano, R M; Seiver, A; Acuff, R; Fagan, L
1996-01-01
Approximately 30 minutes of computer access time are required by surgical residents at Stanford University Medical Center (SUMC) to examine the lab values of all patients on a surgical intensive care unit (ICU) service, a task that must be performed several times a day. To reduce the time accessing this information and simultaneously increase the readability and currency of the data, we have created a mobile, pen-based user interface and software system that delivers lab results to surgeons in the ICU. The ScroungeMaster system, loaded on a portable tablet computer, retrieves lab results for a subset of patients from the central laboratory computer and stores them in a local database cache. The cache can be updated on command; this update takes approximately 2.7 minutes for all ICU patients being followed by the surgeon, and can be performed as a background task while the user continues to access selected lab results. The user interface presents lab results according to physiologic system. Which labs are displayed first is governed by a layout selection algorithm based on previous accesses to the patient's lab information, physician preferences, and the nature of the patient's medical condition. Initial evaluation of the system has shown that physicians prefer the ScroungeMaster interface to that of existing systems at SUMC and are satisfied with the system's performance. We discuss the evolution of ScroungeMaster and make observations on changes to physician work flow with the presence of mobile, pen-based computing in the ICU.
Life Lab Computer Support System's Manual.
ERIC Educational Resources Information Center
Lippman, Beatrice D.; Walfish, Stephen
Step-by-step procedures for utilizing the computer support system of Miami-Dade Community College's Life Lab program are described for the following categories: (1) Registration--Student's Lists and Labels, including three separate computer programs for current listings, next semester listings, and grade listings; (2) Competence and Resource…
ERIC Educational Resources Information Center
Balakrishnan, B.; Woods, P. C.
2013-01-01
Over the years, rapid development in computer technology has engendered simulation-based laboratory (lab) in addition to the traditional hands-on (physical) lab. Many higher education institutions adopt simulation lab, replacing some existing physical lab experiments. The creation of new systems for conducting engineering lab activities has raised…
ERIC Educational Resources Information Center
Gercek, Gokhan; Saleem, Naveed
2006-01-01
Providing adequate computing lab support for Management Information Systems (MIS) and Computer Science (CS) programs is a perennial challenge for most academic institutions in the US and abroad. Factors, such as lack of physical space, budgetary constraints, conflicting needs of different courses, and rapid obsolescence of computing technology,…
Program Processes Thermocouple Readings
NASA Technical Reports Server (NTRS)
Quave, Christine A.; Nail, William, III
1995-01-01
Digital Signal Processor for Thermocouples (DART) computer program implements precise and fast method of converting voltage to temperature for large-temperature-range thermocouple applications. Written using LabVIEW software. DART available only as object code for use on Macintosh II FX or higher-series computers running System 7.0 or later and IBM PC-series and compatible computers running Microsoft Windows 3.1. Macintosh version of DART (SSC-00032) requires LabVIEW 2.2.1 or 3.0 for execution. IBM PC version (SSC-00031) requires LabVIEW 3.0 for Windows 3.1. LabVIEW software product of National Instruments and not included with program.
ERIC Educational Resources Information Center
Mok, Heng Ngee; Lee, Yeow Leong; Tan, Wee Kiat
2012-01-01
This paper describes how a generic computer laboratory equipped with 52 workstations is set up for teaching IT-related courses and other general purpose usage. The authors have successfully constructed a lab management system based on decentralised, client-side software virtualisation technology using Linux and free software tools from VMware that…
SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects
NASA Technical Reports Server (NTRS)
Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M
1998-01-01
SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.
Computer Modeling of Complete IC Fabrication Process.
1987-05-28
James Shipley National Semi.Peter N. Manos AMD Ritu Shrivastava Cypress Semi. Corp.Deborah D. Maracas Motorola, Inc. Paramjit Singh Rockwell Intl.Sidney...Carl F Daegs Sandia Hishan Z Massoud Duke* UnIVersdy Anant Dix* Silicon Systems David Matthews Hughes Rese~arch Lab DIolidi DoIIos Spery Tmioomly K...Jaczynski AT&T Bell Labs Jack C. Carlson Motorola Sanjay Jain AT&T Bell Labs Andrew Chan Fairchild Weston Systems Werner Juengling AT&T Bell Labs
Macintosh/LabVIEW based control and data acquisition system for a single photon counting fluorometer
NASA Astrophysics Data System (ADS)
Stryjewski, Wieslaw J.
1991-08-01
A flexible software system has been developed for controlling fluorescence decay measurements using the virtual instrument approach offered by LabVIEW. The time-correlated single photon counting instrument operates under computer control in both manual and automatic mode. Implementation time was short and the equipment is now easier to use, reducing the training time required for new investigators. It is not difficult to customize the front panel or adapt the program to a different instrument. We found LabVIEW much more convenient to use for this application than traditional, textual computer languages.
LabVIEW: a software system for data acquisition, data analysis, and instrument control.
Kalkman, C J
1995-01-01
Computer-based data acquisition systems play an important role in clinical monitoring and in the development of new monitoring tools. LabVIEW (National Instruments, Austin, TX) is a data acquisition and programming environment that allows flexible acquisition and processing of analog and digital data. The main feature that distinguishes LabVIEW from other data acquisition programs is its highly modular graphical programming language, "G," and a large library of mathematical and statistical functions. The advantage of graphical programming is that the code is flexible, reusable, and self-documenting. Subroutines can be saved in a library and reused without modification in other programs. This dramatically reduces development time and enables researchers to develop or modify their own programs. LabVIEW uses a large amount of processing power and computer memory, thus requiring a powerful computer. A large-screen monitor is desirable when developing larger applications. LabVIEW is excellently suited for testing new monitoring paradigms, analysis algorithms, or user interfaces. The typical LabVIEW user is the researcher who wants to develop a new monitoring technique, a set of new (derived) variables by integrating signals from several existing patient monitors, closed-loop control of a physiological variable, or a physiological simulator.
Modelling the Landing of a Plane in a Calculus Lab
ERIC Educational Resources Information Center
Morante, Antonio; Vallejo, Jose A.
2012-01-01
We exhibit a simple model of a plane landing that involves only basic concepts of differential calculus, so it is suitable for a first-year calculus lab. We use the computer algebra system Maxima and the interactive geometry software GeoGebra to do the computations and graphics. (Contains 5 figures and 1 note.)
Computer systems and software engineering
NASA Technical Reports Server (NTRS)
Mckay, Charles W.
1988-01-01
The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.
Atmospheric simulation using a liquid crystal wavefront-controlling device
NASA Astrophysics Data System (ADS)
Brooks, Matthew R.; Goda, Matthew E.
2004-10-01
Test and evaluation of laser warning devices is important due to the increased use of laser devices in aerial applications. This research consists of an atmospheric aberrating system to enable in-lab testing of various detectors and sensors. This system employs laser light at 632.8nm from a Helium-Neon source and a spatial light modulator (SLM) to cause phase changes using a birefringent liquid crystal material. Measuring outgoing radiation from the SLM using a CCD targetboard and Shack-Hartmann wavefront sensor reveals an acceptable resemblance of system output to expected atmospheric theory. Over three turbulence scenarios, an error analysis reveals that turbulence data matches theory. A wave optics computer simulation is created analogous to the lab-bench design. Phase data, intensity data, and a computer simulation affirm lab-bench results so that the aberrating SLM system can be operated confidently.
The community FabLab platform: applications and implications in biomedical engineering.
Stephenson, Makeda K; Dow, Douglas E
2014-01-01
Skill development in science, technology, engineering and math (STEM) education present one of the most formidable challenges of modern society. The Community FabLab platform presents a viable solution. Each FabLab contains a suite of modern computer numerical control (CNC) equipment, electronics and computing hardware and design, programming, computer aided design (CAD) and computer aided machining (CAM) software. FabLabs are community and educational resources and open to the public. Development of STEM based workforce skills such as digital fabrication and advanced manufacturing can be enhanced using this platform. Particularly notable is the potential of the FabLab platform in STEM education. The active learning environment engages and supports a diversity of learners, while the iterative learning that is supported by the FabLab rapid prototyping platform facilitates depth of understanding, creativity, innovation and mastery. The product and project based learning that occurs in FabLabs develops in the student a personal sense of accomplishment, self-awareness, command of the material and technology. This helps build the interest and confidence necessary to excel in STEM and throughout life. Finally the introduction and use of relevant technologies at every stage of the education process ensures technical familiarity and a broad knowledge base needed for work in STEM based fields. Biomedical engineering education strives to cultivate broad technical adeptness, creativity, interdisciplinary thought, and an ability to form deep conceptual understanding of complex systems. The FabLab platform is well designed to enhance biomedical engineering education.
Spaceport Processing System Development Lab
NASA Technical Reports Server (NTRS)
Dorsey, Michael
2013-01-01
The Spaceport Processing System Development Lab (SPSDL), developed and maintained by the Systems Hardware and Engineering Branch (NE-C4), is a development lab with its own private/restricted networks. A private/restricted network is a network with restricted or no communication with other networks. This allows users from different groups to work on their own projects in their own configured environment without interfering with others utilizing their resources in the lab. The different networks being used in the lab have no way to talk with each other due to the way they are configured, so how a user configures his software, operating system, or the equipment doesn't interfere or carry over on any of the other networks in the lab. The SPSDL is available for any project in KSC that is in need of a lab environment. My job in the SPSDL was to assist in maintaining the lab to make sure it's accessible for users. This includes, but is not limited to, making sure the computers in the lab are properly running and patched with updated hardware/software. In addition to this, I also was to assist users who had issues in utilizing the resources in the lab, which may include helping to configure a restricted network for their own environment. All of this was to ensure workers were able to use the SPSDL to work on their projects without difficulty which would in turn, benefit the work done throughout KSC. When I wasn't working in the SPSDL, I would instead help other coworkers with smaller tasks which included, but wasn't limited to, the proper disposal, moving of, or search for essential equipment. I also, during the free time I had, used NASA's resources to increase my knowledge and skills in a variety of subjects related to my major as a computer engineer, particularly in UNIX, Networking, and Embedded Systems.
FAQ's | College of Engineering & Applied Science
zipped (compressed) format. This will help when the file is very large or created by one of the high end Milwaukee Engineer People Faculty and Staff Biomedical Engineering Civil & Environmental Engineering Computer Labs Technical Questions The labs are generally open 24/7, how will I know when a lab/system
Computing Systems | High-Performance Computing | NREL
investigate, build, and test models of complex phenomena or entire integrated systems-that cannot be directly observed or manipulated in the lab, or would be too expensive or time consuming. Models and visualizations
18. VIEW OF THE GENERAL CHEMISTRY LAB. THE LABORATORY PROVIDED ...
18. VIEW OF THE GENERAL CHEMISTRY LAB. THE LABORATORY PROVIDED GENERAL ANALYTICAL AND STANDARDS CALIBRATION, AS WELL AS DEVELOPMENT OPERATIONS INCLUDING WASTE TECHNOLOGY DEVELOPMENT AND DEVELOPMENT AND TESTING OF MECHANICAL SYSTEMS FOR WEAPONS SYSTEMS. (4/4/66) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
Computers, Networks, and Desegregation at San Jose High Academy.
ERIC Educational Resources Information Center
Solomon, Gwen
1987-01-01
Describes magnet high school which was created in California to meet desegregation requirements and emphasizes computer technology. Highlights include local computer networks that connect science and music labs, the library/media center, business computer lab, writing lab, language arts skills lab, and social studies classrooms; software; teacher…
ERIC Educational Resources Information Center
Preusse-Burr, Beatrix
2011-01-01
Many classrooms have interactive whiteboards and several computers and many schools are equipped with a computer lab and mobile labs. However, there typically are not enough computers for every student in each classroom; mobile labs are often shared between several members of a team and time in the computer labs needs to be scheduled in advance.…
Computational Science News | Computational Science | NREL
-Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC
Energy and technology review, July--August, 1990
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burnham, A.K.
1990-01-01
This report highlights various research programs conducted at the Lab to include: defense systems, laser research, fusion energy, biomedical and environmental sciences, engineering, physics, chemistry, materials science, and computational analysis. It also contains a statement on the state of the Lab and Laboratory Administration. (JEF)
Automatic Response to Intrusion
2002-10-01
Computing Corporation Sidewinder Firewall [18] SRI EMERALD Basic Security Module (BSM) and EMERALD File Transfer Protocol (FTP) Monitors...the same event TCP Wrappers [24] Internet Security Systems RealSecure [31] SRI EMERALD IDIP monitor NAI Labs Generic Software Wrappers Prototype...included EMERALD , NetRadar, NAI Labs UNIX wrappers, ARGuE, MPOG, NetRadar, CyberCop Server, Gauntlet, RealSecure, and the Cyber Command System
Computer-Aided College Algebra: Learning Components that Students Find Beneficial
ERIC Educational Resources Information Center
Aichele, Douglas B.; Francisco, Cynthia; Utley, Juliana; Wescoatt, Benjamin
2011-01-01
A mixed-method study was conducted during the Fall 2008 semester to better understand the experiences of students participating in computer-aided instruction of College Algebra using the software MyMathLab. The learning environment included a computer learning system for the majority of the instruction, a support system via focus groups (weekly…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Eric D.
1999-06-17
In the world of computer-based data acquisition and control, the graphical interface program LabVIEW from National Instruments is so ubiquitous that in many ways it has almost become the laboratory standard. To date, there have been approximately fifteen books concerning LabVIEW, but Professor Essick's treatise takes on a completely different tack than all of the previous discussions. In the more standard treatments of the ways and wherefores of LabVIEW such as LabVIEW Graphical Programming: Practical Applications in Instrumentation and Control by Gary W. Johnson (McGraw Hill, NY 1997), the emphasis has been instructing the reader how to program LabVIEW tomore » create a Virtual Instrument (VI) on the computer for interfacing to a particular instruments. LabVIEW is written in G a graphical programming language developed by National Instruments. In the past the emphasis has been on training the experimenter to learn G . Without going into details here, G incorporates the usual loops, arithmetic expressions, etc., found in many programming languages, but in an icon (graphical) environment. The net result being that LabVIEW contains all of the standard methods needed for interfacing to instruments, data acquisition, data analysis, graphics, and also methodology to incorporate programs written in other languages into LabVIEW. Historically, according to Professor Essick, he developed a series of experiments for an upper division laboratory course for computer-based instrumentation. His observation was that while many students had the necessary background in computer programming languages, there were students who had virtually no concept about writing a computer program let alone a computer- based interfacing program. Thus the beginnings of a concept for not only teaching computer- based instrumentation techniques, but aiso a method for the beginner to experience writing a com- puter program. Professor Essick saw LabVIEW as the perfect environment in which to teach computer-based research skills. With this goal in mind, he has succeeded admirably. Advanced LabVIEW Labs presents a series of chapters devoted to not only introducing the reader to LabVIEW, but also to the concepts necessary for writing a successful computer pro- gram. Each chapter is an assignment for the student and is suitable for a ten week course. The first topic introduces the while loop and waveform chart VI'S. After learning how to launch LabVIEW, the student then leans how to use LabVIEW functions such as sine and cosine. The beauty of thk and subsequent chapters, the student is introduced immediately to computer-based instruction by learning how to display the results in graph form on the screen. At each point along the way, the student is not only introduced to another LabVIEW operation, but also to such subjects as spread sheets for data storage, numerical integration, Fourier transformations', curve fitting algorithms, etc. The last few chapters conclude with the purpose of the learning module, and that is, com- puter-based instrumentation. Computer-based laboratory projects such as analog-to-digital con- version, digitizing oscilloscopes treated. Advanced Lab VIEW Labs finishes with a treatment on GPIB interfacing and finally, the student is asked to create an operating VI for temperature con- trol. This is an excellent text, not only as an treatise on LabVIEW but also as an introduction to computer programming logic. All programmers, who are struggling to not only learning how interface computers to instruments, but also trying understand top down programming and other programming language techniques, should add Advanced Lab-VIEW Labs to their computer library.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Eric D.
1999-06-17
In the world of computer-based data acquisition and control, the graphical interface program LabVIEW from National Instruments is so ubiquitous that in many ways it has almost become the laboratory standard. To date, there have been approximately fifteen books concerning LabVIEW, but Professor Essick's treatise takes on a completely different tack than all of the previous discussions. In the more standard treatments of the ways and wherefores of LabVIEW such as LabVIEW Graphical Programming: Practical Applications in Instrumentation and Control by Gary W. Johnson (McGraw Hill, NY 1997), the emphasis has been instructing the reader how to program LabVIEW tomore » create a Virtual Instrument (VI) on the computer for interfacing to a particular instruments. LabVIEW is written in "G" a graphical programming language developed by National Instruments. In the past the emphasis has been on training the experimenter to learn "G". Without going into details here, "G" incorporates the usual loops, arithmetic expressions, etc., found in many programming languages, but in an icon (graphical) environment. The net result being that LabVIEW contains all of the standard methods needed for interfacing to instruments, data acquisition, data analysis, graphics, and also methodology to incorporate programs written in other languages into LabVIEW. Historically, according to Professor Essick, he developed a series of experiments for an upper division laboratory course for computer-based instrumentation. His observation was that while many students had the necessary background in computer programming languages, there were students who had virtually no concept about writing a computer program let alone a computer- based interfacing program. Thus the beginnings of a concept for not only teaching computer- based instrumentation techniques, but aiso a method for the beginner to experience writing a com- puter program. Professor Essick saw LabVIEW as the "perfect environment in which to teach computer-based research skills." With this goal in mind, he has succeeded admirably. Advanced LabVIEW Labs presents a series of chapters devoted to not only introducing the reader to LabVIEW, but also to the concepts necessary for writing a successful computer pro- gram. Each chapter is an assignment for the student and is suitable for a ten week course. The first topic introduces the while loop and waveform chart VI'S. After learning how to launch LabVIEW, the student then leans how to use LabVIEW functions such as sine and cosine. The beauty of thk and subsequent chapters, the student is introduced immediately to computer-based instruction by learning how to display the results in graph form on the screen. At each point along the way, the student is not only introduced to another LabVIEW operation, but also to such subjects as spread sheets for data storage, numerical integration, Fourier transformations', curve fitting algorithms, etc. The last few chapters conclude with the purpose of the learning module, and that is, com- puter-based instrumentation. Computer-based laboratory projects such as analog-to-digital con- version, digitizing oscilloscopes treated. Advanced Lab VIEW Labs finishes with a treatment on GPIB interfacing and finally, the student is asked to create an operating VI for temperature con- trol. This is an excellent text, not only as an treatise on LabVIEW but also as an introduction to computer programming logic. All programmers, who are struggling to not only learning how interface computers to instruments, but also trying understand top down programming and other programming language techniques, should add Advanced Lab-VIEW Labs to their computer library.« less
AnimatLab: a 3D graphics environment for neuromechanical simulations.
Cofer, David; Cymbalyuk, Gennady; Reid, James; Zhu, Ying; Heitler, William J; Edwards, Donald H
2010-03-30
The nervous systems of animals evolved to exert dynamic control of behavior in response to the needs of the animal and changing signals from the environment. To understand the mechanisms of dynamic control requires a means of predicting how individual neural and body elements will interact to produce the performance of the entire system. AnimatLab is a software tool that provides an approach to this problem through computer simulation. AnimatLab enables a computational model of an animal's body to be constructed from simple building blocks, situated in a virtual 3D world subject to the laws of physics, and controlled by the activity of a multicellular, multicompartment neural circuit. Sensor receptors on the body surface and inside the body respond to external and internal signals and then excite central neurons, while motor neurons activate Hill muscle models that span the joints and generate movement. AnimatLab provides a common neuromechanical simulation environment in which to construct and test models of any skeletal animal, vertebrate or invertebrate. The use of AnimatLab is demonstrated in a neuromechanical simulation of human arm flexion and the myotactic and contact-withdrawal reflexes. Copyright (c) 2010 Elsevier B.V. All rights reserved.
An Educational Approach to Computationally Modeling Dynamical Systems
ERIC Educational Resources Information Center
Chodroff, Leah; O'Neal, Tim M.; Long, David A.; Hemkin, Sheryl
2009-01-01
Chemists have used computational science methodologies for a number of decades and their utility continues to be unabated. For this reason we developed an advanced lab in computational chemistry in which students gain understanding of general strengths and weaknesses of computation-based chemistry by working through a specific research problem.…
Teaching Calculus with Wolfram|Alpha
ERIC Educational Resources Information Center
Dimiceli, Vincent E.; Lang, Andrew S. I. D.; Locke, LeighAnne
2010-01-01
This article describes the benefits and drawbacks of using Wolfram|Alpha as the platform for teaching calculus concepts in the lab setting. It is a result of our experiences designing and creating an entirely new set of labs using Wolfram|Alpha. We present the reasoning behind our transition from using a standard computer algebra system (CAS) to…
The Study on Virtual Medical Instrument based on LabVIEW.
Chengwei, Li; Limei, Zhang; Xiaoming, Hu
2005-01-01
With the increasing performance of computer, the virtual instrument technology has greatly advanced over the years, and then virtual medical instrument technology becomes available. This paper presents the virtual medical instrument, and then as an example, an application of a signal acquisition, processing and analysis system using LabVIEW is also given.
NASA Astrophysics Data System (ADS)
Balakrishnan, B.; Woods, P. C.
2013-05-01
Over the years, rapid development in computer technology has engendered simulation-based laboratory (lab) in addition to the traditional hands-on (physical) lab. Many higher education institutions adopt simulation lab, replacing some existing physical lab experiments. The creation of new systems for conducting engineering lab activities has raised concerns among educators on the merits and shortcomings of both physical and simulation labs; at the same time, many arguments have been raised on the differences of both labs. Investigating the effectiveness of both labs is complicated, as there are multiple factors that should be considered. In view of this challenge, a study on students' perspectives on their experience related to key aspects on engineering laboratory exercise was conducted. In this study, the Visual Auditory Read and Kinetic model was utilised to measure the students' cognitive styles. The investigation was done through a survey among participants from Multimedia University, Malaysia. The findings revealed that there are significant differences for most of the aspects in physical and simulation labs.
Co-"Lab"oration: A New Paradigm for Building a Management Information Systems Course
ERIC Educational Resources Information Center
Breimer, Eric; Cotler, Jami; Yoder, Robert
2010-01-01
We propose a new paradigm for building a Management Information Systems course that focuses on laboratory activities developed collaboratively using Computer-Mediated Communication and Collaboration tools. A highlight of our paradigm is the "practice what you preach" concept where the computer communication tools and collaboration…
A Computer Lab that Students Use but Never See
ERIC Educational Resources Information Center
Young, Jeffrey R.
2008-01-01
North Carolina State University may never build another computer lab. Instead the university has installed racks of equipment in windowless rooms where students and professors never go. This article describes a project called the Virtual Computing Lab. Users enter it remotely from their own computers in dormitory rooms or libraries. They get all…
NASA Astrophysics Data System (ADS)
Yin, Leilei; Chen, Ying-Chieh; Gelb, Jeff; Stevenson, Darren M.; Braun, Paul A.
2010-09-01
High resolution x-ray computed tomography is a powerful non-destructive 3-D imaging method. It can offer superior resolution on objects that are opaque or low contrast for optical microscopy. Synchrotron based x-ray computed tomography systems have been available for scientific research, but remain difficult to access for broader users. This work introduces a lab-based high-resolution x-ray nanotomography system with 50nm resolution in absorption and Zernike phase contrast modes. Using this system, we have demonstrated high quality 3-D images of polymerized photonic crystals which have been analyzed for band gap structures. The isotropic volumetric data shows excellent consistency with other characterization results.
Introduction to Computing: Lab Manual. Faculty Guide [and] Student Guide.
ERIC Educational Resources Information Center
Frasca, Joseph W.
This lab manual is designed to accompany a college course introducing students to computing. The exercises are designed to be completed by the average student in a supervised 2-hour block of time at a computer lab over 15 weeks. The intent of each lab session is to introduce a topic and have the student feel comfortable with the use of the machine…
Planning a Computer Lab: Considerations To Ensure Success.
ERIC Educational Resources Information Center
IALL Journal of Language Learning Technologies, 1994
1994-01-01
Presents points to consider when organizing a computer laboratory. These include the lab's overall objectives and how best to meet them; what type of students will use the lab; where the lab will be located; and what software and hardware can best meet the lab's overall objectives, population, and location requirements. Other factors include time,…
Neilson, Christine J
2010-01-01
The Saskatchewan Health Information Resources Partnership (SHIRP) provides library instruction to Saskatchewan's health care practitioners and students on placement in health care facilities as part of its mission to provide province-wide access to evidence-based health library resources. A portable computer lab was assembled in 2007 to provide hands-on training in rural health facilities that do not have computer labs of their own. Aside from some minor inconveniences, the introduction and operation of the portable lab has gone smoothly. The lab has been well received by SHIRP patrons and continues to be an essential part of SHIRP outreach.
Teaching Mathematics in the PC Lab--The Students' Viewpoints
ERIC Educational Resources Information Center
Schmidt, Karsten; Kohler, Anke
2013-01-01
The Matrix Algebra portion of the intermediate mathematics course at the Schmalkalden University Faculty of Business and Economics has been moved from a traditional classroom setting to a technology-based setting in the PC lab. A Computer Algebra System license was acquired that also allows its use on the students' own PCs. A survey was carried…
A Comparison, for Teaching Purposes, of Three Data-Acquisition Systems for the Macintosh.
ERIC Educational Resources Information Center
Swanson, Harold D.
1990-01-01
Three commercial products for data acquisition with the Macintosh computer, known by the trade names of LabVIEW, Analog Connection WorkBench, and MacLab were reviewed and compared, on the basis of actual trials, for their suitability in physiological and biological teaching laboratories. Suggestions for using these software packages are provided.…
Educationally and Cost Effective: Computers in the Classroom.
ERIC Educational Resources Information Center
Agee, Roy
1986-01-01
The author states that the educational community must provide programs that assure students they will be able to learn how to use and control computers. He discusses micro labs, prerequisites to computer literacy, curriculum development, teaching methods, simulation projects, a systems analysis project, new job titles, and primary basic skills…
Computers in Post-Secondary Developmental Education and Learning Assistance.
ERIC Educational Resources Information Center
Christ, Frank L.; McLaughlin, Richard C.
This update on computer technology--as it affects learning assistance directors and developmental education personnel--begins by reporting on new developments and changes that have taken place during the past two years in five areas: (1) hardware (microcomputer systems, low cost PC clones, combination Apple/PC machines, lab computer controllers…
Integrated Speech and Language Technology for Intelligence, Surveillance, and Reconnaissance (ISR)
2017-07-01
applying submodularity techniques to address computing challenges posed by large datasets in speech and language processing. MT and speech tools were...aforementioned research-oriented activities, the IT system administration team provided necessary support to laboratory computing and network operations...operations of SCREAM Lab computer systems and networks. Other miscellaneous activities in relation to Task Order 29 are presented in an additional fourth
1993-06-01
administering contractual support for lab-wide or multiple buys of ADP systems, software, and services. Computer systems located in the Central Computing Facility...Code Dr. D.L. Bradley Vacant Mrs. N.J. Beauchamp Dr. W.A. Kuperman Dr. E.R. Franchi Dr. M.H. Orr Dr. J.A. Bucaro Mr. L.B. Palmer Dr. D.J. Ramsdale Mr
Black hole based quantum computing in labs and in the sky
NASA Astrophysics Data System (ADS)
Dvali, Gia; Panchenko, Mischa
2016-08-01
Analyzing some well established facts, we give a model-independent parameterization of black hole quantum computing in terms of a set of macro and micro quantities and their relations. These include the relations between the extraordinarily-small energy gap of black hole qubits and important time-scales of information-processing, such as, scrambling time and Page's time. We then show, confirming and extending previous results, that other systems of nature with identical quantum informatics features are attractive Bose-Einstein systems at the critical point of quantum phase transition. Here we establish a complete isomorphy between the quantum computational properties of these two systems. In particular, we show that the quantum hair of a critical condensate is strikingly similar to the quantum hair of a black hole. Irrespectively whether one takes the similarity between the two systems as a remarkable coincidence or as a sign of a deeper underlying connection, the following is evident. Black holes are not unique in their way of quantum information processing and we can manufacture black hole based quantum computers in labs by taking advantage of quantum criticality.
Neves Tafula, Sérgio M; Moreira da Silva, Nádia; Rozanski, Verena E; Silva Cunha, João Paulo
2014-01-01
Neuroscience is an increasingly multidisciplinary and highly cooperative field where neuroimaging plays an important role. Neuroimaging rapid evolution is demanding for a growing number of computing resources and skills that need to be put in place at every lab. Typically each group tries to setup their own servers and workstations to support their neuroimaging needs, having to learn from Operating System management to specific neuroscience software tools details before any results can be obtained from each setup. This setup and learning process is replicated in every lab, even if a strong collaboration among several groups is going on. In this paper we present a new cloud service model - Brain Imaging Application as a Service (BiAaaS) - and one of its implementation - Advanced Brain Imaging Lab (ABrIL) - in the form of an ubiquitous virtual desktop remote infrastructure that offers a set of neuroimaging computational services in an interactive neuroscientist-friendly graphical user interface (GUI). This remote desktop has been used for several multi-institution cooperative projects with different neuroscience objectives that already achieved important results, such as the contribution to a high impact paper published in the January issue of the Neuroimage journal. The ABrIL system has shown its applicability in several neuroscience projects with a relatively low-cost, promoting truly collaborative actions and speeding up project results and their clinical applicability.
ERIC Educational Resources Information Center
Singh, Gurmukh
2012-01-01
The present article is primarily targeted for the advanced college/university undergraduate students of chemistry/physics education, computational physics/chemistry, and computer science. The most recent software system such as MS Visual Studio .NET version 2010 is employed to perform computer simulations for modeling Bohr's quantum theory of…
Lewis hybrid computing system, users manual
NASA Technical Reports Server (NTRS)
Bruton, W. M.; Cwynar, D. S.
1979-01-01
The Lewis Research Center's Hybrid Simulation Lab contains a collection of analog, digital, and hybrid (combined analog and digital) computing equipment suitable for the dynamic simulation and analysis of complex systems. This report is intended as a guide to users of these computing systems. The report describes the available equipment' and outlines procedures for its use. Particular is given to the operation of the PACER 100 digital processor. System software to accomplish the usual digital tasks such as compiling, editing, etc. and Lewis-developed special purpose software are described.
FY17 ISCR Scholar End-of-Assignment Report - Robbie Sadre
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadre, R.
2017-10-20
Throughout this internship assignment, I did various tasks that contributed towards the starting of the SASEDS (Safe Active Scanning for Energy Delivery Systems) and CES-21 (California Energy Systems for the 21st Century) projects in the SKYFALL laboratory. The goal of the SKYFALL laboratory is to perform modeling and simulation verification of transmission power system devices, while integrating with high-performance computing. The first thing I needed to do was acquire official Online LabVIEW training from National Instruments. Through these online tutorial modules, I learned the basics of LabVIEW, gaining experience in connecting to NI devices through the DAQmx API as wellmore » as LabVIEW basic programming techniques (structures, loops, state machines, front panel GUI design etc).« less
Measurement system for nitrous oxide based on amperometric gas sensor
NASA Astrophysics Data System (ADS)
Siswoyo, S.; Persaud, K. C.; Phillips, V. R.; Sneath, R.
2017-03-01
It has been well known that nitrous oxide is an important greenhouse gas, so monitoring and control of its concentration and emission is very important. In this work a nitrous oxide measurement system has been developed consisting of an amperometric sensor and an appropriate lab-made potentiostat that capable measuring picoampere current ranges. The sensor was constructed using a gold microelectrode as working electrode surrounded by a silver wire as quasi reference electrode, with tetraethyl ammonium perchlorate and dimethylsulphoxide as supporting electrolyte and solvent respectively. The lab-made potentiostat was built incorporating a transimpedance amplifier capable of picoampere measurements. This also incorporated a microcontroller based data acquisition system, controlled by a host personal computer using a dedicated computer program. The system was capable of detecting N2O concentrations down to 0.07 % v/v.
Computational Labs Using VPython Complement Conventional Labs in Online and Regular Physics Classes
NASA Astrophysics Data System (ADS)
Bachlechner, Martina E.
2009-03-01
Fairmont State University has developed online physics classes for the high-school teaching certificate based on the text book Matter and Interaction by Chabay and Sherwood. This lead to using computational VPython labs also in the traditional class room setting to complement conventional labs. The computational modeling process has proven to provide an excellent basis for the subsequent conventional lab and allows for a concrete experience of the difference between behavior according to a model and realistic behavior. Observations in the regular class room setting feed back into the development of the online classes.
Interfacing LabVIEW With Instrumentation for Electronic Failure Analysis and Beyond
NASA Technical Reports Server (NTRS)
Buchanan, Randy K.; Bryan, Coleman; Ludwig, Larry
1996-01-01
The Laboratory Virtual Instrumentation Engineering Workstation (LabVIEW) software is designed such that equipment and processes related to control systems can be operationally lined and controlled by the use of a computer. Various processes within the failure analysis laboratories of NASA's Kennedy Space Center (KSC) demonstrate the need for modernization and, in some cases, automation, using LabVIEW. An examination of procedures and practices with the Failure Analaysis Laboratory resulted in the conclusion that some device was necessary to elevate the potential users of LabVIEW to an operational level in minimum time. This paper outlines the process involved in creating a tutorial application to enable personnel to apply LabVIEW to their specific projects. Suggestions for furthering the extent to which LabVIEW is used are provided in the areas of data acquisition and process control.
Reflection Effects in Multimode Fiber Systems Utilizing Laser Transmitters
NASA Technical Reports Server (NTRS)
Bates, Harry E.
1991-01-01
A number of optical communication lines are now in use at NASA-Kennedy for the transmission of voice, computer data, and video signals. Now, all of these channels use a single carrier wavelength centered near 1300 or 1550 nm. Engineering tests in the past have given indications of the growth of systematic and random noise in the RF spectrum of a fiber network as the number of connector pairs is increased. This noise seems to occur when a laser transmitter is used instead of a LED. It has been suggested that the noise is caused by back reflections created at connector fiber interfaces. Experiments were performed to explore the effect of reflection on the transmitting laser under conditions of reflective feedback. This effort included computer integration of some of the instrumentation in the fiber optic lab using the Lab View software recently acquired by the lab group. The main goal was to interface the Anritsu Optical and RF spectrum analyzers to the MacIntosh II computer so that laser spectra and network RF spectra could be simultaneously and rapidly acquired in a form convenient for analysis. Both single and multimode fiber is installed at Kennedy. Since most are multimode, this effort concentrated on multimode systems.
Reflection effects in multimode fiber systems utilizing laser transmitters
NASA Astrophysics Data System (ADS)
Bates, Harry E.
1991-11-01
A number of optical communication lines are now in use at NASA-Kennedy for the transmission of voice, computer data, and video signals. Now, all of these channels use a single carrier wavelength centered near 1300 or 1550 nm. Engineering tests in the past have given indications of the growth of systematic and random noise in the RF spectrum of a fiber network as the number of connector pairs is increased. This noise seems to occur when a laser transmitter is used instead of a LED. It has been suggested that the noise is caused by back reflections created at connector fiber interfaces. Experiments were performed to explore the effect of reflection on the transmitting laser under conditions of reflective feedback. This effort included computer integration of some of the instrumentation in the fiber optic lab using the Lab View software recently acquired by the lab group. The main goal was to interface the Anritsu Optical and RF spectrum analyzers to the MacIntosh II computer so that laser spectra and network RF spectra could be simultaneously and rapidly acquired in a form convenient for analysis. Both single and multimode fiber is installed at Kennedy. Since most are multimode, this effort concentrated on multimode systems.
Evaluation and recommendations for work group integration within the Materials and Processes Lab
NASA Technical Reports Server (NTRS)
Farrington, Phillip A.
1992-01-01
The goal of this study was to evaluate and make recommendations for improving the level of integration of several work groups within the Materials and Processes Lab at the Marshall Space Flight Center. This evaluation has uncovered a variety of projects that could improve the efficiency and operation of the work groups as well as the overall integration of the system. In addition, this study provides the foundation for specification of a computer integrated manufacturing test bed environment in the Materials and Processes Lab.
Grid Computing in K-12 Schools. Soapbox Digest. Volume 3, Number 2, Fall 2004
ERIC Educational Resources Information Center
AEL, 2004
2004-01-01
Grid computing allows large groups of computers (either in a lab, or remote and connected only by the Internet) to extend extra processing power to each individual computer to work on components of a complex request. Grid middleware, recognizing priorities set by systems administrators, allows the grid to identify and use this power without…
Data recording and trend display during anaesthesia using 'MacLab'.
Kennedy, R R
1991-08-01
A single screen display of variables monitored during anaesthesia may be ergonomically superior to the 'stack' of monitors seen in many anaesthetising locations. A system based on a MacLab (Analogue Digital Instruments) analogue-to-digital convertor used in conjunction with a Macintosh computer was evaluated. The system was configured to provide trend displays of up to eight variables on a single screen. It was found to be a useful adjunct to monitoring during anaesthesia. Advantages of this system are low cost, flexibility, and the quality of the software and support provided. Limitations of this and other similar systems are discussed.
Improve Problem Solving Skills through Adapting Programming Tools
NASA Technical Reports Server (NTRS)
Shaykhian, Linda H.; Shaykhian, Gholam Ali
2007-01-01
There are numerous ways for engineers and students to become better problem-solvers. The use of command line and visual programming tools can help to model a problem and formulate a solution through visualization. The analysis of problem attributes and constraints provide insight into the scope and complexity of the problem. The visualization aspect of the problem-solving approach tends to make students and engineers more systematic in their thought process and help them catch errors before proceeding too far in the wrong direction. The problem-solver identifies and defines important terms, variables, rules, and procedures required for solving a problem. Every step required to construct the problem solution can be defined in program commands that produce intermediate output. This paper advocates improved problem solving skills through using a programming tool. MatLab created by MathWorks, is an interactive numerical computing environment and programming language. It is a matrix-based system that easily lends itself to matrix manipulation, and plotting of functions and data. MatLab can be used as an interactive command line or a sequence of commands that can be saved in a file as a script or named functions. Prior programming experience is not required to use MatLab commands. The GNU Octave, part of the GNU project, a free computer program for performing numerical computations, is comparable to MatLab. MatLab visual and command programming are presented here.
ERIC Educational Resources Information Center
Kuan, Wen-Hsuan; Tseng, Chi-Hung; Chen, Sufen; Wong, Ching-Chang
2016-01-01
We propose an integrated curriculum to establish essential abilities of computer programming for the freshmen of a physics department. The implementation of the graphical-based interfaces from Scratch to LabVIEW then to LabVIEW for Arduino in the curriculum "Computer-Assisted Instrumentation in the Design of Physics Laboratories" brings…
NASA Technical Reports Server (NTRS)
Rives, T. B.; Ingels, F. M.
1988-01-01
An analysis of the Automated Booster Assembly Checkout System (ABACS) has been conducted. A computer simulation of the ETHERNET LAN has been written. The simulation allows one to investigate different structures of the ABACS system. The simulation code is in PASCAL and is VAX compatible.
NASA Technical Reports Server (NTRS)
1988-01-01
Martin Marietta Aero and Naval Systems has advanced the CAD art to a very high level at its Robotics Laboratory. One of the company's major projects is construction of a huge Field Material Handling Robot for the Army's Human Engineering Lab. Design of FMR, intended to move heavy and dangerous material such as ammunition, was a triumph in CAD Engineering. Separate computer problems modeled the robot's kinematics and dynamics, yielding such parameters as the strength of materials required for each component, the length of the arms, their degree of freedom and power of hydraulic system needed. The Robotics Lab went a step further and added data enabling computer simulation and animation of the robot's total operational capability under various loading and unloading conditions. NASA computer program (IAC), integrated Analysis Capability Engineering Database was used. Program contains a series of modules that can stand alone or be integrated with data from sensors or software tools.
Integrated Laser Characterization, Data Acquisition, and Command and Control Test System
NASA Technical Reports Server (NTRS)
Stysley, Paul; Coyle, Barry; Lyness, Eric
2012-01-01
Satellite-based laser technology has been developed for topographical measurements of the Earth and of other planets. Lasers for such missions must be highly efficient and stable over long periods in the temperature variations of orbit. In this innovation, LabVIEW is used on an Apple Macintosh to acquire and analyze images of the laser beam as it exits the laser cavity to evaluate the laser s performance over time, and to monitor and control the environmental conditions under which the laser is tested. One computer attached to multiple cameras and instruments running LabVIEW-based software replaces a conglomeration of computers and software packages, saving hours in maintenance and data analysis, and making very longterm tests possible. This all-in-one system was written primarily using LabVIEW for Mac OS X, which allows the combining of data from multiple RS-232, USB, and Ethernet instruments for comprehensive laser analysis and control. The system acquires data from CCDs (charge coupled devices), power meters, thermistors, and oscilloscopes over a controllable period of time. This data is saved to an html file that can be accessed later from a variety of data analysis programs. Also, through the LabVIEW interface, engineers can easily control laser input parameters such as current, pulse width, chiller temperature, and repetition rates. All of these parameters can be adapted and cycled over a period of time.
ICCE/ICCAI 2000 Full & Short Papers (Virtual Lab/Classroom/School).
ERIC Educational Resources Information Center
2000
This document contains the following full and short papers on virtual laboratories, classrooms, and schools from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Collaborative Learning Support System Based on Virtual Environment Server for Multiple…
Affordances of instrumentation in general chemistry laboratories
NASA Astrophysics Data System (ADS)
Sherman, Kristin Mary Daniels
The purpose of this study is to find out what students in the first chemistry course at the undergraduate level (general chemistry for science majors) know about the affordances of instrumentation used in the general chemistry laboratory and how their knowledge develops over time. Overall, students see the PASCO(TM) system as a useful and accurate measuring tool for general chemistry labs. They see the probeware as easy to use, portable, and able to interact with computers. Students find that the PASCO(TM) probeware system is useful in their general chemistry labs, more advanced chemistry labs, and in other science classes, and can be used in a variety of labs done in general chemistry. Students learn the affordances of the probeware through the lab manual, the laboratory teaching assistant, by trial and error, and from each other. The use of probeware systems provides lab instructors the opportunity to focus on the concepts illustrated by experiments and the opportunity to spend time discussing the results. In order to teach effectively, the instructor must know the correct name of the components involved, how to assemble and disassemble it correctly, how to troubleshoot the software, and must be able to replace broken or missing components quickly. The use of podcasts or Web-based videos should increase student understanding of affordances of the probeware.
Teaching mathematics in the PC lab - the students' viewpoints
NASA Astrophysics Data System (ADS)
Schmidt, Karsten; Köhler, Anke
2013-04-01
The Matrix Algebra portion of the intermediate mathematics course at the Schmalkalden University Faculty of Business and Economics has been moved from a traditional classroom setting to a technology-based setting in the PC lab. A Computer Algebra System license was acquired that also allows its use on the students' own PCs. A survey was carried out to analyse the students' attitudes towards the use of technology in mathematics teaching.
Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model.
Liu, Fang; Velikina, Julia V; Block, Walter F; Kijowski, Richard; Samsonov, Alexey A
2017-02-01
We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexible representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplified treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed ∼ 200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure.
Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model
Velikina, Julia V.; Block, Walter F.; Kijowski, Richard; Samsonov, Alexey A.
2017-01-01
We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexibl representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplifie treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed ∼200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure. PMID:28113746
Space lab system analysis: Advanced Solid Rocket Motor (ASRM) communications networks analysis
NASA Technical Reports Server (NTRS)
Ingels, Frank M.; Moorhead, Robert J., II; Moorhead, Jane N.; Shearin, C. Mark; Thompson, Dale R.
1990-01-01
A synopsis of research on computer viruses and computer security is presented. A review of seven technical meetings attended is compiled. A technical discussion on the communication plans for the ASRM facility is presented, with a brief tutorial on the potential local area network media and protocols.
ERIC Educational Resources Information Center
Akhtar, S.; Warburton, S.; Xu, W.
2017-01-01
In this paper we report on the use of a purpose built Computer Support Collaborative learning environment designed to support lab-based CAD teaching through the monitoring of student participation and identified predictors of success. This was carried out by analysing data from the interactive learning system and correlating student behaviour with…
A Series of Computational Neuroscience Labs Increases Comfort with MATLAB.
Nichols, David F
2015-01-01
Computational simulations allow for a low-cost, reliable means to demonstrate complex and often times inaccessible concepts to undergraduates. However, students without prior computer programming training may find working with code-based simulations to be intimidating and distracting. A series of computational neuroscience labs involving the Hodgkin-Huxley equations, an Integrate-and-Fire model, and a Hopfield Memory network were used in an undergraduate neuroscience laboratory component of an introductory level course. Using short focused surveys before and after each lab, student comfort levels were shown to increase drastically from a majority of students being uncomfortable or with neutral feelings about working in the MATLAB environment to a vast majority of students being comfortable working in the environment. Though change was reported within each lab, a series of labs was necessary in order to establish a lasting high level of comfort. Comfort working with code is important as a first step in acquiring computational skills that are required to address many questions within neuroscience.
A Series of Computational Neuroscience Labs Increases Comfort with MATLAB
Nichols, David F.
2015-01-01
Computational simulations allow for a low-cost, reliable means to demonstrate complex and often times inaccessible concepts to undergraduates. However, students without prior computer programming training may find working with code-based simulations to be intimidating and distracting. A series of computational neuroscience labs involving the Hodgkin-Huxley equations, an Integrate-and-Fire model, and a Hopfield Memory network were used in an undergraduate neuroscience laboratory component of an introductory level course. Using short focused surveys before and after each lab, student comfort levels were shown to increase drastically from a majority of students being uncomfortable or with neutral feelings about working in the MATLAB environment to a vast majority of students being comfortable working in the environment. Though change was reported within each lab, a series of labs was necessary in order to establish a lasting high level of comfort. Comfort working with code is important as a first step in acquiring computational skills that are required to address many questions within neuroscience. PMID:26557798
Teaching computer interfacing with virtual instruments in an object-oriented language.
Gulotta, M
1995-01-01
LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given. PMID:8580361
Teaching computer interfacing with virtual instruments in an object-oriented language.
Gulotta, M
1995-11-01
LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given.
Nordstrom, M A; Mapletoft, E A; Miles, T S
1995-11-01
A solution is described for the acquisition on a personal computer of standard pulses derived from neuronal discharge, measurement of neuronal discharge times, real-time control of stimulus delivery based on specified inter-pulse interval conditions in the neuronal spike train, and on-line display and analysis of the experimental data. The hardware consisted of an Apple Macintosh IIci computer and a plug-in card (National Instruments NB-MIO16) that supports A/D, D/A, digital I/O and timer functions. The software was written in the object-oriented graphical programming language LabView. Essential elements of the source code of the LabView program are presented and explained. The use of the system is demonstrated in an experiment in which the reflex responses to muscle stretch are assessed for a single motor unit in the human masseter muscle.
ERIC Educational Resources Information Center
Swanson, Dewey A.; Phillips, Julie A.
At the Purdue University School of Technology (PST) at Columbus, Indiana, the Total Quality Management (TQM) philosophy was used in the computer laboratories to better meet student needs. A customer satisfaction survey was conducted to gather data on lab facilities, lab assistants, and hardware/software; other sections of the survey included…
LBNL Computational ResearchTheory Facility Groundbreaking - Full Press Conference. Feb 1st, 2012
Yelick, Kathy
2018-01-24
Energy Secretary Steven Chu, along with Berkeley Lab and UC leaders, broke ground on the Lab's Computational Research and Theory (CRT) facility yesterday. The CRT will be at the forefront of high-performance supercomputing research and be DOE's most efficient facility of its kind. Joining Secretary Chu as speakers were Lab Director Paul Alivisatos, UC President Mark Yudof, Office of Science Director Bill Brinkman, and UC Berkeley Chancellor Robert Birgeneau. The festivities were emceed by Associate Lab Director for Computing Sciences, Kathy Yelick, and Berkeley Mayor Tom Bates joined in the shovel ceremony.
LBNL Computational Research and Theory Facility Groundbreaking. February 1st, 2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yelick, Kathy
2012-02-02
Energy Secretary Steven Chu, along with Berkeley Lab and UC leaders, broke ground on the Lab's Computational Research and Theory (CRT) facility yesterday. The CRT will be at the forefront of high-performance supercomputing research and be DOE's most efficient facility of its kind. Joining Secretary Chu as speakers were Lab Director Paul Alivisatos, UC President Mark Yudof, Office of Science Director Bill Brinkman, and UC Berkeley Chancellor Robert Birgeneau. The festivities were emceed by Associate Lab Director for Computing Sciences, Kathy Yelick, and Berkeley Mayor Tom Bates joined in the shovel ceremony.
LBNL Computational Research and Theory Facility Groundbreaking. February 1st, 2012
Yelick, Kathy
2017-12-09
Energy Secretary Steven Chu, along with Berkeley Lab and UC leaders, broke ground on the Lab's Computational Research and Theory (CRT) facility yesterday. The CRT will be at the forefront of high-performance supercomputing research and be DOE's most efficient facility of its kind. Joining Secretary Chu as speakers were Lab Director Paul Alivisatos, UC President Mark Yudof, Office of Science Director Bill Brinkman, and UC Berkeley Chancellor Robert Birgeneau. The festivities were emceed by Associate Lab Director for Computing Sciences, Kathy Yelick, and Berkeley Mayor Tom Bates joined in the shovel ceremony.
Easily Transported CCD Systems for Use in Astronomy Labs
NASA Astrophysics Data System (ADS)
Meisel, D.
1992-12-01
Relatively inexpensive CCD cameras and portable computers are now easily obtained as commercially available products. I will describe a prototype system that can be used by introductory astronomy students, even urban enviroments, to obtain useful observations of the night sky. It is based on the ST-4 CCDs made by Santa Barbara Instruments Group and Macintosh Powerbook145 computers. Students take outdoor images directly from the college campus, bring the exposures back into the lab and download the images into our networked server. These stored images can then be processed (at a later time) using a variety of image processing programs including a new astronomical version of the popular "freeware" NIH Image package that is currently under development at Geneseo. The prototype of this system will be demonstrated and available for hands-on use during the meeting. This work is supported by NSF ILI Demonstration Grant USE9250493 and Grants from SUNY-GENESEO.
Heave-pitch-roll analysis and testing of air cushion landing systems
NASA Technical Reports Server (NTRS)
Boghani, A. B.; Captain, K. M.; Wormley, D. N.
1978-01-01
The analytical tools (analysis and computer simulation) needed to explain and predict the dynamic operation of air cushion landing systems (ACLS) is described. The following tasks were performed: the development of improved analytical models for the fan and the trunk; formulation of a heave pitch roll analysis for the complete ACLS; development of a general purpose computer simulation to evaluate landing and taxi performance of an ACLS equipped aircraft; and the verification and refinement of the analysis by comparison with test data obtained through lab testing of a prototype cushion. Demonstration of simulation capabilities through typical landing and taxi simulation of an ACLS aircraft are given. Initial results show that fan dynamics have a major effect on system performance. Comparison with lab test data (zero forward speed) indicates that the analysis can predict most of the key static and dynamic parameters (pressure, deflection, acceleration, etc.) within a margin of a 10 to 25 percent.
NASA Astrophysics Data System (ADS)
Cowell, Martin Andrew
The world already hosts more internet connected devices than people, and that ratio is only increasing. These devices seamlessly integrate with peoples lives to collect rich data and give immediate feedback about complex systems from business, health care, transportation, and security. As every aspect of global economies integrate distributed computing into their industrial systems and these systems benefit from rich datasets. Managing the power demands of these distributed computers will be paramount to ensure the continued operation of these networks, and is elegantly addressed by including local energy harvesting and storage on a per-node basis. By replacing non-rechargeable batteries with energy harvesting, wireless sensor nodes will increase their lifetimes by an order of magnitude. This work investigates the coupling of high power energy storage with energy harvesting technologies to power wireless sensor nodes; with sections covering device manufacturing, system integration, and mathematical modeling. First we consider the energy storage mechanism of supercapacitors and batteries, and identify favorable characteristics in both reservoir types. We then discuss experimental methods used to manufacture high power supercapacitors in our labs. We go on to detail the integration of our fabricated devices with collaborating labs to create functional sensor node demonstrations. With the practical knowledge gained through in-lab manufacturing and system integration, we build mathematical models to aid in device and system design. First, we model the mechanism of energy storage in porous graphene supercapacitors to aid in component architecture optimization. We then model the operation of entire sensor nodes for the purpose of optimally sizing the energy harvesting and energy reservoir components. In consideration of deploying these sensor nodes in real-world environments, we model the operation of our energy harvesting and power management systems subject to spatially and temporally varying energy availability in order to understand sensor node reliability. Looking to the future, we see an opportunity for further research to implement machine learning algorithms to control the energy resources of distributed computing networks.
NASA Technical Reports Server (NTRS)
Oreilly, Daniel; Williams, Robert; Yarborough, Kevin
1988-01-01
This is a tutorial/diagnostic system for training personnel in the use of the Space Shuttle Main Engine Controller (SSMEC) Simulation Lab. It also provides a diagnostic capable of isolating lab failures at least to the major lab component. The system was implemented using Hypercard, which is an program of hypermedia running on Apple Macintosh computers. Hypercard proved to be a viable platform for the development and use of sophisticated tutorial systems and moderately capable diagnostic systems. This tutorial/diagnostic system uses the basic Hypercard tools to provide the tutorial. The diagnostic part of the system uses a simple interpreter written in the Hypercard language (Hypertalk) to implement the backward chaining rule based logic commonly found in diagnostic systems using Prolog. Some of the advantages of Hypercard in developing this type of system include sophisticated graphics, animation, sound and voice capabilities, its ability as a hypermedia tool, and its ability to include digitized pictures. The major disadvantage is the slow execution time for evaluation of rules (due to the interpretive processing of the language). Other disadvantages include the limitation on the size of the cards, that color is not supported, that it does not support grey scale graphics, and its lack of selectable fonts for text fields.
Research on distributed optical fiber sensing data processing method based on LabVIEW
NASA Astrophysics Data System (ADS)
Li, Zhonghu; Yang, Meifang; Wang, Luling; Wang, Jinming; Yan, Junhong; Zuo, Jing
2018-01-01
The pipeline leak detection and leak location problem have gotten extensive attention in the industry. In this paper, the distributed optical fiber sensing system is designed based on the heat supply pipeline. The data processing method of distributed optical fiber sensing based on LabVIEW is studied emphatically. The hardware system includes laser, sensing optical fiber, wavelength division multiplexer, photoelectric detector, data acquisition card and computer etc. The software system is developed using LabVIEW. The software system adopts wavelet denoising method to deal with the temperature information, which improved the SNR. By extracting the characteristic value of the fiber temperature information, the system can realize the functions of temperature measurement, leak location and measurement signal storage and inquiry etc. Compared with traditional negative pressure wave method or acoustic signal method, the distributed optical fiber temperature measuring system can measure several temperatures in one measurement and locate the leak point accurately. It has a broad application prospect.
Gaspar, Paula; Carvalho, Ana L; Vinga, Susana; Santos, Helena; Neves, Ana Rute
2013-11-01
The lactic acid bacteria (LAB) are a functionally related group of low-GC Gram-positive bacteria known essentially for their roles in bioprocessing of foods and animal feeds. Due to extensive industrial use and enormous economical value, LAB have been intensively studied and a large body of comprehensive data on their metabolism and genetics was generated throughout the years. This knowledge has been instrumental in the implementation of successful applications in the food industry, such as the selection of robust starter cultures with desired phenotypic traits. The advent of genomics, functional genomics and high-throughput experimentation combined with powerful computational tools currently allows for a systems level understanding of these food industry workhorses. The technological developments in the last decade have provided the foundation for the use of LAB in applications beyond the classic food fermentations. Here we discuss recent metabolic engineering strategies to improve particular cellular traits of LAB and to design LAB cell factories for the bioproduction of added value chemicals. Copyright © 2013 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Dupont, Stephen
2000-01-01
Presents a selection of computers and peripherals designed to enhance the classroom. They include personal digital assistants (the AlphaSmart 30001R, CalcuScribe Duo, and DreamWriter IT); new Apple products (the iBook laptop, improved iMac, and OS 9 operating system); PC options (new Gateway and Compaq computers); and gadgets (imagiLab, the QX3…
NASA Astrophysics Data System (ADS)
Tong, Qiujie; Wang, Qianqian; Li, Xiaoyang; Shan, Bin; Cui, Xuntai; Li, Chenyu; Peng, Zhong
2016-11-01
In order to satisfy the requirements of the real-time and generality, a laser target simulator in semi-physical simulation system based on RTX+LabWindows/CVI platform is proposed in this paper. Compared with the upper-lower computers simulation platform architecture used in the most of the real-time system now, this system has better maintainability and portability. This system runs on the Windows platform, using Windows RTX real-time extension subsystem to ensure the real-time performance of the system combining with the reflective memory network to complete some real-time tasks such as calculating the simulation model, transmitting the simulation data, and keeping real-time communication. The real-time tasks of simulation system run under the RTSS process. At the same time, we use the LabWindows/CVI to compile a graphical interface, and complete some non-real-time tasks in the process of simulation such as man-machine interaction, display and storage of the simulation data, which run under the Win32 process. Through the design of RTX shared memory and task scheduling algorithm, the data interaction between the real-time tasks process of RTSS and non-real-time tasks process of Win32 is completed. The experimental results show that this system has the strongly real-time performance, highly stability, and highly simulation accuracy. At the same time, it also has the good performance of human-computer interaction.
Logistics in the Computer Lab.
ERIC Educational Resources Information Center
Cowles, Jim
1989-01-01
Discusses ways to provide good computer laboratory facilities for elementary and secondary schools. Topics discussed include establishing the computer lab and selecting hardware; types of software; physical layout of the room; printers; networking possibilities; considerations relating to the physical environment; and scheduling methods. (LRW)
Development and design of a late-model fitness test instrument based on LabView
NASA Astrophysics Data System (ADS)
Xie, Ying; Wu, Feiqing
2010-12-01
Undergraduates are pioneers of China's modernization program and undertake the historic mission of rejuvenating our nation in the 21st century, whose physical fitness is vital. A smart fitness test system can well help them understand their fitness and health conditions, thus they can choose more suitable approaches and make practical plans for exercising according to their own situation. following the future trends, a Late-model fitness test Instrument based on LabView has been designed to remedy defects of today's instruments. The system hardware consists of fives types of sensors with their peripheral circuits, an acquisition card of NI USB-6251 and a computer, while the system software, on the basis of LabView, includes modules of user register, data acquisition, data process and display, and data storage. The system, featured by modularization and an open structure, is able to be revised according to actual needs. Tests results have verified the system's stability and reliability.
Robert Leland - Associate Lab Director, Scientific Computing and Energy
, applied mathematics, visualization, data, and analysis of energy systems, technologies, policies and Energy Analysis directorate. Leland earned his Ph.D. in mathematics from Oxford University in 1989
The StratusLab cloud distribution: Use-cases and support for scientific applications
NASA Astrophysics Data System (ADS)
Floros, E.
2012-04-01
The StratusLab project is integrating an open cloud software distribution that enables organizations to setup and provide their own private or public IaaS (Infrastructure as a Service) computing clouds. StratusLab distribution capitalizes on popular infrastructure virtualization solutions like KVM, the OpenNebula virtual machine manager, Claudia service manager and SlipStream deployment platform, which are further enhanced and expanded with additional components developed within the project. The StratusLab distribution covers the core aspects of a cloud IaaS architecture, namely Computing (life-cycle management of virtual machines), Storage, Appliance management and Networking. The resulting software stack provides a packaged turn-key solution for deploying cloud computing services. The cloud computing infrastructures deployed using StratusLab can support a wide range of scientific and business use cases. Grid computing has been the primary use case pursued by the project and for this reason the initial priority has been the support for the deployment and operation of fully virtualized production-level grid sites; a goal that has already been achieved by operating such a site as part of EGI's (European Grid Initiative) pan-european grid infrastructure. In this area the project is currently working to provide non-trivial capabilities like elastic and autonomic management of grid site resources. Although grid computing has been the motivating paradigm, StratusLab's cloud distribution can support a wider range of use cases. Towards this direction, we have developed and currently provide support for setting up general purpose computing solutions like Hadoop, MPI and Torque clusters. For what concerns scientific applications the project is collaborating closely with the Bioinformatics community in order to prepare VM appliances and deploy optimized services for bioinformatics applications. In a similar manner additional scientific disciplines like Earth Science can take advantage of StratusLab cloud solutions. Interested users are welcomed to join StratusLab's user community by getting access to the reference cloud services deployed by the project and offered to the public.
The Computer-Networked Writing Lab: One Instructor's View. ERIC Digest.
ERIC Educational Resources Information Center
Puccio, P. M.
According to an instructor of basic writing in the Writing Lab at the University of Massachusetts in Amherst, he can teach differently in a computer-networked writing lab than he did in a conventional classroom. Because the room is designed to teach writing and nothing else, it offers a congenial workspace where the teacher can interact with…
Integrating Multiple On-line Knowledge Bases for Disease-Lab Test Relation Extraction.
Zhang, Yaoyun; Soysal, Ergin; Moon, Sungrim; Wang, Jingqi; Tao, Cui; Xu, Hua
2015-01-01
A computable knowledge base containing relations between diseases and lab tests would be a great resource for many biomedical informatics applications. This paper describes our initial step towards establishing a comprehensive knowledge base of disease and lab tests relations utilizing three public on-line resources. LabTestsOnline, MedlinePlus and Wikipedia are integrated to create a freely available, computable disease-lab test knowledgebase. Disease and lab test concepts are identified using MetaMap and relations between diseases and lab tests are determined based on source-specific rules. Experimental results demonstrate a high precision for relation extraction, with Wikipedia achieving the highest precision of 87%. Combining the three sources reached a recall of 51.40%, when compared with a subset of disease-lab test relations extracted from a reference book. Moreover, we found additional disease-lab test relations from on-line resources, indicating they are complementary to existing reference books for building a comprehensive disease and lab test relation knowledge base.
Pal, Reshmi; Mendelson, John; Clavier, Odile; Baggott, Mathew J; Coyle, Jeremy; Galloway, Gantt P
2016-01-01
In methamphetamine (MA) users, drug-induced neurocognitive deficits may help to determine treatment, monitor adherence, and predict relapse. To measure these relationships, we developed an iPhone app (Neurophone) to compare lab and field performance of N-Back, Stop Signal, and Stroop tasks that are sensitive to MA-induced deficits. Twenty healthy controls and 16 MA-dependent participants performed the tasks in-lab using a validated computerized platform and the Neurophone before taking the latter home and performing the tasks twice daily for two weeks. N-Back task: there were no clear differences in performance between computer-based vs. phone-based in-lab tests and phone-based in-lab vs. phone-based in-field tests. Stop-Signal task: difference in parameters prevented comparison of computer-based and phone-based versions. There was significant difference in phone performance between field and lab. Stroop task: response time measured by the speech recognition engine lacked precision to yield quantifiable results. There was no learning effect over time. On an average, each participant completed 84.3% of the in-field NBack tasks and 90.4% of the in-field Stop Signal tasks (MA-dependent participants: 74.8% and 84.3%; healthy controls: 91.4% and 95.0%, respectively). Participants rated Neurophone easy to use. Cognitive tasks performed in-field using Neurophone have the potential to yield results comparable to those obtained in a laboratory setting. Tasks need to be modified for use as the app's voice recognition system is not yet adequate for timed tests.
Learning Evolution and the Nature of Science Using Evolutionary Computing and Artificial Life
ERIC Educational Resources Information Center
Pennock, Robert T.
2007-01-01
Because evolution in natural systems happens so slowly, it is difficult to design inquiry-based labs where students can experiment and observe evolution in the way they can when studying other phenomena. New research in evolutionary computation and artificial life provides a solution to this problem. This paper describes a new A-Life software…
LINUX, Virtualization, and the Cloud: A Hands-On Student Introductory Lab
ERIC Educational Resources Information Center
Serapiglia, Anthony
2013-01-01
Many students are entering Computer Science education with limited exposure to operating systems and applications other than those produced by Apple or Microsoft. This gap in familiarity with the Open Source community can quickly be bridged with a simple exercise that can also be used to strengthen two other important current computing concepts,…
ERIC Educational Resources Information Center
Ruddick, Kristie R.; Parrill, Abby L.; Petersen, Richard L.
2012-01-01
In this study, a computational molecular orbital theory experiment was implemented in a first-semester honors general chemistry course. Students used the GAMESS (General Atomic and Molecular Electronic Structure System) quantum mechanical software (as implemented in ChemBio3D) to optimize the geometry for various small molecules. Extended Huckel…
Department of Defense In-House RDT and E Activities: Management Analysis Report for Fiscal Year 1993
1994-11-01
A worldwide unique lab because it houses a high - speed modeling and simulation system, a prototype...E Division, San Diego, CA: High Performance Computing Laboratory providing a wide range of advanced computer systems for the scientific investigation...Machines CM-200 and a 256-node Thinking Machines CM-S. The CM-5 is in a very large memory, ( high performance 32 Gbytes, >4 0 OFlop) coafiguration,
NASA Astrophysics Data System (ADS)
Smith, Wilford; Nunez, Patrick
2005-05-01
This paper describes the work being performed under the RDECOM Power and Energy (P&E) program (formerly the Combat Hybrid Power System (CHPS) program) developing hybrid power system models and integrating them into larger simulations, such as OneSAF, that can be used to find duty cycles to feed designers of hybrid power systems. This paper also describes efforts underway to link the TARDEC P&E System Integration Lab (SIL) in San Jose CA to the TARDEC Ground Vehicle Simulation Lab (GVSL) in Warren, MI. This linkage is being performed to provide a methodology for generating detailed driver profiles for use in the development of vignettes and mission profiles for system design excursions.
ERIC Educational Resources Information Center
Furberg, Anniken
2016-01-01
This paper reports on a study of teacher support in a setting where students engaged with computer-supported collaborative learning (CSCL) in science. The empirical basis is an intervention study where secondary school students and their teacher performed a lab experiment in genetics supported by a digital learning environment. The analytical…
Jackman, Patrick; Sun, Da-Wen; Elmasry, Gamal
2012-08-01
A new algorithm for the conversion of device dependent RGB colour data into device independent L*a*b* colour data without introducing noticeable error has been developed. By combining a linear colour space transform and advanced multiple regression methodologies it was possible to predict L*a*b* colour data with less than 2.2 colour units of error (CIE 1976). By transforming the red, green and blue colour components into new variables that better reflect the structure of the L*a*b* colour space, a low colour calibration error was immediately achieved (ΔE(CAL) = 14.1). Application of a range of regression models on the data further reduced the colour calibration error substantially (multilinear regression ΔE(CAL) = 5.4; response surface ΔE(CAL) = 2.9; PLSR ΔE(CAL) = 2.6; LASSO regression ΔE(CAL) = 2.1). Only the PLSR models deteriorated substantially under cross validation. The algorithm is adaptable and can be easily recalibrated to any working computer vision system. The algorithm was tested on a typical working laboratory computer vision system and delivered only a very marginal loss of colour information ΔE(CAL) = 2.35. Colour features derived on this system were able to safely discriminate between three classes of ham with 100% correct classification whereas colour features measured on a conventional colourimeter were not. Copyright © 2012 Elsevier Ltd. All rights reserved.
Synchronized Pair Configuration in Virtualization-Based Lab for Learning Computer Networks
ERIC Educational Resources Information Center
Kongcharoen, Chaknarin; Hwang, Wu-Yuin; Ghinea, Gheorghita
2017-01-01
More studies are concentrating on using virtualization-based labs to facilitate computer or network learning concepts. Some benefits are lower hardware costs and greater flexibility in reconfiguring computer and network environments. However, few studies have investigated effective mechanisms for using virtualization fully for collaboration.…
Proceedings of Tenth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1985-01-01
Papers are presented on the following topics: measurement of software technology, recent studies of the Software Engineering Lab, software management tools, expert systems, error seeding as a program validation technique, software quality assurance, software engineering environments (including knowledge-based environments), the Distributed Computing Design System, and various Ada experiments.
Software platform virtualization in chemistry research and university teaching
2009-01-01
Background Modern chemistry laboratories operate with a wide range of software applications under different operating systems, such as Windows, LINUX or Mac OS X. Instead of installing software on different computers it is possible to install those applications on a single computer using Virtual Machine software. Software platform virtualization allows a single guest operating system to execute multiple other operating systems on the same computer. We apply and discuss the use of virtual machines in chemistry research and teaching laboratories. Results Virtual machines are commonly used for cheminformatics software development and testing. Benchmarking multiple chemistry software packages we have confirmed that the computational speed penalty for using virtual machines is low and around 5% to 10%. Software virtualization in a teaching environment allows faster deployment and easy use of commercial and open source software in hands-on computer teaching labs. Conclusion Software virtualization in chemistry, mass spectrometry and cheminformatics is needed for software testing and development of software for different operating systems. In order to obtain maximum performance the virtualization software should be multi-core enabled and allow the use of multiprocessor configurations in the virtual machine environment. Server consolidation, by running multiple tasks and operating systems on a single physical machine, can lead to lower maintenance and hardware costs especially in small research labs. The use of virtual machines can prevent software virus infections and security breaches when used as a sandbox system for internet access and software testing. Complex software setups can be created with virtual machines and are easily deployed later to multiple computers for hands-on teaching classes. We discuss the popularity of bioinformatics compared to cheminformatics as well as the missing cheminformatics education at universities worldwide. PMID:20150997
Software platform virtualization in chemistry research and university teaching.
Kind, Tobias; Leamy, Tim; Leary, Julie A; Fiehn, Oliver
2009-11-16
Modern chemistry laboratories operate with a wide range of software applications under different operating systems, such as Windows, LINUX or Mac OS X. Instead of installing software on different computers it is possible to install those applications on a single computer using Virtual Machine software. Software platform virtualization allows a single guest operating system to execute multiple other operating systems on the same computer. We apply and discuss the use of virtual machines in chemistry research and teaching laboratories. Virtual machines are commonly used for cheminformatics software development and testing. Benchmarking multiple chemistry software packages we have confirmed that the computational speed penalty for using virtual machines is low and around 5% to 10%. Software virtualization in a teaching environment allows faster deployment and easy use of commercial and open source software in hands-on computer teaching labs. Software virtualization in chemistry, mass spectrometry and cheminformatics is needed for software testing and development of software for different operating systems. In order to obtain maximum performance the virtualization software should be multi-core enabled and allow the use of multiprocessor configurations in the virtual machine environment. Server consolidation, by running multiple tasks and operating systems on a single physical machine, can lead to lower maintenance and hardware costs especially in small research labs. The use of virtual machines can prevent software virus infections and security breaches when used as a sandbox system for internet access and software testing. Complex software setups can be created with virtual machines and are easily deployed later to multiple computers for hands-on teaching classes. We discuss the popularity of bioinformatics compared to cheminformatics as well as the missing cheminformatics education at universities worldwide.
Using LabVIEW to facilitate calibration and verification for respiratory impedance plethysmography.
Ellis, W S; Jones, R T
1991-12-01
A system for calibrating the Respitrace impedance plethysmograph was developed with the capacity to quantitatively verify the accuracy of calibration. LabVIEW software was used on a Macintosh II computer to create a user-friendly environment, with the added benefit of reducing development time. The system developed enabled a research assistant to calibrate the Respitrace within 15 min while achieving an accuracy within the normally accepted 10% deviation when the Respitrace output is compared to a water spirometer standard. The system and methods described were successfully used in a study of 10 subjects smoking cigarettes containing marijuana or cocaine under four conditions, calibrating all subjects to 10% accuracy within 15 min.
Innovative Use of a Classroom Response System During Physics Lab
NASA Astrophysics Data System (ADS)
Walgren, Jay
2011-01-01
More and more physics instructors are making use of personal/classroom response systems or "clickers." The use of clickers to engage students with multiple-choice questions during lecture and available instructor resources for clickers have been well documented in this journal.1-4 Newer-generation clickers, which I refer to as classroom response systems (CRS), have evolved to accept numeric answers (such as 9.81) instead of just single "multiple-choice" entries (Fig. 1). This advancement is available from most major clicker companies and allows for a greater variety of engaging questions during lecture. In addition, these new "numeric ready" clickers are marketed to be used for student assessments. During a test or quiz, students' answers are entered into their clicker instead of on paper or Scantron® and immediately absorbed by wireless connection into a computer for grading and analysis. I recognize the usefulness and benefit these new-generation CRSs provide for many instructors. However, I do not use my CRS in either of the aforementioned activities. Instead, I use it in an unconventional way. I use the CRS to electronically capture students' lab data as they are performing a physics lab (Fig. 2). I set up the clickers as if I were going to use them for a test, but instead of entering answers to a test, my students enter lab data as they collect it. In this paper I discuss my use of a classroom response system during physics laboratory and three benefits that result: 1) Students are encouraged to "take ownership of" and "have integrity with" their physics lab data. 2) Students' measuring and unit conversion deficiencies are identified immediately during the lab. 3) The process of grading students' labs is simplified because the results of each student's lab calculations can be pre-calculated for the instructor using a spreadsheet. My use of clickers during lab can be implemented with most clicker systems available to instructors today. The CRS I use is the eInstruction's® Classroom Performance System™ (CPS™).5 (Fig. 1)
Community College Uses a Video-Game Lab to Lure Students to Computer Courses
ERIC Educational Resources Information Center
Young, Jeffrey R.
2007-01-01
A computer lab has become one of the most popular hangouts at Northern Virginia Community College after officials decided to load its PCs with popular video games, install a PlayStation and an Xbox, and declare it "for gamers only." The goal of this lab is to entice students to take game-design and other IT courses. John Min, dean of…
NASA Astrophysics Data System (ADS)
Manley, J.; Chegwidden, D.; Mote, A. S.; Ledley, T. S.; Lynds, S. E.; Haddad, N.; Ellins, K.
2016-02-01
EarthLabs, envisioned as a national model for high school Earth or Environmental Science lab courses, is adaptable for both undergraduate middle school students. The collection includes ten online modules that combine to feature a global view of our planet as a dynamic, interconnected system, by engaging learners in extended investigations. EarthLabs support state and national guidelines, including the NGSS, for science content. Four modules directly guide students to discover vital aspects of the oceans while five other modules incorporate ocean sciences in order to complete an understanding of Earth's climate system. Students gain a broad perspective on the key role oceans play in fishing industry, droughts, coral reefs, hurricanes, the carbon cycle, as well as life on land and in the seas to drive our changing climate by interacting with scientific research data, manipulating satellite imagery, numerical data, computer visualizations, experiments, and video tutorials. Students explore Earth system processes and build quantitative skills that enable them to objectively evaluate scientific findings for themselves as they move through ordered sequences that guide the learning. As a robust collection, EarthLabs modules engage students in extended, rigorous investigations allowing a deeper understanding of the ocean, climate and weather. This presentation provides an overview of the ten curriculum modules that comprise the EarthLabs collection developed by TERC and found at http://serc.carleton.edu/earthlabs/index.html. Evaluation data on the effectiveness and use in secondary education classrooms will be summarized.
A LabVIEW-Based Virtual Instrument System for Laser-Induced Fluorescence Spectroscopy.
Wu, Qijun; Wang, Lufei; Zu, Lily
2011-01-01
We report the design and operation of a Virtual Instrument (VI) system based on LabVIEW 2009 for laser-induced fluorescence experiments. This system achieves synchronous control of equipment and acquisition of real-time fluorescence data communicating with a single computer via GPIB, USB, RS232, and parallel ports. The reported VI system can also accomplish data display, saving, and analysis, and printing the results. The VI system performs sequences of operations automatically, and this system has been successfully applied to obtain the excitation and dispersion spectra of α-methylnaphthalene. The reported VI system opens up new possibilities for researchers and increases the efficiency and precision of experiments. The design and operation of the VI system are described in detail in this paper, and the advantages that this system can provide are highlighted.
A LabVIEW-Based Virtual Instrument System for Laser-Induced Fluorescence Spectroscopy
Wu, Qijun; Wang, Lufei; Zu, Lily
2011-01-01
We report the design and operation of a Virtual Instrument (VI) system based on LabVIEW 2009 for laser-induced fluorescence experiments. This system achieves synchronous control of equipment and acquisition of real-time fluorescence data communicating with a single computer via GPIB, USB, RS232, and parallel ports. The reported VI system can also accomplish data display, saving, and analysis, and printing the results. The VI system performs sequences of operations automatically, and this system has been successfully applied to obtain the excitation and dispersion spectra of α-methylnaphthalene. The reported VI system opens up new possibilities for researchers and increases the efficiency and precision of experiments. The design and operation of the VI system are described in detail in this paper, and the advantages that this system can provide are highlighted. PMID:22013388
Custovic, Adnan; Ainsworth, John; Arshad, Hasan; Bishop, Christopher; Buchan, Iain; Cullinan, Paul; Devereux, Graham; Henderson, John; Holloway, John; Roberts, Graham; Turner, Steve; Woodcock, Ashley; Simpson, Angela
2015-01-01
We created Asthma e-Lab, a secure web-based research environment to support consistent recording, description and sharing of data, computational/statistical methods and emerging findings across the five UK birth cohorts. The e-Lab serves as a data repository for our unified dataset and provides the computational resources and a scientific social network to support collaborative research. All activities are transparent, and emerging findings are shared via the e-Lab, linked to explanations of analytical methods, thus enabling knowledge transfer. eLab facilitates the iterative interdisciplinary dialogue between clinicians, statisticians, computer scientists, mathematicians, geneticists and basic scientists, capturing collective thought behind the interpretations of findings. PMID:25805205
A Macintosh-Based Scientific Images Video Analysis System
NASA Technical Reports Server (NTRS)
Groleau, Nicolas; Friedland, Peter (Technical Monitor)
1994-01-01
A set of experiments was designed at MIT's Man-Vehicle Laboratory in order to evaluate the effects of zero gravity on the human orientation system. During many of these experiments, the movements of the eyes are recorded on high quality video cassettes. The images must be analyzed off-line to calculate the position of the eyes at every moment in time. To this aim, I have implemented a simple inexpensive computerized system which measures the angle of rotation of the eye from digitized video images. The system is implemented on a desktop Macintosh computer, processes one play-back frame per second and exhibits adequate levels of accuracy and precision. The system uses LabVIEW, a digital output board, and a video input board to control a VCR, digitize video images, analyze them, and provide a user friendly interface for the various phases of the process. The system uses the Concept Vi LabVIEW library (Graftek's Image, Meudon la Foret, France) for image grabbing and displaying as well as translation to and from LabVIEW arrays. Graftek's software layer drives an Image Grabber board from Neotech (Eastleigh, United Kingdom). A Colour Adapter box from Neotech provides adequate video signal synchronization. The system also requires a LabVIEW driven digital output board (MacADIOS II from GW Instruments, Cambridge, MA) controlling a slightly modified VCR remote control used mainly to advance the video tape frame by frame.
NASA Technical Reports Server (NTRS)
Schlecht, Leslie E.; Kutler, Paul (Technical Monitor)
1998-01-01
This is a proposal for a general use system based, on the SGI IRIS workstation platform, for recording computer animation to videotape. In addition, this system would provide features for simple editing and enhancement. Described here are a list of requirements for the system, and a proposed configuration including the SGI VideoLab Integrator, VideoMedia VLAN animation controller and the Pioneer rewritable laserdisc recorder.
Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories
ERIC Educational Resources Information Center
Burd, Stephen D.; Seazzu, Alessandro F.; Conway, Christopher
2009-01-01
Current technology enables schools to provide remote or virtual computing labs that can be implemented in multiple ways ranging from remote access to banks of dedicated workstations to sophisticated access to large-scale servers hosting virtualized workstations. This paper reports on the implementation of a specific lab using remote access to…
Integration of Computer Technology Into an Introductory-Level Neuroscience Laboratory
ERIC Educational Resources Information Center
Evert, Denise L.; Goodwin, Gregory; Stavnezer, Amy Jo
2005-01-01
We describe 3 computer-based neuroscience laboratories. In the first 2 labs, we used commercially available interactive software to enhance the study of functional and comparative neuroanatomy and neurophysiology. In the remaining lab, we used customized software and hardware in 2 psychophysiological experiments. With the use of the computer-based…
ERIC Educational Resources Information Center
Elmore, Donald E.; Guayasamin, Ryann C.; Kieffer, Madeleine E.
2010-01-01
As computational modeling plays an increasingly central role in biochemical research, it is important to provide students with exposure to common modeling methods in their undergraduate curriculum. This article describes a series of computer labs designed to introduce undergraduate students to energy minimization, molecular dynamics simulations,…
NBodyLab Simulation Experiments with GRAPE-6a AND MD-GRAPE2 Acceleration
NASA Astrophysics Data System (ADS)
Johnson, V.; Ates, A.
2005-12-01
NbodyLab is an astrophysical N-body simulation testbed for student research. It is accessible via a web interface and runs as a backend framework under Linux. NbodyLab can generate data models or perform star catalog lookups, transform input data sets, perform direct summation gravitational force calculations using a variety of integration schemes, and produce analysis and visualization output products. NEMO (Teuben 1994), a popular stellar dynamics toolbox, is used for some functions. NbodyLab integrators can optionally utilize two types of low-cost desktop supercomputer accelerators, the newly available GRAPE-6a (125 Gflops peak) and the MD-GRAPE2 (64-128 Gflops peak). The initial version of NBodyLab was presented at ADASS 2002. This paper summarizes software enhancements developed subsequently, focusing on GRAPE-6a related enhancements, and gives examples of computational experiments and astrophysical research, including star cluster and solar system studies, that can be conducted with the new testbed functionality.
Hands in space: gesture interaction with augmented-reality interfaces.
Billinghurst, Mark; Piumsomboon, Tham; Huidong Bai
2014-01-01
Researchers at the Human Interface Technology Laboratory New Zealand (HIT Lab NZ) are investigating free-hand gestures for natural interaction with augmented-reality interfaces. They've applied the results to systems for desktop computers and mobile devices.
Assessing Usage and Maximizing Finance Lab Impact: A Case Exploration
ERIC Educational Resources Information Center
Noguera, Magdy; Budden, Michael Craig; Silva, Alberto
2011-01-01
This paper reports the results of a survey conducted to assess students' usage and perceptions of a finance lab. Finance labs differ from simple computer labs as they typically contain data boards, streaming market quotes, terminals and software that allow for real-time financial analyses. Despite the fact that such labs represent significant and…
Auto-tuning system for NMR probe with LabView
NASA Astrophysics Data System (ADS)
Quen, Carmen; Mateo, Olivia; Bernal, Oscar
2013-03-01
Typical manual NMR-tuning method is not suitable for broadband spectra spanning several megahertz linewidths. Among the main problems encountered during manual tuning are pulse-power reproducibility, baselines, and transmission line reflections, to name a few. We present a design of an auto-tuning system using graphic programming language, LabVIEW, to minimize these problems. The program is designed to analyze the detected power signal of an antenna near the NMR probe and use this analysis to automatically tune the sample coil to match the impedance of the spectrometer (50 Ω). The tuning capacitors of the probe are controlled by a stepper motor through a LabVIEW/computer interface. Our program calculates the area of the power signal as an indicator to control the motor so disconnecting the coil to tune it through a network analyzer is unnecessary. Work supported by NSF-DMR 1105380
ASPIRE: An Authoring System and Deployment Environment for Constraint-Based Tutors
ERIC Educational Resources Information Center
Mitrovic, Antonija; Martin, Brent; Suraweera, Pramuditha; Zakharov, Konstantin; Milik, Nancy; Holland, Jay; McGuigan, Nicholas
2009-01-01
Over the last decade, the Intelligent Computer Tutoring Group (ICTG) has implemented many successful constraint-based Intelligent Tutoring Systems (ITSs) in a variety of instructional domains. Our tutors have proven their effectiveness not only in controlled lab studies but also in real classrooms, and some of them have been commercialized.…
Continuous-Grouped-Self-Learning: In the Perspective of Lecturers, Tutors and Laboratory Instructors
ERIC Educational Resources Information Center
Azau, Mohd Azrin Mohd; Yao, Low Ming; Aik, Goo Soon; Yeong, Chin Kock; Nor, Mohamad Nizam; Abdullah, Ahmad Yusri; Jamil, Mohd Hafidz Mohamad; Yahya, Nasiruddin; Abas, Ahmad Fauzi; Saripan, M. Iqbal
2009-01-01
This paper presents the perception of lecturers, tutors and lab instructors towards the implemented Continuous-Group-Self-Learning (CGSL) in the Department of Computer and Communication System Engineering (CCSE), Universiti Putra Malaysia. This innovative system introduces mock teaching and student-lecturer role as a technique of delivery. The…
Implementation of Project Based Learning in Mechatronic Lab Course at Bandung State Polytechnic
ERIC Educational Resources Information Center
Basjaruddin, Noor Cholis; Rakhman, Edi
2016-01-01
Mechatronics is a multidisciplinary that includes a combination of mechanics, electronics, control systems, and computer science. The main objective of mechatronics learning is to establish a comprehensive mindset in the development of mechatronic systems. Project Based Learning (PBL) is an appropriate method for use in the learning process of…
Optics and optics-based technologies education with the benefit of LabVIEW
NASA Astrophysics Data System (ADS)
Wan, Yuhong; Man, Tianlong; Tao, Shiquan
2015-10-01
The details of design and implementation of incoherent digital holographic experiments based on LabVIEW are demonstrated in this work in order to offer a teaching modal by making full use of LabVIEW as an educational tool. Digital incoherent holography enables holograms to be recorded from incoherent light with just a digital camera and spatial light modulator and three-dimensional properties of the specimen are revealed after the hologram is reconstructed in the computer. The experiment of phase shifting incoherent digital holography is designed and implemented based on the principle of Fresnel incoherent correlation holography. An automatic control application is developed based on LabVIEW, which combines the functions of major experimental hardware control and digital reconstruction of the holograms. The basic functions of the system are completed and a user-friendly interface is provided for easy operation. The students are encouraged and stimulated to learn and practice the basic principle of incoherent digital holography and other related optics-based technologies during the programming of the application and implementation of the system.
The Effectiveness of Using Virtual Laboratories to Teach Computer Networking Skills in Zambia
ERIC Educational Resources Information Center
Lampi, Evans
2013-01-01
The effectiveness of using virtual labs to train students in computer networking skills, when real equipment is limited or unavailable, is uncertain. The purpose of this study was to determine the effectiveness of using virtual labs to train students in the acquisition of computer network configuration and troubleshooting skills. The study was…
Laboratory Directed Research and Development Program FY 2006
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen
2007-03-08
The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab or LBNL) is a multi-program national research facility operated by the University of California for the Department of Energy (DOE). As an integral element of DOE's National Laboratory System, Berkeley Lab supports DOE's missions in fundamental science, energy resources, and environmental quality. Berkeley Lab programs advance four distinct goals for DOE and the nation: (1) To perform leading multidisciplinary research in the computing sciences, physical sciences, energy sciences, biosciences, and general sciences in a manner that ensures employee and public safety and protection of the environment. (2) To develop and operatemore » unique national experimental facilities for qualified investigators. (3) To educate and train future generations of scientists and engineers to promote national science and education goals. (4) To transfer knowledge and technological innovations and to foster productive relationships among Berkeley Lab's research programs, universities, and industry in order to promote national economic competitiveness.« less
Chaudhry, Fouad A; Ismail, Sanaa Z; Davis, Edward T
2018-05-01
Computer-assisted navigation techniques are used to optimise component placement and alignment in total hip replacement. It has developed in the last 10 years but despite its advantages only 0.3% of all total hip replacements in England and Wales are done using computer navigation. One of the reasons for this is that computer-assisted technology increases operative time. A new method of pelvic registration has been developed without the need to register the anterior pelvic plane (BrainLab hip 6.0) which has shown to improve the accuracy of THR. The purpose of this study was to find out if the new method reduces the operating time. This was a retrospective analysis of comparing operating time in computer navigated primary uncemented total hip replacement using two methods of registration. Group 1 included 128 cases that were performed using BrainLab versions 2.1-5.1. This version relied on the acquisition of the anterior pelvic plane for registration. Group 2 included 128 cases that were performed using the newest navigation software, BrainLab hip 6.0 (registration possible with the patient in the lateral decubitus position). The operating time was 65.79 (40-98) minutes using the old method of registration and was 50.87 (33-74) minutes using the new method of registration. This difference was statistically significant. The body mass index (BMI) was comparable in both groups. The study supports the use of new method of registration in improving the operating time in computer navigated primary uncemented total hip replacements.
Report on a Highly Used Computer Aid for the Professor in his Grade and Record Keeping Tasks.
ERIC Educational Resources Information Center
Brockmeier, Richard
SPARS is a computer data base management system designed to aid the college professor in handling his students' grades and other classroom data. It can handle multiple sections and labs, and allows the professor to combine and separate these components in a variety of ways. SPARS seeks to meet the sometimes competing goals of simplicity of use and…
A simple computer-based measurement and analysis system of pulmonary auscultation sounds.
Polat, Hüseyin; Güler, Inan
2004-12-01
Listening to various lung sounds has proven to be an important diagnostic tool for detecting and monitoring certain types of lung diseases. In this study a computer-based system has been designed for easy measurement and analysis of lung sound using the software package DasyLAB. The designed system presents the following features: it is able to digitally record the lung sounds which are captured with an electronic stethoscope plugged to a sound card on a portable computer, display the lung sound waveform for auscultation sites, record the lung sound into the ASCII format, acoustically reproduce the lung sound, edit and print the sound waveforms, display its time-expanded waveform, compute the Fast Fourier Transform (FFT), and display the power spectrum and spectrogram.
The College of Charleston's 400-Student Observational Lab Program
NASA Astrophysics Data System (ADS)
True, C. M.
2006-06-01
For over thirty years the College of Charleston has been teaching a year-long introductory astronomy course incorporating a mandatory 3 hour lab. Despite our location in a very light polluted, coastal, high humidity, and often cloudy metropolitan area we have emphasized observational activities as much as possible. To accommodate our population of between 300-400 students per semester, we have 28 8-inch Celestron Telescopes and 25 GPS capable 8-inch Meade LX-200 telescopes. Finally, we have a 16 DFM adjacent to our rooftop observing decks. For indoor activities we have access to 42 computers running a variety of astronomy education software. Some of the computer activities are based on the Starry Night software (Backyard and Pro), the CLEA software from Gettysburg College, and Spectrum Explorer from Boston University. Additionally, we have labs involving cratering, eclipses and phases, coordinate systems with celestial globes, the inverse square law, spectroscopy and spectral classification, as well as others. In this presentation we will discuss the difficulties in managing a program of this size. We have approximately 14 lab sections a week. The lab manager's task involves coordinating 8-10 lab instructors and the same number of undergraduate teaching assistants as well as trying to maintain a coherent experience between the labs and lecture sections. Our lab manuals are produced locally with yearly updates. Samples from the manuals will be available. This program has been developed by a large number of College of Charleston astronomy faculty, including Don Drost, Bob Dukes, Chris Fragile, Tim Giblin, Jon Hakkila, Bill Kubinec, Lee Lindner, Jim Neff, Laura Penny, Al Rainis, Terry Richardson, and D. J. Williams, as well as adjunct and visiting faculty Bill Baird, Kevin Bourque, Ethan Denault, Kwayera Davis, Francie Halter, and Alan Johnson. Part of this work has been funded by NSF DUE grants to the College of Charleston.
Using Computer Simulations to Integrate Learning.
ERIC Educational Resources Information Center
Liao, Thomas T.
1983-01-01
Describes the primary design criteria and the classroom activities involved in "The Yellow Light Problem," a minicourse on decision making in the secondary school Mathematics, Engineering and Science Achievement (MESA) program in California. Activities include lectures, discussions, science and math labs, computer labs, and development…
Berkeley Lab 2nd Grader Outreach
Scoggins, Jackie; Louie, Virginia
2017-12-11
The Berkeley Lab IT Department sponsored a community outreach program aimed at teaching young children about computers and networks. Second graders from LeConte Elementary School joined Lab IT Staff for a day of in-depth exercises and fun.
Jackson, M E; Gnadt, J W
1999-03-01
The object-oriented graphical programming language LabView was used to implement the numerical solution to a computational model of saccade generation in primates. The computational model simulates the activity and connectivity of anatomical strictures known to be involved in saccadic eye movements. The LabView program provides a graphical user interface to the model that makes it easy to observe and modify the behavior of each element of the model. Essential elements of the source code of the LabView program are presented and explained. A copy of the model is available for download from the internet.
Computer-Controlled System for Plasma Ion Energy Auto-Analyzer
NASA Astrophysics Data System (ADS)
Wu, Xian-qiu; Chen, Jun-fang; Jiang, Zhen-mei; Zhong, Qing-hua; Xiong, Yu-ying; Wu, Kai-hua
2003-02-01
A computer-controlled system for plasma ion energy auto-analyzer was technically studied for rapid and online measurement of plasma ion energy distribution. The system intelligently controls all the equipments via a RS-232 port, a printer port and a home-built circuit. The software designed by Lab VIEW G language automatically fulfils all of the tasks such as system initializing, adjustment of scanning-voltage, measurement of weak-current, data processing, graphic export, etc. By using the system, a few minutes are taken to acquire the whole ion energy distribution, which rapidly provides important parameters of plasma process techniques based on semiconductor devices and microelectronics.
The Macintosh Lab Monitor, Numbers 1-4.
ERIC Educational Resources Information Center
Wanderman, Richard; And Others
1987-01-01
Four issues of the "Macintosh Lab Monitor" document the Computer-Aided Writing Project at the Forman School (Connecticut) which is a college preparatory school for bright dyslexic adolescents. The project uses Macintosh computers to teach outlining, writing, organizational and thinking skills. Sample articles have the following titles:…
ERIC Educational Resources Information Center
Dowling, John, Jr.
1972-01-01
Discusses the use of a set of computer programs (FORTRAN IV) in an introductory mechanics course for science majors. One laboratory activity is described for determining the coefficient of restitution of a glider on an air track. A student evaluation for the lab is included in the appendix. (Author/TS)
Multicore: Fallout from a Computing Evolution
Yelick, Kathy [Director, NERSC
2017-12-09
July 22, 2008 Berkeley Lab lecture: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.
NASA Astrophysics Data System (ADS)
Schlattauer, Leo; Parali, Levent; Pechousek, Jiri; Sabikoglu, Israfil; Celiktas, Cuneyt; Tektas, Gozde; Novak, Petr; Jancar, Ales; Prochazka, Vit
2017-09-01
This paper reports on the development of a gamma-ray spectroscopic system for the (i) recording and (ii) processing of spectra. The utilized data read-out unit consists of a PCI digital oscilloscope, personal computer and LabVIEW™ programming environment. A pulse-height spectra of various sources were recorded with two NaI(Tl) detectors and analyzed, demonstrating the proper usage of the detectors. A multichannel analyzer implements the Gaussian photopeak fitting. The presented method provides results which are in compliance to the ones taken from commercial spectroscopy systems. Each individual hardware or software unit can be further utilized in different spectrometric user-systems. An application of the developed system for research and teaching purposes regarding the design of digital spectrometric systems has been successfully tested at the laboratories of the Department of Experimental Physics.
A digital frequency stabilization system of external cavity diode laser based on LabVIEW FPGA
NASA Astrophysics Data System (ADS)
Liu, Zhuohuan; Hu, Zhaohui; Qi, Lu; Wang, Tao
2015-10-01
Frequency stabilization for external cavity diode laser has played an important role in physics research. Many laser frequency locking solutions have been proposed by researchers. Traditionally, the locking process was accomplished by analog system, which has fast feedback control response speed. However, analog system is susceptible to the effects of environment. In order to improve the automation level and reliability of the frequency stabilization system, we take a grating-feedback external cavity diode laser as the laser source and set up a digital frequency stabilization system based on National Instrument's FPGA (NI FPGA). The system consists of a saturated absorption frequency stabilization of beam path, a differential photoelectric detector, a NI FPGA board and a host computer. Many functions, such as piezoelectric transducer (PZT) sweeping, atomic saturation absorption signal acquisition, signal peak identification, error signal obtaining and laser PZT voltage feedback controlling, are totally completed by LabVIEW FPGA program. Compared with the analog system, the system built by the logic gate circuits, performs stable and reliable. User interface programmed by LabVIEW is friendly. Besides, benefited from the characteristics of reconfiguration, the LabVIEW program is good at transplanting in other NI FPGA boards. Most of all, the system periodically checks the error signal. Once the abnormal error signal is detected, FPGA will restart frequency stabilization process without manual control. Through detecting the fluctuation of error signal of the atomic saturation absorption spectrum line in the frequency locking state, we can infer that the laser frequency stability can reach 1MHz.
Student's Lab Assignments in PDE Course with MAPLE.
ERIC Educational Resources Information Center
Ponidi, B. Alhadi
Computer-aided software has been used intensively in many mathematics courses, especially in computational subjects, to solve initial value and boundary value problems in Partial Differential Equations (PDE). Many software packages were used in student lab assignments such as FORTRAN, PASCAL, MATLAB, MATHEMATICA, and MAPLE in order to accelerate…
The Ever-Present Demand for Public Computing Resources. CDS Spotlight
ERIC Educational Resources Information Center
Pirani, Judith A.
2014-01-01
This Core Data Service (CDS) Spotlight focuses on public computing resources, including lab/cluster workstations in buildings, virtual lab/cluster workstations, kiosks, laptop and tablet checkout programs, and workstation access in unscheduled classrooms. The findings are derived from 758 CDS 2012 participating institutions. A dataset of 529…
Complete LabVIEW-Controlled HPLC Lab: An Advanced Undergraduate Experience
ERIC Educational Resources Information Center
Beussman, Douglas J.; Walters, John P.
2017-01-01
Virtually all modern chemical instrumentation is controlled by computers. While software packages are continually becoming easier to use, allowing for more researchers to utilize more complex instruments, conveying some level of understanding as to how computers and instruments communicate is still an important part of the undergraduate…
From Computer Lab to Technology Class.
ERIC Educational Resources Information Center
Sherwood, Sandra
1999-01-01
Discussion of integrating technology into elementary school classrooms focuses on teacher training that is based on a three-year plan developed at an elementary school in Marathon, New York. Describes the role of a technology teacher who facilitates technology integration by running the computer lab, offering workshops, and developing inservice…
Music Learning in Your School Computer Lab.
ERIC Educational Resources Information Center
Reese, Sam
1998-01-01
States that a growing number of schools are installing general computer labs equipped to use notation, accompaniment, and sequencing software independent of MIDI keyboards. Discusses (1) how to configure the software without MIDI keyboards or external sound modules, (2) using the actual MIDI software, (3) inexpensive enhancements, and (4) the…
The Hidden Costs of Wireless Computer Labs
ERIC Educational Resources Information Center
Daly, Una
2005-01-01
Various elementary schools and middle schools across the U.S. have purchased one or more mobile laboratories. Although the wireless labs have provided more classroom computing, teachers and technology aides still have mixed views about their cost-benefit ratio. This is because the proliferation of viruses and spyware has dramatically increased…
Requirements for the Military Message System (MMS) Family: Data Types and User Commands.
1986-04-11
AD-A167 126 REQUIREMENTS FOR THE MILITARY MESSASE SYSTEM (NHS) i FRILY: DATA TYPES AND USER CONNNDS(U) NAVAL RESEARCH LAB WASHINGTON DC C L HEITHEVER... System (MMS) Family: Data Types and User Commands CONSTANCE L. HEITMEYER Computer Science and Systems Branch I Information Technology Division April 11...Security Classification) Requirements for the Military Message System (MMS) Family: Data Types and User Commands 12. PERSONAL AUTHOR(S) Heitmeer, Constance
Software for Testing Electroactive Structural Components
NASA Technical Reports Server (NTRS)
Moses, Robert W.; Fox, Robert L.; Dimery, Archie D.; Bryant, Robert G.; Shams, Qamar
2003-01-01
A computer program generates a graphical user interface that, in combination with its other features, facilitates the acquisition and preprocessing of experimental data on the strain response, hysteresis, and power consumption of a multilayer composite-material structural component containing one or more built-in sensor(s) and/or actuator(s) based on piezoelectric materials. This program runs in conjunction with Lab-VIEW software in a computer-controlled instrumentation system. For a test, a specimen is instrumented with appliedvoltage and current sensors and with strain gauges. Once the computational connection to the test setup has been made via the LabVIEW software, this program causes the test instrumentation to step through specified configurations. If the user is satisfied with the test results as displayed by the software, the user activates an icon on a front-panel display, causing the raw current, voltage, and strain data to be digitized and saved. The data are also put into a spreadsheet and can be plotted on a graph. Graphical displays are saved in an image file for future reference. The program also computes and displays the power and the phase angle between voltage and current.
The Air Force's central reference laboratory: maximizing service while minimizing cost.
Armbruster, D A
1991-11-01
The Laboratory Services Branch (Epi Lab) of the Epidemiology Division, Brooks AFB, Texas, is designated by regulation to serve as the Air Force's central reference laboratory, providing clinical laboratory testing support to all Air Force medical treatment facilities (MTFs). Epi Lab recognized that it was not offering the MTFs a service comparable to civilian reference laboratories and that, as a result, the Air Force medical system was spending hundreds of thousands of dollars yearly for commercial laboratory support. An in-house laboratory upgrade program was proposed to and approved by the USAF Surgeon General, as a Congressional Efficiencies Add project, to launch a two-phase initiative consisting of a 1-year field trial of 30 MTFs, followed by expansion to another 60 MTFs. Major components of the program include overnight air courier service to deliver patient samples to Epi Lab, a mainframe computer laboratory information system and electronic reporting of results to the MTFs throughout the CONUS. Application of medical marketing concepts and the Total Quality Management (TQM) philosophy allowed Epi to provide dramatically enhanced reference service at a cost savings of about $1 million to the medical system. The Epi Lab upgrade program represents an innovative problem-solving approach, combining technical and managerial improvements, resulting in substantial patient care service and financial dividends. It serves as an example of successful application of TQM and marketing within the military medical system.
Another expert system rule inference based on DNA molecule logic gates
NASA Astrophysics Data System (ADS)
WÄ siewicz, Piotr
2013-10-01
With the help of silicon industry microfluidic processors were invented utilizing nano membrane valves, pumps and microreactors. These so called lab-on-a-chips combined together with molecular computing create molecular-systems-ona- chips. This work presents a new approach to implementation of molecular inference systems. It requires the unique representation of signals by DNA molecules. The main part of this work includes the concept of logic gates based on typical genetic engineering reactions. The presented method allows for constructing logic gates with many inputs and for executing them at the same quantity of elementary operations, regardless of a number of input signals. Every microreactor of the lab-on-a-chip performs one unique operation on input molecules and can be connected by dataflow output-input connections to other ones.
Virtualization in education: Information Security lab in your hands
NASA Astrophysics Data System (ADS)
Karlov, A. A.
2016-09-01
The growing demand for qualified specialists in advanced information technologies poses serious challenges to the education and training of young personnel for science, industry and social problems. Virtualization as a way to isolate the user from the physical characteristics of computing resources (processors, servers, operating systems, networks, applications, etc.), has, in particular, an enormous influence in the field of education, increasing its efficiency, reducing the cost, making it more widely and readily available. The study of Information Security of computer systems is considered as an example of use of virtualization in education.
NASA Astrophysics Data System (ADS)
da Silva, A. M. R.; de Macêdo, J. A.
2016-06-01
On the basis of the technological advancement in the middle and the difficulty of learning by the students in the discipline of physics, this article describes the process of elaboration and implementation of a hypermedia system for high school teachers involving computer simulations for teaching basic concepts of electromagnetism, using free tool. With the completion and publication of the project there will be a new possibility of interaction of students and teachers with the technology in the classroom and in labs.
The Next Wave: Humans, Computers, and Redefining Reality
NASA Technical Reports Server (NTRS)
Little, William
2018-01-01
The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.
[Application of virtual instrumentation technique in toxicological studies].
Moczko, Jerzy A
2005-01-01
Research investigations require frequently direct connection of measuring equipment to the computer. Virtual instrumentation technique considerably facilitates programming of sophisticated acquisition-and-analysis procedures. In standard approach these two steps are performed subsequently with separate software tools. The acquired data are transfered with export / import procedures of particular program to the another one which executes next step of analysis. The described procedure is cumbersome, time consuming and may be potential source of the errors. In 1987 National Instruments Corporation introduced LabVIEW language based on the concept of graphical programming. Contrary to conventional textual languages it allows the researcher to concentrate on the resolved problem and omit all syntactical rules. Programs developed in LabVIEW are called as virtual instruments (VI) and are portable among different computer platforms as PCs, Macintoshes, Sun SPARCstations, Concurrent PowerMAX stations, HP PA/RISK workstations. This flexibility warrants that the programs prepared for one particular platform would be also appropriate to another one. In presented paper basic principles of connection of research equipment to computer systems were described.
ERIC Educational Resources Information Center
Terris, Ben
2010-01-01
Colleges are looking for ways to cut costs, and most students now own laptops. As a result, many campus technology leaders are taking a hard look at those brightly lit rooms with rows of networked computers, which cost hundreds of thousands of dollars a year to maintain. More than 11% of colleges and universities are phasing out computer labs or…
Katz, Jonathan E
2017-01-01
Laboratories tend to be amenable environments for long-term reliable operation of scientific measurement equipment. Indeed, it is not uncommon to find equipment 5, 10, or even 20+ years old still being routinely used in labs. Unfortunately, the Achilles heel for many of these devices is the control/data acquisition computer. Often these computers run older operating systems (e.g., Windows XP) and, while they might only use standard network, USB or serial ports, they require proprietary software to be installed. Even if the original installation disks can be found, it is a burdensome process to reinstall and is fraught with "gotchas" that can derail the process-lost license keys, incompatible hardware, forgotten configuration settings, etc. If you have running legacy instrumentation, the computer is the ticking time bomb waiting to put a halt to your operation.In this chapter, I describe how to virtualize your currently running control computer. This virtualized computer "image" is easy to maintain, easy to back up and easy to redeploy. I have used this multiple times in my own lab to greatly improve the robustness of my legacy devices.After completing the steps in this chapter, you will have your original control computer as well as a virtual instance of that computer with all the software installed ready to control your hardware should your original computer ever be decommissioned.
NASA Technical Reports Server (NTRS)
Sen, Syamal K.; Shaykhian, Gholam Ali
2011-01-01
MatLab(TradeMark)(MATrix LABoratory) is a numerical computation and simulation tool that is used by thousands Scientists and Engineers in many countries. MatLab does purely numerical calculations, which can be used as a glorified calculator or interpreter programming language; its real strength is in matrix manipulations. Computer algebra functionalities are achieved within the MatLab environment using "symbolic" toolbox. This feature is similar to computer algebra programs, provided by Maple or Mathematica to calculate with mathematical equations using symbolic operations. MatLab in its interpreter programming language form (command interface) is similar with well known programming languages such as C/C++, support data structures and cell arrays to define classes in object oriented programming. As such, MatLab is equipped with most of the essential constructs of a higher programming language. MatLab is packaged with an editor and debugging functionality useful to perform analysis of large MatLab programs and find errors. We believe there are many ways to approach real-world problems; prescribed methods to ensure foregoing solutions are incorporated in design and analysis of data processing and visualization can benefit engineers and scientist in gaining wider insight in actual implementation of their perspective experiments. This presentation will focus on data processing and visualizations aspects of engineering and scientific applications. Specifically, it will discuss methods and techniques to perform intermediate-level data processing covering engineering and scientific problems. MatLab programming techniques including reading various data files formats to produce customized publication-quality graphics, importing engineering and/or scientific data, organizing data in tabular format, exporting data to be used by other software programs such as Microsoft Excel, data presentation and visualization will be discussed.
Adjustable Speed Drive Project for Teaching a Servo Systems Course Laboratory
ERIC Educational Resources Information Center
Rodriguez-Resendiz, J.; Herrera-Ruiz, G.; Rivas-Araiza, E. A.
2011-01-01
This paper describes an adjustable speed drive for a three-phase motor, which has been implemented as a design for a servo system laboratory course in an engineering curriculum. The platform is controlled and analyzed in a LabVIEW environment and run on a PC. Theory is introduced in order to show the sensorless algorithms. These are computed by…
A New Flying Wire System for the Tevatron
NASA Astrophysics Data System (ADS)
Blokland, Willem; Dey, Joseph; Vogel, Greg
1997-05-01
A new Flying Wires system replaces the old system to enhance the analysis of the beam emittance, improve the reliability, and handle the upcoming upgrades of the Tevatron. New VME data acquisition modules and timing modules allow for more bunches to be sampled more precisely. The programming language LabVIEW, running on a Macintosh computer, controls the VME modules and the nuLogic motion board that flies the wires. LabVIEW also analyzes and stores the data, and handles local and remote commands. The new system flies three wires and fits profiles of 72 bunches to a gaussian function within two seconds. A new console application operates the flying wires from any control console. This paper discusses the hardware and software setup, the capabilities and measurement results of the new Flying Wires system.
Trusted Computing Technologies, Intel Trusted Execution Technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guise, Max Joseph; Wendt, Jeremy Daniel
2011-01-01
We describe the current state-of-the-art in Trusted Computing Technologies - focusing mainly on Intel's Trusted Execution Technology (TXT). This document is based on existing documentation and tests of two existing TXT-based systems: Intel's Trusted Boot and Invisible Things Lab's Qubes OS. We describe what features are lacking in current implementations, describe what a mature system could provide, and present a list of developments to watch. Critical systems perform operation-critical computations on high importance data. In such systems, the inputs, computation steps, and outputs may be highly sensitive. Sensitive components must be protected from both unauthorized release, and unauthorized alteration: Unauthorizedmore » users should not access the sensitive input and sensitive output data, nor be able to alter them; the computation contains intermediate data with the same requirements, and executes algorithms that the unauthorized should not be able to know or alter. Due to various system requirements, such critical systems are frequently built from commercial hardware, employ commercial software, and require network access. These hardware, software, and network system components increase the risk that sensitive input data, computation, and output data may be compromised.« less
This presentation gives a brief introduction to EPA's computational toxicology program and the Athens Lab's role in it. The talk also covered a brief introduction to metabolomics; advantages/disadvanage of metabolomics for toxicity assessment; goals of the EPA Athens metabolomics...
Computer Labs Report to the Holodeck
ERIC Educational Resources Information Center
Raths, David
2011-01-01
In many ways, specialized computer labs are the black holes of IT organizations. Budgets, equipment, employees--even space itself--are sucked in. Given a choice, many IT shops would engage warp drive and escape their gravitational pull forever. While Captain Kirk might have looked to Scotty for a fix to the problem, colleges and universities are…
ODU-CAUSE: Computer Based Learning Lab.
ERIC Educational Resources Information Center
Sachon, Michael W.; Copeland, Gary E.
This paper describes the Computer Based Learning Lab (CBLL) at Old Dominion University (ODU) as a component of the ODU-Comprehensive Assistance to Undergraduate Science Education (CAUSE) Project. Emphasis is directed to the structure and management of the facility and to the software under development by the staff. Serving the ODU-CAUSE User Group…
TIMESERIESSTREAMING.VI: LabVIEW program for reliable data streaming of large analog time series
NASA Astrophysics Data System (ADS)
Czerwinski, Fabian; Oddershede, Lene B.
2011-02-01
With modern data acquisition devices that work fast and very precise, scientists often face the task of dealing with huge amounts of data. These need to be rapidly processed and stored onto a hard disk. We present a LabVIEW program which reliably streams analog time series of MHz sampling. Its run time has virtually no limitation. We explicitly show how to use the program to extract time series from two experiments: For a photodiode detection system that tracks the position of an optically trapped particle and for a measurement of ionic current through a glass capillary. The program is easy to use and versatile as the input can be any type of analog signal. Also, the data streaming software is simple, highly reliable, and can be easily customized to include, e.g., real-time power spectral analysis and Allan variance noise quantification. Program summaryProgram title: TimeSeriesStreaming.VI Catalogue identifier: AEHT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 250 No. of bytes in distributed program, including test data, etc.: 63 259 Distribution format: tar.gz Programming language: LabVIEW ( http://www.ni.com/labview/) Computer: Any machine running LabVIEW 8.6 or higher Operating system: Windows XP and Windows 7 RAM: 60-360 Mbyte Classification: 3 Nature of problem: For numerous scientific and engineering applications, it is highly desirable to have an efficient, reliable, and flexible program to perform data streaming of time series sampled with high frequencies and possibly for long time intervals. This type of data acquisition often produces very large amounts of data not easily streamed onto a computer hard disk using standard methods. Solution method: This LabVIEW program is developed to directly stream any kind of time series onto a hard disk. Due to optimized timing and usage of computational resources, such as multicores and protocols for memory usage, this program provides extremely reliable data acquisition. In particular, the program is optimized to deal with large amounts of data, e.g., taken with high sampling frequencies and over long time intervals. The program can be easily customized for time series analyses. Restrictions: Only tested in Windows-operating LabVIEW environments, must use TDMS format, acquisition cards must be LabVIEW compatible, driver DAQmx installed. Running time: As desirable: microseconds to hours
Laboratory directed research and development program FY 1999
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Todd; Levy, Karin
2000-03-08
The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab or LBNL) is a multi-program national research facility operated by the University of California for the Department of Energy (DOE). As an integral element of DOE's National Laboratory System, Berkeley Lab supports DOE's missions in fundamental science, energy resources, and environmental quality. Berkeley Lab programs advance four distinct goals for DOE and the nation: (1) To perform leading multidisciplinary research in the computing sciences, physical sciences, energy sciences, biosciences, and general sciences in a manner that ensures employee and public safety and protection of the environment. (2) To develop and operatemore » unique national experimental facilities for qualified investigators. (3) To educate and train future generations of scientists and engineers to promote national science and education goals. (4) To transfer knowledge and technological innovations and to foster productive relationships among Berkeley Lab's research programs, universities, and industry in order to promote national economic competitiveness. This is the annual report on Laboratory Directed Research and Development (LDRD) program for FY99.« less
None
2017-12-09
Learn what it will take to create tomorrow's net-zero energy home as scientists reveal the secrets of cool roofs, smart windows, and computer-driven energy control systems. The net-zero energy home: Scientists are working to make tomorrow's homes more than just energy efficient -- they want them to be zero energy. Iain Walker, a scientist in the Lab's Energy Performance of Buildings Group, will discuss what it takes to develop net-zero energy houses that generate as much energy as they use through highly aggressive energy efficiency and on-site renewable energy generation. Talking back to the grid: Imagine programming your house to use less energy if the electricity grid is full or price are high. Mary Ann Piette, deputy director of Berkeley Lab's building technology department and director of the Lab's Demand Response Research Center, will discuss how new technologies are enabling buildings to listen to the grid and automatically change their thermostat settings or lighting loads, among other demands, in response to fluctuating electricity prices. The networked (and energy efficient) house: In the future, your home's lights, climate control devices, computers, windows, and appliances could be controlled via a sophisticated digital network. If it's plugged in, it'll be connected. Bruce Nordman, an energy scientist in Berkeley Lab's Energy End-Use Forecasting group, will discuss how he and other scientists are working to ensure these networks help homeowners save energy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Learn what it will take to create tomorrow's net-zero energy home as scientists reveal the secrets of cool roofs, smart windows, and computer-driven energy control systems. The net-zero energy home: Scientists are working to make tomorrow's homes more than just energy efficient -- they want them to be zero energy. Iain Walker, a scientist in the Lab's Energy Performance of Buildings Group, will discuss what it takes to develop net-zero energy houses that generate as much energy as they use through highly aggressive energy efficiency and on-site renewable energy generation. Talking back to the grid: Imagine programming your house tomore » use less energy if the electricity grid is full or price are high. Mary Ann Piette, deputy director of Berkeley Lab's building technology department and director of the Lab's Demand Response Research Center, will discuss how new technologies are enabling buildings to listen to the grid and automatically change their thermostat settings or lighting loads, among other demands, in response to fluctuating electricity prices. The networked (and energy efficient) house: In the future, your home's lights, climate control devices, computers, windows, and appliances could be controlled via a sophisticated digital network. If it's plugged in, it'll be connected. Bruce Nordman, an energy scientist in Berkeley Lab's Energy End-Use Forecasting group, will discuss how he and other scientists are working to ensure these networks help homeowners save energy.« less
Computational Tools and Facilities for the Next-Generation Analysis and Design Environment
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)
1997-01-01
This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.
A Simple Technique for Securing Data at Rest Stored in a Computing Cloud
NASA Astrophysics Data System (ADS)
Sedayao, Jeff; Su, Steven; Ma, Xiaohao; Jiang, Minghao; Miao, Kai
"Cloud Computing" offers many potential benefits, including cost savings, the ability to deploy applications and services quickly, and the ease of scaling those application and services once they are deployed. A key barrier for enterprise adoption is the confidentiality of data stored on Cloud Computing Infrastructure. Our simple technique implemented with Open Source software solves this problem by using public key encryption to render stored data at rest unreadable by unauthorized personnel, including system administrators of the cloud computing service on which the data is stored. We validate our approach on a network measurement system implemented on PlanetLab. We then use it on a service where confidentiality is critical - a scanning application that validates external firewall implementations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnstad, H.
The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnstad, H.
The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less
The computer program AQUASIM was used to model biological treatment of perchlorate-contaminated water using zero-valent iron corrosion as the hydrogen gas source. The laboratory-scale column was seeded with an autohydrogenotrophic microbial consortium previously shown to degrade ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peles, Slaven
2016-11-06
GridKit is a software development kit for interfacing power systems and power grid application software with high performance computing (HPC) libraries developed at National Labs and academia. It is also intended as interoperability layer between different numerical libraries. GridKit is not a standalone application, but comes with a suite of test examples illustrating possible usage.
Inexpensive Data Acquisition with a Sound Card
ERIC Educational Resources Information Center
Hassan, Umer; Pervaiz, Saad; Anwar, Muhammad Sabieh
2011-01-01
Signal generators, oscilloscopes, and data acquisition (DAQ) systems are standard components of the modern experimental physics laboratory. The sound card, a built-in component in the ubiquitous personal computer, can be utilized for all three of these tasks and offers an attractive option for labs in developing countries such as…
Combining Instructionist and Constructionist Learning in a Virtual Biotech Lab.
ERIC Educational Resources Information Center
Dawabi, Peter; Wessner, Martin
The background of this paper is an internal research project at the German National Research Center for Information Technology, Integrated Publication and Information Systems Institute, (GMD-IPSI) dealing with software engineering, computer-supported cooperative learning (CSCL) and practical biotech knowledge. The project goal is to develop a…
Computer Simulations for Lab Experiences in Secondary Physics
ERIC Educational Resources Information Center
Murphy, David Shannon
2012-01-01
Physical science instruction often involves modeling natural systems, such as electricity that possess particles which are invisible to the unaided eye. The effect of these particles' motion is observable, but the particles are not directly observable to humans. Simulations have been developed in physics, chemistry and biology that, under certain…
Shared-resource computing for small research labs.
Ackerman, M J
1982-04-01
A real time laboratory computer network is described. This network is composed of four real-time laboratory minicomputers located in each of four division laboratories and a larger minicomputer in a centrally located computer room. Off the shelf hardware and software were used with no customization. The network is configured for resource sharing using DECnet communications software and the RSX-11-M multi-user real-time operating system. The cost effectiveness of the shared resource network and multiple real-time processing using priority scheduling is discussed. Examples of utilization within a medical research department are given.
NASA Astrophysics Data System (ADS)
Kuan, Wen-Hsuan; Tseng, Chi-Hung; Chen, Sufen; Wong, Ching-Chang
2016-06-01
We propose an integrated curriculum to establish essential abilities of computer programming for the freshmen of a physics department. The implementation of the graphical-based interfaces from Scratch to LabVIEW then to LabVIEW for Arduino in the curriculum `Computer-Assisted Instrumentation in the Design of Physics Laboratories' brings rigorous algorithm and syntax protocols together with imagination, communication, scientific applications and experimental innovation. The effectiveness of the curriculum was evaluated via statistical analysis of questionnaires, interview responses, the increase in student numbers majoring in physics, and performance in a competition. The results provide quantitative support that the curriculum remove huge barriers to programming which occur in text-based environments, helped students gain knowledge of programming and instrumentation, and increased the students' confidence and motivation to learn physics and computer languages.
Strategies for Sharing Seismic Data Among Multiple Computer Platforms
NASA Astrophysics Data System (ADS)
Baker, L. M.; Fletcher, J. B.
2001-12-01
Seismic waveform data is readily available from a variety of sources, but it often comes in a distinct, instrument-specific data format. For example, data may be from portable seismographs, such as those made by Refraction Technology or Kinemetrics, from permanent seismograph arrays, such as the USGS Parkfield Dense Array, from public data centers, such as the IRIS Data Center, or from personal communication with other researchers through e-mail or ftp. A computer must be selected to import the data - usually whichever is the most suitable for reading the originating format. However, the computer best suited for a specific analysis may not be the same. When copies of the data are then made for analysis, a proliferation of copies of the same data results, in possibly incompatible, computer-specific formats. In addition, if an error is detected and corrected in one copy, or some other change is made, all the other copies must be updated to preserve their validity. Keeping track of what data is available, where it is located, and which copy is authoritative requires an effort that is easy to neglect. We solve this problem by importing waveform data to a shared network file server that is accessible to all our computers on our campus LAN. We use a Network Appliance file server running Sun's Network File System (NFS) software. Using an NFS client software package on each analysis computer, waveform data can then be read by our MatLab or Fortran applications without first copying the data. Since there is a single copy of the waveform data in a single location, the NFS file system hierarchy provides an implicit complete waveform data catalog and the single copy is inherently authoritative. Another part of our solution is to convert the original data into a blocked-binary format (known historically as USGS DR100 or VFBB format) that is interpreted by MatLab or Fortran library routines available on each computer so that the idiosyncrasies of each machine are not visible to the user. Commercial software packages, such as MatLab, also have the ability to share data in their own formats across multiple computer platforms. Our Fortran applications can create plot files in Adobe PostScript, Illustrator, and Portable Document Format (PDF) formats. Vendor support for reading these files is readily available on multiple computer platforms. We will illustrate by example our strategies for sharing seismic data among our multiple computer platforms, and we will discuss our positive and negative experiences. We will include our solutions for handling the different byte ordering, floating-point formats, and text file ``end-of-line'' conventions on the various computer platforms we use (6 different operating systems on 5 processor architectures).
The Influence of Tablet PCs on Students' Use of Multiple Representations in Lab Reports
NASA Astrophysics Data System (ADS)
Guelman, Clarisa Bercovich; De Leone, Charles; Price, Edward
2009-11-01
This study examined how different tools influenced students' use of representations in the Physics laboratory. In one section of a lab course, every student had a Tablet PC that served as a digital-ink based lab notebook. Students could seamlessly create hand-drawn graphics and equations, and write lab reports on the same computer used for data acquisition, simulation, and analysis. In another lab section, students used traditional printed lab guides, kept paper notebooks, and then wrote lab reports on regular laptops. Analysis of the lab reports showed differences between the sections' use of multiple representations, including an increased use of diagrams and equations by the Tablet users.
LabVIEW application for motion tracking using USB camera
NASA Astrophysics Data System (ADS)
Rob, R.; Tirian, G. O.; Panoiu, M.
2017-05-01
The technical state of the contact line and also the additional equipment in electric rail transport is very important for realizing the repairing and maintenance of the contact line. During its functioning, the pantograph motion must stay in standard limits. Present paper proposes a LabVIEW application which is able to track in real time the motion of a laboratory pantograph and also to acquire the tracking images. An USB webcam connected to a computer acquires the desired images. The laboratory pantograph contains an automatic system which simulates the real motion. The tracking parameters are the horizontally motion (zigzag) and the vertically motion which can be studied in separate diagrams. The LabVIEW application requires appropriate tool-kits for vision development. Therefore the paper describes the subroutines that are especially programmed for real-time image acquisition and also for data processing.
Computer Programs for Chemistry Experiments I and II.
ERIC Educational Resources Information Center
Reynard, Dale C.
This unit of instruction includes nine laboratory experiments. All of the experiments are from the D.C. Health Revision of the Chemical Education Materials Study (CHEMS) with one exception. Program six is the lab from the original version of the CHEMS program. Each program consists of three parts (1) the lab and computer hints, (2) the description…
Sneak Preview of Berkeley Lab's Science at the Theatre on June 6th, 2011
Sanii, Babak
2017-12-11
Babak Sanii provides a sneak preview of Berkeley Lab's next Science at the Theater Event: Big Thinking: The Power of Nanoscience. Berkeley Lab scientists reveal how nanoscience will bring us cleaner energy, faster computers, and improved medicine. Berkeley Repertory Theatre on June 6th, 2011.
Sneak Preview of Berkeley Lab's Science at the Theatre on June 6th, 2011
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanii, Babak
Babak Sanii provides a sneak preview of Berkeley Lab's next Science at the Theater Event: Big Thinking: The Power of Nanoscience. Berkeley Lab scientists reveal how nanoscience will bring us cleaner energy, faster computers, and improved medicine. Berkeley Repertory Theatre on June 6th, 2011.
Design and development of a solar powered mobile laboratory
NASA Astrophysics Data System (ADS)
Jiao, L.; Simon, A.; Barrera, H.; Acharya, V.; Repke, W.
2016-08-01
This paper describes the design and development of a solar powered mobile laboratory (SPML) system. The SPML provides a mobile platform that schools, universities, and communities can use to give students and staff access to laboratory environments where dedicated laboratories are not available. The lab includes equipment like 3D printers, computers, and soldering stations. The primary power source of the system is solar PV which allows the laboratory to be operated in places where the grid power is not readily available or not sufficient to power all the equipment. The main system components include PV panels, junction box, battery, charge controller, and inverter. Not only is it used to teach students and staff how to use the lab equipment, but it is also a great tool to educate the public about solar PV technologies.
Technology: Catalyst for Enhancing Chemical Education for Pre-service Teachers
NASA Astrophysics Data System (ADS)
Kumar, Vinay; Bedell, Julia Yang; Seed, Allen H.
1999-05-01
A DOE/KYEPSCoR-funded project enabled us to introduce a new curricular initiative aimed at improving the chemical education of pre-service elementary teachers. The new curriculum was developed in collaboration with the School of Education faculty. A new course for the pre-service teachers, "Discovering Chemistry with Lab" (CHE 105), was developed. The integrated lecture and lab course covers basic principles of chemistry and their applications in daily life. The course promotes reasoning and problem-solving skills and utilizes hands-on, discovery/guided-inquiry, and cooperative learning approaches. This paper describes the implementation of technology (computer-interfacing and simulation experiments) in the lab. Results of two assessment surveys conducted in the laboratory are also discussed. The key features of the lab course are eight new experiments, including four computer-interfacing/simulation experiments involving the use of Macintosh Power PCs, temperature and pH probes, and a serial box interface, and use of household materials. Several experiments and the midterm and final lab practical exams emphasize the discovery/guided-inquiry approach. The results of pre- and post-surveys showed very significant positive changes in students' attitude toward the relevancy of chemistry, use of technology (computers) in elementary school classrooms, and designing and teaching discovery-based units. Most students indicated that they would be very interested (52%) or interested (36%) in using computers in their science teaching.
NASA Astrophysics Data System (ADS)
Pohlman, Nicholas A.; Hynes, Eric; Kutz, April
2015-11-01
Lectures in introductory fluid mechanics at NIU are a combination of students with standard enrollment and students seeking honors credit for an enriching experience. Most honors students dread the additional homework problems or an extra paper assigned by the instructor. During the past three years, honors students of my class have instead collaborated to design wet-lab experiments for their peers to predict variable volume flow rates of open reservoirs driven by gravity. Rather than learn extra, the honors students learn the Bernoulli head-loss equation earlier to design appropriate systems for an experimental wet lab. Prior designs incorporated minor loss features such as sudden contraction or multiple unions and valves. The honors students from Spring 2015 expanded the repertoire of available options by developing large scale set-ups with multiple pipe networks that could be combined together to test the flexibility of the student team's computational programs. The engagement of bridging the theory with practice was appreciated by all of the students such that multiple teams were able to predict performance within 4% accuracy. The challenges, schedules, and cost estimates of incorporating the experimental lab into an introductory fluid mechanics course will be reported.
NASA Astrophysics Data System (ADS)
Henderson, Jean Foster
The purpose of this study was to assess the effect of classroom restructuring involving computer laboratories on student achievement and student attitudes toward computers and computer courses. The effects of the targeted student attributes of gender, previous programming experience, math background, and learning style were also examined. The open lab-based class structure consisted of a traditional lecture class with a separate, unscheduled lab component in which lab assignments were completed outside of class; the closed lab-based class structure integrated a lab component within the lecture class so that half the class was reserved for lecture and half the class was reserved for students to complete lab assignments by working cooperatively with each other and under the supervision and guidance of the instructor. The sample consisted of 71 students enrolled in four intact classes of Computer Science I during the fall and spring semesters of the 2006--2007 school year at two southern universities: two classes were held in the fall (one at each university) and two classes were held in the spring (one at each university). A counterbalanced repeated measures design was used in which all students experienced both class structures for half of each semester. The order of control and treatment was rotated among the four classes. All students received the same amount of class and instructor time. A multivariate analysis of variance (MANOVA) via a multiple regression strategy was used to test the study's hypotheses. Although the overall MANOVA model was statistically significant, independent follow-up univariate analyses relative to each dependent measure found that the only significant research factor was math background: Students whose mathematics background was at the level of Calculus I or higher had significantly higher student achievement than students whose mathematics background was less than Calculus I. The results suggest that classroom structures that incorporate an open laboratory setting are just as effective on student achievement and attitudes as classroom structures that incorporate a closed laboratory setting. The results also suggest that math background is a strong predictor of student achievement in CS 1.
Rutkowski, Tomasz M
2015-08-01
This paper presents an applied concept of a brain-computer interface (BCI) student research laboratory (BCI-LAB) at the Life Science Center of TARA, University of Tsukuba, Japan. Several successful case studies of the student projects are reviewed together with the BCI Research Award 2014 winner case. The BCI-LAB design and project-based teaching philosophy is also explained. Future teaching and research directions summarize the review.
2001-01-03
KENNEDY SPACE CENTER, Fla. -- Under wispy white morning clouds, Space Shuttle Atlantis approaches Launch Pad 39A, which shows the Rotating Service Structure open (left) and the Fixed Service Structure (right). At the RSS, the payload canister is being lifted up to the Payload Changeout Room. This is the Shuttle’s second attempt at rollout. Jan. 2 a failed computer processor on the crawler transporter aborted the rollout and the Shuttle was returned to the Vehicle Assembly Building using a secondary computer processor on the vehicle. Atlantis will fly on mission STS-98, the seventh construction flight to the International Space Station, carrying the U.S. Laboratory, named Destiny. The lab will have five system racks already installed inside the module. After delivery of electronics in the lab, electrically powered attitude control for Control Moment Gyroscopes will be activated. Atlantis is scheduled for launch no earlier than Jan. 19, 2001, with a crew of five
Patel, Shyamal; McGinnis, Ryan S; Silva, Ikaro; DiCristofaro, Steve; Mahadevan, Nikhil; Jortberg, Elise; Franco, Jaime; Martin, Albert; Lust, Joseph; Raj, Milan; McGrane, Bryan; DePetrillo, Paolo; Aranyosi, A J; Ceruolo, Melissa; Pindado, Jesus; Ghaffari, Roozbeh
2016-08-01
Wearable sensors have the potential to enable clinical-grade ambulatory health monitoring outside the clinic. Technological advances have enabled development of devices that can measure vital signs with great precision and significant progress has been made towards extracting clinically meaningful information from these devices in research studies. However, translating measurement accuracies achieved in the controlled settings such as the lab and clinic to unconstrained environments such as the home remains a challenge. In this paper, we present a novel wearable computing platform for unobtrusive collection of labeled datasets and a new paradigm for continuous development, deployment and evaluation of machine learning models to ensure robust model performance as we transition from the lab to home. Using this system, we train activity classification models across two studies and track changes in model performance as we go from constrained to unconstrained settings.
Games, Simulations and Virtual Labs for Science Education: a Compendium and Some Examples
NASA Astrophysics Data System (ADS)
Russell, R. M.
2012-12-01
We have assembled a list of computer-based simulations, games, and virtual labs for science education. This list, with links to the sources of these resources, is available online. The entries span a broad range of science, math, and engineering topics. They also span a range of target student ages, from elementary school to university students. We will provide a brief overview of this web site and the resources found on it. We will also briefly demonstrate some of our own educational simulations and games. Computer-based simulations and virtual labs are valuable resources for science educators in various settings, allowing learners to experiment and explore "what if" scenarios. Educational computer games can motivate learners in both formal and informal settings, encouraging them to spend much more time exploring a topic than they might otherwise be inclined to do. Part of this presentation is effectively a "literature review" of numerous sources of simulations, games, and virtual labs. Although we have encountered several nice collections of such resources, those collections seem to be restricted in scope. They either represent materials developed by a specific group or agency (e.g. NOAA's games web site) or are restricted to a specific discipline (e.g. geology simulations and virtual labs). This presentation directs viewers to games, simulations, and virtual labs from many different sources and spanning a broad range of STEM disciplines.
Graphics supercomputer for computational fluid dynamics research
NASA Astrophysics Data System (ADS)
Liaw, Goang S.
1994-11-01
The objective of this project is to purchase a state-of-the-art graphics supercomputer to improve the Computational Fluid Dynamics (CFD) research capability at Alabama A & M University (AAMU) and to support the Air Force research projects. A cutting-edge graphics supercomputer system, Onyx VTX, from Silicon Graphics Computer Systems (SGI), was purchased and installed. Other equipment including a desktop personal computer, PC-486 DX2 with a built-in 10-BaseT Ethernet card, a 10-BaseT hub, an Apple Laser Printer Select 360, and a notebook computer from Zenith were also purchased. A reading room has been converted to a research computer lab by adding some furniture and an air conditioning unit in order to provide an appropriate working environments for researchers and the purchase equipment. All the purchased equipment were successfully installed and are fully functional. Several research projects, including two existing Air Force projects, are being performed using these facilities.
LabVIEW-based control and data acquisition system for cathodoluminescence experiments.
Bok, J; Schauer, P
2011-11-01
Computer automation of cathodoluminescence (CL) experiments using equipment developed in our laboratory is described. The equipment provides various experiments for CL efficiency, CL spectra, and CL time response studies. The automation was realized utilizing the graphical programming environment LabVIEW. The developed application software with procedures for equipment control and data acquisition during various CL experiments is presented. As the measured CL data are distorted by technical limitations of the equipment, such as equipment spectral sensitivity and time response, data correction algorithms were incorporated into the procedures. Some examples of measured data corrections are presented. © 2011 American Institute of Physics
Meng, Hu; Li, Jiang-Yuan; Tang, Yong-Huai
2009-01-01
The virtual instrument system based on LabVIEW 8.0 for ion analyzer which can measure and analyze ion concentrations in solution is developed and comprises homemade conditioning circuit, data acquiring board, and computer. It can calibrate slope, temperature, and positioning automatically. When applied to determine the reaction rate constant by pX, it achieved live acquiring, real-time displaying, automatical processing of testing data, generating the report of results; and other functions. This method simplifies the experimental operation greatly, avoids complicated procedures of manual processing data and personal error, and improves veracity and repeatability of the experiment results.
2010-07-15
operations of mathematical morphology applied for analysis of images are ways to extract information of image. The approach early developed [52] to use...1,2568 57 VB2 5,642; 5,804; 5,67; 5,784 0,5429 0,2338 0,04334 0,45837 CrB2 5,62; 5,779; 5,61; 5,783 0,53276 0,23482...maxT For VB2 - has min value if compare with other composite materials on the base of LaB6 and diborides of transitive metals [3], = Joule and
Oh, Se An; Park, Jae Won; Yea, Ji Woon; Kim, Sung Kyu
2017-01-01
The objective of this study was to evaluate the setup discrepancy between BrainLAB 6 degree-of-freedom (6D) ExacTrac and cone-beam computed tomography (CBCT) used with the imaging guidance system Novalis Tx for intracranial stereotactic radiosurgery. We included 107 consecutive patients for whom white stereotactic head frame masks (R408; Clarity Medical Products, Newark, OH) were used to fix the head during intracranial stereotactic radiosurgery, between August 2012 and July 2016. The patients were immobilized in the same state for both the verification image using 6D ExacTrac and online 3D CBCT. In addition, after radiation treatment, registration between the computed tomography simulation images and the CBCT images was performed with offline 6D fusion in an offline review. The root-mean-square of the difference in the translational dimensions between the ExacTrac system and CBCT was <1.01 mm for online matching and <1.10 mm for offline matching. Furthermore, the root-mean-square of the difference in the rotational dimensions between the ExacTrac system and the CBCT were <0.82° for online matching and <0.95° for offline matching. It was concluded that while the discrepancies in residual setup errors between the ExacTrac 6D X-ray and the CBCT were minor, they should not be ignored.
The Business Education Lab and Local Area Networking for Curriculum Improvement.
ERIC Educational Resources Information Center
Seals, Georgina; And Others
This guide explains how to incorporate a local area network (LAN) into the business education curriculum. The first section defines LAN, a communications system that links computers and other peripherals within an office or throughout nearby buildings and shares multiuser software and send and/or receive information. Curriculum planning…
Computer Based Data Acquisition in the Undergraduate Lab.
ERIC Educational Resources Information Center
Wepfer, William J.; Oehmke, Roger L. T.
1987-01-01
Describes a data acquisition system developed for an undergraduate engineering students' instructional laboratory at Georgia Tech. Special emphasis is placed on the design of an A/D Converter Board used to measure the viscosity and temperature of motor oil. The Simons' BASIC Program Listing for the Commodore 64 microcomputer is appended. (LRW)
Swipe In, Tap Out: Advancing Student Entrepreneurship in the CIS Sandbox
ERIC Educational Resources Information Center
Charlebois, Conner; Hentschel, Nicholas; Frydenberg, Mark
2014-01-01
The Computer Information Systems Learning and Technology Sandbox (CIS Sandbox) opened as a collaborative learning lab during the fall 2011 semester at a New England business university. The facility employs 24 student workers, who, in addition to providing core tutoring services, are encouraged to explore new technologies and take on special…
ERIC Educational Resources Information Center
Bradford, Jane T.; And Others
1996-01-01
Academic Computing Services staff and University librarians at Stetson University (DeLand, Florida) designed and implemented a three-day Internet workshop for interested faculty. The workshop included both hands-on lab sessions and discussions covering e-mail, telnet, ftp, Gopher, and World Wide Web. The planning, preparation of the lab and…
Making the Switch to Open Source Software
ERIC Educational Resources Information Center
Surran, Michael
2003-01-01
During the 2001-2002 school year the author was struck with the reality that their computer lab would not meet the demands of their school for another year. Greater Houlton Christian Academy (www.ghca.com) is a private school in Maine, and thus does not have access to state or federal funding. This meant that financing a new computer lab would be…
Simulated Exercise Physiology Laboratories.
ERIC Educational Resources Information Center
Morrow, James R., Jr.; Pivarnik, James M.
This book consists of a lab manual and computer disks for either Apple or IBM hardware. The lab manual serves as "tour guide" for the learner going through the various lab experiences. The manual contains definitions, proper terminology, and other basic information about physiological principles. It is organized so a step-by-step procedure may be…
A Computer Engineering Curriculum for the Air Force Academy: An Implementation Plan
1985-04-01
engineerinq is needed as a r ul of the findings? 5. What is the impact of this study’s rocommendat ion to pursue the Electrico I Engineering deqree with onpt...stepper motor 9 S35 LAB 36 Serial 10 S37 GR #3 - 38 8251 10 chip ) 39 LAB serial 10 10 * 40 LAB " 1)41 LAB S 42 Course review - S FINAL EXAM 00 % 80 0
Auto-tuning for NMR probe using LabVIEW
NASA Astrophysics Data System (ADS)
Quen, Carmen; Pham, Stephanie; Bernal, Oscar
2014-03-01
Typical manual NMR-tuning method is not suitable for broadband spectra spanning several megahertz linewidths. Among the main problems encountered during manual tuning are pulse-power reproducibility, baselines, and transmission line reflections, to name a few. We present a design of an auto-tuning system using graphic programming language, LabVIEW, to minimize these problems. The program uses a simplified model of the NMR probe conditions near perfect tuning to mimic the tuning process and predict the position of the capacitor shafts needed to achieve the desirable impedance. The tuning capacitors of the probe are controlled by stepper motors through a LabVIEW/computer interface. Our program calculates the effective capacitance needed to tune the probe and provides controlling parameters to advance the motors in the right direction. The impedance reading of a network analyzer can be used to correct the model parameters in real time for feedback control.
Li, Jun; Shi, Wenyin; Andrews, David; Werner-Wasik, Maria; Lu, Bo; Yu, Yan; Dicker, Adam; Liu, Haisong
2017-06-01
The study was aimed to compare online 6 degree-of-freedom image registrations of TrueBeam cone-beam computed tomography and BrainLab ExacTrac X-ray imaging systems for intracranial radiosurgery. Phantom and patient studies were performed on a Varian TrueBeam STx linear accelerator (version 2.5), which is integrated with a BrainLab ExacTrac imaging system (version 6.1.1). The phantom study was based on a Rando head phantom and was designed to evaluate isocenter location dependence of the image registrations. Ten isocenters at various locations representing clinical treatment sites were selected in the phantom. Cone-beam computed tomography and ExacTrac X-ray images were taken when the phantom was located at each isocenter. The patient study included 34 patients. Cone-beam computed tomography and ExacTrac X-ray images were taken at each patient's treatment position. The 6 degree-of-freedom image registrations were performed on cone-beam computed tomography and ExacTrac, and residual errors calculated from cone-beam computed tomography and ExacTrac were compared. In the phantom study, the average residual error differences (absolute values) between cone-beam computed tomography and ExacTrac image registrations were 0.17 ± 0.11 mm, 0.36 ± 0.20 mm, and 0.25 ± 0.11 mm in the vertical, longitudinal, and lateral directions, respectively. The average residual error differences in the rotation, roll, and pitch were 0.34° ± 0.08°, 0.13° ± 0.09°, and 0.12° ± 0.10°, respectively. In the patient study, the average residual error differences in the vertical, longitudinal, and lateral directions were 0.20 ± 0.16 mm, 0.30 ± 0.18 mm, 0.21 ± 0.18 mm, respectively. The average residual error differences in the rotation, roll, and pitch were 0.40°± 0.16°, 0.17° ± 0.13°, and 0.20° ± 0.14°, respectively. Overall, the average residual error differences were <0.4 mm in the translational directions and <0.5° in the rotational directions. ExacTrac X-ray image registration is comparable to TrueBeam cone-beam computed tomography image registration in intracranial treatments.
MatLab Script and Functional Programming
NASA Technical Reports Server (NTRS)
Shaykhian, Gholam Ali
2007-01-01
MatLab Script and Functional Programming: MatLab is one of the most widely used very high level programming languages for scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. The MatLab seminar covers the functional and script programming aspect of MatLab language. Specific expectations are: a) Recognize MatLab commands, script and function. b) Create, and run a MatLab function. c) Read, recognize, and describe MatLab syntax. d) Recognize decisions, loops and matrix operators. e) Evaluate scope among multiple files, and multiple functions within a file. f) Declare, define and use scalar variables, vectors and matrices.
MatLab Programming for Engineers Having No Formal Programming Knowledge
NASA Technical Reports Server (NTRS)
Shaykhian, Linda H.; Shaykhian, Gholam Ali
2007-01-01
MatLab is one of the most widely used very high level programming languages for Scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. Also, stated are the current limitations of the MatLab, which possibly can be taken care of by Mathworks Inc. in a future version to make MatLab more versatile.
Monitoring of the electrical parameters in off-grid solar power system
NASA Astrophysics Data System (ADS)
Idzkowski, Adam; Leoniuk, Katarzyna; Walendziuk, Wojciech
2016-09-01
The aim of this work was to make a monitoring dedicated to an off-grid installation. A laboratory set, which was built for that purpose, was equipped with a PV panel, a battery, a charge controller and a load. Additionally, to monitor electrical parameters from this installation there were used: LabJack module (data acquisition card), measuring module (self-built) and a computer with a program, which allows to measure and present the off-grid installation parameters. The program was made in G language using LabVIEW software. The designed system enables analyzing the currents and voltages of PV panel, battery and load. It makes also possible to visualize them on charts and to make reports from registered data. The monitoring system was also verified by a laboratory test and in real conditions. The results of this verification are also presented.
Assessment Outcomes: Computerized Instruction in a Human Gross Anatomy Course.
ERIC Educational Resources Information Center
Bukowski, Elaine L.
2002-01-01
The first of three successive classes of beginning physical therapy students (n=17) completed traditional cadaver anatomy lecture/lab; the next 17 a self-study computerized anatomy lab, and the next 20 both lectures and computer lab. No differences in study times and course or licensure exam performance appeared. Computerized self-study is a…
A remote laboratory for USRP-based software defined radio
NASA Astrophysics Data System (ADS)
Gandhinagar Ekanthappa, Rudresh; Escobar, Rodrigo; Matevossian, Achot; Akopian, David
2014-02-01
Electrical and computer engineering graduates need practical working skills with real-world electronic devices, which are addressed to some extent by hands-on laboratories. Deployment capacity of hands-on laboratories is typically constrained due to insufficient equipment availability, facility shortages, and lack of human resources for in-class support and maintenance. At the same time, at many sites, existing experimental systems are usually underutilized due to class scheduling bottlenecks. Nowadays, online education gains popularity and remote laboratories have been suggested to broaden access to experimentation resources. Remote laboratories resolve many problems as various costs can be shared, and student access to instrumentation is facilitated in terms of access time and locations. Labs are converted to homeworks that can be done without physical presence in laboratories. Even though they are not providing full sense of hands-on experimentation, remote labs are a viable alternatives for underserved educational sites. This paper studies remote modality of USRP-based radio-communication labs offered by National Instruments (NI). The labs are offered to graduate and undergraduate students and tentative assessments support feasibility of remote deployments.
Report to the Institutional Computing Executive Group (ICEG) August 14, 2006
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carnes, B
We have delayed this report from its normal distribution schedule for two reasons. First, due to the coverage provided in the White Paper on Institutional Capability Computing Requirements distributed in August 2005, we felt a separate 2005 ICEG report would not be value added. Second, we wished to provide some specific information about the Peloton procurement and we have just now reached a point in the process where we can make some definitive statements. The Peloton procurement will result in an almost complete replacement of current M&IC systems. We have plans to retire MCR, iLX, and GPS. We will replacemore » them with new parallel and serial capacity systems based on the same node architecture in the new Peloton capability system named ATLAS. We are currently adding the first users to the Green Data Oasis, a large file system on the open network that will provide the institution with external collaboration data sharing. Only Thunder will remain from the current M&IC system list and it will be converted from Capability to Capacity. We are confident that we are entering a challenging yet rewarding new phase for the M&IC program. Institutional computing has been an essential component of our S&T investment strategy and has helped us achieve recognition in many scientific and technical forums. Through consistent institutional investments, M&IC has grown into a powerful unclassified computing resource that is being used across the Lab to push the limits of computing and its application to simulation science. With the addition of Peloton, the Laboratory will significantly increase the broad-based computing resources available to meet the ever-increasing demand for the large scale simulations indispensable to advancing all scientific disciplines. All Lab research efforts are bolstered through the long term development of mission driven scalable applications and platforms. The new systems will soon be fully utilized and will position Livermore to extend the outstanding science and technology breakthroughs the M&IC program has enabled to date.« less
NASA Technical Reports Server (NTRS)
Sen, Syamal K.; Shaykhian, Gholam Ali
2011-01-01
MatLab(R) (MATrix LABoratory) is a numerical computation and simulation tool that is used by thousands Scientists and Engineers in many cou ntries. MatLab does purely numerical calculations, which can be used as a glorified calculator or interpreter programming language; its re al strength is in matrix manipulations. Computer algebra functionalities are achieved within the MatLab environment using "symbolic" toolbo x. This feature is similar to computer algebra programs, provided by Maple or Mathematica to calculate with mathematical equations using s ymbolic operations. MatLab in its interpreter programming language fo rm (command interface) is similar with well known programming languag es such as C/C++, support data structures and cell arrays to define c lasses in object oriented programming. As such, MatLab is equipped with most ofthe essential constructs of a higher programming language. M atLab is packaged with an editor and debugging functionality useful t o perform analysis of large MatLab programs and find errors. We belie ve there are many ways to approach real-world problems; prescribed methods to ensure foregoing solutions are incorporated in design and ana lysis of data processing and visualization can benefit engineers and scientist in gaining wider insight in actual implementation of their perspective experiments. This presentation will focus on data processing and visualizations aspects of engineering and scientific applicati ons. Specifically, it will discuss methods and techniques to perform intermediate-level data processing covering engineering and scientifi c problems. MatLab programming techniques including reading various data files formats to produce customized publication-quality graphics, importing engineering and/or scientific data, organizing data in tabu lar format, exporting data to be used by other software programs such as Microsoft Excel, data presentation and visualization will be discussed. The presentation will emphasize creating practIcal scripts (pro grams) that extend the basic features of MatLab TOPICS mclude (1) Ma trix and vector analysis and manipulations (2) Mathematical functions (3) Symbolic calculations & functions (4) Import/export data files (5) Program lOgic and flow control (6) Writing function and passing parameters (7) Test application programs
NASA Astrophysics Data System (ADS)
Orłowska-Szostak, Maria; Orłowski, Ryszard
2017-11-01
The paper discusses some relevant aspects of the calibration of a computer model describing flows in the water supply system. The authors described an exemplary water supply system and used it as a practical illustration of calibration. A range of measures was discussed and applied, which improve the convergence and effective use of calculations in the calibration process and also the effect of such calibration which is the validity of the results obtained. Drawing up results of performed measurements, i.e. estimating pipe roughnesses, the authors performed using the genetic algorithm implementation of which is a software developed by Resan Labs company from Brazil.
Cone-beam micro-CT system based on LabVIEW software.
Ionita, Ciprian N; Hoffmann, Keneth R; Bednarek, Daniel R; Chityala, Ravishankar; Rudin, Stephen
2008-09-01
Construction of a cone-beam computed tomography (CBCT) system for laboratory research usually requires integration of different software and hardware components. As a result, building and operating such a complex system require the expertise of researchers with significantly different backgrounds. Additionally, writing flexible code to control the hardware components of a CBCT system combined with designing a friendly graphical user interface (GUI) can be cumbersome and time consuming. An intuitive and flexible program structure, as well as the program GUI for CBCT acquisition, is presented in this note. The program was developed in National Instrument's Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) graphical language and is designed to control a custom-built CBCT system but has been also used in a standard angiographic suite. The hardware components are commercially available to researchers and are in general provided with software drivers which are LabVIEW compatible. The program structure was designed as a sequential chain. Each step in the chain takes care of one or two hardware commands at a time; the execution of the sequence can be modified according to the CBCT system design. We have scanned and reconstructed over 200 specimens using this interface and present three examples which cover different areas of interest encountered in laboratory research. The resulting 3D data are rendered using a commercial workstation. The program described in this paper is available for use or improvement by other researchers.
None
2018-01-16
Take a virtual tour of the campus of Thomas Jefferson National Accelerator Facility. You can see inside our two accelerators, three experimental areas, accelerator component fabrication and testing areas, high-performance computing areas and laser labs.
Simulations, Games, and Virtual Labs for Science Education: a Compendium and Some Examples
NASA Astrophysics Data System (ADS)
Russell, R. M.
2011-12-01
We have assembled a list of computer-based simulations, games, and virtual labs for science education. This list, with links to the sources of these resources, is available online. The entries span a broad range of science, math, and engineering topics. They also span a range of target student ages, from elementary school to university students. We will provide a brief overview of this web site and the resources found on it. We will also briefly demonstrate some of our own educational simulations, including the "Very, Very Simple Climate Model", and report on formative evaluations of these resources. Computer-based simulations and virtual labs are valuable resources for science educators in various settings, allowing learners to experiment and explore "what if" scenarios. Educational computer games can motivate learners in both formal and informal settings, encouraging them to spend much more time exploring a topic than they might otherwise be inclined to do. Part of this presentation is effectively a "literature review" of numerous sources of simulations, games, and virtual labs. Although we have encountered several nice collections of such resources, those collections seem to be restricted in scope. They either represent materials developed by a specific group or agency (e.g. NOAA's games web site) or are restricted to a specific discipline (e.g. geology simulations and virtual labs). This presentation directs viewers to games, simulations, and virtual labs from many different sources and spanning a broad range of STEM disciplines.
Inexpensive DAQ based physics labs
NASA Astrophysics Data System (ADS)
Lewis, Benjamin; Clark, Shane
2015-11-01
Quality Data Acquisition (DAQ) based physics labs can be designed using microcontrollers and very low cost sensors with minimal lab equipment. A prototype device with several sensors and documentation for a number of DAQ-based labs is showcased. The device connects to a computer through Bluetooth and uses a simple interface to control the DAQ and display real time graphs, storing the data in .txt and .xls formats. A full device including a larger number of sensors combined with software interface and detailed documentation would provide a high quality physics lab education for minimal cost, for instance in high schools lacking lab equipment or students taking online classes. An entire semester’s lab course could be conducted using a single device with a manufacturing cost of under $20.
NASA Astrophysics Data System (ADS)
Oien, R. P.; Anders, A. M.; Long, A.
2014-12-01
We present the initial results of transitioning laboratory activities in an introductory physical geology course from passive to active learning. Educational research demonstrates that student-driven investigations promote increased engagement and better retention of material. Surveys of students in introductory physical geology helped us identify lab activities which do not engage students. We designed new lab activities to be more collaborative, open-ended and "hands-on". Student feedback was most negative for lab activities which are computer-based. In response, we have removed computers from the lab space and increased the length and number of activities involving physical manipulation of samples and models. These changes required investment in lab equipment and supplies. New lab activities also include student-driven exploration of data with open-ended responses. Student-evaluations of the new lab activities will be compiled during Fall 2014 and Spring 2015 to allow us to measure the impact of the changes on student satisfaction and we will report on our findings to date. Modification of this course has been sponsored by NSF's Widening Implementation & Demonstration of Evidence Based Reforms (WIDER) program through grant #1347722 to the University of Illinois. The overall goal of the grant is to increase retention and satisfaction of STEM students in introductory courses.
New Software Architecture Options for the TCL Data Acquisition System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valenton, Emmanuel
2014-09-01
The Turbulent Combustion Laboratory (TCL) conducts research on combustion in turbulent flow environments. To conduct this research, the TCL utilizes several pulse lasers, a traversable wind tunnel, flow controllers, scientific grade CCD cameras, and numerous other components. Responsible for managing these different data-acquiring instruments and data processing components is the Data Acquisition (DAQ) software. However, the current system is constrained to running through VXI hardware—an instrument-computer interface—that is several years old, requiring the use of an outdated version of the visual programming language, LabVIEW. A new Acquisition System is being programmed which will borrow heavily from either a programming modelmore » known as the Current Value Table (CVT) System or another model known as the Server-Client System. The CVT System model is in essence, a giant spread sheet from which data or commands may be retrieved or written to, and the Server-Client System is based on network connections between a server and a client, very much like the Server-Client model of the Internet. Currently, the bare elements of a CVT DAQ Software have been implemented, consisting of client programs in addition to a server program that the CVT will run on. This system is being rigorously tested to evaluate the merits of pursuing the CVT System model and to uncover any potential flaws which may result in further implementation. If the CVT System is chosen, which is likely, then future work will consist of build up the system until enough client programs have been created to run the individual components of the lab. The advantages of such a System will be flexibility, portability, and polymorphism. Additionally, the new DAQ software will allow the Lab to replace the VXI with a newer instrument interface—the PXI—and take advantage of the capabilities of current and future versions of LabVIEW.« less
Data Visualization and Animation Lab (DVAL) overview
NASA Technical Reports Server (NTRS)
Stacy, Kathy; Vonofenheim, Bill
1994-01-01
The general capabilities of the Langley Research Center Data Visualization and Animation Laboratory is described. These capabilities include digital image processing, 3-D interactive computer graphics, data visualization and analysis, video-rate acquisition and processing of video images, photo-realistic modeling and animation, video report generation, and color hardcopies. A specialized video image processing system is also discussed.
ERIC Educational Resources Information Center
Smith, Peter, Ed.
This 2000 Association of Small Computer Users in Education (ASCUE) conference proceedings first highlights keynote speakers and describes the pre-conference workshops. The conference papers and abstracts that follow discuss: strategic planning for faculty, staff, and student development; a network lab; the Blackboard course delivery system;…
ERIC Educational Resources Information Center
Alexiadis, D. S.; Mitianoudis, N.
2013-01-01
Digital signal processing (DSP) has been an integral part of most electrical, electronic, and computer engineering curricula. The applications of DSP in multimedia (audio, image, video) storage, transmission, and analysis are also widely taught at both the undergraduate and post-graduate levels, as digital multimedia can be encountered in most…
ERIC Educational Resources Information Center
Mendez, Sergio; AungYong, Lisa
2014-01-01
To help students make the connection between the concepts of heat conduction and convection to real-world phenomenon, we developed a combined experimental and computational module that can be incorporated into lecture or lab courses. The experimental system we present requires materials and apparatus that are readily accessible, and the procedure…
Scientific Visualization, Seeing the Unseeable
LBNL
2017-12-09
June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in bo... June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.
Pawlik, Aleksandra; van Gelder, Celia W.G.; Nenadic, Aleksandra; Palagi, Patricia M.; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole
2017-01-01
Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community. PMID:28781745
Pawlik, Aleksandra; van Gelder, Celia W G; Nenadic, Aleksandra; Palagi, Patricia M; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole
2017-01-01
Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community.
ERIC Educational Resources Information Center
Houston, Linda; Johnson, Candice
After much trial and error, the Agricultural Technical Institute of the Ohio State University (ATI/OSO) discovered that training of writing lab tutors can best be done through collaboration of the Writing Lab Coordinator with the "Development of Tutor Effectiveness" course offered at the institute. The ATI/OSO main computer lab and…
ERIC Educational Resources Information Center
Rodriguez, Santiago; Zamorano, Juan; Rosales, Francisco; Dopico, Antonio Garcia; Pedraza, Jose Luis
2007-01-01
This paper describes a complete lab work management framework designed and developed in the authors' department to help teachers to manage the small projects that students are expected to complete as lab assignments during their graduate-level computer engineering studies. The paper focuses on an application example of the framework to a specific…
Systems Engineering Building Advances Power Grid Research
Virden, Jud; Huang, Henry; Skare, Paul; Dagle, Jeff; Imhoff, Carl; Stoustrup, Jakob; Melton, Ron; Stiles, Dennis; Pratt, Rob
2018-01-16
Researchers and industry are now better equipped to tackle the nationâs most pressing energy challenges through PNNLâs new Systems Engineering Building â including challenges in grid modernization, buildings efficiency and renewable energy integration. This lab links real-time grid data, software platforms, specialized laboratories and advanced computing resources for the design and demonstration of new tools to modernize the grid and increase buildings energy efficiency.
Integrating labview into a distributed computing environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kasemir, K. U.; Pieck, M.; Dalesio, L. R.
2001-01-01
Being easy to learn and well suited for a selfcontained desktop laboratory setup, many casual programmers prefer to use the National Instruments Lab-VIEW environment to develop their logic. An ActiveX interface is presented that allows integration into a plant-wide distributed environment based on the Experimental Physics and Industrial Control System (EPICS). This paper discusses the design decisions and provides performance information, especially considering requirements for the Spallation Neutron Source (SNS) diagnostics system.
Exploring Space Physics Concepts Using Simulation Results
NASA Astrophysics Data System (ADS)
Gross, N. A.
2008-05-01
The Center for Integrated Space Weather Modeling (CISM), a Science and Technology Center (STC) funded by the National Science Foundation, has the goal of developing a suite of integrated physics based computer models of the space environment that can follow the evolution of a space weather event from the Sun to the Earth. In addition to the research goals, CISM is also committed to training the next generation of space weather professionals who are imbued with a system view of space weather. This view should include an understanding of both helio-spheric and geo-space phenomena. To this end, CISM offers a yearly Space Weather Summer School targeted to first year graduate students, although advanced undergraduates and space weather professionals have also attended. This summer school uses a number of innovative pedagogical techniques including devoting each afternoon to a computer lab exercise that use results from research quality simulations and visualization techniques, along with ground based and satellite data to explore concepts introduced during the morning lectures. These labs are suitable for use in wide variety educational settings from formal classroom instruction to outreach programs. The goal of this poster is to outline the goals and content of the lab materials so that instructors may evaluate their potential use in the classroom or other settings.
Polysaccharide production by lactic acid bacteria: from genes to industrial applications.
Zeidan, Ahmad A; Poulsen, Vera Kuzina; Janzen, Thomas; Buldo, Patrizia; Derkx, Patrick M F; Øregaard, Gunnar; Neves, Ana Rute
2017-08-01
The ability to produce polysaccharides with diverse biological functions is widespread in bacteria. In lactic acid bacteria (LAB), production of polysaccharides has long been associated with the technological, functional and health-promoting benefits of these microorganisms. In particular, the capsular polysaccharides and exopolysaccharides have been implicated in modulation of the rheological properties of fermented products. For this reason, screening and selection of exocellular polysaccharide-producing LAB has been extensively carried out by academia and industry. To further exploit the ability of LAB to produce polysaccharides, an in-depth understanding of their biochemistry, genetics, biosynthetic pathways, regulation and structure-function relationships is mandatory. Here, we provide a critical overview of the latest advances in the field of glycosciences in LAB. Surprisingly, the understanding of the molecular processes involved in polysaccharide synthesis is lagging behind, and has not accompanied the increasing commercial value and application potential of these polymers. Seizing the natural diversity of polysaccharides for exciting new applications will require a concerted effort encompassing in-depth physiological characterization of LAB at the systems level. Combining high-throughput experimentation with computational approaches, biochemical and structural characterization of the polysaccharides and understanding of the structure-function-application relationships is essential to achieve this ambitious goal. © FEMS 2017.
Virtual Instrument for Determining Rate Constant of Second-Order Reaction by pX Based on LabVIEW 8.0
Meng, Hu; Li, Jiang-Yuan; Tang, Yong-Huai
2009-01-01
The virtual instrument system based on LabVIEW 8.0 for ion analyzer which can measure and analyze ion concentrations in solution is developed and comprises homemade conditioning circuit, data acquiring board, and computer. It can calibrate slope, temperature, and positioning automatically. When applied to determine the reaction rate constant by pX, it achieved live acquiring, real-time displaying, automatical processing of testing data, generating the report of results; and other functions. This method simplifies the experimental operation greatly, avoids complicated procedures of manual processing data and personal error, and improves veracity and repeatability of the experiment results. PMID:19730752
Cloud computing: a new business paradigm for biomedical information sharing.
Rosenthal, Arnon; Mork, Peter; Li, Maya Hao; Stanford, Jean; Koester, David; Reynolds, Patti
2010-04-01
We examine how the biomedical informatics (BMI) community, especially consortia that share data and applications, can take advantage of a new resource called "cloud computing". Clouds generally offer resources on demand. In most clouds, charges are pay per use, based on large farms of inexpensive, dedicated servers, sometimes supporting parallel computing. Substantial economies of scale potentially yield costs much lower than dedicated laboratory systems or even institutional data centers. Overall, even with conservative assumptions, for applications that are not I/O intensive and do not demand a fully mature environment, the numbers suggested that clouds can sometimes provide major improvements, and should be seriously considered for BMI. Methodologically, it was very advantageous to formulate analyses in terms of component technologies; focusing on these specifics enabled us to bypass the cacophony of alternative definitions (e.g., exactly what does a cloud include) and to analyze alternatives that employ some of the component technologies (e.g., an institution's data center). Relative analyses were another great simplifier. Rather than listing the absolute strengths and weaknesses of cloud-based systems (e.g., for security or data preservation), we focus on the changes from a particular starting point, e.g., individual lab systems. We often find a rough parity (in principle), but one needs to examine individual acquisitions--is a loosely managed lab moving to a well managed cloud, or a tightly managed hospital data center moving to a poorly safeguarded cloud? 2009 Elsevier Inc. All rights reserved.
Analyzing high energy physics data using database computing: Preliminary report
NASA Technical Reports Server (NTRS)
Baden, Andrew; Day, Chris; Grossman, Robert; Lifka, Dave; Lusk, Ewing; May, Edward; Price, Larry
1991-01-01
A proof of concept system is described for analyzing high energy physics (HEP) data using data base computing. The system is designed to scale up to the size required for HEP experiments at the Superconducting SuperCollider (SSC) lab. These experiments will require collecting and analyzing approximately 10 to 100 million 'events' per year during proton colliding beam collisions. Each 'event' consists of a set of vectors with a total length of approx. one megabyte. This represents an increase of approx. 2 to 3 orders of magnitude in the amount of data accumulated by present HEP experiments. The system is called the HEPDBC System (High Energy Physics Database Computing System). At present, the Mark 0 HEPDBC System is completed, and can produce analysis of HEP experimental data approx. an order of magnitude faster than current production software on data sets of approx. 1 GB. The Mark 1 HEPDBC System is currently undergoing testing and is designed to analyze data sets 10 to 100 times larger.
Summer Series 2012 - Conversation with Kathy Yelick
Yelick, Kathy, Miller, Jeff
2018-05-11
Jeff Miller, head of Public Affairs, sat down in conversation with Kathy Yelick, Associate Berkeley Lab Director, Computing Sciences, in the second of a series of powerpoint-free talks on July 18th 2012, at Berkeley Lab.
Summer Series 2012 - Conversation with Kathy Yelick
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yelick, Kathy, Miller, Jeff
2012-07-23
Jeff Miller, head of Public Affairs, sat down in conversation with Kathy Yelick, Associate Berkeley Lab Director, Computing Sciences, in the second of a series of powerpoint-free talks on July 18th 2012, at Berkeley Lab.
Computer-based Astronomy Labs for Non-science Majors
NASA Astrophysics Data System (ADS)
Smith, A. B. E.; Murray, S. D.; Ward, R. A.
1998-12-01
We describe and demonstrate two laboratory exercises, Kepler's Third Law and Stellar Structure, which are being developed for use in an astronomy laboratory class aimed at non-science majors. The labs run with Microsoft's Excel 98 (Macintosh) or Excel 97 (Windows). They can be run in a classroom setting or in an independent learning environment. The intent of the labs is twofold; first and foremost, students learn the subject matter through a series of informational frames. Next, students enhance their understanding by applying their knowledge in lab procedures, while also gaining familiarity with the use and power of a widely-used software package and scientific tool. No mathematical knowledge beyond basic algebra is required to complete the labs or to understand the computations in the spreadsheets, although the students are exposed to the concepts of numerical integration. The labs are contained in Excel workbook files. In the files are multiple spreadsheets, which contain either a frame with information on how to run the lab, material on the subject, or one or more procedures. Excel's VBA macro language is used to automate the labs. The macros are accessed through button interfaces positioned on the spreadsheets. This is done intentionally so that students can focus on learning the subject matter and the basic spreadsheet features without having to learn advanced Excel features all at once. Students open the file and progress through the informational frames to the procedures. After each procedure, student comments and data are automatically recorded in a preformatted Lab Report spreadsheet. Once all procedures have been completed, the student is prompted for a filename in which to save their Lab Report. The lab reports can then be printed or emailed to the instructor. The files will have full worksheet and workbook protection, and will have a "redo" feature at the end of the lab for students who want to repeat a procedure.
Optical sensors for electrical elements of a medium voltage distribution network
NASA Astrophysics Data System (ADS)
De Maria, Letizia; Bartalesi, Daniele; Serragli, Paolo; Paladino, Domenico
2012-04-01
The aging of most of the components of the National transmission and distribution system can potentially influence the reliability of power supply in a Medium Voltage (MV) network. In order to prevent possible dangerous situations, selected diagnostic indicators on electrical parts exploiting reliable and potentially low-cost sensors are required. This paper presents results concerning two main research activities regarding the development and application of innovative optical sensors for the diagnostic of MV electrical components. The first concerns a multi-sensor prototype for the detection of pre-discharges in MV switchboards: it is the combination of three different types of sensors operating simultaneously to detect incipient failure and to reduce the occurrence of false alarms. The system is real-time controlled by an embedded computer through a LabView interface. The second activity refers to a diagnostic tool to provide significant real-time information about early aging of MV/Low Voltage (LV) transformers by means of its vibration fingerprint. A miniaturized Optical Micro-Electro-Mechanical System (MEMS) based unit has been assembled for vibration measurements, wireless connected to a remote computer and controlled via LabView interface. Preliminary comparative tests were carried out with standard piezoelectric accelerometers on a conventional MV/LV test transformer under open circuit and in short-circuited configuration.
A LabVIEW model incorporating an open-loop arterial impedance and a closed-loop circulatory system.
Cole, R T; Lucas, C L; Cascio, W E; Johnson, T A
2005-11-01
While numerous computer models exist for the circulatory system, many are limited in scope, contain unwanted features or incorporate complex components specific to unique experimental situations. Our purpose was to develop a basic, yet multifaceted, computer model of the left heart and systemic circulation in LabVIEW having universal appeal without sacrificing crucial physiologic features. The program we developed employs Windkessel-type impedance models in several open-loop configurations and a closed-loop model coupling a lumped impedance and ventricular pressure source. The open-loop impedance models demonstrate afterload effects on arbitrary aortic pressure/flow inputs. The closed-loop model catalogs the major circulatory waveforms with changes in afterload, preload, and left heart properties. Our model provides an avenue for expanding the use of the ventricular equations through closed-loop coupling that includes a basic coronary circuit. Tested values used for the afterload components and the effects of afterload parameter changes on various waveforms are consistent with published data. We conclude that this model offers the ability to alter several circulatory factors and digitally catalog the most salient features of the pressure/flow waveforms employing a user-friendly platform. These features make the model a useful instructional tool for students as well as a simple experimental tool for cardiovascular research.
Computer Labs | College of Engineering & Applied Science
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z D2L PAWS Email My UW-System About UWM UWM Jobs D2L PAWS Email My UW-System University of Wisconsin-Milwaukee College ofEngineering & Olympiad Girls Who Code Club FIRST Tech Challenge NSF I-Corps Site of Southeastern Wisconsin UW-Milwaukee
Experiment and application of soft x-ray grazing incidence optical scattering phenomena
NASA Astrophysics Data System (ADS)
Chen, Shuyan; Li, Cheng; Zhang, Yang; Su, Liping; Geng, Tao; Li, Kun
2017-08-01
For short wavelength imaging systems,surface scattering effects is one of important factors degrading imaging performance. Study of non-intuitive surface scatter effects resulting from practical optical fabrication tolerances is a necessary work for optical performance evaluation of high resolution short wavelength imaging systems. In this paper, Soft X-ray optical scattering distribution is measured by a soft X-ray reflectometer installed by my lab, for different sample mirrors、wavelength and grazing angle. Then aim at space solar telescope, combining these scattered light distributions, and surface scattering numerical model of grazing incidence imaging system, PSF and encircled energy of optical system of space solar telescope are computed. We can conclude that surface scattering severely degrade imaging performance of grazing incidence systems through analysis and computation.
NASA Astrophysics Data System (ADS)
Stroobant, M.; Locritani, M.; Marini, D.; Sabbadini, L.; Carmisciano, C.; Manzella, G.; Magaldi, M.; Aliani, S.
2012-04-01
DLTM is the Ligurian Region (north Italy) cluster of Centre of Excellence (CoE) in waterborne technologies, that involves about 120 enterprises - of which, more than 100 SMEs -, the University of Genoa, all the main National Research Centres dealing with maritime and marine technologies established in Liguria (CNR, INGV, ENEA-UTMAR), the NATO Undersea Research Centre (NURC) and the Experimental Centre of the Italian Navy (CSSN), the Bank, the Port Authority and the Chamber of Commerce of the city of La Spezia. Following its mission, DLTM has recently established three Collaborative Research Laboratories focused on: 1. Computational Fluid dynamics (CFD_Lab) 2. High Performance Computing (HPC_Lab) 3. Monitoring and Analysis of Marine Ecosystems (MARE_Lab). The main role of them is to improve the relationships among the research centres and the enterprises, encouraging a systematic networking approach and sharing of knowledge, data, services, tools and human resources. Two of the key objectives of Lab_MARE are the establishment of: - an integrated system of observation and sea forecasting; - a Regional Marine Instrument Centre (RMIC) for oceanographic and metereological instruments (assembled using 'shared' tools and facilities). Besides, an important and innovative research project has been recently submitted to the Italian Ministry for Education, University and Research (MIUR). This project, in agreement with the European Directives (COM2009 (544)), is aimed to develop a Management Information System (MIS) for oceanographic and meteorological data in the Mediterranean Sea. The availability of adequate HPC inside DLTM is, of course, an important asset for achieving useful results; for example, the Regional Ocean Modeling System (ROMS) model is currently running on a high-resolution mesh on the cluster to simulate and reproduce the circulation within the Ligurian Sea. ROMS outputs will have broad and multidisciplinary impacts because ocean circulation affects the dispersion of different substances like oil spills and other pollutants but also sediments, nutrients and larvae. This could be an important tool for the environmental preservation, prevention and remediation, by placing the bases for the integrated management of the ocean.
What is Supercomputing? A Conversation with Kathy Yelick
Yelick, Kathy
2017-12-11
In this highlight video, Jeff Miller, head of Public Affairs, sat down in conversation with Kathy Yelick, Associate Berkeley Lab Director, Computing Sciences, in the second of a series of "powerpoint-free" talks on July 18th 2012, at Berkeley Lab.
What is Supercomputing? A Conversation with Kathy Yelick
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yelick, Kathy
2012-07-23
In this highlight video, Jeff Miller, head of Public Affairs, sat down in conversation with Kathy Yelick, Associate Berkeley Lab Director, Computing Sciences, in the second of a series of "powerpoint-free" talks on July 18th 2012, at Berkeley Lab.
Speed control for a mobile robot
NASA Astrophysics Data System (ADS)
Kolli, Kaylan C.; Mallikarjun, Sreeram; Kola, Krishnamohan; Hall, Ernest L.
1997-09-01
Automated guided vehicles (AGVs) have many potential applications in manufacturing, medicine, space and defense. The purpose of this paper is to describe exploratory research on the design of a speed control for a modular autonomous mobile robot controller. The speed control of the traction motor is essential for safe operation of a mobile robot. The challenges of autonomous operation of a vehicle require safe, runaway and collision free operation. A mobile robot test-bed has been constructed using a golf cart base. The computer controlled speed control has been implemented and works with guidance provided by vision system and obstacle avoidance using ultrasonic sensors systems. A 486 computer through a 3- axis motion controller supervises the speed control. The traction motor is controlled via the computer by an EV-1 speed control. Testing of the system was done both in the lab and on an outside course with positive results. This design is a prototype and suggestions for improvements are also given. The autonomous speed controller is applicable for any computer controlled electric drive mobile vehicle.
Integration of High-Performance Computing into Cloud Computing Services
NASA Astrophysics Data System (ADS)
Vouk, Mladen A.; Sills, Eric; Dreher, Patrick
High-Performance Computing (HPC) projects span a spectrum of computer hardware implementations ranging from peta-flop supercomputers, high-end tera-flop facilities running a variety of operating systems and applications, to mid-range and smaller computational clusters used for HPC application development, pilot runs and prototype staging clusters. What they all have in common is that they operate as a stand-alone system rather than a scalable and shared user re-configurable resource. The advent of cloud computing has changed the traditional HPC implementation. In this article, we will discuss a very successful production-level architecture and policy framework for supporting HPC services within a more general cloud computing infrastructure. This integrated environment, called Virtual Computing Lab (VCL), has been operating at NC State since fall 2004. Nearly 8,500,000 HPC CPU-Hrs were delivered by this environment to NC State faculty and students during 2009. In addition, we present and discuss operational data that show that integration of HPC and non-HPC (or general VCL) services in a cloud can substantially reduce the cost of delivering cloud services (down to cents per CPU hour).
Integration of the HTC Vive into the medical platform MeVisLab
NASA Astrophysics Data System (ADS)
Egger, Jan; Gall, Markus; Wallner, Jürgen; de Almeida Germano Boechat, Pedro; Hann, Alexander; Li, Xing; Chen, Xiaojun; Schmalstieg, Dieter
2017-03-01
Virtual Reality (VR) is an immersive technology that replicates an environment via computer-simulated reality. VR gets a lot of attention in computer games but has also great potential in other areas, like the medical domain. Examples are planning, simulations and training of medical interventions, like for facial surgeries where an aesthetic outcome is important. However, importing medical data into VR devices is not trivial, especially when a direct connection and visualization from your own application is needed. Furthermore, most researcher don't build their medical applications from scratch, rather they use platforms, like MeVisLab, Slicer or MITK. The platforms have in common that they integrate and build upon on libraries like ITK and VTK, further providing a more convenient graphical interface to them for the user. In this contribution, we demonstrate the usage of a VR device for medical data under MeVisLab. Therefore, we integrated the OpenVR library into MeVisLab as an own module. This enables the direct and uncomplicated usage of head mounted displays, like the HTC Vive under MeVisLab. Summarized, medical data from other MeVisLab modules can directly be connected per drag-and-drop to our VR module and will be rendered inside the HTC Vive for an immersive inspection.
Zhao, Li; Xing, Xiao; Guo, Xuhong; Liu, Zehua; He, Yang
2014-10-01
Brain-computer interface (BCI) system is a system that achieves communication and control among humans and computers and other electronic equipment with the electroencephalogram (EEG) signals. This paper describes the working theory of the wireless smart home system based on the BCI technology. We started to get the steady-state visual evoked potential (SSVEP) using the single chip microcomputer and the visual stimulation which composed by LED lamp to stimulate human eyes. Then, through building the power spectral transformation on the LabVIEW platform, we processed timely those EEG signals under different frequency stimulation so as to transfer them to different instructions. Those instructions could be received by the wireless transceiver equipment to control the household appliances and to achieve the intelligent control towards the specified devices. The experimental results showed that the correct rate for the 10 subjects reached 100%, and the control time of average single device was 4 seconds, thus this design could totally achieve the original purpose of smart home system.
NASA Technical Reports Server (NTRS)
Yanosy, James L.
1988-01-01
Over the years, computer modeling has been used extensively in many disciplines to solve engineering problems. A set of computer program tools is proposed to assist the engineer in the various phases of the Space Station program from technology selection through flight operations. The development and application of emulation and simulation transient performance modeling tools for life support systems are examined. The results of the development and the demonstration of the utility of three computer models are presented. The first model is a detailed computer model (emulation) of a solid amine water desorbed (SAWD) CO2 removal subsystem combined with much less detailed models (simulations) of a cabin, crew, and heat exchangers. This model was used in parallel with the hardware design and test of this CO2 removal subsystem. The second model is a simulation of an air revitalization system combined with a wastewater processing system to demonstrate the capabilities to study subsystem integration. The third model is that of a Space Station total air revitalization system. The station configuration consists of a habitat module, a lab module, two crews, and four connecting nodes.
Is This Real Life? Is This Just Fantasy?: Realism and Representations in Learning with Technology
NASA Astrophysics Data System (ADS)
Sauter, Megan Patrice
Students often engage in hands-on activities during science learning; however, financial and practical constraints often limit the availability of these activities. Recent advances in technology have led to increases in the use of simulations and remote labs, which attempt to recreate hands-on science learning via computer. Remote labs and simulations are interesting from a cognitive perspective because they allow for different relations between representations and their referents. Remote labs are unique in that they provide a yoked representation, meaning that the representation of the lab on the computer screen is actually linked to that which it represents: a real scientific device. Simulations merely represent the lab and are not connected to any real scientific devices. However, the type of visual representations used in the lab may modify the effects of the lab technology. The purpose of this dissertation is to examine the relation between representation and technology and its effects of students' psychological experiences using online science labs. Undergraduates participated in two studies that investigated the relation between technology and representation. In the first study, participants performed either a remote lab or a simulation incorporating one of two visual representations, either a static image or a video of the equipment. Although participants in both lab conditions learned, participants in the remote lab condition had more authentic experiences. However, effects were moderated by the realism of the visual representation. Participants who saw a video were more invested and felt the experience was more authentic. In a second study, participants performed a remote lab and either saw the same video as in the first study, an animation, or the video and an animation. Most participants had an authentic experience because both representations evoked strong feelings of presence. However, participants who saw the video were more likely to believe the remote technology was real. Overall, the findings suggest that participants' experiences with technology were shaped by representation. Students had more authentic experiences using the remote lab than the simulation. However, incorporating visual representations that enhance presence made these experiences even more authentic and meaningful than afforded by the technology alone.
Ramos, Rogelio; Zlatev, Roumen; Valdez, Benjamin; Stoytcheva, Margarita; Carrillo, Mónica; García, Juan-Francisco
2013-01-01
A virtual instrumentation (VI) system called VI localized corrosion image analyzer (LCIA) based on LabVIEW 2010 was developed allowing rapid automatic and subjective error-free determination of the pits number on large sized corroded specimens. The VI LCIA controls synchronously the digital microscope image taking and its analysis, finally resulting in a map file containing the coordinates of the detected probable pits containing zones on the investigated specimen. The pits area, traverse length, and density are also determined by the VI using binary large objects (blobs) analysis. The resulting map file can be used further by a scanning vibrating electrode technique (SVET) system for rapid (one pass) "true/false" SVET check of the probable zones only passing through the pit's centers avoiding thus the entire specimen scan. A complete SVET scan over the already proved "true" zones could determine the corrosion rate in any of the zones.
Inference of missing data and chemical model parameters using experimental statistics
NASA Astrophysics Data System (ADS)
Casey, Tiernan; Najm, Habib
2017-11-01
A method for determining the joint parameter density of Arrhenius rate expressions through the inference of missing experimental data is presented. This approach proposes noisy hypothetical data sets from target experiments and accepts those which agree with the reported statistics, in the form of nominal parameter values and their associated uncertainties. The data exploration procedure is formalized using Bayesian inference, employing maximum entropy and approximate Bayesian computation methods to arrive at a joint density on data and parameters. The method is demonstrated in the context of reactions in the H2-O2 system for predictive modeling of combustion systems of interest. Work supported by the US DOE BES CSGB. Sandia National Labs is a multimission lab managed and operated by Nat. Technology and Eng'g Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell Intl, for the US DOE NCSA under contract DE-NA-0003525.
NASA Astrophysics Data System (ADS)
Lyu, Bo-Han; Wang, Chen; Tsai, Chun-Wei
2017-08-01
Jasper Display Corp. (JDC) offer high reflectivity, high resolution Liquid Crystal on Silicon - Spatial Light Modulator (LCoS-SLM) which include an associated controller ASIC and LabVIEW based modulation software. Based on this LCoS-SLM, also called Education Kit (EDK), we provide a training platform which includes a series of optical theory and experiments to university students. This EDK not only provides a LabVIEW based operation software to produce Computer Generated Holograms (CGH) to generate some basic diffraction image or holographic image, but also provides simulation software to verity the experiment results simultaneously. However, we believe that a robust LCoSSLM, operation software, simulation software, training system, and training course can help students to study the fundamental optics, wave optics, and Fourier optics more easily. Based on these fundamental knowledges, they could develop their unique skills and create their new innovations on the optoelectronic application in the future.
Remotely accessible laboratory for MEMS testing
NASA Astrophysics Data System (ADS)
Sivakumar, Ganapathy; Mulsow, Matthew; Melinger, Aaron; Lacouture, Shelby; Dallas, Tim E.
2010-02-01
We report on the construction of a remotely accessible and interactive laboratory for testing microdevices (aka: MicroElectroMechancial Systems - MEMS). Enabling expanded utilization of microdevices for research, commercial, and educational purposes is very important for driving the creation of future MEMS devices and applications. Unfortunately, the relatively high costs associated with MEMS devices and testing infrastructure makes widespread access to the world of MEMS difficult. The creation of a virtual lab to control and actuate MEMS devices over the internet helps spread knowledge to a larger audience. A host laboratory has been established that contains a digital microscope, microdevices, controllers, and computers that can be logged into through the internet. The overall layout of the tele-operated MEMS laboratory system can be divided into two major parts: the server side and the client side. The server-side is present at Texas Tech University, and hosts a server machine that runs the Linux operating system and is used for interfacing the MEMS lab with the outside world via internet. The controls from the clients are transferred to the lab side through the server interface. The server interacts with the electronics required to drive the MEMS devices using a range of National Instruments hardware and LabView Virtual Instruments. An optical microscope (100 ×) with a CCD video camera is used to capture images of the operating MEMS. The server broadcasts the live video stream over the internet to the clients through the website. When the button is pressed on the website, the MEMS device responds and the video stream shows the movement in close to real time.
Quantum computing gates via optimal control
NASA Astrophysics Data System (ADS)
Atia, Yosi; Elias, Yuval; Mor, Tal; Weinstein, Yossi
2014-10-01
We demonstrate the use of optimal control to design two entropy-manipulating quantum gates which are more complex than the corresponding, commonly used, gates, such as CNOT and Toffoli (CCNOT): A two-qubit gate called polarization exchange (PE) and a three-qubit gate called polarization compression (COMP) were designed using GRAPE, an optimal control algorithm. Both gates were designed for a three-spin system. Our design provided efficient and robust nuclear magnetic resonance (NMR) radio frequency (RF) pulses for 13C2-trichloroethylene (TCE), our chosen three-spin system. We then experimentally applied these two quantum gates onto TCE at the NMR lab. Such design of these gates and others could be relevant for near-future applications of quantum computing devices.
Land classification of south-central Iowa from computer enhanced images
NASA Technical Reports Server (NTRS)
Lucas, J. R.; Taranik, J. V.; Billingsley, F. C. (Principal Investigator)
1977-01-01
The author has identified the following significant results. Enhanced LANDSAT imagery was most useful for land classification purposes, because these images could be photographically printed at large scales such as 1:63,360. The ability to see individual picture elements was no hindrance as long as general image patterns could be discerned. Low cost photographic processing systems for color printings have proved to be effective in the utilization of computer enhanced LANDSAT products for land classification purposes. The initial investment for this type of system was very low, ranging from $100 to $200 beyond a black and white photo lab. The technical expertise can be acquired from reading a color printing and processing manual.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darlene Roth
Completed in 2011, Albright's new Science Center includes three independent student and faculty research labs in Biology, Chemistry/Biochemistry, and Physics (separate from teaching labs). Providing independent research facilities, they eliminate disruptions in classrooms and teaching labs, encourage and accommodate increased student interest, and stimulate advanced research. The DOE grant of $369,943 enabled Albright to equip these advanced labs for 21st century science research, with much instrumentation shared among departments. The specialty labs will enable Albright to expand its student-faculty research program to meet growing interest, help attract superior science students, maximize faculty expertise, and continue exceeding its already high ratesmore » of acceptance for students applying for postgraduate education or pharmaceutical research positions. Biology instrumentation/equipment supports coursework and independent and collaborative research by students and faculty. The digital shaker, CO{sub 2} and water bath incubators (for controlled cell growth), balance, and micropipettes support cellular biology research in the advanced cell biology course and student-faculty research into heavy metal induction of heat shock proteins in cultured mammalian cells and the development of PCR markers from different populations of the native tree, Franklinia. The gravity convection oven and lyophilizer support research into physical and chemical analysis of floodplain sediments used in assessment of riparian restoration efforts. The Bio-Rad thermocycler permits fast and accurate DNA amplification as part of research into genetic diversity in small mammal populations and how those populations are affected by land-use practices and environmental management. The Millipore water deionizing system and glassware washer provide general support of the independent research lab and ensure quality control of coursework and interdisciplinary research at the intersection of biology, chemistry, and toxicology. Grant purchases support faculty and students working in the areas of plant cellular biology, landscape ecology and wildlife management, wetland restoration, and ecotoxicology of aquatic invertebrates. Chemistry/BioChemistry instrumentation supports a wide range of research and teaching needs. The Dell quad core Xeon processors and Gaussian 09 support computational research efforts of two of our faculty. The computational work of one of these groups is part of close collaboration with one organic chemist and provides support info for the synthetic work of this professor and his students. Computational chemistry studies were also introduced into the physical chemistry laboratory course for junior chemistry concentrators. The AKTA plus system and superdex columns, Thermoscientific Sorvall RC-6 plus superspeed centrifuge, Nanodrop spectrometer, Eppendorf microfuge, Homogenizer and Pipetman pipetters were incorporated into a research project involving purification and characterization of a construct of beta 2-microglobulin by one of our biochemists. The vacuum system (glove box, stand, and pump) makes a significant contribution to the research of our inorganic chemist, the newest department member, working on research projects with four students. The glove box provides the means to carry out their synthetic work in an oxygenless atmosphere. Supporting basic research pursued by faculty and students, the remaining items (refrigerator/freezer units for flammable storage, freezer, refrigerated water bath, rotary evaporator system, vacuum oven, analytical and top-loading balances) were distributed between our biochemistry and chemistry research labs. The Nanodrop spectrometer, Sorvall centrifuge, and rotary evaporator system are used in several junior/senior lab courses in both biochemistry and chemistry. To date, 14 undergraduate research students have been involved in projects using the new instrumentation and equipment provided by this grant. Physics equipment acquired is radically transforming Albright research and teaching capabilities. The two main purchases are an atomic force microscope (AFM) and a scanning tunneling microscope (STM). These two devices allow us to view surfaces at much higher resolution than ever before, even to the level of individual atoms. Already the AFM has been incorporated into courses for advanced physics and biology students, allowing them to view at high resolution material such as carbon nanotubes, cell structure, and proteins. These devices offer possibilities for interdisciplinary collaboration among students and faculty in various departments that have barely begun to be tapped. Additional equipment, such as software, optical tables, lasers, and other support equipment, is also strengthening our research and teaching capabilities in optics-related areas.« less
Research on Modeling Technology of Virtual Robot Based on LabVIEW
NASA Astrophysics Data System (ADS)
Wang, Z.; Huo, J. L.; Y Sun, L.; Y Hao, X.
2017-12-01
Because of the dangerous working environment, the underwater operation robot for nuclear power station needs manual teleoperation. In the process of operation, it is necessary to guide the position and orientation of the robot in real time. In this paper, the geometric modeling of the virtual robot and the working environment is accomplished by using SolidWorks software, and the accurate modeling and assembly of the robot are realized. Using LabVIEW software to read the model, and established the manipulator forward kinematics and inverse kinematics model, and realized the hierarchical modeling of virtual robot and computer graphics modeling. Experimental results show that the method studied in this paper can be successfully applied to robot control system.
Good enough practices in scientific computing.
Wilson, Greg; Bryan, Jennifer; Cranston, Karen; Kitzes, Justin; Nederbragt, Lex; Teal, Tracy K
2017-06-01
Computers are now essential in all branches of science, but most researchers are never taught the equivalent of basic lab skills for research computing. As a result, data can get lost, analyses can take much longer than necessary, and researchers are limited in how effectively they can work with software and data. Computing workflows need to follow the same practices as lab projects and notebooks, with organized data, documented steps, and the project structured for reproducibility, but researchers new to computing often don't know where to start. This paper presents a set of good computing practices that every researcher can adopt, regardless of their current level of computational skill. These practices, which encompass data management, programming, collaborating with colleagues, organizing projects, tracking work, and writing manuscripts, are drawn from a wide variety of published sources from our daily lives and from our work with volunteer organizations that have delivered workshops to over 11,000 people since 2010.
Parmitano and Cassidy in U.S. Lab
2013-05-31
ISS036-E-005515 (31 May 2013) --- European Space Agency astronaut Luca Parmitano (left) and NASA astronaut Chris Cassidy talk with fellow human beings on Earth using videoconferencing software and one of their on-board laptop computers in the U.S. lab Destiny.
Have Observatory, Will Travel.
ERIC Educational Resources Information Center
White, James C., II
1996-01-01
Describes several of the labs developed by Project CLEA (Contemporary Laboratory Experiences in Astronomy). The computer labs cover simulated spectrometer use, investigating the moons of Jupiter, radar measurements, energy flow out of the sun, classifying stellar spectra, photoelectric photometry, Doppler effect, eclipsing binary stars, and lunar…
An Algebra-Based Introductory Computational Neuroscience Course with Lab.
Fink, Christian G
2017-01-01
A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.
Integrating all medical records to an enterprise viewer.
Li, Haomin; Duan, Huilong; Lu, Xudong; Zhao, Chenhui; An, Jiye
2005-01-01
The idea behind hospital information systems is to make all of a patient's medical reports, lab results, and images electronically available to clinicians, instantaneously, wherever they are. But the higgledy-piggledy evolution of most hospital computer systems makes it hard to integrate all these clinical records. Although several integration standards had been proposed to meet this challenger, none of them is fit to Chinese hospitals. In this paper, we introduce our work of implementing a three-tiered architecture enterprise viewer in Huzhou Central Hospital to integration all existing medical information systems using limited resource.
ERIC Educational Resources Information Center
Caminero, Agustín C.; Ros, Salvador; Hernández, Roberto; Robles-Gómez, Antonio; Tobarra, Llanos; Tolbaños Granjo, Pedro J.
2016-01-01
The use of practical laboratories is a key in engineering education in order to provide our students with the resources needed to acquire practical skills. This is specially true in the case of distance education, where no physical interactions between lecturers and students take place, so virtual or remote laboratories must be used. UNED has…
NASA Technical Reports Server (NTRS)
Smith, Kevin
2011-01-01
This tutorial will explain the concepts and steps for interfacing a National Instruments LabView virtual instrument (VI) running on a Windows platform with another computer via the Object Management Group (OMG) Data Distribution Service (DDS) as implemented by the Twin Oaks Computing CoreDX. This paper is for educational purposes only and therefore, the referenced source code will be simplistic and void of all error checking. Implementation will be accomplished using the C programming language.
Swarming Robot Design, Construction and Software Implementation
NASA Technical Reports Server (NTRS)
Stolleis, Karl A.
2014-01-01
In this paper is presented an overview of the hardware design, construction overview, software design and software implementation for a small, low-cost robot to be used for swarming robot development. In addition to the work done on the robot, a full simulation of the robotic system was developed using Robot Operating System (ROS) and its associated simulation. The eventual use of the robots will be exploration of evolving behaviors via genetic algorithms and builds on the work done at the University of New Mexico Biological Computation Lab.
ERIC Educational Resources Information Center
Ruben, Barbara
1994-01-01
Reviews a number of interactive environmental computer education networks and software packages. Computer networks include National Geographic Kids Network, Global Lab, and Global Rivers Environmental Education Network. Computer software involve environmental decision making, simulation games, tropical rainforests, the ocean, the greenhouse…
BioLab: Using Yeast Fermentation as a Model for the Scientific Method.
ERIC Educational Resources Information Center
Pigage, Helen K.; Neilson, Milton C.; Greeder, Michele M.
This document presents a science experiment demonstrating the scientific method. The experiment consists of testing the fermentation capabilities of yeasts under different circumstances. The experiment is supported with computer software called BioLab which demonstrates yeast's response to different environments. (YDS)
A future for systems and computational neuroscience in France?
Faugeras, Olivier; Frégnac, Yves; Samuelides, Manuel
2007-01-01
This special issue of the Journal of Physiology, Paris, is an outcome of NeuroComp'06, the first French conference in Computational Neuroscience. The preparation for this conference, held at Pont-à-Mousson in October 2006, was accompanied by a survey which has resulted in an up-to-date inventory of human resources and labs in France concerned with this emerging new field of research (see team directory in http://neurocomp.risc.cnrs.fr/). This thematic JPP issue gathers some of the key scientific presentations made on the occasion of this first interdisciplinary meeting, which should soon become recognized as a yearly national conference representative of a new scientific community. The present introductory paper presents the general scientific context of the conference and reviews some of the historical and conceptual foundations of Systems and Computational Neuroscience in France.
NASA Astrophysics Data System (ADS)
Weagant, Scott; Karanassios, Vassili
2015-06-01
The use of portable hand held computing devices for the acquisition of spectrochemical data is briefly discussed using examples from the author's laboratory. Several network topologies are evaluated. At present, one topology that involves a portable computing device for data acquisition and spectrometer control and that has wireless access to the internet at one end and communicates with a smart phone at the other end appears to be better suited for "taking part of the lab to the sample" types of applications. Thus, spectrometric data can be accessed from anywhere in the world.
omniClassifier: a Desktop Grid Computing System for Big Data Prediction Modeling
Phan, John H.; Kothari, Sonal; Wang, May D.
2016-01-01
Robust prediction models are important for numerous science, engineering, and biomedical applications. However, best-practice procedures for optimizing prediction models can be computationally complex, especially when choosing models from among hundreds or thousands of parameter choices. Computational complexity has further increased with the growth of data in these fields, concurrent with the era of “Big Data”. Grid computing is a potential solution to the computational challenges of Big Data. Desktop grid computing, which uses idle CPU cycles of commodity desktop machines, coupled with commercial cloud computing resources can enable research labs to gain easier and more cost effective access to vast computing resources. We have developed omniClassifier, a multi-purpose prediction modeling application that provides researchers with a tool for conducting machine learning research within the guidelines of recommended best-practices. omniClassifier is implemented as a desktop grid computing system using the Berkeley Open Infrastructure for Network Computing (BOINC) middleware. In addition to describing implementation details, we use various gene expression datasets to demonstrate the potential scalability of omniClassifier for efficient and robust Big Data prediction modeling. A prototype of omniClassifier can be accessed at http://omniclassifier.bme.gatech.edu/. PMID:27532062
NASA Technical Reports Server (NTRS)
Ingels, F. M.; Rives, T. B.
1987-01-01
An analytical analysis of the HOSC Generic Peripheral processing system was conducted. The results are summarized and they indicate that the maximum delay in performing screen change requests should be less than 2.5 sec., occurring for a slow VAX host to video screen I/O rate of 50 KBps. This delay is due to the average I/O rate from the video terminals to their host computer. Software structure of the main computers and the host computers will have greater impact on screen change or refresh response times. The HOSC data system model was updated by a newly coded PASCAL based simulation program which was installed on the HOSC VAX system. This model is described and documented. Suggestions are offered to fine tune the performance of the ETERNET interconnection network. Suggestions for using the Nutcracker by Excelan to trace itinerate packets which appear on the network from time to time were offered in discussions with the HOSC personnel. Several visits to the HOSC facility were to install and demonstrate the simulation model.
Computer Controlled Portable Greenhouse Climate Control System for Enhanced Energy Efficiency
NASA Astrophysics Data System (ADS)
Datsenko, Anthony; Myer, Steve; Petties, Albert; Hustek, Ryan; Thompson, Mark
2010-04-01
This paper discusses a student project at Kettering University focusing on the design and construction of an energy efficient greenhouse climate control system. In order to maintain acceptable temperatures and stabilize temperature fluctuations in a portable plastic greenhouse economically, a computer controlled climate control system was developed to capture and store thermal energy incident on the structure during daylight periods and release the stored thermal energy during dark periods. The thermal storage mass for the greenhouse system consisted of a water filled base unit. The heat exchanger consisted of a system of PVC tubing. The control system used a programmable LabView computer interface to meet functional specifications that minimized temperature fluctuations and recorded data during operation. The greenhouse was a portable sized unit with a 5' x 5' footprint. Control input sensors were temperature, water level, and humidity sensors and output control devices were fan actuating relays and water fill solenoid valves. A Graphical User Interface was developed to monitor the system, set control parameters, and to provide programmable data recording times and intervals.
NASA Astrophysics Data System (ADS)
Baptista, M.; Teles, P.; Cardoso, G.; Vaz, P.
2014-11-01
Over the last decade, there was a substantial increase in the number of interventional cardiology procedures worldwide, and the corresponding ionizing radiation doses for both the medical staff and patients became a subject of concern. Interventional procedures in cardiology are normally very complex, resulting in long exposure times. Also, these interventions require the operator to work near the patient and, consequently, close to the primary X-ray beam. Moreover, due to the scattered radiation from the patient and the equipment, the medical staff is also exposed to a non-uniform radiation field that can lead to a significant exposure of sensitive body organs and tissues, such as the eye lens, the thyroid and the extremities. In order to better understand the spatial variation of the dose and dose rate distributions during an interventional cardiology procedure, the dose distribution around a C-arm fluoroscopic system, in operation in a cardiac cath lab at Portuguese Hospital, was estimated using both Monte Carlo (MC) simulations and dosimetric measurements. To model and simulate the cardiac cath lab, including the fluoroscopic equipment used to execute interventional procedures, the state-of-the-art MC radiation transport code MCNPX 2.7.0 was used. Subsequently, Thermo-Luminescent Detector (TLD) measurements were performed, in order to validate and support the simulation results obtained for the cath lab model. The preliminary results presented in this study reveal that the cardiac cath lab model was successfully validated, taking into account the good agreement between MC calculations and TLD measurements. The simulated results for the isodose curves related to the C-arm fluoroscopic system are also consistent with the dosimetric information provided by the equipment manufacturer (Siemens). The adequacy of the implemented computational model used to simulate complex procedures and map dose distributions around the operator and the medical staff is discussed, in view of the optimization principle (and the associated ALARA objective), one of the pillars of the international system of radiological protection.
Computer-Mediated Communication in a High School: The Users Shape the Medium--Part 1.
ERIC Educational Resources Information Center
Bresler, Liora
1990-01-01
This field study represents a departure from structured, or directed, computer-mediated communication as used in its natural environment, the computer lab. Using observations, interviews, and the computer medium itself, the investigators report how high school students interact with computers and create their own agendas for computer usage and…
A "Language Lab" for Architectural Design.
ERIC Educational Resources Information Center
Mackenzie, Arch; And Others
This paper discusses a "language lab" strategy in which traditional studio learning may be supplemented by language lessons using computer graphics techniques to teach architectural grammar, a body of elements and principles that govern the design of buildings belonging to a particular architectural theory or style. Two methods of…
Image analysis and green tea color change kinetics during thin-layer drying.
Shahabi, Mohammad; Rafiee, Shahin; Mohtasebi, Seyed Saeid; Hosseinpour, Soleiman
2014-09-01
This study was conducted to investigate the effect of air temperature and air flow velocity on kinetics of color parameter changes during hot-air drying of green tea, to obtain the best model for hot-air drying of green tea, to apply a computer vision system and to study the color changes during drying. In the proposed computer vision system system, at first RGB values of the images were converted into XYZ values and then to Commission International d'Eclairage L*a*b* color coordinates. The obtained color parameters of L*, a* and b* were calibrated with Hunter-Lab colorimeter. These values were also used for calculation of the color difference, chroma, hue angle and browning index. The values of L* and b* decreased, while the values of a* and color difference (ΔE*ab ) increased during hot-air drying. Drying data were fitted to three kinetic models. Zero, first-order and fractional conversion models were utilized to describe the color changes of green tea. The suitability of fitness was determined using the coefficient of determination (R (2)) and root-mean-square error. Results showed that the fraction conversion model had more acceptable fitness than the other two models in most of color parameters. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
PWL 1.0 Personal WaveLab: an object-oriented workbench for seismogram analysis on Windows systems
NASA Astrophysics Data System (ADS)
Bono, Andrea; Badiali, Lucio
2005-02-01
Personal WaveLab 1.0 wants to be the starting point for an ex novo development of seismic time-series analysis procedures for Windows-based personal computers. Our objective is two-fold. Firstly, being itself a stand-alone application, it allows to do "basic" digital or digitised seismic waveform analysis. Secondly, thanks to its architectural characteristics it can be the basis for the development of more complex and power featured applications. An expanded version of PWL, called SisPick!, is currently in use at the Istituto Nazionale di Geofisica e Vulcanologia (Italian Institute of Geophysics and Volcanology) for real-time monitoring with purposes of Civil Protection. This means that about 90 users tested the application for more than 1 year, making its features more robust and efficient. SisPick! was also employed in the United Nations Nyragongo Project, in Congo, and during the Stromboli emergency in summer of 2002. The main appeals of the application package are: ease of use, object-oriented design, good computational speed, minimal need of disk space and the complete absence of third-party developed components (including ActiveX). Windows environment spares the user scripting or complex interaction with the system. The system is in constant development to answer the needs and suggestions of its users. Microsoft Visual Basic 6 source code, installation package, test data sets and documentation are available at no cost.
McCallum, Ethan B; Peterson, Zoë D
2015-11-01
Factors related to the research context, such as inquiry mode, setting, and experimenter contact, may affect participants' comfort with and willingness to disclose certain sexual attitudes or admit to engaging in sensitive sexual behaviors. In this study, 255 female undergraduates (42.7 % non-White) completed a survey containing measures of sexual behavior and attitudes. The level of experimenter contact (high vs. low contact), setting (in lab vs. out of lab), and inquiry mode (pencil-and-paper vs. computer) were manipulated and participants were randomly assigned to conditions. We hypothesized that low-contact, out-of-lab, computer conditions would be associated with more liberal sexual attitudes and higher rates of reported sexual behaviors than high-contact, in-lab, and paper-and-pencil conditions, respectively. Further, we hypothesized that effects would be moderated by race, such that differences would be greater for non-White participants because of concerns that reporting socially undesirable behavior might fuel racial stereotypes. For attitudinal measures, White participants endorsed more liberal attitudes toward sex in high-contact conditions and non-White participants endorsed more liberal attitudes in low-contact conditions. For behavioral measures, non-White participants reported more behaviors on pencil-and-paper surveys than on computers. White participants demonstrated no significant mode-related differences or reported more sexual behaviors in computer conditions than paper-and-pencil conditions. Overall, results suggest that experimenter contact and mode significantly impact sexual self-report and this impact is often moderated by race.
A flexible telerobotic system for space operations
NASA Technical Reports Server (NTRS)
Sliwa, N. O.; Will, R. W.
1987-01-01
The objective and design of a proposed goal-oriented knowledge-based telerobotic system for space operations is described. This design effort encompasses the elements of the system executive and user interface and the distribution and general structure of the knowledge base, the displays, and the task sequencing. The objective of the design effort is to provide an expandable structure for a telerobotic system that provides cooperative interaction between the human operator and computer control. The initial phase of the implementation provides a rule-based, goal-oriented script generator to interface to the existing control modes of a telerobotic research system, in the Intelligent Systems Research Lab at NASA Research Center.
Study on property and stability mechanism of LAB-AEO-4 system
NASA Astrophysics Data System (ADS)
Song, Kaifei; Ge, Jijiang; Wang, Yang; Zhang, Guicai; Jiang, Ping
2017-04-01
The behaviors of binary blending systems of fatty alcohol polyoxyethylene ether (AEO-4) blended with the laurel amide betaine (LAB) was investigated at 80°C,the results indicated that the optimal ratio of the mixed system of LAB-AEO-4 was 5:2. The stability mechanism of LAB-AEO-4 system was analyzed from three aspects of dynamic surface tension,gas permeation rate and surface rheology.The results showed that the tension of mixed system was easier to achieve balance,the constant of gas permeation rate of the mixed system decreased by about 7% and the elastic modulus and dilational modulus increased by about 2 times compared with the single LAB system.
A Computer-Based Simulation of an Acid-Base Titration
ERIC Educational Resources Information Center
Boblick, John M.
1971-01-01
Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)
When Everyone Is a Probe, Everyone Is a Learner
ERIC Educational Resources Information Center
Berenfeld, Boris; Krupa, Tatiana; Lebedev, Arseny; Stafeev, Sergey
2014-01-01
Most students globally have mobile devices and the Global Students Laboratory (GlobalLab) project is integrating mobility into learning. First launched in 1991, GlobalLab builds a community of learners engaged in collaborative, distributed investigations. Long relying on stationary desktop computers, or students inputting their observations by…
Berkeley Lab Wins Seven 2015 R&D 100 Awards | Berkeley Lab
products from industry, academia, and government-sponsored research, ranging from chemistry to materials to problems in metrology techniques: the quantitative characterization of the imaging instrumentation Computational Research Division led the development of the technology. Sensor Integrated with Recombinant and
The Virtual Genetics Lab: A Freely-Available Open-Source Genetics Simulation
ERIC Educational Resources Information Center
White, Brian; Bolker, Ethan; Koolar, Nikunj; Ma, Wei; Maw, Naing Naing; Yu, Chung Ying
2007-01-01
This lab is a computer simulation of transmission genetics. It presents students with a genetic phenomenon--the inheritance of a randomly--selected trait. The students' task is to determine how this trait is inherited by designing their own crosses and analyzing the results produced by the software.
Using SimCPU in Cooperative Learning Laboratories.
ERIC Educational Resources Information Center
Lin, Janet Mei-Chuen; Wu, Cheng-Chih; Liu, Hsi-Jen
1999-01-01
Reports research findings of an experimental design in which cooperative-learning strategies were applied to closed-lab instruction of computing concepts. SimCPU, a software package specially designed for closed-lab usage was used by 171 high school students of four classes. Results showed that collaboration enhanced learning and that blending…
Why Pulse If You Live in Turbulent Flow? Studying the Benefits of Pulsing Behavior in Xeniid Corals
NASA Astrophysics Data System (ADS)
Samson, J. E.; Khatri, S.; Holzman, R.; Shavit, U.; Miller, L.
2016-02-01
Pulsing behavior in benthic cnidarians increases local water flows and thus mass transfer (i.e. nutrient exchange) between organisms and environment. This increased mass transfer plays an especially important role in photosynthetic organisms by increasing the exchange rate of oxygen and carbon dioxide, allowing for increased metabolic rates. For organisms living mostly in the boundary layer of quiet water bodies, the benefits of pulsing to create a (feeding) current seem to be straightforward; the benefit of increased flow around the organism is larger than the cost of sustaining an energetically expensive behavior. Xeniid corals, however, are often found in turbulent flows, and it is unclear what the benefits of pulsing behavior are in an already well-mixed environment. Using lab experiments (particle image velocimetry or PIV), computational fluid dynamics simulations (immersed boundary method), and field data, we explore the reason(s) behind this paradoxical observation. 3D video recordings from pulsing corals in the lab and in the field were used to extract the kinematics of the pulsing motion. These kinematics served as input to create computational fluid dynamics simulations that allow us to further explore and compare fluid flows resulting from different situations (presence or absence of background flow around a coral colony, for example). The PIV data collected in the lab will serve to validate these simulations. Developing our computational models further will allow us to study the potential benefit of pulsing on mass transfer and to explore the advantage of collective pulsing behavior. Xeniid corals form colonies in which collective pulsing patterns can be observed. These patterns, however, have not yet been quantified and it is unclear how they arise, since cnidarians lack a centralized nervous system.
ERIC Educational Resources Information Center
Schlenker, Richard M.; Dillon, Timothy
This document contains lab activities, problem sets, and a tape script to be accompanied by a slide show. The minicourse covers the following topics of general chemistry: kinetic-molecular theory, the Bohr atom, acids, bases, and salts, the periodic table, bonding, chemical equations, the metric system, computation of density, mass, and volume,…
ERIC Educational Resources Information Center
Chiang, Harry; Robinson, Lucy C.; Brame, Cynthia J.; Messina, Troy C.
2013-01-01
Over the past 20 years, the biological sciences have increasingly incorporated chemistry, physics, computer science, and mathematics to aid in the development and use of mathematical models. Such combined approaches have been used to address problems from protein structure-function relationships to the workings of complex biological systems.…
1994-06-01
algorithms for large, irreducibly coupled systems iteratively solve concurrent problems within different subspaces of a Hilbert space, or within different...effective on problems amenable to SIMD solution. Together with researchers at AT&T Bell Labs (Boris Lubachevsky, Albert Greenberg ) we have developed...reasonable measurement. In the study of different speedups, various causes of superlinear speedup are also presented. Greenberg , Albert G., Boris D
NASA Astrophysics Data System (ADS)
Mavelli, Fabio; Ruiz-Mirazo, Kepa
2010-09-01
'ENVIRONMENT' is a computational platform that has been developed in the last few years with the aim to simulate stochastically the dynamics and stability of chemically reacting protocellular systems. Here we present and describe some of its main features, showing how the stochastic kinetics approach can be applied to study the time evolution of reaction networks in heterogeneous conditions, particularly when supramolecular lipid structures (micelles, vesicles, etc) coexist with aqueous domains. These conditions are of special relevance to understand the origins of cellular, self-reproducing compartments, in the context of prebiotic chemistry and evolution. We contrast our simulation results with real lab experiments, with the aim to bring together theoretical and experimental research on protocell and minimal artificial cell systems.
Scalable Entity-Based Modeling of Population-Based Systems, Final LDRD Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cleary, A J; Smith, S G; Vassilevska, T K
2005-01-27
The goal of this project has been to develop tools, capabilities and expertise in the modeling of complex population-based systems via scalable entity-based modeling (EBM). Our initial focal application domain has been the dynamics of large populations exposed to disease-causing agents, a topic of interest to the Department of Homeland Security in the context of bioterrorism. In the academic community, discrete simulation technology based on individual entities has shown initial success, but the technology has not been scaled to the problem sizes or computational resources of LLNL. Our developmental emphasis has been on the extension of this technology to parallelmore » computers and maturation of the technology from an academic to a lab setting.« less
Baenninger, Philipp B; Voegeli, Susanne; Bachmann, Lucas M; Faes, Livia; Iselin, Katja; Kaufmann, Claude; Thiel, Michael A
2018-04-04
To assess the variability of osmolarity measured by the point-of-care TearLab system in healthy eyes. A systematic review was performed by searching MEDLINE, Scopus, and the Web of Science Databases until November 2016 and checking reference lists of included articles and reviews. The requirements for inclusion were the availability of TearLab results in healthy subjects and a minimum study sample of 20 eyes. Two reviewers assessed articles against the inclusion criteria, extracted relevant data, and examined the methodological quality. We computed the weighted mean osmolarity using the study size as the weighting factor and calculated the rate of subjects with osmolarity values >308 mOsm/L, the Dry Eye Workshop Report 2017 (DEWS) cut-off value for dry eye disease (DED). We repeated the analysis after excluding reports with a possible conflict of interest or missing description of subject selection. Searches retrieved 105 nonduplicate articles, and we included 33 studies investigating 1362 eyes of healthy participants who were asymptomatic and showed no clinical signs of DED. Sixty-three percent were female, and mean age was 37.3 years (range: 21.5-69.0 yr). Weighted mean osmolarity was 298 mOsm/L (95% confidence interval, 282-321 mOsm/L). The result of the subgroup analysis was similar. Overall, 386 of 1362 eyes (28.3%) fulfilled the DEWS's definition of DED (>308 mOsm/L). There is a high variability of osmolarity measurements with the TearLab system. A substantial number of healthy subjects fulfill the DEWS's definition of DED. We propose interpreting the TearLab osmolarity results cautiously and in the context of other established methods.
Analysis of the Hexapod Work Space using integration of a CAD/CAE system and the LabVIEW software
NASA Astrophysics Data System (ADS)
Herbuś, K.; Ociepka, P.
2015-11-01
The paper presents the problems related to the integration of a CAD/CAE system with the LabVIEW software. The purpose of the integration is to determine the workspace of a hexapod model basing on a mathematical model describing it motion. In the first stage of the work concerning the integration task the 3D model to simulate movements of a hexapod was elaborated. This phase of the work was done in the “Motion Simulation” module of the CAD/CAE/CAM Siemens NX system. The first step was to define the components of the 3D model in the form of “links”. Individual links were defined according to the nature of the hexapod elements action. In the model prepared for movement simulation were created links corresponding to such elements as: electric actuator, top plate, bottom plate, ball-and-socket joint, toggle joint Phillips. Then were defined the constraints of the “joint” type (e.g.: revolute joint, slider joint, spherical joint) between the created component of the “link” type, so that the computer simulation corresponds to the operation of a real hexapod. The next stage of work included implementing the mathematical model describing the functioning of a hexapod in the LabVIEW software. At this stage, particular attention was paid to determining procedures for integrating the virtual 3D hexapod model with the results of calculations performed in the LabVIEW. The results relate to specific values of the jump of electric actuators depending on the position of the car on the hexapod. The use of integration made it possible to determine the safe operating space of a stationary hexapod taking into consideration the security of a person in the driving simulator designed for the disabled.
Fan, Shounian; Jiang, Yi; Jiang, Chenxi; Yang, Tianhe; Zhang, Chengyun; Liu, Junshi; Wu, Qiang; Zheng, Yaxi; Liu, Xiaoqiao
2004-10-01
Polygraph has become a necessary instrument in interventional cardiology and fundamental research of medicine up to the present. In this study, a LabView development system (DS) (developed by NI in U.S.) used as software platform, a DAQ data acquisition module and universal computer used as hardware platform, were creatively coupled with our self-made low noise multi-channels preamplifier to develop Multi-channels electrocardiograph. The device possessed the functions such as real time display of physiological process, digit highpass and lowpass, 50Hz filtered and gain adjustment, instant storing, random playback and printing, and process control stimulation. Besides, it was small-sized, economically practical and easy to operate. It could advance the spread of cardiac intervention treatment in hospitals.
Building Automatic Grading Tools for Basic of Programming Lab in an Academic Institution
NASA Astrophysics Data System (ADS)
Harimurti, Rina; Iwan Nurhidayat, Andi; Asmunin
2018-04-01
The skills of computer programming is a core competency that must be mastered by students majoring in computer sciences. The best way to improve this skill is through the practice of writing many programs to solve various problems from simple to complex. It takes hard work and a long time to check and evaluate the results of student labs one by one, especially if the number of students a lot. Based on these constrain, web proposes Automatic Grading Tools (AGT), the application that can evaluate and deeply check the source code in C, C++. The application architecture consists of students, web-based applications, compilers, and operating systems. Automatic Grading Tools (AGT) is implemented MVC Architecture and using open source software, such as laravel framework version 5.4, PostgreSQL 9.6, Bootstrap 3.3.7, and jquery library. Automatic Grading Tools has also been tested for real problems by submitting source code in C/C++ language and then compiling. The test results show that the AGT application has been running well.
2008-03-01
Appendix 82 MatLab© Cd Calculator Routine FORTRAN© Subroutine of the Variable Cd Model ii ABBREVIATIONS & ACRONYMS Cd...Figure 29. Overview Flowchart of Benét Labs Recoil Analysis Code Figure 30. Overview Flowchart of Recoil Brake Subroutine Figure 31...Detail Flowchart of Recoil Pressure/Force Calculations Figure 32. Detail Flowchart of Variable Cd Subroutine Figure 33. Simulated Brake
Transforming the advanced lab: Part I - Learning goals
NASA Astrophysics Data System (ADS)
Zwickl, Benjamin; Finkelstein, Noah; Lewandowski, H. J.
2012-02-01
Within the physics education research community relatively little attention has been given to laboratory courses, especially at the upper-division undergraduate level. As part of transforming our senior-level Optics and Modern Physics Lab at the University of Colorado Boulder we are developing learning goals, revising curricula, and creating assessments. In this paper, we report on the establishment of our learning goals and a surrounding framework that have emerged from discussions with a wide variety of faculty, from a review of the literature on labs, and from identifying the goals of existing lab courses. Our goals go beyond those of specific physics content and apparatus, allowing instructors to personalize them to their contexts. We report on four broad themes and associated learning goals: Modeling (math-physics-data connection, statistical error analysis, systematic error, modeling of engineered "black boxes"), Design (of experiments, apparatus, programs, troubleshooting), Communication, and Technical Lab Skills (computer-aided data analysis, LabVIEW, test and measurement equipment).
EarthTutor: An Interactive Intelligent Tutoring System for Remote Sensing
NASA Astrophysics Data System (ADS)
Bell, A. M.; Parton, K.; Smith, E.
2005-12-01
Earth science classes in colleges and high schools use a variety of satellite image processing software to teach earth science and remote sensing principles. However, current tutorials for image processing software are often paper-based or lecture-based and do not take advantage of the full potential of the computer context to teach, immerse, and stimulate students. We present EarthTutor, an adaptive, interactive Intelligent Tutoring System (ITS) being built for NASA (National Aeronautics and Space Administration) that is integrated directly with an image processing application. The system aims to foster the use of satellite imagery in classrooms and encourage inquiry-based, hands-on earth science scientific study by providing students with an engaging imagery analysis learning environment. EarthTutor's software is available as a plug-in to ImageJ, a free image processing system developed by the NIH (National Institute of Health). Since it is written in Java, it can be run on almost any platform and also as an applet from the Web. Labs developed for EarthTutor combine lesson content (such as HTML web pages) with interactive activities and questions. In each lab the student learns to measure, calibrate, color, slice, plot and otherwise process and analyze earth science imagery. During the activities, EarthTutor monitors students closely as they work, which allows it to provide immediate feedback that is customized to a particular student's needs. As the student moves through the labs, EarthTutor assesses the student, and tailors the presentation of the content to a student's demonstrated skill level. EarthTutor's adaptive approach is based on emerging Artificial Intelligence (AI) research. Bayesian networks are employed to model a student's proficiency with different earth science and image processing concepts. Agent behaviors are used to track the student's progress through activities and provide guidance when a student encounters difficulty. Through individual feedback and adaptive instruction, EarthTutor aims to offer the benefits of a one-on-one human instructor in a cost-effective, easy-to-use application. We are currently working with remote sensing experts to develop EarthTutor labs for diverse earth science subjects such as global vegetation, stratospheric ozone, oceanography, polar sea ice and natural hazards. These labs will be packaged with the first public release of EarthTutor in December 2005. Custom labs can be designed with the EarthTutor authoring tool. The tool is basic enough to allow teachers to construct tutorials to fit their classroom's curriculum and locale, but also powerful enough to allow advanced users to create highly-interactive labs. Preliminary results from an ongoing pilot study demonstrate that the EarthTutor system is effective and enjoyable teaching tool, relative to traditional satellite imagery teaching methods.
Social Play at the Computer: Preschoolers Scaffold and Support Peers' Computer Competence.
ERIC Educational Resources Information Center
Freeman, Nancy K.; Somerindyke, Jennifer
2001-01-01
Describes preschoolers' collaboration during free play in a computer lab, focusing on the computer's contribution to active, peer-mediated learning. Discusses these observations in terms of Parten's insights on children's social play and Vygotsky's socio-cultural learning theory, noting that the children scaffolded each other's growing computer…
Implementation of the Web-based laboratory
NASA Astrophysics Data System (ADS)
Ying, Liu; Li, Xunbo
2005-12-01
With the rapid developments of Internet technologies, remote access and control via Internet is becoming a reality. A realization of the web-based laboratory (the W-LAB) was presented. The main target of the W-LAB was to allow users to easily access and conduct experiments via the Internet. While realizing the remote communication, a system, which adopted the double client-server architecture, was introduced. It ensures the system better security and higher functionality. The experimental environment implemented in the W-Lab was integrated by both virtual lab and remote lab. The embedded technology in the W-LAB system as an economical and efficient way to build the distributed infrastructural network was introduced. Furthermore, by introducing the user authentication mechanism in the system, it effectively secures the remote communication.
Interactive, Online, Adsorption Lab to Support Discovery of the Scientific Process
NASA Astrophysics Data System (ADS)
Carroll, K. C.; Ulery, A. L.; Chamberlin, B.; Dettmer, A.
2014-12-01
Science students require more than methods practice in lab activities; they must gain an understanding of the application of the scientific process through lab work. Large classes, time constraints, and funding may limit student access to science labs, denying students access to the types of experiential learning needed to motivate and develop new scientists. Interactive, discovery-based computer simulations and virtual labs provide an alternative, low-risk opportunity for learners to engage in lab processes and activities. Students can conduct experiments, collect data, draw conclusions, and even abort a session. We have developed an online virtual lab, through which students can interactively develop as scientists as they learn about scientific concepts, lab equipment, and proper lab techniques. Our first lab topic is adsorption of chemicals to soil, but the methodology is transferrable to other topics. In addition to learning the specific procedures involved in each lab, the online activities will prompt exploration and practice in key scientific and mathematical concepts, such as unit conversion, significant digits, assessing risks, evaluating bias, and assessing quantity and quality of data. These labs are not designed to replace traditional lab instruction, but to supplement instruction on challenging or particularly time-consuming concepts. To complement classroom instruction, students can engage in a lab experience outside the lab and over a shorter time period than often required with real-world adsorption studies. More importantly, students can reflect, discuss, review, and even fail at their lab experience as part of the process to see why natural processes and scientific approaches work the way they do. Our Media Productions team has completed a series of online digital labs available at virtuallabs.nmsu.edu and scienceofsoil.com, and these virtual labs are being integrated into coursework to evaluate changes in student learning.
Exploring the changing learning environment of the gross anatomy lab.
Hopkins, Robin; Regehr, Glenn; Wilson, Timothy D
2011-07-01
The objective of this study was to assess the impact of virtual models and prosected specimens in the context of the gross anatomy lab. In 2009, student volunteers from an undergraduate anatomy class were randomly assigned to study groups in one of three learning conditions. All groups studied the muscles of mastication and completed identical learning objectives during a 45-minute lab. All groups were provided with two reference atlases. Groups were distinguished by the type of primary tools they were provided: gross prosections, three-dimensional stereoscopic computer model, or both resources. The facilitator kept observational field notes. A prepost multiple-choice knowledge test was administered to evaluate students' learning. No significant effect of the laboratory models was demonstrated between groups on the prepost assessment of knowledge. Recurring observations included students' tendency to revert to individual memorization prior to the posttest, rotation of models to match views in the provided atlas, and dissemination of groups into smaller working units. The use of virtual lab resources seemed to influence the social context and learning environment of the anatomy lab. As computer-based learning methods are implemented and studied, they must be evaluated beyond their impact on knowledge gain to consider the effect technology has on students' social development.
ANALOG I/O MODULE TEST SYSTEM BASED ON EPICS CA PROTOCOL AND ACTIVEX CA INTERFACE
DOE Office of Scientific and Technical Information (OSTI.GOV)
YENG,YHOFF,L.
2003-10-13
Analog input (ADC) and output (DAC) modules play a substantial role in device level control of accelerator and large experiment physics control system. In order to get the best performance some features of analog modules including linearity, accuracy, crosstalk, thermal drift and so on have to be evaluated during the preliminary design phase. Gain and offset error calibration and thermal drift compensation (if needed) may have to be done in the implementation phase as well. A natural technique for performing these tasks is to interface the analog VO modules and GPIB interface programmable test instruments with a computer, which canmore » complete measurements or calibration automatically. A difficulty is that drivers of analog modules and test instruments usually work on totally different platforms (vxworks VS Windows). Developing new test routines and drivers for testing instruments under VxWorks (or any other RTOS) platform is not a good solution because such systems have relatively poor user interface and developing such software requires substantial effort. EPICS CA protocol and ActiveX CA interface provide another choice, a PC and LabVIEW based test system. Analog 110 module can be interfaced from LabVIEW test routines via ActiveX CA interface. Test instruments can be controlled via LabVIEW drivers, most of which are provided by instrument vendors or by National Instruments. Labview also provides extensive data analysis and process functions. Using these functions, users can generate powerful test routines very easily. Several applications built for Spallation Neutron Source (SNS) Beam Loss Monitor (BLM) system are described in this paper.« less
Reflections on Teaching in a Wireless Laptop Lab
ERIC Educational Resources Information Center
Beasley, William; Dobda, Kathyanne W.; Wang, Lih-Ching Chen
2005-01-01
In recent years laptop computers have become increasingly popular in educational settings; wireless connectivity is a more recent development which is only now being fully explored, and which has led to the creation of the "wireless laptop lab." In this article, the authors share some of the experiences and concerns that they have encountered…
Implementing Wireless Mobile Instructional Labs: Planning Issues and Case Study
ERIC Educational Resources Information Center
McKimmy, Paul B.
2005-01-01
In April 2002, the Technology Advisory Committee of the University of Hawaii-Manoa College of Education (COE) prioritized the upgrade of existing instructional computer labs. Following several weeks of research and discussion, a decision was made to support wireless and mobile technologies during the upgrade. In June 2002, the first of three…
ERIC Educational Resources Information Center
Auld, Lawrence W. S.; Pantelidis, Veronica S.
1994-01-01
Describes the Virtual Reality and Education Lab (VREL) established at East Carolina University to study the implications of virtual reality for elementary and secondary education. Highlights include virtual reality software evaluation; hardware evaluation; computer-based curriculum objectives which could use virtual reality; and keeping current…
Brain-computer interfacing under distraction: an evaluation study
NASA Astrophysics Data System (ADS)
Brandl, Stephanie; Frølich, Laura; Höhne, Johannes; Müller, Klaus-Robert; Samek, Wojciech
2016-10-01
Objective. While motor-imagery based brain-computer interfaces (BCIs) have been studied over many years by now, most of these studies have taken place in controlled lab settings. Bringing BCI technology into everyday life is still one of the main challenges in this field of research. Approach. This paper systematically investigates BCI performance under 6 types of distractions that mimic out-of-lab environments. Main results. We report results of 16 participants and show that the performance of the standard common spatial patterns (CSP) + regularized linear discriminant analysis classification pipeline drops significantly in this ‘simulated’ out-of-lab setting. We then investigate three methods for improving the performance: (1) artifact removal, (2) ensemble classification, and (3) a 2-step classification approach. While artifact removal does not enhance the BCI performance significantly, both ensemble classification and the 2-step classification combined with CSP significantly improve the performance compared to the standard procedure. Significance. Systematically analyzing out-of-lab scenarios is crucial when bringing BCI into everyday life. Algorithms must be adapted to overcome nonstationary environments in order to tackle real-world challenges.
Comparison of digital intraoral scanners by single-image capture system and full-color movie system.
Yamamoto, Meguru; Kataoka, Yu; Manabe, Atsufumi
2017-01-01
The use of dental computer-aided design/computer-aided manufacturing (CAD/CAM) restoration is rapidly increasing. This study was performed to evaluate the marginal and internal cement thickness and the adhesive gap of internal cavities comprising CAD/CAM materials using two digital impression acquisition methods and micro-computed tomography. Images obtained by a single-image acquisition system (Bluecam Ver. 4.0) and a full-color video acquisition system (Omnicam Ver. 4.2) were divided into the BL and OM groups, respectively. Silicone impressions were prepared from an ISO-standard metal mold, and CEREC Stone BC and New Fuji Rock IMP were used to create working models (n=20) in the BL and OM groups (n=10 per group), respectively. Individual inlays were designed in a conventional manner using designated software, and all restorations were prepared using CEREC inLab MC XL. These were assembled with the corresponding working models used for measurement, and the level of fit was examined by three-dimensional analysis based on micro-computed tomography. Significant differences in the marginal and internal cement thickness and adhesive gap spacing were found between the OM and BL groups. The full-color movie capture system appears to be a more optimal restoration system than the single-image capture system.
NASA Technical Reports Server (NTRS)
Kovarik, Madeline
1993-01-01
Intelligent computer aided training systems hold great promise for the application of this technology to mainstream education and training. Yet, this technology, which holds such a vast potential impact for the future of education and training, has had little impact beyond the enclaves of government research labs. This is largely due to the inaccessibility of the technology to those individuals in whose hands it can have the greatest impact, teachers and educators. Simply throwing technology at an educator and expecting them to use it as an effective tool is not the answer. This paper provides a background into the use of technology as a training tool. MindLink, developed by HyperTech Systems, provides trainers with a powerful rule-based tool that can be integrated directly into a Windows application. By embedding expert systems technology it becomes more accessible and easier to master.
Understanding of and applications for robot vision guidance at KSC
NASA Technical Reports Server (NTRS)
Shawaga, Lawrence M.
1988-01-01
The primary thrust of robotics at KSC is for the servicing of Space Shuttle remote umbilical docking functions. In order for this to occur, robots performing servicing operations must be capable of tracking a swaying Orbiter in Six Degrees of Freedom (6-DOF). Currently, in NASA KSC's Robotic Applications Development Laboratory (RADL), an ASEA IRB-90 industrial robot is being equipped with a real-time computer vision (hardware and software) system to allow it to track a simulated Orbiter interface (target) in 6-DOF. The real-time computer vision system effectively becomes the eyes for the lab robot, guiding it through a closed loop visual feedback system to move with the simulated Orbiter interface. This paper will address an understanding of this vision guidance system and how it will be applied to remote umbilical servicing at KSC. In addition, other current and future applications will be addressed.
Parallel processing and expert systems
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Lau, Sonie
1991-01-01
Whether it be monitoring the thermal subsystem of Space Station Freedom, or controlling the navigation of the autonomous rover on Mars, NASA missions in the 90's cannot enjoy an increased level of autonomy without the efficient use of expert systems. Merely increasing the computational speed of uniprocessors may not be able to guarantee that real time demands are met for large expert systems. Speed-up via parallel processing must be pursued alongside the optimization of sequential implementations. Prototypes of parallel expert systems have been built at universities and industrial labs in the U.S. and Japan. The state-of-the-art research in progress related to parallel execution of expert systems was surveyed. The survey is divided into three major sections: (1) multiprocessors for parallel expert systems; (2) parallel languages for symbolic computations; and (3) measurements of parallelism of expert system. Results to date indicate that the parallelism achieved for these systems is small. In order to obtain greater speed-ups, data parallelism and application parallelism must be exploited.
NASA/FAA North Texas Research Station Overview
NASA Technical Reports Server (NTRS)
Borchers, Paul F.
2012-01-01
NTX Research Staion: NASA research assets embedded in an interesting operational air transport environment. Seven personnel (2 civil servants, 5 contractors). ARTCC, TRACON, Towers, 3 air carrier AOCs(American, Eagle and Southwest), and 2 major airports all within 12 miles. Supports NASA Airspace Systems Program with research products at all levels (fundamental to system level). NTX Laboratory: 5000 sq ft purpose-built, dedicated, air traffic management research facility. Established data links to ARTCC, TRACON, Towers, air carriers, airport and NASA facilities. Re-configurable computer labs, dedicated radio tower, state-of-the-art equipment.
Development of advanced avionics systems applicable to terminal-configured vehicles
NASA Technical Reports Server (NTRS)
Heimbold, R. L.; Lee, H. P.; Leffler, M. F.
1980-01-01
A technique to add the time constraint to the automatic descent feature of the existing L-1011 aircraft Flight Management System (FMS) was developed. Software modifications were incorporated in the FMS computer program and the results checked by lab simulation and on a series of eleven test flights. An arrival time dispersion (2 sigma) of 19 seconds was achieved. The 4 D descent technique can be integrated with the time-based metering method of air traffic control. Substantial reductions in delays at today's busy airports should result.
ATHLETE: Trading Complexity for Mass in Roving Vehicles
NASA Technical Reports Server (NTRS)
Wilcox, Brian H.
2013-01-01
This paper describes a scaling analysis of ATHLETE for exploration of the moon, Mars and Near-Earth Asteroids (NEAs) in comparison to a more conventional vehicle configuration. Recently, the focus of human exploration beyond LEO has been on NEAs. A low gravity testbed has been constructed in the ATHLETE lab, with six computer-controlled winches able to lift ATHLETE and payloads so as to simulate the motion of the system in the vicinity of a NEA or to simulate ATHLETE on extreme terrain in lunar or Mars gravity. Test results from this system are described.
Ramos, Rogelio; Zlatev, Roumen; Valdez, Benjamin; Stoytcheva, Margarita; Carrillo, Mónica; García, Juan-Francisco
2013-01-01
A virtual instrumentation (VI) system called VI localized corrosion image analyzer (LCIA) based on LabVIEW 2010 was developed allowing rapid automatic and subjective error-free determination of the pits number on large sized corroded specimens. The VI LCIA controls synchronously the digital microscope image taking and its analysis, finally resulting in a map file containing the coordinates of the detected probable pits containing zones on the investigated specimen. The pits area, traverse length, and density are also determined by the VI using binary large objects (blobs) analysis. The resulting map file can be used further by a scanning vibrating electrode technique (SVET) system for rapid (one pass) “true/false” SVET check of the probable zones only passing through the pit's centers avoiding thus the entire specimen scan. A complete SVET scan over the already proved “true” zones could determine the corrosion rate in any of the zones. PMID:23691434
Wind turbine remote control using Android devices
NASA Astrophysics Data System (ADS)
Rat, C. L.; Panoiu, M.
2018-01-01
This paper describes the remote control of a wind turbine system over the internet using an Android device, namely a tablet or a smartphone. The wind turbine workstation contains a LabVIEW program which monitors the entire wind turbine energy conversion system (WECS). The Android device connects to the LabVIEW application, working as a remote interface to the wind turbine. The communication between the devices needs to be secured because it takes place over the internet. Hence, the data are encrypted before being sent through the network. The scope was the design of remote control software capable of visualizing real-time wind turbine data through a secure connection. Since the WECS is fully automated and no full-time human operator exists, unattended access to the turbine workstation is needed. Therefore the device must not require any confirmation or permission from the computer operator in order to control it. Another condition is that Android application does not have any root requirements.
Bayomy, Hanaa; El Awadi, Mona; El Araby, Eman; Abed, Hala A
2016-12-01
Computer-assisted medical education has been developed to enhance learning and enable high-quality medical care. This study aimed to assess computer knowledge and attitude toward the inclusion of computers in medical education among second-year medical students in Benha Faculty of Medicine, Egypt, to identify limitations, and obtain suggestions for successful computer-based learning. This was a one-group pre-post-test study, which was carried out on second-year students in Benha Faculty of Medicine. A structured self-administered questionnaire was used to compare students' knowledge, attitude, limitations, and suggestions toward computer usage in medical education before and after the computer course to evaluate the change in students' responses. The majority of students were familiar with use of the mouse and keyboard, basic word processing, internet and web searching, and e-mail both before and after the computer course. The proportion of students who were familiar with software programs other than the word processing and trouble-shoot software/hardware was significantly higher after the course (P<0.001). There was a significant increase in the proportion of students who agreed on owning a computer (P=0.008), the inclusion of computer skills course in medical education, downloading lecture handouts, and computer-based exams (P<0.001) after the course. After the course, there was a significant increase in the proportion of students who agreed that the lack of central computers limited the inclusion of computer in medical education (P<0.001). Although the lack of computer labs, lack of Information Technology staff mentoring, large number of students, unclear course outline, and lack of internet access were more frequently reported before the course (P<0.001), the majority of students suggested the provision of computer labs, inviting Information Technology staff to support computer teaching, and the availability of free Wi-Fi internet access covering several areas in the university campus; all would support computer-assisted medical education. Medical students in Benha University are computer literate, which allows for computer-based medical education. Staff training, provision of computer labs, and internet access are essential requirements for enhancing computer usage in medical education in the university.
Wide-field Imaging System and Rapid Direction of Optical Zoom (WOZ)
2010-09-25
commercial software packages: SolidWorks, COMSOL Multiphysics, and ZEMAX optical design. SolidWorks is a computer aided design package, which as a live...interface to COMSOL. COMSOL is a finite element analysis/partial differential equation solver. ZEMAX is an optical design package. Both COMSOL and... ZEMAX have live interfaces to MatLab. Our initial investigations have enabled a model in SolidWorks to be updated in COMSOL, an FEA calculation
Polyphony: A Workflow Orchestration Framework for Cloud Computing
NASA Technical Reports Server (NTRS)
Shams, Khawaja S.; Powell, Mark W.; Crockett, Tom M.; Norris, Jeffrey S.; Rossi, Ryan; Soderstrom, Tom
2010-01-01
Cloud Computing has delivered unprecedented compute capacity to NASA missions at affordable rates. Missions like the Mars Exploration Rovers (MER) and Mars Science Lab (MSL) are enjoying the elasticity that enables them to leverage hundreds, if not thousands, or machines for short durations without making any hardware procurements. In this paper, we describe Polyphony, a resilient, scalable, and modular framework that efficiently leverages a large set of computing resources to perform parallel computations. Polyphony can employ resources on the cloud, excess capacity on local machines, as well as spare resources on the supercomputing center, and it enables these resources to work in concert to accomplish a common goal. Polyphony is resilient to node failures, even if they occur in the middle of a transaction. We will conclude with an evaluation of a production-ready application built on top of Polyphony to perform image-processing operations of images from around the solar system, including Mars, Saturn, and Titan.
Prediction of quantitative intrathoracic fluid volume to diagnose pulmonary oedema using LabVIEW.
Urooj, Shabana; Khan, M; Ansari, A Q; Lay-Ekuakille, Aimé; Salhan, Ashok K
2012-01-01
Pulmonary oedema is a life-threatening disease that requires special attention in the area of research and clinical diagnosis. Computer-based techniques are rarely used to quantify the intrathoracic fluid volume (IFV) for diagnostic purposes. This paper discusses a software program developed to detect and diagnose pulmonary oedema using LabVIEW. The software runs on anthropometric dimensions and physiological parameters, mainly transthoracic electrical impedance (TEI). This technique is accurate and faster than existing manual techniques. The LabVIEW software was used to compute the parameters required to quantify IFV. An equation relating per cent control and IFV was obtained. The results of predicted TEI and measured TEI were compared with previously reported data to validate the developed program. It was found that the predicted values of TEI obtained from the computer-based technique were much closer to the measured values of TEI. Six new subjects were enrolled to measure and predict transthoracic impedance and hence to quantify IFV. A similar difference was also observed in the measured and predicted values of TEI for the new subjects.
Berkeley lab checkpoint/restart (BLCR) for Linux clusters
Hargrove, Paul H.; Duell, Jason C.
2006-09-01
This article describes the motivation, design and implementation of Berkeley Lab Checkpoint/Restart (BLCR), a system-level checkpoint/restart implementation for Linux clusters that targets the space of typical High Performance Computing applications, including MPI. Application-level solutions, including both checkpointing and fault-tolerant algorithms, are recognized as more time and space efficient than system-level checkpoints, which cannot make use of any application-specific knowledge. However, system-level checkpointing allows for preemption, making it suitable for responding to fault precursors (for instance, elevated error rates from ECC memory or network CRCs, or elevated temperature from sensors). Preemption can also increase the efficiency of batch scheduling; for instancemore » reducing idle cycles (by allowing for shutdown without any queue draining period or reallocation of resources to eliminate idle nodes when better fitting jobs are queued), and reducing the average queued time (by limiting large jobs to running during off-peak hours, without the need to limit the length of such jobs). Each of these potential uses makes BLCR a valuable tool for efficient resource management in Linux clusters. © 2006 IOP Publishing Ltd.« less
Andy Jenkins Builds Applications Development For Lab-on-a-Chip
NASA Technical Reports Server (NTRS)
2004-01-01
Andy Jenkins, an engineer for the Lab on a Chip Applications Development program, helped build the Applications Development Unit (ADU-25), a one-of-a-kind facility for controlling and analyzing processes on chips with extreme accuracy. Pressure is used to cause fluids to travel through network of fluid pathways, or micro-channels, embossed on the chips through a process similar to the one used to print circuits on computer chips. To make customized chips for various applications, NASA has an agreement with the U.S. Army's Micro devices and Micro fabrication Laboratory at Redstone Arsenal in Huntsville, Alabama, where NASA's Marshall Space Flight Center (MSFC) is located. The Marshall Center team is also collaborating with scientists at other NASA centers and at universities to develop custom chip designs for many applications, such as studying how fluidic systems work in spacecraft and identifying microbes in self-contained life support systems. Chips could even be designed for use on Earth, such as for detecting deadly microbes in heating and air systems. (NASA/MSFC/D.Stoffer)
A Virtual Instrument System for Determining Sugar Degree of Honey
Wu, Qijun; Gong, Xun
2015-01-01
This study established a LabVIEW-based virtual instrument system to measure optical activity through the communication of conventional optical instrument with computer via RS232 port. This system realized the functions for automatic acquisition, real-time display, data processing, results playback, and so forth. Therefore, it improved accuracy of the measurement results by avoiding the artificial operation, cumbersome data processing, and the artificial error in optical activity measurement. The system was applied to the analysis of the batch inspection on the sugar degree of honey. The results obtained were satisfying. Moreover, it showed advantages such as friendly man-machine dialogue, simple operation, and easily expanded functions. PMID:26504615
Expert overseer for mass spectrometer system
Filby, Evan E.; Rankin, Richard A.
1991-01-01
An expert overseer for the operation and real-time management of a mass spectrometer and associated laboratory equipment. The overseer is a computer-based expert diagnostic system implemented on a computer separate from the dedicated computer used to control the mass spectrometer and produce the analysis results. An interface links the overseer to components of the mass spectrometer, components of the laboratory support system, and the dedicated control computer. Periodically, the overseer polls these devices and as well as itself. These data are fed into an expert portion of the system for real-time evaluation. A knowledge base used for the evaluation includes both heuristic rules and precise operation parameters. The overseer also compares current readings to a long-term database to detect any developing trends using a combination of statistical and heuristic rules to evaluate the results. The overseer has the capability to alert lab personnel whenever questionable readings or trends are observed and provide a background review of the problem and suggest root causes and potential solutions, or appropriate additional tests that could be performed. The overseer can change the sequence or frequency of the polling to respond to an observation in the current data.
New Technology and the Curriculum.
ERIC Educational Resources Information Center
Conklin, Joyce
1987-01-01
Hillsdale High School, in San Mateo, California, installed the nation's first 15-computer Macintosh laboratory donated by Apple Computer, Inc. This article describes the lab and the uses to which it has been put, including computer education, word processing, preparation of student publications, and creative writing instruction. (PGD)
Computer Series, 98. Electronics for Scientists: A Computer-Intensive Approach.
ERIC Educational Resources Information Center
Scheeline, Alexander; Mork, Brian J.
1988-01-01
Reports the design for a principles-before-details presentation of electronics for an instrumental analysis class. Uses computers for data collection and simulations. Requires one semester with two 2.5-hour periods and two lectures per week. Includes lab and lecture syllabi. (MVL)
Implementing Equal Access Computer Labs.
ERIC Educational Resources Information Center
Clinton, Janeen; And Others
This paper discusses the philosophy followed in Palm Beach County to adapt computer literacy curriculum, hardware, and software to meet the needs of all children. The Department of Exceptional Student Education and the Department of Instructional Computing Services cooperated in planning strategies and coordinating efforts to implement equal…
Beyond the Renderer: Software Architecture for Parallel Graphics and Visualization
NASA Technical Reports Server (NTRS)
Crockett, Thomas W.
1996-01-01
As numerous implementations have demonstrated, software-based parallel rendering is an effective way to obtain the needed computational power for a variety of challenging applications in computer graphics and scientific visualization. To fully realize their potential, however, parallel renderers need to be integrated into a complete environment for generating, manipulating, and delivering visual data. We examine the structure and components of such an environment, including the programming and user interfaces, rendering engines, and image delivery systems. We consider some of the constraints imposed by real-world applications and discuss the problems and issues involved in bringing parallel rendering out of the lab and into production.
A virtual computer lab for distance biomedical technology education.
Locatis, Craig; Vega, Anibal; Bhagwat, Medha; Liu, Wei-Li; Conde, Jose
2008-03-13
The National Library of Medicine's National Center for Biotechnology Information offers mini-courses which entail applying concepts in biochemistry and genetics to search genomics databases and other information sources. They are highly interactive and involve use of 3D molecular visualization software that can be computationally taxing. Methods were devised to offer the courses at a distance so as to provide as much functionality of a computer lab as possible, the venue where they are normally taught. The methods, which can be employed with varied videoconferencing technology and desktop sharing software, were used to deliver mini-courses at a distance in pilot applications where students could see demonstrations by the instructor and the instructor could observe and interact with students working at their remote desktops. Student ratings of the learning experience and comments to open ended questions were similar to those when the courses are offered face to face. The real time interaction and the instructor's ability to access student desktops from a distance in order to provide individual assistance and feedback were considered invaluable. The technologies and methods mimic much of the functionality of computer labs and may be usefully applied in any context where content changes frequently, training needs to be offered on complex computer applications at a distance in real time, and where it is necessary for the instructor to monitor students as they work.
Effects of Computer Animation Exercises on Student Cognitive Processes.
ERIC Educational Resources Information Center
Fowler, Will
A study examining the effects of computer animation exercises on cognitive development asked two groups of seventh graders to create computer animations, working from a simple mythic text. The ability of students to create narrative scenarios from this mythic text was analyzed. These scenarios were then recreated in the school computer lab, using…
ERIC Educational Resources Information Center
Roblyer, M. D., Ed.
Current issues in educational uses for microcomputers are addressed in this collection of 139 abstracts of papers in which computer literacy and practical applications dominate. Topics discussed include factors related to computer use in the classroom, e.g., computer lab utilization; teaching geometry, science, math, and English via…
ERIC Educational Resources Information Center
Ziegler, Blake E.
2013-01-01
Computational chemistry undergraduate laboratory courses are now part of the chemistry curriculum at many universities. However, there remains a lack of computational chemistry exercises available to instructors. This exercise is presented for students to develop skills using computational chemistry software while supplementing their knowledge of…
NASA Astrophysics Data System (ADS)
Møll Nilsen, Halvor; Lie, Knut-Andreas; Andersen, Odd
2015-06-01
MRST-co2lab is a collection of open-source computational tools for modeling large-scale and long-time migration of CO2 in conductive aquifers, combining ideas from basin modeling, computational geometry, hydrology, and reservoir simulation. Herein, we employ the methods of MRST-co2lab to study long-term CO2 storage on the scale of hundreds of megatonnes. We consider public data sets of two aquifers from the Norwegian North Sea and use geometrical methods for identifying structural traps, percolation-type methods for identifying potential spill paths, and vertical-equilibrium methods for efficient simulation of structural, residual, and solubility trapping in a thousand-year perspective. In particular, we investigate how data resolution affects estimates of storage capacity and discuss workflows for identifying good injection sites and optimizing injection strategies.
Arduino: a low-cost multipurpose lab equipment.
D'Ausilio, Alessandro
2012-06-01
Typical experiments in psychological and neurophysiological settings often require the accurate control of multiple input and output signals. These signals are often generated or recorded via computer software and/or external dedicated hardware. Dedicated hardware is usually very expensive and requires additional software to control its behavior. In the present article, I present some accuracy tests on a low-cost and open-source I/O board (Arduino family) that may be useful in many lab environments. One of the strengths of Arduinos is the possibility they afford to load the experimental script on the board's memory and let it run without interfacing with computers or external software, thus granting complete independence, portability, and accuracy. Furthermore, a large community has arisen around the Arduino idea and offers many hardware add-ons and hundreds of free scripts for different projects. Accuracy tests show that Arduino boards may be an inexpensive tool for many psychological and neurophysiological labs.
ERIC Educational Resources Information Center
Chen, Baiyun; DeMara, Ronald F.; Salehi, Soheil; Hartshorne, Richard
2018-01-01
A laboratory pedagogy interweaving weekly student portfolios with onsite formative electronic laboratory assessments (ELAs) is developed and assessed within the laboratory component of a required core course of the electrical and computer engineering (ECE) undergraduate curriculum. The approach acts to promote student outcomes, and neutralize…
Gene Expression Analysis: Teaching Students to Do 30,000 Experiments at Once with Microarray
ERIC Educational Resources Information Center
Carvalho, Felicia I.; Johns, Christopher; Gillespie, Marc E.
2012-01-01
Genome scale experiments routinely produce large data sets that require computational analysis, yet there are few student-based labs that illustrate the design and execution of these experiments. In order for students to understand and participate in the genomic world, teaching labs must be available where students generate and analyze large data…
Effectiveness of e-Lab Use in Science Teaching at the Omani Schools
ERIC Educational Resources Information Center
Al Musawi, A.; Ambusaidi, A.; Al-Balushi, S.; Al-Balushi, K.
2015-01-01
Computer and information technology can be used so that students can individually, in groups, or by electronic demonstration experiment and draw conclusion for the required activities in an electronic form in what is now called "e-lab". It enables students to conduct experiments more flexibly and in an interactive way using multimedia.…
ERIC Educational Resources Information Center
Schellhammer, Karl Sebastian; Cuniberti, Gianaurelio
2017-01-01
We are hereby presenting a didactic concept for an advanced lab course that focuses on the design of donor materials for organic solar cells. Its research-related and competence-based approach qualifies the students to independently and creatively apply computational methods and to profoundly and critically discuss the results obtained. The high…
SenseCube--A Novel Inexpensive Wireless Multisensor for Physics Lab Experimentations
ERIC Educational Resources Information Center
Mehta, Vedant; Lane, Charles D.
2018-01-01
SenseCube is a multisensor capable of measuring many different real-time events and changes in environment. Most conventional sensors used in introductory-physics labs use their own software and have wires that must be attached to a computer or an alternate device to analyze the data. This makes the standard sensors time consuming, tedious, and…
NASA Astrophysics Data System (ADS)
di, L.; Deng, M.
2010-12-01
Remote sensing (RS) is an essential method to collect data for Earth science research. Huge amount of remote sensing data, most of them in the image form, have been acquired. Almost all geography departments in the world offer courses in digital processing of remote sensing images. Such courses place emphasis on how to digitally process large amount of multi-source images for solving real world problems. However, due to the diversity and complexity of RS images and the shortcomings of current data and processing infrastructure, obstacles for effectively teaching such courses still remain. The major obstacles include 1) difficulties in finding, accessing, integrating and using massive RS images by students and educators, and 2) inadequate processing functions and computing facilities for students to freely explore the massive data. Recent development in geospatial Web processing service systems, which make massive data, computing powers, and processing capabilities to average Internet users anywhere in the world, promises the removal of the obstacles. The GeoBrain system developed by CSISS is an example of such systems. All functions available in GRASS Open Source GIS have been implemented as Web services in GeoBrain. Petabytes of remote sensing images in NASA data centers, the USGS Landsat data archive, and NOAA CLASS are accessible transparently and processable through GeoBrain. The GeoBrain system is operated on a high performance cluster server with large disk storage and fast Internet connection. All GeoBrain capabilities can be accessed by any Internet-connected Web browser. Dozens of universities have used GeoBrain as an ideal platform to support data-intensive remote sensing education. This presentation gives a specific example of using GeoBrain geoprocessing services to enhance the teaching of GGS 588, Digital Remote Sensing taught at the Department of Geography and Geoinformation Science, George Mason University. The course uses the textbook "Introductory Digital Image Processing, A Remote Sensing Perspective" authored by John Jensen. The textbook is widely adopted in the geography departments around the world for training students on digital processing of remote sensing images. In the traditional teaching setting for the course, the instructor prepares a set of sample remote sensing images to be used for the course. Commercial desktop remote sensing software, such as ERDAS, is used for students to do the lab exercises. The students have to do the excurses in the lab and can only use the simple images. For this specific course at GMU, we developed GeoBrain-based lab excurses for the course. With GeoBrain, students now can explore petabytes of remote sensing images in the NASA, NOAA, and USGS data archives instead of dealing only with sample images. Students have a much more powerful computing facility available for their lab excurses. They can explore the data and do the excurses any time at any place they want as long as they can access the Internet through the Web Browser. The feedbacks from students are all very positive about the learning experience on the digital image processing with the help of GeoBrain web processing services. The teaching/lab materials and GeoBrain services are freely available to anyone at http://www.laits.gmu.edu.
Computational strategies for three-dimensional flow simulations on distributed computer systems
NASA Technical Reports Server (NTRS)
Sankar, Lakshmi N.; Weed, Richard A.
1995-01-01
This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.
Computational strategies for three-dimensional flow simulations on distributed computer systems
NASA Astrophysics Data System (ADS)
Sankar, Lakshmi N.; Weed, Richard A.
1995-08-01
This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Underwood, Keith D; Ulmer, Craig D.; Thompson, David
Field programmable gate arrays (FPGAs) have been used as alternative computational de-vices for over a decade; however, they have not been used for traditional scientific com-puting due to their perceived lack of floating-point performance. In recent years, there hasbeen a surge of interest in alternatives to traditional microprocessors for high performancecomputing. Sandia National Labs began two projects to determine whether FPGAs wouldbe a suitable alternative to microprocessors for high performance scientific computing and,if so, how they should be integrated into the system. We present results that indicate thatFPGAs could have a significant impact on future systems. FPGAs have thepotentialtohave ordermore » of magnitude levels of performance wins on several key algorithms; however,there are serious questions as to whether the system integration challenge can be met. Fur-thermore, there remain challenges in FPGA programming and system level reliability whenusing FPGA devices.4 AcknowledgmentArun Rodrigues provided valuable support and assistance in the use of the Structural Sim-ulation Toolkit within an FPGA context. Curtis Janssen and Steve Plimpton provided valu-able insights into the workings of two Sandia applications (MPQC and LAMMPS, respec-tively).5« less
Red Storm usage model :Version 1.12.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jefferson, Karen L.; Sturtevant, Judith E.
Red Storm is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Sandia National Laboratories (SNL). The Red Storm Usage Model (RSUM) documents the capabilities and the environment provided for the FY05 Tri-Lab Level II Limited Availability Red Storm User Environment Milestone and the FY05 SNL Level II Limited Availability Red Storm Platform Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL),more » and SNL. Additionally, the Red Storm Usage Model maps the provided capabilities to the Tri-Lab ASC Computing Environment (ACE) requirements. The ACE requirements reflect the high performance computing requirements for the ASC community and have been updated in FY05 to reflect the community's needs. For each section of the RSUM, Appendix I maps the ACE requirements to the Limited Availability User Environment capabilities and includes a description of ACE requirements met and those requirements that are not met in that particular section. The Red Storm Usage Model, along with the ACE mappings, has been issued and vetted throughout the Tri-Lab community.« less
Babjack, Destiny L; Cernicky, Brandon; Sobotka, Andrew J; Basler, Lee; Struthers, Devon; Kisic, Richard; Barone, Kimberly; Zuccolotto, Anthony P
2015-09-01
Using differing computer platforms and audio output devices to deliver audio stimuli often introduces (1) substantial variability across labs and (2) variable time between the intended and actual sound delivery (the sound onset latency). Fast, accurate audio onset latencies are particularly important when audio stimuli need to be delivered precisely as part of studies that depend on accurate timing (e.g., electroencephalographic, event-related potential, or multimodal studies), or in multisite studies in which standardization and strict control over the computer platforms used is not feasible. This research describes the variability introduced by using differing configurations and introduces a novel approach to minimizing audio sound latency and variability. A stimulus presentation and latency assessment approach is presented using E-Prime and Chronos (a new multifunction, USB-based data presentation and collection device). The present approach reliably delivers audio stimuli with low latencies that vary by ≤1 ms, independent of hardware and Windows operating system (OS)/driver combinations. The Chronos audio subsystem adopts a buffering, aborting, querying, and remixing approach to the delivery of audio, to achieve a consistent 1-ms sound onset latency for single-sound delivery, and precise delivery of multiple sounds that achieves standard deviations of 1/10th of a millisecond without the use of advanced scripting. Chronos's sound onset latencies are small, reliable, and consistent across systems. Testing of standard audio delivery devices and configurations highlights the need for careful attention to consistency between labs, experiments, and multiple study sites in their hardware choices, OS selections, and adoption of audio delivery systems designed to sidestep the audio latency variability issue.
Seqcrawler: biological data indexing and browsing platform.
Sallou, Olivier; Bretaudeau, Anthony; Roult, Aurelien
2012-07-24
Seqcrawler takes its roots in software like SRS or Lucegene. It provides an indexing platform to ease the search of data and meta-data in biological banks and it can scale to face the current flow of data. While many biological bank search tools are available on the Internet, mainly provided by large organizations to search their data, there is a lack of free and open source solutions to browse one's own set of data with a flexible query system and able to scale from a single computer to a cloud system. A personal index platform will help labs and bioinformaticians to search their meta-data but also to build a larger information system with custom subsets of data. The software is scalable from a single computer to a cloud-based infrastructure. It has been successfully tested in a private cloud with 3 index shards (pieces of index) hosting ~400 millions of sequence information (whole GenBank, UniProt, PDB and others) for a total size of 600 GB in a fault tolerant architecture (high-availability). It has also been successfully integrated with software to add extra meta-data from blast results to enhance users' result analysis. Seqcrawler provides a complete open source search and store solution for labs or platforms needing to manage large amount of data/meta-data with a flexible and customizable web interface. All components (search engine, visualization and data storage), though independent, share a common and coherent data system that can be queried with a simple HTTP interface. The solution scales easily and can also provide a high availability infrastructure.
Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.
Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E
2016-01-01
Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.
Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems
Timmis, Jon; Qwarnstrom, Eva E.
2016-01-01
Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414
ERIC Educational Resources Information Center
Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David
2007-01-01
Most universities have invested in extensive infrastructure in the form of computer laboratories and computer kiosks. However, is this investment justified when it is suggested that students work predominantly from home using their own computers? This paper provides an empirical study investigating how students at a regional multi-campus…
Using Flash Technology for Motivation and Assessment
ERIC Educational Resources Information Center
Deal, Walter F., III
2004-01-01
A visit to most any technology education laboratory or classroom will reveal that computers, software, and multimedia software are rapidly becoming a mainstay in learning about technology and technological literacy. Almost all technology labs have at least several computers dedicated to specialized software or hardware such as Computer-aided…
CAI at CSDF: Organizational Strategies.
ERIC Educational Resources Information Center
Irwin, Margaret G.
1982-01-01
The computer assisted instruction (CAI) program at the California School for the Deaf, at Fremont, features individual Apple computers in classrooms as well as in CAI labs. When the whole class uses computers simultaneously, the teacher can help individuals, identify group weaknesses, note needs of the materials, and help develop additional CAI…
College Students' Use of the Internet.
ERIC Educational Resources Information Center
McFadden, Anna C.
1999-01-01
Studied use of the Internet by college students by determining sites selected on 6 of 70 computers in a college computer laboratory. The overwhelming use of the Internet in this open lab conformed to university acceptable-use policy, with almost no use of the computers to contact pornographic sites. (SLD)
Berkeley Lab - Materials Sciences Division
Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion and Materials Physics Scattering and Instrumentation Science Centers Center for Computational Study of Sciences Centers Center for Computational Study of Excited-State Phenomena in Energy Materials Center for X
Large Hospital Buildings in the United States in 2007
2012-01-01
Hospitals consume large amounts of energy because of how they are run and the many people that use them. They are open 24 hours a day; thousands of employees, patients, and visitors occupy the buildings daily; and sophisticated heating, ventilation, and air conditioning (HVAC) systems control the temperatures and air flow. In addition, many energy intensive activities occur in these buildings: laundry, medical and lab equipment use, sterilization, computer and server use, food service, and refrigeration.
Terascale spectral element algorithms and implementations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fischer, P. F.; Tufo, H. M.
1999-08-17
We describe the development and implementation of an efficient spectral element code for multimillion gridpoint simulations of incompressible flows in general two- and three-dimensional domains. We review basic and recently developed algorithmic underpinnings that have resulted in good parallel and vector performance on a broad range of architectures, including the terascale computing systems now coming online at the DOE labs. Sustained performance of 219 GFLOPS has been recently achieved on 2048 nodes of the Intel ASCI-Red machine at Sandia.
Capturing Location-Privacy Preferences: Quantifying Accuracy and User-Burden Tradeoffs
2010-03-01
research groups have developed location - based services , including PARC’s Active Badges [24], Active- Campus [2], MyCampus [17], Intel’s PlaceLab [11], and...groups. In Conference on Human Factors in Computing Systems (CHI), 2008. [2] L. Barkhuus and A. Dey. Location - based services for mobile telephony: A... location - based services . In OnTheMove Conferences (OTM), 2009. [6] K. Connelly, A. Khalil, and Y. Liu. Do I do what I say?: Observed versus stated privacy
Strategic Design of an Interactive Video Learning Lab (IVL).
ERIC Educational Resources Information Center
Switzer, Ralph V., Jr.; Switzer, Jamie S.
1993-01-01
Describes a study that researched elements necessary for the design of an interactive video learning (IVL) lab for business courses. Highlights include a review of pertinent literature; guidelines for the use of an IVL lab; IVL systems integration; system specifications; hardware costs; and system software. (five references) (LRW)
Impact of four training conditions on physician use of a web-based clinical decision support system.
Kealey, Edith; Leckman-Westin, Emily; Finnerty, Molly T
2013-09-01
Training has been identified as an important barrier to implementation of clinical decision support systems (CDSSs), but little is known about the effectiveness of different training approaches. Using an observational retrospective cohort design, we examined the impact of four training conditions on physician use of a CDSS: (1) computer lab training with individualized follow-up (CL-FU) (n=40), (2) computer lab training without follow-up (CL) (n=177), (3) lecture demonstration (LD) (n=16), or (4) no training (NT) (n=134). Odds ratios of any use and ongoing use under training conditions were compared to no training over a 2-year follow-up period. CL-FU was associated with the highest percent of active users and odds for any use (90.0%, odds ratio (OR)=10.2, 95% confidence interval (CI): 3.2-32.9) and ongoing use (60.0%, OR=6.1 95% CI: 2.6-13.7), followed by CL (any use=81.4%, OR=5.3, CI: 2.9-9.6; ongoing use=28.8%, OR=1.7, 95% CI: 1.0-3.0). LD was not superior to no training (any use=47%, ongoing use=22.4%). Training format may have differential effects on initial and long-term follow-up of CDSSs use by physicians. Copyright © 2013 Elsevier B.V. All rights reserved.
Designing and validating the joint battlespace infosphere
NASA Astrophysics Data System (ADS)
Peterson, Gregory D.; Alexander, W. Perry; Birdwell, J. Douglas
2001-08-01
Fielding and managing the dynamic, complex information systems infrastructure necessary for defense operations presents significant opportunities for revolutionary improvements in capabilities. An example of this technology trend is the creation and validation of the Joint Battlespace Infosphere (JBI) being developed by the Air Force Research Lab. The JBI is a system of systems that integrates, aggregates, and distributes information to users at all echelons, from the command center to the battlefield. The JBI is a key enabler of meeting the Air Force's Joint Vision 2010 core competencies such as Information Superiority, by providing increased situational awareness, planning capabilities, and dynamic execution. At the same time, creating this new operational environment introduces significant risk due to an increased dependency on computational and communications infrastructure combined with more sophisticated and frequent threats. Hence, the challenge facing the nation is the most effective means to exploit new computational and communications technologies while mitigating the impact of attacks, faults, and unanticipated usage patterns.
Life Testing and Diagnostics of a Planar Out-of-Core Thermionic Converter
NASA Astrophysics Data System (ADS)
Thayer, Kevin L.; Ramalingam, Mysore L.; Young, Timothy J.; Lamp, Thomas R.
1994-07-01
This paper details the design and performance of an automated computer data acquisition system for a planar, out-of-core thermionic converter with CVD rhenium electrodes. The output characteristics of this converter have been mapped for emitter temperatures ranging from approximately 1700K to 2000K, and life testing of the converter is presently being performed at the design point of operation. An automated data acquisition system has been constructed to facilitate the collection of current density versus output voltage (J-V) and temperature data from the converter throughout the life test. This system minimizes the amount of human interaction necessary during the lifetest to measure and archive the data and present it in a usable form. The task was accomplished using a Macintosh Ilcx computer, two multiple-purpose interface boards, a digital oscilloscope, a sweep generator, and National Instrument's LabVIEW application software package.
Dunne, James R; McDonald, Claudia L
2010-07-01
Pulse!! The Virtual Clinical Learning Lab at Texas A&M University-Corpus Christi, in collaboration with the United States Navy, has developed a model for research and technological development that they believe is an essential element in the future of military and civilian medical education. The Pulse!! project models a strategy for providing cross-disciplinary expertise and resources to educational, governmental, and business entities challenged with meeting looming health care crises. It includes a three-dimensional virtual learning platform that provides unlimited, repeatable, immersive clinical experiences without risk to patients, and is available anywhere there is a computer. Pulse!! utilizes expertise in the fields of medicine, medical education, computer science, software engineering, physics, computer animation, art, and architecture. Lab scientists collaborate with the commercial virtual-reality simulation industry to produce research-based learning platforms based on cutting-edge computer technology.
NDE scanning and imaging of aircraft structure
NASA Astrophysics Data System (ADS)
Bailey, Donald; Kepler, Carl; Le, Cuong
1995-07-01
The Science and Engineering Lab at McClellan Air Force Base, Sacramento, Calif. has been involved in the development and use of computer-based scanning systems for NDE (nondestructive evaluation) since 1985. This paper describes the history leading up to our current applications which employ eddy current and ultrasonic scanning of aircraft structures that contain both metallics and advanced composites. The scanning is performed using industrialized computers interfaced to proprietary acquisition equipment and software. Examples are shown that image several types of damage such as exfoliation and fuselage lap joint corrosion in aluminum, impact damage, embedded foreign material, and porosity in Kevlar and graphite epoxy composites. Image analysis techniques are reported that are performed using consumer oriented computer hardware and software that are not NDE specific and not expensive
Human and Robotic Space Mission Use Cases for High-Performance Spaceflight Computing
NASA Technical Reports Server (NTRS)
Some, Raphael; Doyle, Richard; Bergman, Larry; Whitaker, William; Powell, Wesley; Johnson, Michael; Goforth, Montgomery; Lowry, Michael
2013-01-01
Spaceflight computing is a key resource in NASA space missions and a core determining factor of spacecraft capability, with ripple effects throughout the spacecraft, end-to-end system, and mission. Onboard computing can be aptly viewed as a "technology multiplier" in that advances provide direct dramatic improvements in flight functions and capabilities across the NASA mission classes, and enable new flight capabilities and mission scenarios, increasing science and exploration return. Space-qualified computing technology, however, has not advanced significantly in well over ten years and the current state of the practice fails to meet the near- to mid-term needs of NASA missions. Recognizing this gap, the NASA Game Changing Development Program (GCDP), under the auspices of the NASA Space Technology Mission Directorate, commissioned a study on space-based computing needs, looking out 15-20 years. The study resulted in a recommendation to pursue high-performance spaceflight computing (HPSC) for next-generation missions, and a decision to partner with the Air Force Research Lab (AFRL) in this development.
Health care information infrastructure: what will it be and how will we get there?
NASA Astrophysics Data System (ADS)
Kun, Luis G.
1996-02-01
During the first Health Care Technology Policy [HCTPI conference last year, during Health Care Reform, four major issues were brought up in regards to the underway efforts to develop a Computer Based Patient Record (CBPR)I the National Information Infrastructure (NIl) as part of the High Performance Computers & Communications (HPCC), and the so-called "Patient Card" . More specifically it was explained how a national information system will greatly affect the way health care delivery is provided to the United States public and reduce its costs. These four issues were: Constructing a National Information Infrastructure (NIl); Building a Computer Based Patient Record System; Bringing the collective resources of our National Laboratories to bear in developing and implementing the NIl and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; Utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues. During the second HCTP conference, in mid 1 995, a section of this meeting entitled: "Health Care Technology Assets of the Federal Government" addressed benefits of the technology transfer which should occur for maximizing already developed resources. Also a section entitled:"Transfer and Utilization of Government Technology Assets to the Private Sector", looked at both Health Care and non-Health Care related technologies since many areas such as Information Technologies (i.e. imaging, communications, archival I retrieval, systems integration, information display, multimedia, heterogeneous data bases, etc.) already exist and are part of our National Labs and/or other federal agencies, i.e. ARPA. These technologies although they are not labeled under "Health Care" programs they could provide enormous value to address technical needs. An additional issue deals with both the technical (hardware, software) and human expertise that resides within these labs and their possible role in creating cost effective solutions.
NASA Astrophysics Data System (ADS)
Kun, Luis G.
1995-10-01
During the first Health Care Technology Policy conference last year, during health care reform, four major issues were brought up in regards to the efforts underway to develop a computer based patient record (CBPR), the National Information Infrastructure (NII) as part of the high performance computers and communications (HPCC), and the so-called 'patient card.' More specifically it was explained how a national information system will greatly affect the way health care delivery is provided to the United States public and reduce its costs. These four issues were: (1) Constructing a national information infrastructure (NII); (2) Building a computer based patient record system; (3) Bringing the collective resources of our national laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; (4) Utilizing government (e.g., DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs, and accelerate technology transfer to address health care issues. This year a section of this conference entitled: 'Health Care Technology Assets of the Federal Government' addresses benefits of the technology transfer which should occur for maximizing already developed resources. This section entitled: 'Transfer and Utilization of Government Technology Assets to the Private Sector,' will look at both health care and non-health care related technologies since many areas such as information technologies (i.e. imaging, communications, archival/retrieval, systems integration, information display, multimedia, heterogeneous data bases, etc.) already exist and are part of our national labs and/or other federal agencies, i.e., ARPA. These technologies although they are not labeled under health care programs they could provide enormous value to address technical needs. An additional issue deals with both the technical (hardware, software) and human expertise that resides within these labs and their possible role in creating cost effective solutions.
ERIC Educational Resources Information Center
Armstrong, Matt; Comitz, Richard L.; Biaglow, Andrew; Lachance, Russ; Sloop, Joseph
2008-01-01
A novel approach to the Chemical Engineering curriculum sequence of courses at West Point enabled our students to experience a much more realistic design process, which more closely replicated a real world scenario. Students conduct the synthesis in the organic chemistry lab, then conduct computer modeling of the reaction with ChemCad and…
STS-105 Crew Training in VR Lab
2001-03-15
JSC2001-00751 (15 March 2001) --- Astronaut Scott J. Horowitz, STS-105 mission commander, uses the virtual reality lab at the Johnson Space Center (JSC) to train for his duties aboard the Space Shuttle Discovery. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team for dealing with International Space Station (ISS) elements.
Photographic coverage of STS-112 during EVA 3 in VR Lab.
2002-08-21
JSC2002-E-34622 (21 August 2002) --- Astronaut David A. Wolf, STS-112 mission specialist, uses the virtual reality lab at the Johnson Space Center (JSC) to train for his duties aboard the Space Shuttle Atlantis. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team for dealing with ISS elements.
2005-06-07
JSC2005-E-21191 (7 June 2005) --- Astronaut Steven G. MacLean, STS-115 mission specialist representing the Canadian Space Agency, uses the virtual reality lab at the Johnson Space Center to train for his duties aboard the space shuttle. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
STS-105 Crew Training in VR Lab
2001-03-15
JSC2001-00758 (15 March 2001) --- Astronaut Frederick W. Sturckow, STS-105 pilot, uses the virtual reality lab at the Johnson Space Center (JSC) to train for his duties aboard the Space Shuttle Discovery. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team for dealing with International Space Station (ISS) elements.
2005-06-07
JSC2005-E-21192 (7 June 2005) --- Astronauts Christopher J. Ferguson (left), STS-115 pilot, and Daniel C. Burbank, mission specialist, use the virtual reality lab at the Johnson Space Center to train for their duties aboard the space shuttle. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
Laboratories for Teaching of Mathematical Subjects
ERIC Educational Resources Information Center
Berežný, Štefan
2017-01-01
We have adapted our two laboratories at our department based on our research results, which were presented at the conference CADGME 2014 in Halle and published in the journal. In this article we describe the hardware and software structure of the Laboratory 1: LabIT4KT-1: Laboratory of Computer Modelling and the Laboratory 2: LabIT4KT-2:…
Providing Guidance in Virtual Lab Experimentation: The Case of an Experiment Design Tool
ERIC Educational Resources Information Center
Efstathiou, Charalampos; Hovardas, Tasos; Xenofontos, Nikoletta A.; Zacharia, Zacharias C.; deJong, Ton; Anjewierden, Anjo; van Riesen, Siswa A. N.
2018-01-01
The present study employed a quasi-experimental design to assess a computer-based tool, which was intended to scaffold the task of designing experiments when using a virtual lab for the process of experimentation. In particular, we assessed the impact of this tool on primary school students' cognitive processes and inquiry skills before and after…
Problem-Based Labs and Group Projects in an Introductory University Physics Course
ERIC Educational Resources Information Center
Kohnle, Antje; Brown, C. Tom A.; Rae, Cameron F.; Sinclair, Bruce D.
2012-01-01
This article describes problem-based labs and analytical and computational project work we have been running at the University of St Andrews in an introductory physics course since 2008/2009. We have found the choice of topics, scaffolding of the process, timing in the year and facilitator guidance decisive for the success of these activities.…
Development of the STPX Spheromak System
NASA Astrophysics Data System (ADS)
Williams, R. L.; Clark, J.; Weatherford, C. A.
2015-11-01
The progress made in starting up the STPX Spheromak system, which is now installed at the Florida A&M University, is reviewed. Experimental, computational and theoretical activities are underway. The control system for firing the magnetized coaxial plasma gun and for collecting data from the diagnostic probes, based on LabView, is being tested and adapted. Preliminary results of testing the installed magnetic field probes, Langmuir triple probes, cylindrical ion probes, and optical diagnostics will be discussed. Progress in modeling this spheromak using simulation codes, such as NIMROD, will be discussed. Progress in investigating the use of algebraic topology to describe this spheromak will be reported.
Web-based system for surgical planning and simulation
NASA Astrophysics Data System (ADS)
Eldeib, Ayman M.; Ahmed, Mohamed N.; Farag, Aly A.; Sites, C. B.
1998-10-01
The growing scientific knowledge and rapid progress in medical imaging techniques has led to an increasing demand for better and more efficient methods of remote access to high-performance computer facilities. This paper introduces a web-based telemedicine project that provides interactive tools for surgical simulation and planning. The presented approach makes use of client-server architecture based on new internet technology where clients use an ordinary web browser to view, send, receive and manipulate patients' medical records while the server uses the supercomputer facility to generate online semi-automatic segmentation, 3D visualization, surgical simulation/planning and neuroendoscopic procedures navigation. The supercomputer (SGI ONYX 1000) is located at the Computer Vision and Image Processing Lab, University of Louisville, Kentucky. This system is under development in cooperation with the Department of Neurological Surgery, Alliant Health Systems, Louisville, Kentucky. The server is connected via a network to the Picture Archiving and Communication System at Alliant Health Systems through a DICOM standard interface that enables authorized clients to access patients' images from different medical modalities.
NASA Astrophysics Data System (ADS)
Fuentes-Cabrera, Miguel; Anderson, John D.; Wilmoth, Jared; Ginovart, Marta; Prats, Clara; Portell-Canal, Xavier; Retterer, Scott
Microbial interactions are critical for governing community behavior and structure in natural environments. Examination of microbial interactions in the lab involves growth under ideal conditions in batch culture; conditions that occur in nature are, however, characterized by disequilibrium. Of particular interest is the role that system variables play in shaping cell-to-cell interactions and organization at ultrafine spatial scales. We seek to use experiments and agent-based modeling to help discover mechanisms relevant to microbial dynamics and interactions in the environment. Currently, we are using an agent-based model to simulate microbial growth, dynamics and interactions that occur on a microwell-array device developed in our lab. Bacterial cells growing in the microwells of this platform can be studied with high-throughput and high-content image analyses using brightfield and fluorescence microscopy. The agent-based model is written in the language Netlogo, which in turn is ''plugged into'' a computational framework that allows submitting many calculations in parallel for different initial parameters; visualizing the outcomes in an interactive phase-like diagram; and searching, with a genetic algorithm, for the parameters that lead to the most optimal simulation outcome.
Hosny, Somaya; Mishriky, Adel M; Youssef, Mirella
2008-01-01
The Faculty of Medicine, Suez Canal University clinical skills lab was established in 1981 as the first skills lab in Egypt to cope with innovation in medical education adopted since school inauguration in 1978. Students are trained using their peers or models. Training is done weekly, guided by checklists tested for validity and reliability and updated regularly. Students receive immediate feedback on their performance. Recently, the number of students has increased, leading to challenges in providing adequate supervision and training experiences. A project to design and implement a computer-assisted training (CAT) system seemed to be a plausible solution. To assess the quality of a newly developed CAT product, faculty and students' satisfaction with it, and its impact on the learning process. The project involved preparation of multimedia video-films with a web interface for links of different scientific materials. The project was implemented on second year students. A quality check was done to assess the product's scientific content, and technical quality using questionnaires filled by 84 faculty members (139 filled forms) and 175 students (924 filled forms). For assessment of impact, results of examinations after project implementation were compared with results of 2nd year students of previous 3 years. More faculty (96.3%) were satisfied with the product and considered its quality good to excellent, compared to 93.9% of students, p < 0.001. Most faculty (76.2%) have agreed on its suitability for self-learning, while most students considered the product would be suitable after modification. The percentage of students' failures was lower after project implementation, compared to previous 3 years, p < 0.05. CAT materials developed for training of second year students in skills lab proved to be of good scientific content and quality, and suitable for self-learning. Their use was associated with lower failure rates among students. A randomized trial is recommended to ascertain the effectiveness of its application.
Flexible HVAC System for Lab or Classroom.
ERIC Educational Resources Information Center
Friedan, Jonathan
2001-01-01
Discusses an effort to design a heating, ventilation, and air conditioning system flexible enough to accommodate an easy conversion of classrooms to laboratories and dry labs to wet labs. The design's energy efficiency and operations and maintenance are examined. (GR)
A Paperless Lab Manual - Lessons Learned
NASA Astrophysics Data System (ADS)
Hatten, Daniel L.; Hatten, Maggie W.
1999-10-01
Every freshman entering Rose-Hulman Institute of Technology is equipped with a laptop computer and a software package that allow classroom and laboratory instructors the freedom to make computer-based assignments, publish course materials in electronic form, etc. All introductory physics laboratories and many of our classrooms are networked, and students routinely take their laptop computers to class/lab. The introductory physics laboratory manual was converted to HTML in the summer of 1997 and was made available to students over the Internet vice printing a paper manual during the 1998-99 school year. The aim was to reduce paper costs and allow timely updates of the laboratory experiments. A poll conducted at the end of the school year showed a generally positive student response to the online laboratory manual, with some reservations.
Galaxy CloudMan: delivering cloud compute clusters.
Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James
2010-12-21
Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.
ERIC Educational Resources Information Center
Journal of Chemical Education, 1988
1988-01-01
Reviews three computer software packages for chemistry education including "Osmosis and Diffusion" and "E.M.E. Titration Lab" for Apple II and "Simplex-V: An Interactive Computer Program for Experimental Optimization" for IBM PC. Summary ratings include ease of use, content, pedagogic value, student reaction, and…
ERIC Educational Resources Information Center
Collins, Michael J.; Vitz, Ed
1988-01-01
Examines two computer interfaced lab experiments: 1) discusses the automation of a Perkin Elmer 337 infrared spectrophotometer noting the mechanical and electronic changes needed; 2) uses the Gouy method and Lotus Measure software to automate magnetic susceptibility determinations. Methodology is described. (MVL)
EnviroLand: A Simple Computer Program for Quantitative Stream Assessment.
ERIC Educational Resources Information Center
Dunnivant, Frank; Danowski, Dan; Timmens-Haroldson, Alice; Newman, Meredith
2002-01-01
Introduces the Enviroland computer program which features lab simulations of theoretical calculations for quantitative analysis and environmental chemistry, and fate and transport models. Uses the program to demonstrate the nature of linear and nonlinear equations. (Author/YDS)
EarthLabs - Investigating Hurricanes: Earth's Meteorological Monsters
NASA Astrophysics Data System (ADS)
McDaris, J. R.; Dahlman, L.; Barstow, D.
2007-12-01
Earth science is one of the most important tools that the global community needs to address the pressing environmental, social, and economic issues of our time. While, at times considered a second-rate science at the high school level, it is currently undergoing a major revolution in the depth of content and pedagogical vitality. As part of this revolution, labs in Earth science courses need to shift their focus from cookbook-like activities with known outcomes to open-ended investigations that challenge students to think, explore and apply their learning. We need to establish a new model for Earth science as a rigorous lab science in policy, perception, and reality. As a concerted response to this need, five states, a coalition of scientists and educators, and an experienced curriculum team are creating a national model for a lab-based high school Earth science course named EarthLabs. This lab course will comply with the National Science Education Standards as well as the states' curriculum frameworks. The content will focus on Earth system science and environmental literacy. The lab experiences will feature a combination of field work, classroom experiments, and computer access to data and visualizations, and demonstrate the rigor and depth of a true lab course. The effort is being funded by NOAA's Environmental Literacy program. One of the prototype units of the course is Investigating Hurricanes. Hurricanes are phenomena which have tremendous impact on humanity and the resources we use. They are also the result of complex interacting Earth systems, making them perfect objects for rigorous investigation of many concepts commonly covered in Earth science courses, such as meteorology, climate, and global wind circulation. Students are able to use the same data sets, analysis tools, and research techniques that scientists employ in their research, yielding truly authentic learning opportunities. This month-long integrated unit uses hurricanes as the story line by which students investigate the different interactions involved in hurricane generation, steering, and intensification. Students analyze a variety of visualization resources looking for patterns in occurrence and to develop an understanding of hurricane structure. They download archived data about past hurricanes and produce temporal and spatial plots to discover patterns in hurricane life cycles. They investigate the relationship between hurricane wind speed and factors such as barometric pressure and sea surface temperature by conducting spreadsheet analyses on archived data. They also conduct hands-on laboratory experiments in order to understand the physical processes that underpin energy transfer in convection, condensation, and latent heat. These activities highlight Earth science as a vital, rich, invigorating course, employing state-of-the-art technologies and in-depth labs with high relevance for our daily lives and the future.
UBioLab: a web-LABoratory for Ubiquitous in-silico experiments.
Bartocci, E; Di Berardini, M R; Merelli, E; Vito, L
2012-03-01
The huge and dynamic amount of bioinformatic resources (e.g., data and tools) available nowadays in Internet represents a big challenge for biologists -for what concerns their management and visualization- and for bioinformaticians -for what concerns the possibility of rapidly creating and executing in-silico experiments involving resources and activities spread over the WWW hyperspace. Any framework aiming at integrating such resources as in a physical laboratory has imperatively to tackle -and possibly to handle in a transparent and uniform way- aspects concerning physical distribution, semantic heterogeneity, co-existence of different computational paradigms and, as a consequence, of different invocation interfaces (i.e., OGSA for Grid nodes, SOAP for Web Services, Java RMI for Java objects, etc.). The framework UBioLab has been just designed and developed as a prototype following the above objective. Several architectural features -as those ones of being fully Web-based and of combining domain ontologies, Semantic Web and workflow techniques- give evidence of an effort in such a direction. The integration of a semantic knowledge management system for distributed (bioinformatic) resources, a semantic-driven graphic environment for defining and monitoring ubiquitous workflows and an intelligent agent-based technology for their distributed execution allows UBioLab to be a semantic guide for bioinformaticians and biologists providing (i) a flexible environment for visualizing, organizing and inferring any (semantics and computational) "type" of domain knowledge (e.g., resources and activities, expressed in a declarative form), (ii) a powerful engine for defining and storing semantic-driven ubiquitous in-silico experiments on the domain hyperspace, as well as (iii) a transparent, automatic and distributed environment for correct experiment executions.
Applications of a digital darkroom in the forensic laboratory
NASA Astrophysics Data System (ADS)
Bullard, Barry D.; Birge, Brian
1997-02-01
Through a joint agreement with the Indiana-Marion County Forensic Laboratory Services Agency, the Institute for Forensic Imaging conducted a pilot program to investigate crime lab applications of a digital darkroom. IFI installed and staffed a state-of-the-art digital darkroom in the photography laboratory of the Indianapolis-Marion County crime lab located at Indianapolis, Indiana. The darkroom consisted of several high resolution color digital cameras, image processing computer, dye sublimation continuous tone digital printers, and CD-ROM writer. This paper describes the use of the digital darkroom in several crime lab investigations conducted during the program.
Portable classroom leads to partnership.
Le Ber, Jeanne Marie; Lombardo, Nancy T; Weber, Alice; Bramble, John
2004-01-01
Library faculty participation on the School of Medicine Curriculum Steering Committee led to a unique opportunity to partner technology and teaching utilizing the library's portable wireless classroom. The pathology lab course master expressed a desire to revise the curriculum using patient cases and direct access to the Web and library resources. Since the pathology lab lacked computers, the library's portable wireless classroom provided a solution. Originally developed to provide maximum portability and flexibility, the wireless classroom consists of ten laptop computers configured with wireless cards and an access point. While the portable wireless classroom led to a partnership with the School of Medicine, there were additional benefits and positive consequences for the library.
Simulink-aided Design and Implementation of Sensorless BLDC Motor Digital Control System
NASA Astrophysics Data System (ADS)
Zhilenkov, A. A.; Tsvetkov, Y. N.; Chistov, V. B.; Nyrkov, A. P.; Sokolov, S. S.
2017-07-01
The paper describes the process of creating of brushless direct current motor’s digital control system. The target motor has no speed sensor, so back-EMF method is used for commutation control. Authors show how to model the control system in MatLab/Simulink and to test it onboard STM32F4 microcontroller.This technology allows to create the most flexible system, which will control possible with a personal computer by communication lines. It is possible to examine the signals in the circuit of the actuator without any external measuring instruments - testers, oscilloscopes, etc. - and output waveforms and measured values of signals directly on the host PC.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-03
... anchors, both as centers for digital literacy and as hubs for access to public computers. While their... expansion of computer labs, and facilitated deployment of new educational applications that would not have... computer fees to help defray the cost of computers or training fees to help cover the cost of training...
A low-cost spectrometer for NMR measurements in the Earth's magnetic field
NASA Astrophysics Data System (ADS)
Michal, Carl A.
2010-10-01
We describe and demonstrate an inexpensive, easy-to-build, portable spectrometer for nuclear magnetic resonance measurements in the Earth's magnetic field. The spectrometer is based upon a widely available inexpensive microcontroller, which acts as a pulse programmer, audio-frequency synthesizer and digitizer, replacing what are typically the most expensive specialized components of the system. The microcontroller provides the capability to execute arbitrarily long and complicated sequences of phase-coherent, phase-modulated excitation pulses and acquire data sets of unlimited duration. Suitably packaged, the spectrometer is amenable to measurements in the research lab, in the field or in the teaching lab. The choice of components was heavily weighted by cost and availability, but required no significant sacrifice in performance. Using an existing personal computer, the resulting design can be assembled for as little as US200. The spectrometer performance is demonstrated with spin-echo and Carr-Purcell-Meiboom-Gill pulse sequences on a water sample.
Software Solution Saves Dollars
ERIC Educational Resources Information Center
Trotter, Andrew
2004-01-01
This article discusses computer software that can give classrooms and computer labs the capabilities of costly PC's at a small fraction of the cost. A growing number of cost-conscious school districts are finding budget relief in low-cost computer software known as "open source" that can do everything from manage school Web sites to equip…
Cane Toad or Computer Mouse? Real and Computer-Simulated Laboratory Exercises in Physiology Classes
ERIC Educational Resources Information Center
West, Jan; Veenstra, Anneke
2012-01-01
Traditional practical classes in many countries are being rationalised to reduce costs. The challenge for university educators is to provide students with the opportunity to reinforce theoretical concepts by running something other than a traditional practical program. One alternative is to replace wet labs with comparable computer simulations.…
The NASA Augmented/Virtual Reality Lab: The State of the Art at KSC
NASA Technical Reports Server (NTRS)
Little, William
2017-01-01
The NASA Augmented Virtual Reality (AVR) Lab at Kennedy Space Center is dedicated to the investigation of Augmented Reality (AR) and Virtual Reality (VR) technologies, with the goal of determining potential uses of these technologies as human-computer interaction (HCI) devices in an aerospace engineering context. Begun in 2012, the AVR Lab has concentrated on commercially available AR and VR devices that are gaining in popularity and use in a number of fields such as gaming, training, and telepresence. We are working with such devices as the Microsoft Kinect, the Oculus Rift, the Leap Motion, the HTC Vive, motion capture systems, and the Microsoft Hololens. The focus of our work has been on human interaction with the virtual environment, which in turn acts as a communications bridge to remote physical devices and environments which the operator cannot or should not control or experience directly. Particularly in reference to dealing with spacecraft and the oftentimes hazardous environments they inhabit, it is our hope that AR and VR technologies can be utilized to increase human safety and mission success by physically removing humans from those hazardous environments while virtually putting them right in the middle of those environments.
Vashpanov, Yuriy; Choo, Hyunseung; Kim, Dongsoo Stephen
2011-01-01
This paper proposes an adsorption sensitivity control method that uses a wireless network and illumination light intensity in a photo-electromagnetic field (EMF)-based gas sensor for measurements in real time of a wide range of ammonia concentrations. The minimum measurement error for a range of ammonia concentration from 3 to 800 ppm occurs when the gas concentration magnitude corresponds with the optimal intensity of the illumination light. A simulation with LabView-engineered modules for automatic control of a new intelligent computer system was conducted to improve measurement precision over a wide range of gas concentrations. This gas sensor computer system with wireless network technology could be useful in the chemical industry for automatic detection and measurement of hazardous ammonia gas levels in real time. PMID:22346680
Computer Assisted REhabilitation (CARE) Lab: A novel approach towards Pediatric Rehabilitation 2.0.
Olivieri, Ivana; Meriggi, Paolo; Fedeli, Cristina; Brazzoli, Elena; Castagna, Anna; Roidi, Marina Luisa Rodocanachi; Angelini, Lucia
2018-01-01
Pediatric Rehabilitation therapists have always worked using a variety of off-the-shelf or custom-made objects and devices, more recently including computer based systems. These Information and Communication Technology (ICT) solutions vary widely in complexity, from easy-to-use interactive videogame consoles originally intended for entertainment purposes to sophisticated systems specifically developed for rehabilitation.This paper describes the principles underlying an innovative "Pediatric Rehabilitation 2.0" approach, based on the combination of suitable ICT solutions and traditional rehabilitation, which has been progressively refined while building up and using a computer-assisted rehabilitation laboratory. These principles are thus summarized in the acronym EPIQ, to account for the terms Ecological, Personalized, Interactive and Quantitative. The paper also presents the laboratory, which has been designed to meet the children's rehabilitation needs and to empower therapists in their work. The laboratory is equipped with commercial hardware and specially developed software called VITAMIN: a virtual reality platform for motor and cognitive rehabilitation.
Inexpensive Data Acquisition with a Sound Card
NASA Astrophysics Data System (ADS)
Hassan, Umer; Pervaiz, Saad; Anwar, Muhammad Sabieh
2011-12-01
Signal generators, oscilloscopes, and data acquisition (DAQ) systems are standard components of the modern experimental physics laboratory. The sound card, a built-in component in the ubiquitous personal computer, can be utilized for all three of these tasks1,2 and offers an attractive option for labs in developing countries such as ours—Pakistan—where affordability is always of prime concern. In this paper, we describe in a recipe fashion how the sound card is used for DAQ and signal generation.
2010-04-01
than 0.6 metric tons. They have landed at low elevation sites (below 1 km Mars Orbiter Laser Altimeter ( MOLA )). All accepted a relatively large...Martian atmosphere, and small scale height of obstacles on the ground limit accessible landing sites to those below - 1.0km MOLA . So far the southern...landing to date is MER-Opportunity at Meridiani Planum (-1km MOLA ). Mars Science Lab (MSL) is attempting to develop an EDL system capable of delivering
2002-04-01
KENNEDY SPACE CENTER, FLA. -- At Launch Pad 39B, Space Shuttle Atlantis' payload bay doors are ready to be closed. The Shuttle payload includes the S0 Integrated Truss Structure (ITS), the Canadian Mobile Transporter, power distribution system modules, a heat pipe radiator for cooling, computers and a pair of rate gyroscopes. The mission is the 13th assembly flight to the ISS and includes four spacewalks to attach the S0 truss to the U.S. Lab Destiny. Launch is scheduled for April 4.
2002-04-01
KENNEDY SPACE CENTER, FLA. -- At Launch Pad 39B, Space Shuttle Atlantis' payload bay doors are ready to be closed. The Shuttle payload includes the S0 Integrated Truss Structure (ITS), the Canadian Mobile Transporter, power distribution system modules, a heat pipe radiator for cooling, computers and a pair of rate gyroscopes. The mission is the 13th assembly flight to the ISS and includes four spacewalks to attach the S0 truss to the U.S. Lab Destiny. Launch is scheduled for April 4.
Mechanotransduction and the Cytoskeleton
NASA Technical Reports Server (NTRS)
Pickard, Barbara G.
1996-01-01
Two intellectual developments in the lab suggested a more powerful approach to the problem defined in the original statement. These were discussed with the "Program Chief', who agreed that an alteration of approach was desirable. Second expansion of our work was on the computational optical sectioning microscope (COSM). Additionally, we hoped to visualize the channels with respect to cytoskeletal entities. We have identified major here-to-fore unknown cytoskeletal structure in our representative experimental system, the onion epidermal cell, and have named this structure the endomembrane grant.
A high-resolution optical measurement system for rapid acquisition of radiation flux density maps
NASA Astrophysics Data System (ADS)
Thelen, Martin; Raeder, Christian; Willsch, Christian; Dibowski, Gerd
2017-06-01
To identify the power and flux density of concentrated solar radiation the Institute of Solar Research at the German Aerospace Center (DLR - Deutsches Zentrum für Luft-und Raumfahrt e. V.) has used the camera-based measurement system FATMES (Flux and Temperature Measurement System) since 1995. The disadvantages of low resolution, difficult handling and poor computing power required a revision of the existing measurement system. The measurement system FMAS (Flux Mapping Acquisition system) is equipped with state-of-the-art-hardware, is compatible with computers off-the-shelf and is programmed in LabView. The expenditure of time for an image evaluation is reduced by the factor 60 compared to FATMES. The new measurement system is no longer associated with the facilities Solar Furnace and High Flux Solar Simulator at the DLR in Cologne but is also applicable as a mobile system. The data and the algorithms are transparent throughout the complete process. The measurement accuracy of FMAS is determined to at most ±3 % until now. The error of measurement of FATMES is at least 2 % higher according to the conducted comparison tests.
STS-111 Training in VR lab with Expedition IV and V Crewmembers
2001-10-18
JSC2001-E-39090 (18 October 2001) --- Cosmonaut Valeri G. Korzun, Expedition Five mission commander representing Rosaviakosmos, uses the virtual reality lab at the Johnson Space Center (JSC) to train for his duties on the International Space Station (ISS). This type of computer interface paired with virtual reality training hardware and software helps the entire team for dealing with ISS elements.
Ding Dong, You've Got Mail! A Lab Activity for Teaching the Internet of Things
ERIC Educational Resources Information Center
Frydenberg, Mark
2017-01-01
Connecting ordinary devices to the Internet is a defining characteristic of the Internet of Things. In this hands-on lab activity, students will connect a wireless doorbell to the Internet using a Raspberry Pi computer. By modifying and running a program on the Raspberry Pi to send an email or text message notifying a recipient that someone is at…
Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de
2012-10-01
The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions < 50% in the LMCA and < 70% in the ADA. The agreement for lesions > 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.
Galaxy CloudMan: delivering cloud compute clusters
2010-01-01
Background Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is “cloud computing”, which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate “as is” use by experimental biologists. Results We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon’s EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. Conclusions The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge. PMID:21210983
Advanced Physics Labs and Undergraduate Research: Helping Them Work Together
NASA Astrophysics Data System (ADS)
Peterson, Richard W.
2009-10-01
The 2009 Advanced Lab Topical Conference in Ann Arbor affirmed the importance of advanced labs that teach crucial skills and methodologies by carefully conducting a time-honored experiment. Others however argued that such a constrained experiment can play a complementary role to more open-ended, project experiences. A genuine ``experiment'' where neither student or faculty member is exactly sure of the best approach or anticipated result can often trigger real excitement, creativity, and career direction for students while reinforcing the advanced lab and undergraduate research interface. Several examples are cited in areas of AMO physics, optics, fluids, and acoustics. Colleges and universities that have dual-degree engineering, engineering physics, or applied physics programs may especially profit from interdisciplinary projects that utilize optical, electromagnetic, and acoustical measurements in conjunction with computational physics and simulation.
Image-guided thoracic surgery in the hybrid operation room.
Ujiie, Hideki; Effat, Andrew; Yasufuku, Kazuhiro
2017-01-01
There has been an increase in the use of image-guided technology to facilitate minimally invasive therapy. The next generation of minimally invasive therapy is focused on advancement and translation of novel image-guided technologies in therapeutic interventions, including surgery, interventional pulmonology, radiation therapy, and interventional laser therapy. To establish the efficacy of different minimally invasive therapies, we have developed a hybrid operating room, known as the guided therapeutics operating room (GTx OR) at the Toronto General Hospital. The GTx OR is equipped with multi-modality image-guidance systems, which features a dual source-dual energy computed tomography (CT) scanner, a robotic cone-beam CT (CBCT)/fluoroscopy, high-performance endobronchial ultrasound system, endoscopic surgery system, near-infrared (NIR) fluorescence imaging system, and navigation tracking systems. The novel multimodality image-guidance systems allow physicians to quickly, and accurately image patients while they are on the operating table. This yield improved outcomes since physicians are able to use image guidance during their procedures, and carry out innovative multi-modality therapeutics. Multiple preclinical translational studies pertaining to innovative minimally invasive technology is being developed in our guided therapeutics laboratory (GTx Lab). The GTx Lab is equipped with similar technology, and multimodality image-guidance systems as the GTx OR, and acts as an appropriate platform for translation of research into human clinical trials. Through the GTx Lab, we are able to perform basic research, such as the development of image-guided technologies, preclinical model testing, as well as preclinical imaging, and then translate that research into the GTx OR. This OR allows for the utilization of new technologies in cancer therapy, including molecular imaging, and other innovative imaging modalities, and therefore enables a better quality of life for patients, both during and after the procedure. In this article, we describe capabilities of the GTx systems, and discuss the first-in-human technologies used, and evaluated in GTx OR.
[Design of visualized medical images network and web platform based on MeVisLab].
Xiang, Jun; Ye, Qing; Yuan, Xun
2017-04-01
With the trend of the development of "Internet +", some further requirements for the mobility of medical images have been required in the medical field. In view of this demand, this paper presents a web-based visual medical imaging platform. First, the feasibility of medical imaging is analyzed and technical points. CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) images are reconstructed three-dimensionally by MeVisLab and packaged as X3D (Extensible 3D Graphics) files shown in the present paper. Then, the B/S (Browser/Server) system specially designed for 3D image is designed by using the HTML 5 and WebGL rendering engine library, and the X3D image file is parsed and rendered by the system. The results of this study showed that the platform was suitable for multiple operating systems to realize the platform-crossing and mobilization of medical image data. The development of medical imaging platform is also pointed out in this paper. It notes that web application technology will not only promote the sharing of medical image data, but also facilitate image-based medical remote consultations and distance learning.
The Software Element of the NASA Portable Electronic Device Radiated Emissions Investigation
NASA Technical Reports Server (NTRS)
Koppen, Sandra V.; Williams, Reuben A. (Technical Monitor)
2002-01-01
NASA Langley Research Center's (LaRC) High Intensity Radiated Fields Laboratory (HIRF Lab) recently conducted a series of electromagnetic radiated emissions tests under a cooperative agreement with Delta Airlines and an interagency agreement with the FAA. The frequency spectrum environment at a commercial airport was measured on location. The environment survey provides a comprehensive picture of the complex nature of the electromagnetic environment present in those areas outside the aircraft. In addition, radiated emissions tests were conducted on portable electronic devices (PEDs) that may be brought onboard aircraft. These tests were performed in both semi-anechoic and reverberation chambers located in the HIRF Lab. The PEDs included cell phones, laptop computers, electronic toys, and family radio systems. The data generated during the tests are intended to support the research on the effect of radiated emissions from wireless devices on aircraft systems. Both tests systems relied on customized control and data reduction software to provide test and instrument control, data acquisition, a user interface, real time data reduction, and data analysis. The software executed on PC's running MS Windows 98 and 2000, and used Agilent Pro Visual Engineering Environment (VEE) development software, Common Object Model (COM) technology, and MS Excel.
cStress: Towards a Gold Standard for Continuous Stress Assessment in the Mobile Environment
Hovsepian, Karen; al’Absi, Mustafa; Ertin, Emre; Kamarck, Thomas; Nakajima, Motohiro; Kumar, Santosh
2015-01-01
Recent advances in mobile health have produced several new models for inferring stress from wearable sensors. But, the lack of a gold standard is a major hurdle in making clinical use of continuous stress measurements derived from wearable sensors. In this paper, we present a stress model (called cStress) that has been carefully developed with attention to every step of computational modeling including data collection, screening, cleaning, filtering, feature computation, normalization, and model training. More importantly, cStress was trained using data collected from a rigorous lab study with 21 participants and validated on two independently collected data sets — in a lab study on 26 participants and in a week-long field study with 20 participants. In testing, the model obtains a recall of 89% and a false positive rate of 5% on lab data. On field data, the model is able to predict each instantaneous self-report with an accuracy of 72%. PMID:26543926
Attitude identification for SCOLE using two infrared cameras
NASA Technical Reports Server (NTRS)
Shenhar, Joram
1991-01-01
An algorithm is presented that incorporates real time data from two infrared cameras and computes the attitude parameters of the Spacecraft COntrol Lab Experiment (SCOLE), a lab apparatus representing an offset feed antenna attached to the Space Shuttle by a flexible mast. The algorithm uses camera position data of three miniature light emitting diodes (LEDs), mounted on the SCOLE platform, permitting arbitrary camera placement and an on-line attitude extraction. The continuous nature of the algorithm allows identification of the placement of the two cameras with respect to some initial position of the three reference LEDs, followed by on-line six degrees of freedom attitude tracking, regardless of the attitude time history. A description is provided of the algorithm in the camera identification mode as well as the mode of target tracking. Experimental data from a reduced size SCOLE-like lab model, reflecting the performance of the camera identification and the tracking processes, are presented. Computer code for camera placement identification and SCOLE attitude tracking is listed.
Biotic games and cloud experimentation as novel media for biophysics education
NASA Astrophysics Data System (ADS)
Riedel-Kruse, Ingmar; Blikstein, Paulo
2014-03-01
First-hand, open-ended experimentation is key for effective formal and informal biophysics education. We developed, tested and assessed multiple new platforms that enable students and children to directly interact with and learn about microscopic biophysical processes: (1) Biotic games that enable local and online play using galvano- and photo-tactic stimulation of micro-swimmers, illustrating concepts such as biased random walks, Low Reynolds number hydrodynamics, and Brownian motion; (2) an undergraduate course where students learn optics, electronics, micro-fluidics, real time image analysis, and instrument control by building biotic games; and (3) a graduate class on the biophysics of multi-cellular systems that contains a cloud experimentation lab enabling students to execute open-ended chemotaxis experiments on slimemolds online, analyze their data, and build biophysical models. Our work aims to generate the equivalent excitement and educational impact for biophysics as robotics and video games have had for mechatronics and computer science, respectively. We also discuss how scaled-up cloud experimentation systems can support MOOCs with true lab components and life-science research in general.
An integrated CMOS high voltage supply for lab-on-a-chip systems.
Behnam, M; Kaigala, G V; Khorasani, M; Marshall, P; Backhouse, C J; Elliott, D G
2008-09-01
Electrophoresis is a mainstay of lab-on-a-chip (LOC) implementations of molecular biology procedures and is the basis of many medical diagnostics. High voltage (HV) power supplies are necessary in electrophoresis instruments and are a significant part of the overall system cost. This cost of instrumentation is a significant impediment to making LOC technologies more widely available. We believe one approach to overcoming this problem is to use microelectronic technology (complementary metal-oxide semiconductor, CMOS) to generate and control the HV. We present a CMOS-based chip (3 mm x 2.9 mm) that generates high voltages (hundreds of volts), switches HV outputs, and is powered by a 5 V input supply (total power of 28 mW) while being controlled using a standard computer serial interface. Microchip electrophoresis with laser induced fluorescence (LIF) detection is implemented using this HV CMOS chip. With the other advancements made in the LOC community (e.g. micro-fluidic and optical devices), these CMOS chips may ultimately enable 'true' LOC solutions where essentially all the microfluidics, photonics and electronics are on a single chip.
NASA Technical Reports Server (NTRS)
Scheper, C.; Baker, R.; Frank, G.; Yalamanchili, S.; Gray, G.
1992-01-01
Systems for Space Defense Initiative (SDI) space applications typically require both high performance and very high reliability. These requirements present the systems engineer evaluating such systems with the extremely difficult problem of conducting performance and reliability trade-offs over large design spaces. A controlled development process supported by appropriate automated tools must be used to assure that the system will meet design objectives. This report describes an investigation of methods, tools, and techniques necessary to support performance and reliability modeling for SDI systems development. Models of the JPL Hypercubes, the Encore Multimax, and the C.S. Draper Lab Fault-Tolerant Parallel Processor (FTPP) parallel-computing architectures using candidate SDI weapons-to-target assignment algorithms as workloads were built and analyzed as a means of identifying the necessary system models, how the models interact, and what experiments and analyses should be performed. As a result of this effort, weaknesses in the existing methods and tools were revealed and capabilities that will be required for both individual tools and an integrated toolset were identified.
Design of the intelligent smoke alarm system based on photoelectric smoke
NASA Astrophysics Data System (ADS)
Ma, Jiangfei; Yang, Xiufang; Wang, Peipei
2017-02-01
This paper designed a kind of intelligent smoke alarm system based on photoelectric smoke detector and temperature, The system takes AT89C51 MCU as the core of hardware control and Labview as the host computer monitoring center.The sensor system acquires temperature signals and smoke signals, the MCU control A/D by Sampling and converting the output analog signals , and then the two signals will be uploaded to the host computer through the serial communication. To achieve real-time monitoring of smoke and temperature in the environment, LabVIEW monitoring platform need to hold, process, analysis and display these samping signals. The intelligent smoke alarm system is suitable for large scale shopping malls and other public places, which can greatly reduce the false alarm rate of fire, The experimental results show that the system runs well and can alarm when the setting threshold is reached,and the threshold parameters can be adjusted according to the actual conditions of the field. The system is easy to operate, simple in structure, intelligent, low cost, and with strong practical value.
The Next Generation of Lab and Classroom Computing - The Silver Lining
2016-12-01
desktop infrastructure (VDI) solution, as well as the computing solutions at three universities, was selected as the basis for comparison. The research... infrastructure , VDI, hardware cost, software cost, manpower, availability, cloud computing, private cloud, bring your own device, BYOD, thin client...virtual desktop infrastructure (VDI) solution, as well as the computing solutions at three universities, was selected as the basis for comparison. The
High Performance, Dependable Multiprocessor
NASA Technical Reports Server (NTRS)
Ramos, Jeremy; Samson, John R.; Troxel, Ian; Subramaniyan, Rajagopal; Jacobs, Adam; Greco, James; Cieslewski, Grzegorz; Curreri, John; Fischer, Michael; Grobelny, Eric;
2006-01-01
With the ever increasing demand for higher bandwidth and processing capacity of today's space exploration, space science, and defense missions, the ability to efficiently apply commercial-off-the-shelf (COTS) processors for on-board computing is now a critical need. In response to this need, NASA's New Millennium Program office has commissioned the development of Dependable Multiprocessor (DM) technology for use in payload and robotic missions. The Dependable Multiprocessor technology is a COTS-based, power efficient, high performance, highly dependable, fault tolerant cluster computer. To date, Honeywell has successfully demonstrated a TRL4 prototype of the Dependable Multiprocessor [I], and is now working on the development of a TRLS prototype. For the present effort Honeywell has teamed up with the University of Florida's High-performance Computing and Simulation (HCS) Lab, and together the team has demonstrated major elements of the Dependable Multiprocessor TRLS system.
Children, Computers, and Powerful Ideas
ERIC Educational Resources Information Center
Bull, Glen
2005-01-01
Today it is commonplace that computers and technology permeate almost every aspect of education. In the late 1960s, though, the idea that computers could serve as a catalyst for thinking about the way children learn was a radical concept. In the early 1960s, Seymour Papert joined the faculty of MIT and founded the Artificial Intelligence Lab with…
Teaching Business Statistics in a Computer Lab: Benefit or Distraction?
ERIC Educational Resources Information Center
Martin, Linda R.
2011-01-01
Teaching in a classroom configured with computers has been heralded as an aid to learning. Students receive the benefits of working with large data sets and real-world problems. However, with the advent of network and wireless connections, students can now use the computer for alternating tasks, such as emailing, web browsing, and social…
On-line determination of pork color and intramuscular fat by computer vision
NASA Astrophysics Data System (ADS)
Liao, Yi-Tao; Fan, Yu-Xia; Wu, Xue-Qian; Xie, Li-juan; Cheng, Fang
2010-04-01
In this study, the application potential of computer vision in on-line determination of CIE L*a*b* and content of intramuscular fat (IMF) of pork was evaluated. Images of pork chop from 211 pig carcasses were captured while samples were on a conveyor belt at the speed of 0.25 m•s-1 to simulate the on-line environment. CIE L*a*b* and IMF content were measured with colorimeter and chemical extractor as reference. The KSW algorithm combined with region selection was employed in eliminating the surrounding fat of longissimus dorsi muscle (MLD). RGB values of the pork were counted and five methods were applied for transforming RGB values to CIE L*a*b* values. The region growing algorithm with multiple seed points was applied to mask out the IMF pixels within the intensity corrected images. The performances of the proposed algorithms were verified by comparing the measured reference values and the quality characteristics obtained by image processing. MLD region of six samples could not be identified using the KSW algorithm. Intensity nonuniformity of pork surface in the image can be eliminated efficiently, and IMF region of three corrected images failed to be extracted. Given considerable variety of color and complexity of the pork surface, CIE L*, a* and b* color of MLD could be predicted with correlation coefficients of 0.84, 0.54 and 0.47 respectively, and IMF content could be determined with a correlation coefficient more than 0.70. The study demonstrated that it is feasible to evaluate CIE L*a*b* values and IMF content on-line using computer vision.
Demo of three ways to use a computer to assist in lab
NASA Technical Reports Server (NTRS)
Neville, J. P.
1990-01-01
The objective is to help the slow learner and students with a language problem, or to challenge the advanced student. Technology has advanced to the point where images generated on a computer can easily be recorded on a VCR and used as a video tutorial. This transfer can be as simple as pointing a video camera at the screen and recording the image. For more clarity and professional results, a board may be inserted into a computer which will convert the signals directly to the TV standard. Using a computer program that generates movies one can animate various principles which would normally be impossible to show or would require time-lapse photography. For example, you might show the change in shape of grains as a piece of metal is cold worked and then show the recrystallization and grain growth as heat is applied. More imaginative titles and graphics are also possible using this technique. Remedial help may also be offered via computer to those who find a specific concept difficult. A printout of specific data, details of the theory or equipment set-up can be offered. Programs are now available that will help as well as test the student in specific areas so that a Keller type approach can be used with each student to insure each knows the subject before going on to the next topic. A computer can serve as an information source and contain the microstructures, physical data and availability of each material tested in the lab. With this source present unknowns can be evaluated and various tests simulated to create a simple or complex case study lab assignment.
UWGSP6: a diagnostic radiology workstation of the future
NASA Astrophysics Data System (ADS)
Milton, Stuart W.; Han, Sang; Choi, Hyung-Sik; Kim, Yongmin
1993-06-01
The Univ. of Washington's Image Computing Systems Lab. (ICSL) has been involved in research into the development of a series of PACS workstations since the middle 1980's. The most recent research, a joint UW-IBM project, attempted to create a diagnostic radiology workstation using an IBM RISC System 6000 (RS6000) computer workstation and the X-Window system. While the results are encouraging, there are inherent limitations in the workstation hardware which prevent it from providing an acceptable level of functionality for diagnostic radiology. Realizing the RS6000 workstation's limitations, a parallel effort was initiated to design a workstation, UWGSP6 (Univ. of Washington Graphics System Processor #6), that provides the required functionality. This paper documents the design of UWGSP6, which not only addresses the requirements for a diagnostic radiology workstation in terms of display resolution, response time, etc., but also includes the processing performance necessary to support key functions needed in the implementation of algorithms for computer-aided diagnosis. The paper includes a description of the workstation architecture, and specifically its image processing subsystem. Verification of the design through hardware simulation is then discussed, and finally, performance of selected algorithms based on detailed simulation is provided.
A UML model for the description of different brain-computer interface systems.
Quitadamo, Lucia Rita; Abbafati, Manuel; Saggio, Giovanni; Marciani, Maria Grazia; Cardarilli, Gian Carlo; Bianchi, Luigi
2008-01-01
BCI research lacks a universal descriptive language among labs and a unique standard model for the description of BCI systems. This results in a serious problem in comparing performances of different BCI processes and in unifying tools and resources. In such a view we implemented a Unified Modeling Language (UML) model for the description virtually of any BCI protocol and we demonstrated that it can be successfully applied to the most common ones such as P300, mu-rhythms, SCP, SSVEP, fMRI. Finally we illustrated the advantages in utilizing a standard terminology for BCIs and how the same basic structure can be successfully adopted for the implementation of new systems.
Computer Based Simulation of Laboratory Experiments.
ERIC Educational Resources Information Center
Edward, Norrie S.
1997-01-01
Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…
A Planning Guide for Instructional Networks, Part II.
ERIC Educational Resources Information Center
Daly, Kevin F.
1994-01-01
This second in a series of articles on planning for instructional computer networks focuses on site preparation, installation, service, and support. Highlights include an implementation schedule; classroom and computer lab layouts; electrical power needs; workstations; network cable; telephones; furniture; climate control; and security. (LRW)
Lab4CE: A Remote Laboratory for Computer Education
ERIC Educational Resources Information Center
Broisin, Julien; Venant, Rémi; Vidal, Philippe
2017-01-01
Remote practical activities have been demonstrated to be efficient when learners come to acquire inquiry skills. In computer science education, virtualization technologies are gaining popularity as this technological advance enables instructors to implement realistic practical learning activities, and learners to engage in authentic and…
An interactive computer lab of the galvanic cell for students in biochemistry.
Ahlstrand, Emma; Buetti-Dinh, Antoine; Friedman, Ran
2018-01-01
We describe an interactive module that can be used to teach basic concepts in electrochemistry and thermodynamics to first year natural science students. The module is used together with an experimental laboratory and improves the students' understanding of thermodynamic quantities such as Δ r G, Δ r H, and Δ r S that are calculated but not directly measured in the lab. We also discuss how new technologies can substitute some parts of experimental chemistry courses, and improve accessibility to course material. Cloud computing platforms such as CoCalc facilitate the distribution of computer codes and allow students to access and apply interactive course tools beyond the course's scope. Despite some limitations imposed by cloud computing, the students appreciated the approach and the enhanced opportunities to discuss study questions with their classmates and instructor as facilitated by the interactive tools. © 2017 by The International Union of Biochemistry and Molecular Biology, 46(1):58-65, 2018. © 2017 The International Union of Biochemistry and Molecular Biology.
PatternLab for proteomics 4.0: A one-stop shop for analyzing shotgun proteomic data
Carvalho, Paulo C; Lima, Diogo B; Leprevost, Felipe V; Santos, Marlon D M; Fischer, Juliana S G; Aquino, Priscila F; Moresco, James J; Yates, John R; Barbosa, Valmir C
2017-01-01
PatternLab for proteomics is an integrated computational environment that unifies several previously published modules for analyzing shotgun proteomic data. PatternLab contains modules for formatting sequence databases, performing peptide spectrum matching, statistically filtering and organizing shotgun proteomic data, extracting quantitative information from label-free and chemically labeled data, performing statistics for differential proteomics, displaying results in a variety of graphical formats, performing similarity-driven studies with de novo sequencing data, analyzing time-course experiments, and helping with the understanding of the biological significance of data in the light of the Gene Ontology. Here we describe PatternLab for proteomics 4.0, which closely knits together all of these modules in a self-contained environment, covering the principal aspects of proteomic data analysis as a freely available and easily installable software package. All updates to PatternLab, as well as all new features added to it, have been tested over the years on millions of mass spectra. PMID:26658470
Kara, Adnane; Rouillard, Camille; Mathault, Jessy; Boisvert, Martin; Tessier, Frédéric; Landari, Hamza; Melki, Imene; Laprise-Pelletier, Myriam; Boisselier, Elodie; Fortin, Marc-André; Boilard, Eric; Greener, Jesse; Miled, Amine
2016-05-28
In this paper, we present a new modular lab on a chip design for multimodal neurotransmitter (NT) sensing and niosome generation based on a plug-and-play concept. This architecture is a first step toward an automated platform for an automated modulation of neurotransmitter concentration to understand and/or treat neurodegenerative diseases. A modular approach has been adopted in order to handle measurement or drug delivery or both measurement and drug delivery simultaneously. The system is composed of three fully independent modules: three-channel peristaltic micropumping system, a three-channel potentiostat and a multi-unit microfluidic system composed of pseudo-Y and cross-shape channels containing a miniature electrode array. The system was wirelessly controlled by a computer interface. The system is compact, with all the microfluidic and sensing components packaged in a 5 cm × 4 cm × 4 cm box. Applied to serotonin, a linear calibration curve down to 0.125 mM, with a limit of detection of 31 μ M was collected at unfunctionalized electrodes. Added sensitivity and selectivity was achieved by incorporating functionalized electrodes for dopamine sensing. Electrode functionalization was achieved with gold nanoparticles and using DNA and o-phenylene diamine polymer. The as-configured platform is demonstrated as a central component toward an "intelligent" drug delivery system based on a feedback loop to monitor drug delivery.
Kara, Adnane; Rouillard, Camille; Mathault, Jessy; Boisvert, Martin; Tessier, Frédéric; Landari, Hamza; Melki, Imene; Laprise-Pelletier, Myriam; Boisselier, Elodie; Fortin, Marc-André; Boilard, Eric; Greener, Jesse; Miled, Amine
2016-01-01
In this paper, we present a new modular lab on a chip design for multimodal neurotransmitter (NT) sensing and niosome generation based on a plug-and-play concept. This architecture is a first step toward an automated platform for an automated modulation of neurotransmitter concentration to understand and/or treat neurodegenerative diseases. A modular approach has been adopted in order to handle measurement or drug delivery or both measurement and drug delivery simultaneously. The system is composed of three fully independent modules: three-channel peristaltic micropumping system, a three-channel potentiostat and a multi-unit microfluidic system composed of pseudo-Y and cross-shape channels containing a miniature electrode array. The system was wirelessly controlled by a computer interface. The system is compact, with all the microfluidic and sensing components packaged in a 5 cm × 4 cm × 4 cm box. Applied to serotonin, a linear calibration curve down to 0.125 mM, with a limit of detection of 31 μM was collected at unfunctionalized electrodes. Added sensitivity and selectivity was achieved by incorporating functionalized electrodes for dopamine sensing. Electrode functionalization was achieved with gold nanoparticles and using DNA and o-phenylene diamine polymer. The as-configured platform is demonstrated as a central component toward an “intelligent” drug delivery system based on a feedback loop to monitor drug delivery. PMID:27240377
Improve Data Mining and Knowledge Discovery Through the Use of MatLab
NASA Technical Reports Server (NTRS)
Shaykhian, Gholam Ali; Martin, Dawn (Elliott); Beil, Robert
2011-01-01
Data mining is widely used to mine business, engineering, and scientific data. Data mining uses pattern based queries, searches, or other analyses of one or more electronic databases/datasets in order to discover or locate a predictive pattern or anomaly indicative of system failure, criminal or terrorist activity, etc. There are various algorithms, techniques and methods used to mine data; including neural networks, genetic algorithms, decision trees, nearest neighbor method, rule induction association analysis, slice and dice, segmentation, and clustering. These algorithms, techniques and methods used to detect patterns in a dataset, have been used in the development of numerous open source and commercially available products and technology for data mining. Data mining is best realized when latent information in a large quantity of data stored is discovered. No one technique solves all data mining problems; challenges are to select algorithms or methods appropriate to strengthen data/text mining and trending within given datasets. In recent years, throughout industry, academia and government agencies, thousands of data systems have been designed and tailored to serve specific engineering and business needs. Many of these systems use databases with relational algebra and structured query language to categorize and retrieve data. In these systems, data analyses are limited and require prior explicit knowledge of metadata and database relations; lacking exploratory data mining and discoveries of latent information. This presentation introduces MatLab(R) (MATrix LABoratory), an engineering and scientific data analyses tool to perform data mining. MatLab was originally intended to perform purely numerical calculations (a glorified calculator). Now, in addition to having hundreds of mathematical functions, it is a programming language with hundreds built in standard functions and numerous available toolboxes. MatLab's ease of data processing, visualization and its enormous availability of built in functionalities and toolboxes make it suitable to perform numerical computations and simulations as well as a data mining tool. Engineers and scientists can take advantage of the readily available functions/toolboxes to gain wider insight in their perspective data mining experiments.
Improve Data Mining and Knowledge Discovery through the use of MatLab
NASA Technical Reports Server (NTRS)
Shaykahian, Gholan Ali; Martin, Dawn Elliott; Beil, Robert
2011-01-01
Data mining is widely used to mine business, engineering, and scientific data. Data mining uses pattern based queries, searches, or other analyses of one or more electronic databases/datasets in order to discover or locate a predictive pattern or anomaly indicative of system failure, criminal or terrorist activity, etc. There are various algorithms, techniques and methods used to mine data; including neural networks, genetic algorithms, decision trees, nearest neighbor method, rule induction association analysis, slice and dice, segmentation, and clustering. These algorithms, techniques and methods used to detect patterns in a dataset, have been used in the development of numerous open source and commercially available products and technology for data mining. Data mining is best realized when latent information in a large quantity of data stored is discovered. No one technique solves all data mining problems; challenges are to select algorithms or methods appropriate to strengthen data/text mining and trending within given datasets. In recent years, throughout industry, academia and government agencies, thousands of data systems have been designed and tailored to serve specific engineering and business needs. Many of these systems use databases with relational algebra and structured query language to categorize and retrieve data. In these systems, data analyses are limited and require prior explicit knowledge of metadata and database relations; lacking exploratory data mining and discoveries of latent information. This presentation introduces MatLab(TradeMark)(MATrix LABoratory), an engineering and scientific data analyses tool to perform data mining. MatLab was originally intended to perform purely numerical calculations (a glorified calculator). Now, in addition to having hundreds of mathematical functions, it is a programming language with hundreds built in standard functions and numerous available toolboxes. MatLab's ease of data processing, visualization and its enormous availability of built in functionalities and toolboxes make it suitable to perform numerical computations and simulations as well as a data mining tool. Engineers and scientists can take advantage of the readily available functions/toolboxes to gain wider insight in their perspective data mining experiments.
Bethune-Cookman University STEM Research Lab. DOE Renovation Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, Herbert W.
DOE funding was used to renovate 4,500 square feet of aging laboratories and classrooms that support science, engineering, and mathematics disciplines (specifically environmental science, and computer engineering). The expansion of the labs was needed to support robotics and environmental science research, and to better accommodate a wide variety of teaching situations. The renovated space includes a robotics laboratory, two multi-use labs, safe spaces for the storage of instrumentation, modern ventilation equipment, and other “smart” learning venues. The renovated areas feature technologies that are environmentally friendly with reduced energy costs. A campus showcase, the laboratories are a reflection of the University’smore » commitment to the environment and research as a tool for teaching. As anticipated, the labs facilitate the exploration of emerging technologies that are compatible with local and regional economic plans.« less
WetLab-2: Wet Lab RNA SmartCycler Providing PCR Capability on ISS
NASA Technical Reports Server (NTRS)
Parra, Macarena; Schonfeld, Julie
2015-01-01
The WetLab-2 system will provide sample preparation and qRT-PCR analysis on-board the ISS, a capability to enable using the ISS as a real laboratory. The system will be validated on SpX-7, and is planned for its first PI use on SpX-9.
NASA Technical Reports Server (NTRS)
Butler, Carolyn; Spencer, Randall
1988-01-01
The improvement of computer hardware and software of the NASA Multipurpose Differential Absorption Lidar (DIAL) system is documented. The NASA DIAL system has undergone development and experimental deployment at NASA/Langley Res. Center for the remote measurement of atmospheric trace gas concentrations from ground and aircraft platforms. A viable DIAL system was developed capable of remotely measuring O3 and H2O concentrations from an aircraft platform. The DIAL Data Acquisition System (DAS) has undergone a number of improvements also. Due to the participation of the DIAL in the Global Tropospheric Experiment, modifications and improvements of the system were tested and used both in the lab and in air. Therefore, this is an operational manual for the DIAL DAS.
NASA Astrophysics Data System (ADS)
2011-07-01
WE RECOMMEND Fun Fly Stick Science Kit Fun fly stick introduces electrostatics to youngsters Special Relativity Text makes a useful addition to the study of relativity as an undergraduate LabVIEWTM 2009 Education Edition LabVIEW sets industry standard for gathering and analysing data, signal processing, instrumentation design and control, and automation and robotics Edison and Ford Winter Estates Thomas Edison's home is open to the public The Computer History Museum Take a walk through technology history at this computer museum WORTH A LOOK Fast Car Physics Book races through physics Beautiful Invisible The main subject of this book is theoretical physics Quantum Theory Cannot Hurt You A guide to physics on the large and small scale Chaos: The Science of Predictable Random Motion Book explores the mathematics behind chaotic behaviour Seven Wonders of the Universe A textual trip through the wonderful universe HANDLE WITH CARE Marie Curie: A Biography Book fails to capture Curie's science WEB WATCH Web clips to liven up science lessons
Machine vision methods for use in grain variety discrimination and quality analysis
NASA Astrophysics Data System (ADS)
Winter, Philip W.; Sokhansanj, Shahab; Wood, Hugh C.
1996-12-01
Decreasing cost of computer technology has made it feasible to incorporate machine vision technology into the agriculture industry. The biggest attraction to using a machine vision system is the computer's ability to be completely consistent and objective. One use is in the variety discrimination and quality inspection of grains. Algorithms have been developed using Fourier descriptors and neural networks for use in variety discrimination of barley seeds. RGB and morphology features have been used in the quality analysis of lentils, and probability distribution functions and L,a,b color values for borage dockage testing. These methods have been shown to be very accurate and have a high potential for agriculture. This paper presents the techniques used and results obtained from projects including: a lentil quality discriminator, a barley variety classifier, a borage dockage tester, a popcorn quality analyzer, and a pistachio nut grading system.
NASA Astrophysics Data System (ADS)
Callieri, M.; Debevec, P.; Pair, J.; Scopigno, R.
2005-06-01
Offine rendering techniques have nowadays reached an astonishing level of realism but paying the cost of a long computational time. The new generation of programmable graphic hardware, on the other hand, gives the possibility to implement in realtime some of the visual effects previously available only for cinematographic production. In a collaboration between the Visual Computing Lab (ISTI-CNR) with the Institute for Creative Technologies of the University of Southern California, has been developed a realtime demo that replicate a sequence from the short movie "The Parthenon" presented at Siggraph 2004. The application is designed to run on an immersive reality system, making possible for a user to perceive the virtual environment with a cinematographic visual quality. In this paper we present the principal ideas of the project, discussing design issues and technical solution used for the realtime demo.
Using PVM to host CLIPS in distributed environments
NASA Technical Reports Server (NTRS)
Myers, Leonard; Pohl, Kym
1994-01-01
It is relatively easy to enhance CLIPS (C Language Integrated Production System) to support multiple expert systems running in a distributed environment with heterogeneous machines. The task is minimized by using the PVM (Parallel Virtual Machine) code from Oak Ridge Labs to provide the distributed utility. PVM is a library of C and FORTRAN subprograms that supports distributive computing on many different UNIX platforms. A PVM deamon is easily installed on each CPU that enters the virtual machine environment. Any user with rsh or rexec access to a machine can use the one PVM deamon to obtain a generous set of distributed facilities. The ready availability of both CLIPS and PVM makes the combination of software particularly attractive for budget conscious experimentation of heterogeneous distributive computing with multiple CLIPS executables. This paper presents a design that is sufficient to provide essential message passing functions in CLIPS and enable the full range of PVM facilities.
Resolving Mixed Algal Species in Hyperspectral Images
Mehrubeoglu, Mehrube; Teng, Ming Y.; Zimba, Paul V.
2014-01-01
We investigated a lab-based hyperspectral imaging system's response from pure (single) and mixed (two) algal cultures containing known algae types and volumetric combinations to characterize the system's performance. The spectral response to volumetric changes in single and combinations of algal mixtures with known ratios were tested. Constrained linear spectral unmixing was applied to extract the algal content of the mixtures based on abundances that produced the lowest root mean square error. Percent prediction error was computed as the difference between actual percent volumetric content and abundances at minimum RMS error. Best prediction errors were computed as 0.4%, 0.4% and 6.3% for the mixed spectra from three independent experiments. The worst prediction errors were found as 5.6%, 5.4% and 13.4% for the same order of experiments. Additionally, Beer-Lambert's law was utilized to relate transmittance to different volumes of pure algal suspensions demonstrating linear logarithmic trends for optical property measurements. PMID:24451451
Timing considerations of Helmet Mounted Display performance
NASA Technical Reports Server (NTRS)
Tharp, Gregory; Liu, Andrew; French, Lloyd; Lai, Steve; Stark, Lawrence
1992-01-01
The Helmet Mounted Display (HMD) system developed in our lab should be a useful teleoperator systems display if it increases operator performance of the desired task; it can, however, introduce degradation in performance due to display update rate constraints and communication delays. Display update rates are slowed by communication bandwidth and/or computational power limitations. We used simulated 3D tracking and pick-and-place tasks to characterize performance levels for a range of update rates. Initial experiments with 3D tracking indicate that performance levels plateau at an update rate between 10 and 20 Hz. We have found that using the HMD with delay decreases performance as delay increases.
UBioLab: a web-laboratory for ubiquitous in-silico experiments.
Bartocci, Ezio; Cacciagrano, Diletta; Di Berardini, Maria Rita; Merelli, Emanuela; Vito, Leonardo
2012-07-09
The huge and dynamic amount of bioinformatic resources (e.g., data and tools) available nowadays in Internet represents a big challenge for biologists –for what concerns their management and visualization– and for bioinformaticians –for what concerns the possibility of rapidly creating and executing in-silico experiments involving resources and activities spread over the WWW hyperspace. Any framework aiming at integrating such resources as in a physical laboratory has imperatively to tackle –and possibly to handle in a transparent and uniform way– aspects concerning physical distribution, semantic heterogeneity, co-existence of different computational paradigms and, as a consequence, of different invocation interfaces (i.e., OGSA for Grid nodes, SOAP for Web Services, Java RMI for Java objects, etc.). The framework UBioLab has been just designed and developed as a prototype following the above objective. Several architectural features –as those ones of being fully Web-based and of combining domain ontologies, Semantic Web and workflow techniques– give evidence of an effort in such a direction. The integration of a semantic knowledge management system for distributed (bioinformatic) resources, a semantic-driven graphic environment for defining and monitoring ubiquitous workflows and an intelligent agent-based technology for their distributed execution allows UBioLab to be a semantic guide for bioinformaticians and biologists providing (i) a flexible environment for visualizing, organizing and inferring any (semantics and computational) "type" of domain knowledge (e.g., resources and activities, expressed in a declarative form), (ii) a powerful engine for defining and storing semantic-driven ubiquitous in-silico experiments on the domain hyperspace, as well as (iii) a transparent, automatic and distributed environment for correct experiment executions.
Mammalian Toxicology Testing: Problem Definition Study, Personnel Plan.
1981-03-01
Technician X X Biochemist X Biologist !Bookkeeper Cage Washer X Clinical Chemist Compound Preparation Technician X Computer Cooer X Computer ...Biologist 62 Bookkeeper 60 Cage rasher 33 Clinical Chemist 26 Comp. Prep. Technician 20 Computer Coder 31 Computer Programer 31 Electron Microscope Op...29,200 * Computer Programmer BS S SFByAe 900-2.0 18,400 - $24500 e Lab Tec-inician (Chemistry) BS 5 SF Say Area 16,600- 24.000 - 14.200- ’,0 * Animal
Mathematics and Computer Science: Exploring a Symbiotic Relationship
ERIC Educational Resources Information Center
Bravaco, Ralph; Simonson, Shai
2004-01-01
This paper describes a "learning community" designed for sophomore computer science majors who are simultaneously studying discrete mathematics. The learning community consists of three courses: Discrete Mathematics, Data Structures and an Integrative Seminar/Lab. The seminar functions as a link that integrates the two disciplines. Participation…
Gender Effects of Computer Use in a Conceptual Physics Lab Course
ERIC Educational Resources Information Center
Van Domelen, Dave
2010-01-01
It's always hard to know what to expect when bringing computers into an educational setting, as things are always changing. Student skills with computers are different today than they were 10 years ago, and 20 years ago almost counts as an alien world. Still, one hopes that some of these changes result in positive trends, such as student attitudes…
ERIC Educational Resources Information Center
Bozzone, Meg A.
1997-01-01
Purchasing custom-made desks with durable glass tops to house computers and double as student work space solved the problem of how to squeeze in additional classroom computers at Johnson Park Elementary School in Princeton, New Jersey. This article describes a K-5 grade school's efforts to overcome barriers to integrating technology. (PEN)
Eirín-López, José M
2013-01-01
The study of chromatin constitutes one of the most active research fields in life sciences, being subject to constant revisions that continuously redefine the state of the art in its knowledge. As every other rapidly changing field, chromatin biology requires clear and straightforward educational strategies able to efficiently translate such a vast body of knowledge to the classroom. With this aim, the present work describes a multidisciplinary computer lab designed to introduce undergraduate students to the dynamic nature of chromatin, within the context of the one semester course "Chromatin: Structure, Function and Evolution." This exercise is organized in three parts including (a) molecular evolutionary biology of histone families (using the H1 family as example), (b) histone structure and variation across different animal groups, and (c) effect of histone diversity on nucleosome structure and chromatin dynamics. By using freely available bioinformatic tools that can be run on common computers, the concept of chromatin dynamics is interactively illustrated from a comparative/evolutionary perspective. At the end of this computer lab, students are able to translate the bioinformatic information into a biochemical context in which the relevance of histone primary structure on chromatin dynamics is exposed. During the last 8 years this exercise has proven to be a powerful approach for teaching chromatin structure and dynamics, allowing students a higher degree of independence during the processes of learning and self-assessment. Copyright © 2013 International Union of Biochemistry and Molecular Biology, Inc.
A convertor and user interface to import CAD files into worldtoolkit virtual reality systems
NASA Technical Reports Server (NTRS)
Wang, Peter Hor-Ching
1996-01-01
Virtual Reality (VR) is a rapidly developing human-to-computer interface technology. VR can be considered as a three-dimensional computer-generated Virtual World (VW) which can sense particular aspects of a user's behavior, allow the user to manipulate the objects interactively, and render the VW at real-time accordingly. The user is totally immersed in the virtual world and feel the sense of transforming into that VW. NASA/MSFC Computer Application Virtual Environments (CAVE) has been developing the space-related VR applications since 1990. The VR systems in CAVE lab are based on VPL RB2 system which consists of a VPL RB2 control tower, an LX eyephone, an Isotrak polhemus sensor, two Fastrak polhemus sensors, a folk of Bird sensor, and two VPL DG2 DataGloves. A dynamics animator called Body Electric from VPL is used as the control system to interface with all the input/output devices and to provide the network communications as well as VR programming environment. The RB2 Swivel 3D is used as the modelling program to construct the VW's. A severe limitation of the VPL VR system is the use of RB2 Swivel 3D, which restricts the files to a maximum of 1020 objects and doesn't have the advanced graphics texture mapping. The other limitation is that the VPL VR system is a turn-key system which does not provide the flexibility for user to add new sensors and C language interface. Recently, NASA/MSFC CAVE lab provides VR systems built on Sense8 WorldToolKit (WTK) which is a C library for creating VR development environments. WTK provides device drivers for most of the sensors and eyephones available on the VR market. WTK accepts several CAD file formats, such as Sense8 Neutral File Format, AutoCAD DXF and 3D Studio file format, Wave Front OBJ file format, VideoScape GEO file format, Intergraph EMS stereolithographics and CATIA Stereolithographics STL file formats. WTK functions are object-oriented in their naming convention, are grouped into classes, and provide easy C language interface. Using a CAD or modelling program to build a VW for WTK VR applications, we typically construct the stationary universe with all the geometric objects except the dynamic objects, and create each dynamic object in an individual file.
NASA Astrophysics Data System (ADS)
Russo, Luigi; Sorrentino, Marco; Polverino, Pierpaolo; Pianese, Cesare
2017-06-01
This work focuses on the development of a fast PEMFC impedance model, built starting from both physical and geometrical variables. Buckingham's π theorem is proposed to define non-dimensional parameters that allow suitably describing the relationships linking the physical variables involved in the process under-study to the fundamental dimensions. This approach is a useful solution for those problems, whose first principles-based models are not known, difficult to build or computationally unfeasible. The key contributions of the proposed similarity theory-based modelling approach are presented and discussed. The major advantage resides in its straightforward online applicability, thanks to very low computational burden, while preserving good level of accuracy. This makes the model suitable for several purposes, such as design, control, diagnostics, state of health monitoring and prognostics. Experimental data, collected in different operating conditions, have been analysed to demonstrate the capability of the model to reproduce PEMFC impedance at different loads and temperatures. This results in a reduction of the experimental effort for the FCS lab characterization. Moreover, it is highlighted the possibility to use the model with scaling-up purposes to reproduce the full stack impedance from single-cell one, thus supporting FC design and development from lab-to commercial system-scale.
Neuhauser, S; Handler, J
2013-09-01
The aims of this study were to compare two different methods of quantifying the colour of the luminal surface of the equine endometrium and to relate the results to histopathological evidence of inflammation and fibrosis. The mucosal surfaces of 17 equine uteri obtained from an abattoir were assessed using a spectrophotometer and by computer-assisted analysis of photographs. Values were converted into L(*)a(*)b(*) colour space. Although there was significant correlation between the two methods of quantification, variations in 'brightness', 'red' and 'yellow' values were noted. Within a given uterus, measurements using the spectrophotometer did not differ significantly. Using photographic analysis, brightness differed between horns, although no differences in chromaticity were found. Histopathological classification of changes within endometria corresponded to measured differences in colour. Extensive fibrosis was associated with increased brightness and decreased chromaticity using both methods. Inflammation correlated with reduced chromaticity, when measured by spectrophotometry, and with reduced brightness and yellow values, when assessed photographically. For this technique to gain wider acceptance as a diagnostic tool, e.g. for the endoscopic evaluation of uterine mucosae in vivo, standardised illumination techniques will be required so that colours can be compared and interpreted accurately. Copyright © 2013 Elsevier Ltd. All rights reserved.
STS-126 crew during preflight VR LAB MSS EVA2 training
2008-04-14
JSC2008-E-033771 (14 April 2008) --- Astronaut Eric A. Boe, STS-126 pilot, uses the virtual reality lab in the Space Vehicle Mockup Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
Probability of illness definition for the Skylab flight crew health stabilization program
NASA Technical Reports Server (NTRS)
1974-01-01
Management and analysis of crew and environmental microbiological data from SMEAT and Skylab are discussed. Samples were collected from ten different body sites on each SMEAT and Skylab crew-member on approximately 50 occasions and since several different organisms could be isolated from each sample, several thousand lab reports were generated. These lab reports were coded and entered in a computer file and from the file various tabular summaries were constructed.
Naval Postgraduate School Research. Volume 8, Number 2, June 1998
1998-06-01
N P S R E S E A R C H Volume 8, Number 2 June 1998 Office of the Dean of Research • Naval Postgraduate School • Monterey, California...LABORATORY Department of Electrical and Computer Engineering Research Associate Professor Richard W. Adler Research Associate Wilbur R . Vincent Visiting...electromagnetic environmental effects. RESEARCH LAB SIGNAL ENHANCEMENT LAB, continued from page 1 -- continued on page 3 Wilbur R . Vincent is a Research
Verification of Space Station Secondary Power System Stability Using Design of Experiment
NASA Technical Reports Server (NTRS)
Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce
1998-01-01
This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.
Design of a Low-Cost Air Levitation System for Teaching Control Engineering.
Chacon, Jesus; Saenz, Jacobo; Torre, Luis de la; Diaz, Jose Manuel; Esquembre, Francisco
2017-10-12
Air levitation is the process by which an object is lifted without mechanical support in a stable position, by providing an upward force that counteracts the gravitational force exerted on the object. This work presents a low-cost lab implementation of an air levitation system, based on open solutions. The rapid dynamics makes it especially suitable for a control remote lab. Due to the system's nature, the design can be optimized and, with some precision trade-off, kept affordable both in cost and construction effort. It was designed to be easily adopted to be used as both a remote lab and as a hands-on lab.
Yang, Qian; Zimmerman, John; Steinfeld, Aaron; Carey, Lisa; Antaki, James F.
2016-01-01
Clinical decision support tools (DSTs) are computational systems that aid healthcare decision-making. While effective in labs, almost all these systems failed when they moved into clinical practice. Healthcare researchers speculated it is most likely due to a lack of user-centered HCI considerations in the design of these systems. This paper describes a field study investigating how clinicians make a heart pump implant decision with a focus on how to best integrate an intelligent DST into their work process. Our findings reveal a lack of perceived need for and trust of machine intelligence, as well as many barriers to computer use at the point of clinical decision-making. These findings suggest an alternative perspective to the traditional use models, in which clinicians engage with DSTs at the point of making a decision. We identify situations across patients’ healthcare trajectories when decision supports would help, and we discuss new forms it might take in these situations. PMID:27833397
Novel Ultrahigh Vacuum System for Chip-Scale Trapped Ion Quantum Computing
NASA Astrophysics Data System (ADS)
Chen, Shaw-Pin; Trapped Team
2011-05-01
This presentation reports the experimental results of an ultrahigh vacuum (UHV) system as a scheme to implement scalable trapped-ion quantum computers that use micro-fabricated ion traps as fundamental building blocks. The novelty of this system resides in our design, material selection, mechanical liability, low complexity of assembly, and reduced signal interference between DC and RF electrodes. Our system utilizes RF isolation and onsite-filtering topologies to attenuate AC signals generated from the resonator. We use a UHV compatible printed circuit board (PCB) material to perform DC routing, while the RF high and RF ground received separated routing via wire-wrapping. The standard PCB fabrication process enabled us to implement ceramic-based filter components adjacent to the chip trap. The DC electrodes are connected to air-side electrical feed through using four 25D adaptors made with polyether ether ketone (PEEK). The assembly process of this system is straight forward and in-chamber structure is self-supporting. We report on initial testing of this concept with a linear chip trap fabricated by the Sandia National Labs.
Operating Dedicated Data Centers - Is It Cost-Effective?
NASA Astrophysics Data System (ADS)
Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.
2014-06-01
The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.
Quantitative Gait Measurement With Pulse-Doppler Radar for Passive In-Home Gait Assessment
Skubic, Marjorie; Rantz, Marilyn; Cuddihy, Paul E.
2014-01-01
In this paper, we propose a pulse-Doppler radar system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed and step time using Doppler radar. The gait parameters have been validated with a Vicon motion capture system in the lab with 13 participants and 158 test runs. The study revealed that for an optimal step recognition and walking speed estimation, a dual radar set up with one radar placed at foot level and the other at torso level is necessary. An excellent absolute agreement with intraclass correlation coefficients of 0.97 was found for step time estimation with the foot level radar. For walking speed, although both radars show excellent consistency they all have a system offset compared to the ground truth due to walking direction with respect to the radar beam. The torso level radar has a better performance (9% offset on average) in the speed estimation compared to the foot level radar (13%–18% offset). Quantitative analysis has been performed to compute the angles causing the systematic error. These lab results demonstrate the capability of the system to be used as a daily gait assessment tool in home environments, useful for fall risk assessment and other health care applications. The system is currently being tested in an unstructured home environment. PMID:24771566
Quantitative gait measurement with pulse-Doppler radar for passive in-home gait assessment.
Wang, Fang; Skubic, Marjorie; Rantz, Marilyn; Cuddihy, Paul E
2014-09-01
In this paper, we propose a pulse-Doppler radar system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed and step time using Doppler radar. The gait parameters have been validated with a Vicon motion capture system in the lab with 13 participants and 158 test runs. The study revealed that for an optimal step recognition and walking speed estimation, a dual radar set up with one radar placed at foot level and the other at torso level is necessary. An excellent absolute agreement with intraclass correlation coefficients of 0.97 was found for step time estimation with the foot level radar. For walking speed, although both radars show excellent consistency they all have a system offset compared to the ground truth due to walking direction with respect to the radar beam. The torso level radar has a better performance (9% offset on average) in the speed estimation compared to the foot level radar (13%-18% offset). Quantitative analysis has been performed to compute the angles causing the systematic error. These lab results demonstrate the capability of the system to be used as a daily gait assessment tool in home environments, useful for fall risk assessment and other health care applications. The system is currently being tested in an unstructured home environment.
Computational Methods for Inviscid and Viscous Two-and-Three-Dimensional Flow Fields.
1975-01-01
Difference Equations Over a Network, Watson Sei. Comput. Lab. Report, 19U9. 173- Isaacson, E. and Keller, H. B., Analaysis of Numerical Methods...element method has given a new impulse to the old mathematical theory of multivariate interpolation. We first study the one-dimensional case, which
A Monte Carlo Simulation of Brownian Motion in the Freshman Laboratory
ERIC Educational Resources Information Center
Anger, C. D.; Prescott, J. R.
1970-01-01
Describes a dry- lab" experiment for the college freshman laboratory, in which the essential features of Browian motion are given principles, using the Monte Carlo technique. Calculations principles, using the Monte Carlo technique. Calculations are carried out by a computation sheme based on computer language. Bibliography. (LC)
ERIC Educational Resources Information Center
Roy, Ken
2005-01-01
Unless the teacher is working at an ergonomically designed workstation, using a computer can result in eyestrain, neck aches, backaches, and headaches. Unfortunately, most teachers do their keyboarding at desks, on lab tables, and in other spaces that were not designed with computer use in mind. Ergonomics is the science of adapting workstations,…
The Impact of Microcomputers on Composition Students.
ERIC Educational Resources Information Center
Hocking, Joan
To determine whether computer assisted instruction was just a fad or a viable alternative to traditional methods for teaching English composition, a microcomputer was used in a traditional college freshman English course. The class was divided into small groups: some went to the computer lab, while others worked in the classroom. Interactive…
Pedagogy and Related Criteria: The Selection of Software for Computer Assisted Language Learning
ERIC Educational Resources Information Center
Samuels, Jeffrey D.
2013-01-01
Computer-Assisted Language Learning (CALL) is an established field of academic inquiry with distinct applications for second language teaching and learning. Many CALL professionals direct language labs or language resource centers (LRCs) in which CALL software applications and generic software applications support language learning programs and…
Wireless Computing in the Library: A Successful Model at St. Louis Community College.
ERIC Educational Resources Information Center
Patton, Janice K.
2001-01-01
Describes the St. Louis Community College (Missouri) library's use of laptop computers in the instruction lab as a way to save space and wiring costs. Discusses the pros and cons of wireless library instruction-advantages include its flexibility and its ability to eliminate cabling. (NB)
Adapting NBODY4 with a GRAPE-6a Supercomputer for Web Access, Using NBodyLab
NASA Astrophysics Data System (ADS)
Johnson, V.; Aarseth, S.
2006-07-01
A demonstration site has been developed by the authors that enables researchers and students to experiment with the capabilities and performance of NBODY4 running on a GRAPE-6a over the web. NBODY4 is a sophisticated open-source N-body code for high accuracy simulations of dense stellar systems (Aarseth 2003). In 2004, NBODY4 was successfully tested with a GRAPE-6a, yielding an unprecedented low-cost tool for astrophysical research. The GRAPE-6a is a supercomputer card developed by astrophysicists to accelerate high accuracy N-body simulations with a cluster or a desktop PC (Fukushige et al. 2005, Makino & Taiji 1998). The GRAPE-6a card became commercially available in 2004, runs at 125 Gflops peak, has a standard PCI interface and costs less than 10,000. Researchers running the widely used NBODY6 (which does not require GRAPE hardware) can compare their own PC or laptop performance with simulations run on http://www.NbodyLab.org. Such comparisons may help justify acquisition of a GRAPE-6a. For workgroups such as university physics or astronomy departments, the demonstration site may be replicated or serve as a model for a shared computing resource. The site was constructed using an NBodyLab server-side framework.
BioMEMS and Lab-on-a-Chip Course Education at West Virginia University
Liu, Yuxin
2011-01-01
With the rapid growth of Biological/Biomedical MicroElectroMechanical Systems (BioMEMS) and microfluidic-based lab-on-a-chip (LOC) technology to biological and biomedical research and applications, demands for educated and trained researchers and technicians in these fields are rapidly expanding. Universities are expected to develop educational plans to address these specialized needs in BioMEMS, microfluidic and LOC science and technology. A course entitled BioMEMS and Lab-on-a-Chip was taught recently at the senior undergraduate and graduate levels in the Department of Computer Science and Electrical Engineering at West Virginia University (WVU). The course focused on the basic principles and applications of BioMEMS and LOC technology to the areas of biomedicine, biology, and biotechnology. The course was well received and the enrolled students had diverse backgrounds in electrical engineering, material science, biology, mechanical engineering, and chemistry. Student feedback and a review of the course evaluations indicated that the course was effective in achieving its objectives. Student presentations at the end of the course were a highlight and a valuable experience for all involved. The course proved successful and will continue to be offered regularly. This paper provides an overview of the course as well as some development and future improvements. PMID:25586697
2002-04-02
KENNEDY SPACE CENTER, FLA. -- STS-110 Mission Specialist Ellen Ochoa has a final check of her launch and entry suit in preparation for launch April 4. This flight will be her fourth. The STS-110 payload includes the S0 Integrated Truss Structure (ITS), the Canadian Mobile Transporter, power distribution system modules, a heat pipe radiator for cooling, computers and a pair of rate gyroscopes. The 11-day mission is the 13th assembly flight to the ISS and includes four spacewalks to attach the S0 truss to the U.S. Lab Destiny
2002-04-02
KENNEDY SPACE CENTER, FLA. -- STS-110 Mission Specialist Lee Morin undergoes final check of his launch and entry suit. Morin will be taking his first Shuttle flight. The STS-110 payload includes the S0 Integrated Truss Structure (ITS), the Canadian Mobile Transporter, power distribution system modules, a heat pipe radiator for cooling, computers and a pair of rate gyroscopes. The 11-day mission is the 13th assembly flight to the ISS and includes four spacewalks to attach the S0 truss to the U.S. Lab Destiny. Launch is scheduled for April 4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryden, Mark; Tucker, David A.
The goal of this project is to develop a merged environment for simulation and analysis (MESA) at the National Energy Technology Laboratory’s (NETL) Hybrid Performance (Hyper) project laboratory. The MESA sensor lab developed as a component of this research will provide a development platform for investigating: 1) advanced control strategies, 2) testing and development of sensor hardware, 3) various modeling in-the-loop algorithms and 4) other advanced computational algorithms for improved plant performance using sensors, real-time models, and complex systems tools.
Computer aided statistical process control for on-line instrumentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meils, D.E.
1995-01-01
On-line chemical process instrumentation historically has been used for trending. Recent technological advances in on-line instrumentation have improved the accuracy and reliability of on-line instrumentation. However, little attention has been given to validating and verifying on-line instrumentation. This paper presents two practical approaches for validating instrument performance by comparison of on-line instrument response to either another portable instrument or another bench instrument. Because the comparison of two instruments` performance to each other requires somewhat complex statistical calculations, a computer code (Lab Stats Pack{reg_sign}) is used to simplify the calculations. Lab Stats Pack{reg_sign} also develops control charts that may be usedmore » for continuous verification of on-line instrument performance.« less
A nested virtualization tool for information technology practical education.
Pérez, Carlos; Orduña, Juan M; Soriano, Francisco R
2016-01-01
A common problem of some information technology courses is the difficulty of providing practical exercises. Although different approaches have been followed to solve this problem, it is still an open issue, specially in security and computer network courses. This paper proposes NETinVM, a tool based on nested virtualization that includes a fully functional lab, comprising several computers and networks, in a single virtual machine. It also analyzes and evaluates how it has been used in different teaching environments. The results show that this tool makes it possible to perform demos, labs and practical exercises, greatly appreciated by the students, that would otherwise be unfeasible. Also, its portability allows to reproduce classroom activities, as well as the students' autonomous work.
LabLessons: Effects of Electronic Prelabs on Student Engagement and Performance
ERIC Educational Resources Information Center
Gryczka, Patrick; Klementowicz, Edward; Sharrock, Chappel; Maxfield, MacRae; Montclare, Jin Kim
2016-01-01
Lab instructors, for both high school and undergraduate college level courses, face issues of constricted time within the lab period and limited student engagement with prelab materials. To address these issues, an online prelab delivery system named LabLessons is developed and tested out in a high school chemistry classroom. The system…
Van Oosten, Ellen B; Buse, Kathleen; Bilimoria, Diana
2017-01-01
Innovative professional development approaches are needed to address the ongoing lack of women leaders in science, technology, engineering, and math (STEM) careers. Developed from the research on women who persist in engineering and computing professions and essential elements of women's leadership development, the Leadership Lab for Women in STEM Program was launched in 2014. The Leadership Lab was created as a research-based leadership development program, offering 360-degree feedback, coaching, and practical strategies aimed at increasing the advancement and retention of women in the STEM professions. The goal is to provide women with knowledge, tools and a supportive learning environment to help them navigate, achieve, flourish, and catalyze organizational change in male-dominated and technology-driven organizations. This article describes the importance of creating unique development experiences for women in STEM fields, the genesis of the Leadership Lab, the design and content of the program, and the outcomes for the participants.
Mobile Robot Lab Project to Introduce Engineering Students to Fault Diagnosis in Mechatronic Systems
ERIC Educational Resources Information Center
Gómez-de-Gabriel, Jesús Manuel; Mandow, Anthony; Fernández-Lozano, Jesús; García-Cerezo, Alfonso
2015-01-01
This paper proposes lab work for learning fault detection and diagnosis (FDD) in mechatronic systems. These skills are important for engineering education because FDD is a key capability of competitive processes and products. The intended outcome of the lab work is that students become aware of the importance of faulty conditions and learn to…
Precision of computer-assisted core decompression drilling of the knee.
Beckmann, J; Goetz, J; Bäthis, H; Kalteis, T; Grifka, J; Perlick, L
2006-06-01
Core decompression by exact drilling into the ischemic areas is the treatment of choice in early stages of osteonecrosis of the femoral condyle. Computer-aided surgery might enhance the precision of the drilling and lower the radiation exposure time of both staff and patients. The aim of this study was to evaluate the precision of the fluoroscopically based VectorVision-navigation system in an in vitro model. Thirty sawbones were prepared with a defect filled up with a radiopaque gypsum sphere mimicking the osteonecrosis. 20 sawbones were drilled by guidance of an intraoperative navigation system VectorVision (BrainLAB, Munich, Germany). Ten sawbones were drilled by fluoroscopic control only. A statistically significant difference with a mean distance of 0.58 mm in the navigated group and 0.98 mm in the control group regarding the distance to the desired mid-point of the lesion could be stated. Significant difference was further found in the number of drilling corrections as well as radiation time needed. The fluoroscopic-based VectorVision-navigation system shows a high feasibility and precision of computer-guided drilling with simultaneously reduction of radiation time and therefore could be integrated into clinical routine.
Data Collection with Linux in the Undergraduate Physics Lab
NASA Astrophysics Data System (ADS)
Ramey, R. Dwayne
2004-11-01
Electronic data devices such as photogates can greatly facilitate data collection in the undergraduate physics laboratory. Unfortunately, these devices have several practical drawbacks. While the photogates themselves are not particularly expensive, manufacturers of these devices have created intermediary hardware devices for data buffering and manipulation. These devices, while useful in some contexts, greatly increase the overall price of data collection and, through the use of proprietary software, limit the ability of the enduser to customize the software. As an alternative, I outline the procedure for establishing a computer-based data collection system that consists of opensource software and user constructed connections. The data collection system consists of the wiring needed to connect a data device to a computer and the software needed to collect and manipulate data. Data devices can be connected to a computer through either through the USB port or the gameport of a sound card. Software capable of collecting and manipulating the data from a photogate type device on a Linux system has been developed and will be discrussed. Results for typical undergraduate photogate based experiments will be shown, error limits and data collect rates will be discussed for both the gameport and USB connections.
A Computer Based Education (CBE) Program for Middle School Mathematics Intervention
ERIC Educational Resources Information Center
Gulley, Bill
2009-01-01
A Computer Based Education (CBE) program for intervention mathematics was developed, used, and modified over a period of three years in a computer lab at an Arizona Title I middle school. The program is described along with a rationale for the need, design, and use of such a program. Data was collected in the third year and results of the program…
Novel 3-D Computer Model Can Help Predict Pathogens’ Roles in Cancer | Poster
To understand how bacterial and viral infections contribute to human cancers, four NCI at Frederick scientists turned not to the lab bench, but to a computer. The team has created the world’s first—and currently, only—3-D computational approach for studying interactions between pathogen proteins and human proteins based on a molecular adaptation known as interface mimicry.
Generic Software for Emulating Multiprocessor Architectures.
1985-05-01
RD-A157 662 GENERIC SOFTWARE FOR EMULATING MULTIPROCESSOR 1/2 AlRCHITECTURES(J) MASSACHUSETTS INST OF TECH CAMBRIDGE U LRS LAB FOR COMPUTER SCIENCE R...AREA & WORK UNIT NUMBERS MIT Laboratory for Computer Science 545 Technology Square Cambridge, MA 02139 ____________ I I. CONTROLLING OFFICE NAME AND...aide If neceeasy end Identify by block number) Computer architecture, emulation, simulation, dataf low 20. ABSTRACT (Continue an reverse slde It
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vigil,Benny Manuel; Ballance, Robert; Haskell, Karen
Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model ismore » focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.« less
Evidence of Two Functionally Distinct Ornithine Decarboxylation Systems in Lactic Acid Bacteria
Romano, Andrea; Trip, Hein; Lonvaud-Funel, Aline; Lolkema, Juke S.
2012-01-01
Biogenic amines are low-molecular-weight organic bases whose presence in food can result in health problems. The biosynthesis of biogenic amines in fermented foods mostly proceeds through amino acid decarboxylation carried out by lactic acid bacteria (LAB), but not all systems leading to biogenic amine production by LAB have been thoroughly characterized. Here, putative ornithine decarboxylation pathways consisting of a putative ornithine decarboxylase and an amino acid transporter were identified in LAB by strain collection screening and database searches. The decarboxylases were produced in heterologous hosts and purified and characterized in vitro, whereas transporters were heterologously expressed in Lactococcus lactis and functionally characterized in vivo. Amino acid decarboxylation by whole cells of the original hosts was determined as well. We concluded that two distinct types of ornithine decarboxylation systems exist in LAB. One is composed of an ornithine decarboxylase coupled to an ornithine/putrescine transmembrane exchanger. Their combined activities results in the extracellular release of putrescine. This typical amino acid decarboxylation system is present in only a few LAB strains and may contribute to metabolic energy production and/or pH homeostasis. The second system is widespread among LAB. It is composed of a decarboxylase active on ornithine and l-2,4-diaminobutyric acid (DABA) and a transporter that mediates unidirectional transport of ornithine into the cytoplasm. Diamines that result from this second system are retained within the cytosol. PMID:22247134
Artificial Intelligence (AI) Center of Excellence at the University of Pennsylvania
1995-07-01
that controls impact forces. Robust Location Estimation for MLR and Non-MLR Distributions (Dissertation Proposal) Gerda L. Kamberova MS-CIS-92-28...Bayesian Approach To Computer Vision Problems Gerda L. Kamberova MS-CIS-92-29 GRASP LAB 310 The object of our study is the Bayesian approach in...Estimation for MLR and Non-MLR Distributions (Dissertation) Gerda L. Kamberova MS-CIS-92-93 GRASP LAB 340 We study the problem of estimating an unknown
STS-105 Crew Training in VR Lab
2001-03-15
JSC2001-00754 (15 March 2001) --- Astronaut Patrick G. Forrester, STS-105 mission specialist, uses specialized gear in the virtual reality lab at the Johnson Space Center (JSC) to train for his duties aboard the Space Shuttle Discovery. This type of virtual reality training allows the astronauts to wear a helmet and special gloves while looking at computer displays simulating actual movements around the various locations on the International Space Station (ISS) hardware with which they will be working.
STS-109 Crew Training in VR Lab, Building 9
2001-08-08
JSC2001-E-24452 (8 August 2001) --- Astronauts John M. Grunsfeld (left), STS-109 payload commander, and Nancy J. Currie, mission specialist, use the virtual reality lab at the Johnson Space Center (JSC) to train for some of their duties aboard the Space Shuttle Columbia. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team to perform its duties during the fourth Hubble Space Telescope (HST) servicing mission.
STS-111 Training in VR lab with Expedition IV and V Crewmembers
2001-10-18
JSC2001-E-39082 (18 October 2001) --- Cosmonaut Valeri G. Korzun (left), Expedition Five mission commander, and astronaut Carl E. Walz, Expedition Four flight engineer, use the virtual reality lab at the Johnson Space Center (JSC) to train for their duties on the International Space Station (ISS). This type of computer interface paired with virtual reality training hardware and software helps the entire team for dealing with ISS elements. Korzun represents Rosaviakosmos.
The Role of Fixation and Visual Attention in Object Recognition.
1995-01-01
computers", Technical Report, Aritificial Intelligence Lab, M.I. T., AI-Memo-915, June 1986. [29] D.P. Huttenlocher and S.Ullman, "Object Recognition Using...attention", Technical Report, Aritificial Intelligence Lab, M.I. T., AI-memo-770, Jan 1984. [35] E.Krotkov, K. Henriksen and R. Kories, "Stereo...MIT Artificial Intelligence Laboratory [ PCTBTBimON STATEMENT X \\ Afipioved tor puciic reieo*«* \\ »?*•;.., jDi*tiibutK» U»lisut»d* 19951004
Allergen screening bioassays: recent developments in lab-on-a-chip and lab-on-a-disc systems.
Ho, Ho-pui; Lau, Pui-man; Kwok, Ho-chin; Wu, Shu-yuen; Gao, Minghui; Cheung, Anthony Ka-lun; Chen, Qiulan; Wang, Guanghui; Kwan, Yiu-wa; Wong, Chun-kwok; Kong, Siu-kai
2014-01-01
Allergies occur when a person's immune system mounts an abnormal response with or without IgE to a normally harmless substance called an allergen. The standard skin-prick test introduces suspected allergens into the skin with lancets in order to trigger allergic reactions. This test is annoying and sometimes life threatening. New tools such as lab-on-a-chip and lab-on-a-disc, which rely on microfabrication, are designed for allergy testing. These systems provide benefits such as short analysis times, enhanced sensitivity, simplified procedures, minimal consumption of sample and reagents and low cost. This article gives a summary of these systems. In particular, a cell-based assay detecting both the IgE- and non-IgE-type triggers through the study of degranulation in a centrifugal microfluidic system is highlighted.
NASA Astrophysics Data System (ADS)
Orlić, Ivica; Mekterović, Darko; Mekterović, Igor; Ivošević, Tatjana
2015-11-01
VIBA-Lab is a computer program originally developed by the author and co-workers at the National University of Singapore (NUS) as an interactive software package for simulation of Particle Induced X-ray Emission and Rutherford Backscattering Spectra. The original program is redeveloped to a VIBA-Lab 3.0 in which the user can perform semi-quantitative analysis by comparing simulated and measured spectra as well as simulate 2D elemental maps for a given 3D sample composition. The latest version has a new and more versatile user interface. It also has the latest data set of fundamental parameters such as Coster-Kronig transition rates, fluorescence yields, mass absorption coefficients and ionization cross sections for K and L lines in a wider energy range than the original program. Our short-term plan is to introduce routine for quantitative analysis for multiple PIXE and XRF excitations. VIBA-Lab is an excellent teaching tool for students and researchers in using PIXE and RBS techniques. At the same time the program helps when planning an experiment and when optimizing experimental parameters such as incident ions, their energy, detector specifications, filters, geometry, etc. By "running" a virtual experiment the user can test various scenarios until the optimal PIXE and BS spectra are obtained and in this way save a lot of expensive machine time.
NASA Technical Reports Server (NTRS)
Arias, Adriel (Inventor)
2016-01-01
The main objective of the Holodeck Testbed is to create a cost effective, realistic, and highly immersive environment that can be used to train astronauts, carry out engineering analysis, develop procedures, and support various operations tasks. Currently, the Holodeck testbed allows to step into a simulated ISS (International Space Station) and interact with objects; as well as, perform Extra Vehicular Activities (EVA) on the surface of the Moon or Mars. The Holodeck Testbed is using the products being developed in the Hybrid Reality Lab (HRL). The HRL is combining technologies related to merging physical models with photo-realistic visuals to create a realistic and highly immersive environment. The lab also investigates technologies and concepts that are needed to allow it to be integrated with other testbeds; such as, the gravity offload capability provided by the Active Response Gravity Offload System (ARGOS). My main two duties were to develop and animate models for use in the HRL environments and work on a new way to interface with computers using Brain Computer Interface (BCI) technology. On my first task, I was able to create precise computer virtual tool models (accurate down to the thousandths or hundredths of an inch). To make these tools even more realistic, I produced animations for these tools so they would have the same mechanical features as the tools in real life. The computer models were also used to create 3D printed replicas that will be outfitted with tracking sensors. The sensor will allow the 3D printed models to align precisely with the computer models in the physical world and provide people with haptic/tactile feedback while wearing a VR (Virtual Reality) headset and interacting with the tools. Getting close to the end of my internship the lab bought a professional grade 3D Scanner. With this, I was able to replicate more intricate tools at a much more time-effective rate. The second task was to investigate the use of BCI to control objects inside the hybrid reality ISS environment. This task looked at using an Electroencephalogram (EEG) headset to collect brain state data that could be mapped to commands that a computer could execute. On this Task, I had a setback with the hardware, which stopped working and was returned to the vendor for repair. However, I was still able to collect some data, was able to process it, and started to create correlation algorithms between the electrical patterns in the brain and the commands we wanted the computer to carry out. I also carried out a test to investigate the comfort of the headset if it is worn for a long time. The knowledge gained will benefit me in my future career. I learned how to use various modeling and programming tools that included Blender, Maya, Substance Painter, Artec Studio, Github, and Unreal Engine 4. I learned how to use a professional grade 3D scanner and 3D printer. On the BCI Project I learned about data mining and how to create correlation algorithms. I also supported various demos including a live demo of the hybrid reality lab capabilities at ComicPalooza. This internship has given me a good look into engineering at NASA. I developed a more thorough understanding of engineering and my overall confidence has grown. I have also realized that any problem can be fixed, if you try hard enough, and as an engineer it is your job to not only fix problems but to embrace coming up with solutions to those problems.
High Precision Prediction of Functional Sites in Protein Structures
Buturovic, Ljubomir; Wong, Mike; Tang, Grace W.; Altman, Russ B.; Petkovic, Dragutin
2014-01-01
We address the problem of assigning biological function to solved protein structures. Computational tools play a critical role in identifying potential active sites and informing screening decisions for further lab analysis. A critical parameter in the practical application of computational methods is the precision, or positive predictive value. Precision measures the level of confidence the user should have in a particular computed functional assignment. Low precision annotations lead to futile laboratory investigations and waste scarce research resources. In this paper we describe an advanced version of the protein function annotation system FEATURE, which achieved 99% precision and average recall of 95% across 20 representative functional sites. The system uses a Support Vector Machine classifier operating on the microenvironment of physicochemical features around an amino acid. We also compared performance of our method with state-of-the-art sequence-level annotator Pfam in terms of precision, recall and localization. To our knowledge, no other functional site annotator has been rigorously evaluated against these key criteria. The software and predictive models are incorporated into the WebFEATURE service at http://feature.stanford.edu/wf4.0-beta. PMID:24632601
A Reduced Order Model for Whole-Chip Thermal Analysis of Microfluidic Lab-on-a-Chip Systems
Wang, Yi; Song, Hongjun; Pant, Kapil
2013-01-01
This paper presents a Krylov subspace projection-based Reduced Order Model (ROM) for whole microfluidic chip thermal analysis, including conjugate heat transfer. Two key steps in the reduced order modeling procedure are described in detail, including (1) the acquisition of a 3D full-scale computational model in the state-space form to capture the dynamic thermal behavior of the entire microfluidic chip; and (2) the model order reduction using the Block Arnoldi algorithm to markedly lower the dimension of the full-scale model. Case studies using practically relevant thermal microfluidic chip are undertaken to establish the capability and to evaluate the computational performance of the reduced order modeling technique. The ROM is compared against the full-scale model and exhibits good agreement in spatiotemporal thermal profiles (<0.5% relative error in pertinent time scales) and over three orders-of-magnitude acceleration in computational speed. The salient model reusability and real-time simulation capability renders it amenable for operational optimization and in-line thermal control and management of microfluidic systems and devices. PMID:24443647
Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen
2014-06-01
This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications.
Luján, J L; Crago, P E
2004-11-01
Neuroprosthestic systems can be used to restore hand grasp and wrist control in individuals with C5/C6 spinal cord injury. A computer-based system was developed for the implementation, tuning and clinical assessment of neuroprosthetic controllers, using off-the-shelf hardware and software. The computer system turned a Pentium III PC running Windows NT into a non-dedicated, real-time system for the control of neuroprostheses. Software execution (written using the high-level programming languages LabVIEW and MATLAB) was divided into two phases: training and real-time control. During the training phase, the computer system collected input/output data by stimulating the muscles and measuring the muscle outputs in real-time, analysed the recorded data, generated a set of training data and trained an artificial neural network (ANN)-based controller. During real-time control, the computer system stimulated the muscles using stimulus pulsewidths predicted by the ANN controller in response to a sampled input from an external command source, to provide independent control of hand grasp and wrist posture. System timing was stable, reliable and capable of providing muscle stimulation at frequencies up to 24Hz. To demonstrate the application of the test-bed, an ANN-based controller was implemented with three inputs and two independent channels of stimulation. The ANN controller's ability to control hand grasp and wrist angle independently was assessed by quantitative comparison of the outputs of the stimulated muscles with a set of desired grasp or wrist postures determined by the command signal. Controller performance results were mixed, but the platform provided the tools to implement and assess future controller designs.
Field-programmable lab-on-a-chip based on microelectrode dot array architecture.
Wang, Gary; Teng, Daniel; Lai, Yi-Tse; Lu, Yi-Wen; Ho, Yingchieh; Lee, Chen-Yi
2014-09-01
The fundamentals of electrowetting-on-dielectric (EWOD) digital microfluidics are very strong: advantageous capability in the manipulation of fluids, small test volumes, precise dynamic control and detection, and microscale systems. These advantages are very important for future biochip developments, but the development of EWOD microfluidics has been hindered by the absence of: integrated detector technology, standard commercial components, on-chip sample preparation, standard manufacturing technology and end-to-end system integration. A field-programmable lab-on-a-chip (FPLOC) system based on microelectrode dot array (MEDA) architecture is presented in this research. The MEDA architecture proposes a standard EWOD microfluidic component called 'microelectrode cell', which can be dynamically configured into microfluidic components to perform microfluidic operations of the biochip. A proof-of-concept prototype FPLOC, containing a 30 × 30 MEDA, was developed by using generic integrated circuits computer aided design tools, and it was manufactured with standard low-voltage complementary metal-oxide-semiconductor technology, which allows smooth on-chip integration of microfluidics and microelectronics. By integrating 900 droplet detection circuits into microelectrode cells, the FPLOC has achieved large-scale integration of microfluidics and microelectronics. Compared to the full-custom and bottom-up design methods, the FPLOC provides hierarchical top-down design approach, field-programmability and dynamic manipulations of droplets for advanced microfluidic operations.
Enabling Smart Manufacturing Research and Development using a Product Lifecycle Test Bed.
Helu, Moneer; Hedberg, Thomas
2015-01-01
Smart manufacturing technologies require a cyber-physical infrastructure to collect and analyze data and information across the manufacturing enterprise. This paper describes a concept for a product lifecycle test bed built on a cyber-physical infrastructure that enables smart manufacturing research and development. The test bed consists of a Computer-Aided Technologies (CAx) Lab and a Manufacturing Lab that interface through the product model creating a "digital thread" of information across the product lifecycle. The proposed structure and architecture of the test bed is presented, which highlights the challenges and requirements of implementing a cyber-physical infrastructure for manufacturing. The novel integration of systems across the product lifecycle also helps identify the technologies and standards needed to enable interoperability between design, fabrication, and inspection. Potential research opportunities enabled by the test bed are also discussed, such as providing publicly accessible CAx and manufacturing reference data, virtual factory data, and a representative industrial environment for creating, prototyping, and validating smart manufacturing technologies.
Enabling Smart Manufacturing Research and Development using a Product Lifecycle Test Bed
Helu, Moneer; Hedberg, Thomas
2017-01-01
Smart manufacturing technologies require a cyber-physical infrastructure to collect and analyze data and information across the manufacturing enterprise. This paper describes a concept for a product lifecycle test bed built on a cyber-physical infrastructure that enables smart manufacturing research and development. The test bed consists of a Computer-Aided Technologies (CAx) Lab and a Manufacturing Lab that interface through the product model creating a “digital thread” of information across the product lifecycle. The proposed structure and architecture of the test bed is presented, which highlights the challenges and requirements of implementing a cyber-physical infrastructure for manufacturing. The novel integration of systems across the product lifecycle also helps identify the technologies and standards needed to enable interoperability between design, fabrication, and inspection. Potential research opportunities enabled by the test bed are also discussed, such as providing publicly accessible CAx and manufacturing reference data, virtual factory data, and a representative industrial environment for creating, prototyping, and validating smart manufacturing technologies. PMID:28664167