NASA's computer science research program
NASA Technical Reports Server (NTRS)
Larsen, R. L.
1983-01-01
Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-07
... Rehabilitation Research--Disability and Rehabilitation Research Projects--Inclusive Cloud and Web Computing... Rehabilitation Research Projects (DRRPs)--Inclusive Cloud and Web Computing Notice inviting applications for new...#DRRP . Priorities: Priority 1--DRRP on Inclusive Cloud and Web Computing-- is from the notice of final...
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.
NASA Technical Reports Server (NTRS)
1988-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.
WE-B-BRD-01: Innovation in Radiation Therapy Planning II: Cloud Computing in RT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, K; Kagadis, G; Xing, L
As defined by the National Institute of Standards and Technology, cloud computing is “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” Despite the omnipresent role of computers in radiotherapy, cloud computing has yet to achieve widespread adoption in clinical or research applications, though the transition to such “on-demand” access is underway. As this transition proceeds, new opportunities for aggregate studies and efficient use of computational resources are set againstmore » new challenges in patient privacy protection, data integrity, and management of clinical informatics systems. In this Session, current and future applications of cloud computing and distributed computational resources will be discussed in the context of medical imaging, radiotherapy research, and clinical radiation oncology applications. Learning Objectives: Understand basic concepts of cloud computing. Understand how cloud computing could be used for medical imaging applications. Understand how cloud computing could be employed for radiotherapy research.4. Understand how clinical radiotherapy software applications would function in the cloud.« less
Some research advances in computer graphics that will enhance applications to engineering design
NASA Technical Reports Server (NTRS)
Allan, J. J., III
1975-01-01
Research in man/machine interactions and graphics hardware/software that will enhance applications to engineering design was described. Research aspects of executive systems, command languages, and networking used in the computer applications laboratory are mentioned. Finally, a few areas where little or no research is being done were identified.
NASA Technical Reports Server (NTRS)
1989-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.
A Test-Bed of Secure Mobile Cloud Computing for Military Applications
2016-09-13
searching databases. This kind of applications is a typical example of mobile cloud computing (MCC). MCC has lots of applications in the military...Release; Distribution Unlimited UU UU UU UU 13-09-2016 1-Aug-2014 31-Jul-2016 Final Report: A Test-bed of Secure Mobile Cloud Computing for Military...Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 Test-bed, Mobile Cloud Computing , Security, Military Applications REPORT
[Research Conducted at the Institute for Computer Applications in Science and Engineering
NASA Technical Reports Server (NTRS)
1997-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period 1 Oct. 1996 - 31 Mar. 1997.
Applied Computational Fluid Dynamics at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Holst, Terry L.; Kwak, Dochan (Technical Monitor)
1994-01-01
The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.
Computer Applications in Reading. Third Edition.
ERIC Educational Resources Information Center
Blanchard, Jay S.; And Others
Intended as a reference for researchers, teachers, and administrators, this book chronicles research, programs, and uses of computers in reading. Chapter 1 provides a broad view of computer applications in education, while Chapter 2 provides annotated references for computer based reading and language arts programs for children and adults in…
Computers in aeronautics and space research at the Lewis Research Center
NASA Technical Reports Server (NTRS)
1991-01-01
This brochure presents a general discussion of the role of computers in aerospace research at NASA's Lewis Research Center (LeRC). Four particular areas of computer applications are addressed: computer modeling and simulation, computer assisted engineering, data acquisition and analysis, and computer controlled testing.
Research in progress at the Institute for Computer Applications in Science and Engineering
NASA Technical Reports Server (NTRS)
1987-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1987 through October 1, 1987.
NASA Technical Reports Server (NTRS)
1993-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics and computer science during the period April 1, 1993 through September 30, 1993. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustic and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.
RAPPORT: running scientific high-performance computing applications on the cloud.
Cohen, Jeremy; Filippis, Ioannis; Woodbridge, Mark; Bauer, Daniela; Hong, Neil Chue; Jackson, Mike; Butcher, Sarah; Colling, David; Darlington, John; Fuchs, Brian; Harvey, Matt
2013-01-28
Cloud computing infrastructure is now widely used in many domains, but one area where there has been more limited adoption is research computing, in particular for running scientific high-performance computing (HPC) software. The Robust Application Porting for HPC in the Cloud (RAPPORT) project took advantage of existing links between computing researchers and application scientists in the fields of bioinformatics, high-energy physics (HEP) and digital humanities, to investigate running a set of scientific HPC applications from these domains on cloud infrastructure. In this paper, we focus on the bioinformatics and HEP domains, describing the applications and target cloud platforms. We conclude that, while there are many factors that need consideration, there is no fundamental impediment to the use of cloud infrastructure for running many types of HPC applications and, in some cases, there is potential for researchers to benefit significantly from the flexibility offered by cloud platforms.
Technological Applications in Science Assessment.
ERIC Educational Resources Information Center
Helgeson, Stanley L.; Kumar, David D.
Educational technology has been a focus of development and research in science teaching and learning. This document reviews research dealing with computer and hypermedia applications to assessment in science education. The paper reports the findings first for computer applications for assessment and then for hypermedia applications in assessment.…
Potential applications of computational fluid dynamics to biofluid analysis
NASA Technical Reports Server (NTRS)
Kwak, D.; Chang, J. L. C.; Rogers, S. E.; Rosenfeld, M.; Kwak, D.
1988-01-01
Computational fluid dynamics was developed to the stage where it has become an indispensable part of aerospace research and design. In view of advances made in aerospace applications, the computational approach can be used for biofluid mechanics research. Several flow simulation methods developed for aerospace problems are briefly discussed for potential applications to biofluids, especially to blood flow analysis.
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.
Computer Applications in Health Care. NCHSR Research Report Series.
ERIC Educational Resources Information Center
Medical Information Systems Cluster, Rockville, MD.
This NCHSR research program in the application of computers in health care--conducted over the ten year span 1968-1978--identified two areas of application research, an inpatient care support system, and an outpatient care support system. Both of these systems were conceived as conceptual frameworks for a related network of projects and ideas that…
DURIP: High Performance Computing in Biomathematics Applications
2017-05-10
Mathematics and Statistics (AMS) at the University of California, Santa Cruz (UCSC) to conduct research and research-related education in areas of...Computing in Biomathematics Applications Report Title The goal of this award was to enhance the capabilities of the Department of Applied Mathematics and...DURIP: High Performance Computing in Biomathematics Applications The goal of this award was to enhance the capabilities of the Department of Applied
NASA Technical Reports Server (NTRS)
Bushnell, Dennis M. (Technical Monitor)
2000-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, computer science, fluid mechanics, and structures and materials during the period October 1, 1999 through March 31, 2000.
Xie, Tianwu; Zaidi, Habib
2016-01-01
The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.
Emerging Uses of Computer Technology in Qualitative Research.
ERIC Educational Resources Information Center
Parker, D. Randall
The application of computer technology in qualitative research and evaluation ranges from simple word processing to doing sophisticated data sorting and retrieval. How computer software can be used for qualitative research is discussed. Researchers should consider the use of computers in data analysis in light of their own familiarity and comfort…
Distributed computing environments for future space control systems
NASA Technical Reports Server (NTRS)
Viallefont, Pierre
1993-01-01
The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Tianwu; Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch; Geneva Neuroscience Center, Geneva University, Geneva CH-1205
The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and themore » development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.« less
ERIC Educational Resources Information Center
Kieren, Thomas E.
This last paper in a set of four reviews research on a wide variety of computer applications in the mathematics classroom. It covers computer-based instruction, especially drill-and-practice and tutorial modes; computer-managed instruction; and computer-augmented problem-solving. Analytical comments on the findings and status of the research are…
Design, Development, and Evaluation of a Mobile Learning Application for Computing Education
ERIC Educational Resources Information Center
Oyelere, Solomon Sunday; Suhonen, Jarkko; Wajiga, Greg M.; Sutinen, Erkki
2018-01-01
The study focused on the application of the design science research approach in the course of developing a mobile learning application, MobileEdu, for computing education in the Nigerian higher education context. MobileEdu facilitates the learning of computer science courses on mobile devices. The application supports ubiquitous, collaborative,…
Why Don't All Professors Use Computers?
ERIC Educational Resources Information Center
Drew, David Eli
1989-01-01
Discusses the adoption of computer technology at universities and examines reasons why some professors don't use computers. Topics discussed include computer applications, including artificial intelligence, social science research, statistical analysis, and cooperative research; appropriateness of the technology for the task; the Computer Aptitude…
Research on OpenStack of open source cloud computing in colleges and universities’ computer room
NASA Astrophysics Data System (ADS)
Wang, Lei; Zhang, Dandan
2017-06-01
In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.
Institute for Computer Applications in Science and Engineering (ICASE)
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period April 1, 1983 through September 30, 1983 is summarized.
ERIC Educational Resources Information Center
Lai, Kwok-Wing
Designed to examine the application and cost-effectiveness of computer-assisted instruction (CAI) for secondary education in developing countries, this document is divided into eight chapters. A general introduction defines the research problem, describes the research methodology, and provides definitions of key terms used throughout the paper.…
Promayon, Emmanuel; Fouard, Céline; Bailet, Mathieu; Deram, Aurélien; Fiard, Gaëlle; Hungr, Nikolai; Luboz, Vincent; Payan, Yohan; Sarrazin, Johan; Saubat, Nicolas; Selmi, Sonia Yuki; Voros, Sandrine; Cinquin, Philippe; Troccaz, Jocelyne
2013-01-01
Computer Assisted Medical Intervention (CAMI hereafter) is a complex multi-disciplinary field. CAMI research requires the collaboration of experts in several fields as diverse as medicine, computer science, mathematics, instrumentation, signal processing, mechanics, modeling, automatics, optics, etc. CamiTK is a modular framework that helps researchers and clinicians to collaborate together in order to prototype CAMI applications by regrouping the knowledge and expertise from each discipline. It is an open-source, cross-platform generic and modular tool written in C++ which can handle medical images, surgical navigation, biomedicals simulations and robot control. This paper presents the Computer Assisted Medical Intervention ToolKit (CamiTK) and how it is used in various applications in our research team.
Activities of the Institute for Computer Applications in Science and Engineering
NASA Technical Reports Server (NTRS)
1985-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1985 through October 2, 1985 is summarized.
Sharing Research Models: Using Software Engineering Practices for Facilitation
Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.
2011-01-01
Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780
NASA Astrophysics Data System (ADS)
Liu, Shuai
Fractal represents a special feature of nature and functional objects. However, fractal based computing can be applied to many research domains because of its fixed property resisted deformation, variable parameters and many unpredictable changes. Theoretical research and practical application of fractal based computing have been hotspots for 30 years and will be continued. There are many pending issues awaiting solutions in this domain, thus this thematic issue containing 14 papers publishes the state-of-the-art developments in theorem and application of fractal based computing, including mathematical analysis and novel engineering applications. The topics contain fractal and multifractal features in application and solution of nonlinear odes and equation.
When cloud computing meets bioinformatics: a review.
Zhou, Shuigeng; Liao, Ruiqi; Guan, Jihong
2013-10-01
In the past decades, with the rapid development of high-throughput technologies, biology research has generated an unprecedented amount of data. In order to store and process such a great amount of data, cloud computing and MapReduce were applied to many fields of bioinformatics. In this paper, we first introduce the basic concepts of cloud computing and MapReduce, and their applications in bioinformatics. We then highlight some problems challenging the applications of cloud computing and MapReduce to bioinformatics. Finally, we give a brief guideline for using cloud computing in biology research.
Activities of the Institute for Computer Applications in Science and Engineering (ICASE)
NASA Technical Reports Server (NTRS)
1985-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1984 through March 31, 1985 is summarized.
Activities of the Institute for Computer Applications in Science and Engineering (ICASE)
NASA Technical Reports Server (NTRS)
1988-01-01
This report summarizes research conducted at the Institute for Computer Applications Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 2, 1987 through March 31, 1988.
[Activities of Institute for Computer Applications in Science and Engineering (ICASE)
NASA Technical Reports Server (NTRS)
1999-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics. fluid mechanics, and computer science during the period April 1, 1999 through September 30. 1999.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas
2012-07-14
The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively onmore » such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.« less
Research in Applied Mathematics, Fluid Mechanics and Computer Science
NASA Technical Reports Server (NTRS)
1999-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1998 through March 31, 1999.
[Research activities in applied mathematics, fluid mechanics, and computer science
NASA Technical Reports Server (NTRS)
1995-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period April 1, 1995 through September 30, 1995.
NASA Technical Reports Server (NTRS)
Smith, Paul H.
1988-01-01
The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.
Report on Computing and Networking in the Space Science Laboratory by the SSL Computer Committee
NASA Technical Reports Server (NTRS)
Gallagher, D. L. (Editor)
1993-01-01
The Space Science Laboratory (SSL) at Marshall Space Flight Center is a multiprogram facility. Scientific research is conducted in four discipline areas: earth science and applications, solar-terrestrial physics, astrophysics, and microgravity science and applications. Representatives from each of these discipline areas participate in a Laboratory computer requirements committee, which developed this document. The purpose is to establish and discuss Laboratory objectives for computing and networking in support of science. The purpose is also to lay the foundation for a collective, multiprogram approach to providing these services. Special recognition is given to the importance of the national and international efforts of our research communities toward the development of interoperable, network-based computer applications.
Computational Particle Dynamic Simulations on Multicore Processors (CPDMu) Final Report Phase I
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmalz, Mark S
2011-07-24
Statement of Problem - Department of Energy has many legacy codes for simulation of computational particle dynamics and computational fluid dynamics applications that are designed to run on sequential processors and are not easily parallelized. Emerging high-performance computing architectures employ massively parallel multicore architectures (e.g., graphics processing units) to increase throughput. Parallelization of legacy simulation codes is a high priority, to achieve compatibility, efficiency, accuracy, and extensibility. General Statement of Solution - A legacy simulation application designed for implementation on mainly-sequential processors has been represented as a graph G. Mathematical transformations, applied to G, produce a graph representation {und G}more » for a high-performance architecture. Key computational and data movement kernels of the application were analyzed/optimized for parallel execution using the mapping G {yields} {und G}, which can be performed semi-automatically. This approach is widely applicable to many types of high-performance computing systems, such as graphics processing units or clusters comprised of nodes that contain one or more such units. Phase I Accomplishments - Phase I research decomposed/profiled computational particle dynamics simulation code for rocket fuel combustion into low and high computational cost regions (respectively, mainly sequential and mainly parallel kernels), with analysis of space and time complexity. Using the research team's expertise in algorithm-to-architecture mappings, the high-cost kernels were transformed, parallelized, and implemented on Nvidia Fermi GPUs. Measured speedups (GPU with respect to single-core CPU) were approximately 20-32X for realistic model parameters, without final optimization. Error analysis showed no loss of computational accuracy. Commercial Applications and Other Benefits - The proposed research will constitute a breakthrough in solution of problems related to efficient parallel computation of particle and fluid dynamics simulations. These problems occur throughout DOE, military and commercial sectors: the potential payoff is high. We plan to license or sell the solution to contractors for military and domestic applications such as disaster simulation (aerodynamic and hydrodynamic), Government agencies (hydrological and environmental simulations), and medical applications (e.g., in tomographic image reconstruction). Keywords - High-performance Computing, Graphic Processing Unit, Fluid/Particle Simulation. Summary for Members of Congress - Department of Energy has many simulation codes that must compute faster, to be effective. The Phase I research parallelized particle/fluid simulations for rocket combustion, for high-performance computing systems.« less
Computational toxicology is a new research initiative being developed within the Office of Research and Development (ORD) of the US Environmental Protection Agency (EPA). Operationally, it is defined as the application of mathematical and computer models together with molecular c...
Computational structural mechanics methods research using an evolving framework
NASA Technical Reports Server (NTRS)
Knight, N. F., Jr.; Lotts, C. G.; Gillian, R. E.
1990-01-01
Advanced structural analysis and computational methods that exploit high-performance computers are being developed in a computational structural mechanics research activity sponsored by the NASA Langley Research Center. These new methods are developed in an evolving framework and applied to representative complex structural analysis problems from the aerospace industry. An overview of the methods development environment is presented, and methods research areas are described. Selected application studies are also summarized.
Center for Advanced Computational Technology
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
2000-01-01
The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.
Computational mechanics and physics at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
South, Jerry C., Jr.
1987-01-01
An overview is given of computational mechanics and physics at NASA Langley Research Center. Computational analysis is a major component and tool in many of Langley's diverse research disciplines, as well as in the interdisciplinary research. Examples are given for algorithm development and advanced applications in aerodynamics, transition to turbulence and turbulence simulation, hypersonics, structures, and interdisciplinary optimization.
Uncover the Cloud for Geospatial Sciences and Applications to Adopt Cloud Computing
NASA Astrophysics Data System (ADS)
Yang, C.; Huang, Q.; Xia, J.; Liu, K.; Li, J.; Xu, C.; Sun, M.; Bambacus, M.; Xu, Y.; Fay, D.
2012-12-01
Cloud computing is emerging as the future infrastructure for providing computing resources to support and enable scientific research, engineering development, and application construction, as well as work force education. On the other hand, there is a lot of doubt about the readiness of cloud computing to support a variety of scientific research, development and educations. This research is a project funded by NASA SMD to investigate through holistic studies how ready is the cloud computing to support geosciences. Four applications with different computing characteristics including data, computing, concurrent, and spatiotemporal intensities are taken to test the readiness of cloud computing to support geosciences. Three popular and representative cloud platforms including Amazon EC2, Microsoft Azure, and NASA Nebula as well as a traditional cluster are utilized in the study. Results illustrates that cloud is ready to some degree but more research needs to be done to fully implemented the cloud benefit as advertised by many vendors and defined by NIST. Specifically, 1) most cloud platform could help stand up new computing instances, a new computer, in a few minutes as envisioned, therefore, is ready to support most computing needs in an on demand fashion; 2) the load balance and elasticity, a defining characteristic, is ready in some cloud platforms, such as Amazon EC2, to support bigger jobs, e.g., needs response in minutes, while some are not ready to support the elasticity and load balance well. All cloud platform needs further research and development to support real time application at subminute level; 3) the user interface and functionality of cloud platforms vary a lot and some of them are very professional and well supported/documented, such as Amazon EC2, some of them needs significant improvement for the general public to adopt cloud computing without professional training or knowledge about computing infrastructure; 4) the security is a big concern in cloud computing platform, with the sharing spirit of cloud computing, it is very hard to ensure higher level security, except a private cloud is built for a specific organization without public access, public cloud platform does not support FISMA medium level yet and may never be able to support FISMA high level; 5) HPC jobs needs of cloud computing is not well supported and only Amazon EC2 supports this well. The research is being taken by NASA and other agencies to consider cloud computing adoption. We hope the publication of the research would also benefit the public to adopt cloud computing.
COMPUTATIONAL TOXICOLOGY: FRAMEWORK, PARTNERSHIPS, AND PROGRAM DEVELOPMENT
Computational toxicology is a new research initiative being developed within the Office of Research and Development (ORD) of the US Environmental Protection Agency (EPA). Operationally, it is defined as the application of mathematical and computer models together with molecular c...
ERIC Educational Resources Information Center
Liao, Yuen-kuang Cliff; Chang, Huei-wen; Chen, Yu-wen
2008-01-01
A meta-analysis was performed to synthesize existing research comparing the effects of computer applications (i.e., computer-assisted instruction, computer simulations, and Web-based learning) versus traditional instruction on elementary school students' achievement in Taiwan. Forty-eight studies were located from four sources, and their…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farbin, Amir
2015-07-15
This is the final report of for DoE Early Career Research Program Grant Titled "Model-Independent Dark-Matter Searches at the ATLAS Experiment and Applications of Many-core Computing to High Energy Physics".
A Brain-Computer Interface Project Applied in Computer Engineering
ERIC Educational Resources Information Center
Katona, Jozsef; Kovari, Attila
2016-01-01
Keeping up with novel methods and keeping abreast of new applications are crucial issues in engineering education. In brain research, one of the most significant research areas in recent decades, many developments have application in both modern engineering technology and education. New measurement methods in the observation of brain activity open…
Boutiques: a flexible framework to integrate command-line applications in computing platforms.
Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C
2018-05-01
We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science.
Research on the application in disaster reduction for using cloud computing technology
NASA Astrophysics Data System (ADS)
Tao, Liang; Fan, Yida; Wang, Xingling
Cloud Computing technology has been rapidly applied in different domains recently, promotes the progress of the domain's informatization. Based on the analysis of the state of application requirement in disaster reduction and combining the characteristics of Cloud Computing technology, we present the research on the application of Cloud Computing technology in disaster reduction. First of all, we give the architecture of disaster reduction cloud, which consists of disaster reduction infrastructure as a service (IAAS), disaster reduction cloud application platform as a service (PAAS) and disaster reduction software as a service (SAAS). Secondly, we talk about the standard system of disaster reduction in five aspects. Thirdly, we indicate the security system of disaster reduction cloud. Finally, we draw a conclusion the use of cloud computing technology will help us to solve the problems for disaster reduction and promote the development of disaster reduction.
Computational Fluid Dynamics Program at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Holst, Terry L.
1989-01-01
The Computational Fluid Dynamics (CFD) Program at NASA Ames Research Center is reviewed and discussed. The technical elements of the CFD Program are listed and briefly discussed. These elements include algorithm research, research and pilot code development, scientific visualization, advanced surface representation, volume grid generation, and numerical optimization. Next, the discipline of CFD is briefly discussed and related to other areas of research at NASA Ames including experimental fluid dynamics, computer science research, computational chemistry, and numerical aerodynamic simulation. These areas combine with CFD to form a larger area of research, which might collectively be called computational technology. The ultimate goal of computational technology research at NASA Ames is to increase the physical understanding of the world in which we live, solve problems of national importance, and increase the technical capabilities of the aerospace community. Next, the major programs at NASA Ames that either use CFD technology or perform research in CFD are listed and discussed. Briefly, this list includes turbulent/transition physics and modeling, high-speed real gas flows, interdisciplinary research, turbomachinery demonstration computations, complete aircraft aerodynamics, rotorcraft applications, powered lift flows, high alpha flows, multiple body aerodynamics, and incompressible flow applications. Some of the individual problems actively being worked in each of these areas is listed to help define the breadth or extent of CFD involvement in each of these major programs. State-of-the-art examples of various CFD applications are presented to highlight most of these areas. The main emphasis of this portion of the presentation is on examples which will not otherwise be treated at this conference by the individual presentations. Finally, a list of principal current limitations and expected future directions is given.
NASA Technical Reports Server (NTRS)
1985-01-01
Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.
A survey of GPU-based medical image computing techniques
Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming
2012-01-01
Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080
D'Alessandro, M P; Ackerman, M J; Sparks, S M
1993-11-01
Educational Technology Network (ET Net) is a free, easy to use, on-line computer conferencing system organized and funded by the National Library of Medicine that is accessible via the SprintNet (SprintNet, Reston, VA) and Internet (Merit, Ann Arbor, MI) computer networks. It is dedicated to helping bring together, in a single continuously running electronic forum, developers and users of computer applications in the health sciences, including radiology. ET Net uses the Caucus computer conferencing software (Camber-Roth, Troy, NY) running on a microcomputer. This microcomputer is located in the National Library of Medicine's Lister Hill National Center for Biomedical Communications and is directly connected to the SprintNet and the Internet networks. The advanced computer conferencing software of ET Net allows individuals who are separated in space and time to unite electronically to participate, at any time, in interactive discussions on applications of computers in radiology. A computer conferencing system such as ET Net allows radiologists to maintain contact with colleagues on a regular basis when they are not physically together. Topics of discussion on ET Net encompass all applications of computers in radiological practice, research, and education. ET Net has been in successful operation for 3 years and has a promising future aiding radiologists in the exchange of information pertaining to applications of computers in radiology.
Cost Optimization Model for Business Applications in Virtualized Grid Environments
NASA Astrophysics Data System (ADS)
Strebel, Jörg
The advent of Grid computing gives enterprises an ever increasing choice of computing options, yet research has so far hardly addressed the problem of mixing the different computing options in a cost-minimal fashion. The following paper presents a comprehensive cost model and a mixed integer optimization model which can be used to minimize the IT expenditures of an enterprise and help in decision-making when to outsource certain business software applications. A sample scenario is analyzed and promising cost savings are demonstrated. Possible applications of the model to future research questions are outlined.
Research in applied mathematics, numerical analysis, and computer science
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.
An overview of computer-based natural language processing
NASA Technical Reports Server (NTRS)
Gevarter, W. B.
1983-01-01
Computer based Natural Language Processing (NLP) is the key to enabling humans and their computer based creations to interact with machines in natural language (like English, Japanese, German, etc., in contrast to formal computer languages). The doors that such an achievement can open have made this a major research area in Artificial Intelligence and Computational Linguistics. Commercial natural language interfaces to computers have recently entered the market and future looks bright for other applications as well. This report reviews the basic approaches to such systems, the techniques utilized, applications, the state of the art of the technology, issues and research requirements, the major participants and finally, future trends and expectations. It is anticipated that this report will prove useful to engineering and research managers, potential users, and others who will be affected by this field as it unfolds.
Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Bartels, R. E.
2008-01-01
NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.
Applications of computer-aided text analysis in natural resources.
David N. Bengston
2000-01-01
Ten contributed papers describe the use of a variety of approaches to computer-aided text analysis and their application to a wide range of research questions related to natural resources and the environment. Taken together, these papers paint a picture of a growing and vital area of research on the human dimensions of natural resource management.
Application of Computational Toxicology to Prospective and Diagnostic Ecological Risk Assessment
Application of Computational Toxicology to Prospective and Diagnostic Ecological Risk Assessment (Presented by: Dan Villeneuve, Ph.D., Research Toxicologist, US-EPA Mid-Continent Ecology Division) (1/24/2013)
Applications of computational modeling in ballistics
NASA Technical Reports Server (NTRS)
Sturek, Walter B.
1987-01-01
The development of the technology of ballistics as applied to gun launched Army weapon systems is the main objective of research at the U.S. Army Ballistic Research Laboratory (BRL). The primary research programs at the BRL consist of three major ballistic disciplines: exterior, interior, and terminal. The work done at the BRL in these areas was traditionally highly dependent on experimental testing. A considerable emphasis was placed on the development of computational modeling to augment the experimental testing in the development cycle; however, the impact of the computational modeling to this date is modest. With the availability of supercomputer computational resources recently installed at the BRL, a new emphasis on the application of computational modeling to ballistics technology is taking place. The major application areas are outlined which are receiving considerable attention at the BRL at present and to indicate the modeling approaches involved. An attempt was made to give some information as to the degree of success achieved and indicate the areas of greatest need.
Applications of complex systems theory in nursing education, research, and practice.
Clancy, Thomas R; Effken, Judith A; Pesut, Daniel
2008-01-01
The clinical and administrative processes in today's healthcare environment are becoming increasingly complex. Multiple providers, new technology, competition, and the growing ubiquity of information all contribute to the notion of health care as a complex system. A complex system (CS) is characterized by a highly connected network of entities (e.g., physical objects, people or groups of people) from which higher order behavior emerges. Research in the transdisciplinary field of CS has focused on the use of computational modeling and simulation as a methodology for analyzing CS behavior. The creation of virtual worlds through computer simulation allows researchers to analyze multiple variables simultaneously and begin to understand behaviors that are common regardless of the discipline. The application of CS principles, mediated through computer simulation, informs nursing practice of the benefits and drawbacks of new procedures, protocols and practices before having to actually implement them. The inclusion of new computational tools and their applications in nursing education is also gaining attention. For example, education in CSs and applied computational applications has been endorsed by The Institute of Medicine, the American Organization of Nurse Executives and the American Association of Colleges of Nursing as essential training of nurse leaders. The purpose of this article is to review current research literature regarding CS science within the context of expert practice and implications for the education of nurse leadership roles. The article focuses on 3 broad areas: CS defined, literature review and exemplars from CS research and applications of CS theory in nursing leadership education. The article also highlights the key role nursing informaticists play in integrating emerging computational tools in the analysis of complex nursing systems.
Computer Perspectives in Recreation.
ERIC Educational Resources Information Center
Haderlie, Brian M., Ed.
This publication describes applications and/or research involved with computer use for professionals in leisure, parks, and recreation. Papers presented are: (1) "Software in the Eighties: Information Exchange and Clearinghouse Applications" (Jeff A. Stuyt); (2) "Microcomputer Applications for the Manager of the Future" (Christine Z. Howe); (3)…
The Magellan Final Report on Cloud Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
,; Coghlan, Susan; Yelick, Katherine
The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computingmore » Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.« less
Application of CFD in Indonesian Research: A review
NASA Astrophysics Data System (ADS)
Ambarita, H.; Siregar, M. R.; Kishinami, K.; Daimaruya, M.; Kawai, H.
2018-04-01
Computational Fluid Dynamics (CFD) is a numerical method that solves fluid flow and related governing equations using a computational tool. The studies on CFD, its methodology and its application as a research tool, are increasing. In this study, application of CFD by Indonesian researcher is briefly reviewed. The main objective is to explore the characteristics of CFD applications in Indonesian researchers. Considering the size and reputation, this study uses Scopus publications indexed data base. All of the documents in Scopus related to CFD which is affiliated by at least one of Indonesian researcher are collected to be reviewed. Research topics, CFD method, and simulation results are reviewed in brief. The results show that there are 260 documents found in literature indexed by Scopus. These documents divided into research articles 125 titles, conference paper 135 titles, book 1 title and review 1 title. In the research articles, only limited researchers focused on the development of CFD methodology. Almost all of the articles focus on using CFD in a particular application, as a research tool, such as aircraft application, wind power and heat exchanger. The topics of the 125 research articles can be divided into 12 specific applications and 1 miscellaneous application. The most popular application is Heating Ventilating and Air Conditioning and followed by Reactor, Transportation and Heat Exchanger applications. The most popular commercial CFD code used is ANSYS Fluent and only several researchers use CFX.
Boutiques: a flexible framework to integrate command-line applications in computing platforms
Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C
2018-01-01
Abstract We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science. PMID:29718199
NASA Technical Reports Server (NTRS)
1994-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in the areas of (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving Langley facilities and scientists; and (4) computer science.
NASA Technical Reports Server (NTRS)
1994-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saffer, Shelley
2014-12-01
This is a final report of the DOE award DE-SC0001132, Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation. This document describes the achievements of the goals, and resulting research made possible by this award.
NASA Technical Reports Server (NTRS)
Huang, C. J.; Motard, R. L.
1978-01-01
The computing equipment in the engineering systems simulation laboratory of the Houston University Cullen College of Engineering is described and its advantages are summarized. The application of computer techniques in aerospace-related research psychology and in chemical, civil, electrical, industrial, and mechanical engineering is described in abstracts of 84 individual projects and in reprints of published reports. Research supports programs in acoustics, energy technology, systems engineering, and environment management as well as aerospace engineering.
CSM research: Methods and application studies
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.
1989-01-01
Computational mechanics is that discipline of applied science and engineering devoted to the study of physical phenomena by means of computational methods based on mathematical modeling and simulation, utilizing digital computers. The discipline combines theoretical and applied mechanics, approximation theory, numerical analysis, and computer science. Computational mechanics has had a major impact on engineering analysis and design. When applied to structural mechanics, the discipline is referred to herein as computational structural mechanics. Complex structures being considered by NASA for the 1990's include composite primary aircraft structures and the space station. These structures will be much more difficult to analyze than today's structures and necessitate a major upgrade in computerized structural analysis technology. NASA has initiated a research activity in structural analysis called Computational Structural Mechanics (CSM). The broad objective of the CSM activity is to develop advanced structural analysis technology that will exploit modern and emerging computers, such as those with vector and/or parallel processing capabilities. Here, the current research directions for the Methods and Application Studies Team of the Langley CSM activity are described.
Cloud computing applications for biomedical science: A perspective.
Navale, Vivek; Bourne, Philip E
2018-06-01
Biomedical research has become a digital data-intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research.
Cloud computing applications for biomedical science: A perspective
2018-01-01
Biomedical research has become a digital data–intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research. PMID:29902176
1983-09-01
AD-Ali33 592 ARTIFICIAL INTELLIGENCE: AN ANALYSIS OF POTENTIAL 1/1 APPLICATIONS TO TRAININ..(U) DENVER RESEARCH INST CO JRICHARDSON SEP 83 AFHRL-TP...83-28 b ’ 3 - 4. TITLE (aied Suhkie) 5. TYPE OF REPORT & PERIOD COVERED ARTIFICIAL INTEL11GENCE: AN ANALYSIS OF Interim POTENTIAL APPLICATIONS TO...8217 sde if neceseamy end ides*f by black naumber) artificial intelligence military research * computer-aided diagnosis performance tests computer
ERIC Educational Resources Information Center
Forwood, Bruce S.
This bibliography has been produced as part of a research program attempting to develop a new approach to building environment and service systems design using computer-aided design techniques. As such it not only classifies available literature on the service systems themselves, but also contains sections on the application of computers and…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-05
... related to the development and application of cloud computing for people with disabilities. Cloud computing offers the potential to provide accommodations that enable people with disabilities to access...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-30
... activities related to the development and application of cloud computing for people with disabilities. Cloud computing offers the potential to provide accommodations that enable people with disabilities to access...
Commodity Cluster Computing for Remote Sensing Applications using Red Hat LINUX
NASA Technical Reports Server (NTRS)
Dorband, John
2003-01-01
Since 1994, we have been doing research at Goddard Space Flight Center on implementing a wide variety of applications on commodity based computing clusters. This talk is about these clusters and haw they are used on these applications including ones for remote sensing.
An Analysis of Graduate Nursing Students' Innovation-Decision Process
Kacynski, Kathryn A.; Roy, Katrina D.
1984-01-01
This study's purpose was to examine the innovation-decision process used by graduate nursing students when deciding to use computer applications. Graduate nursing students enrolled in a mandatory research class were surveyed before and after their use of a mainframe computer for beginning data analysis about their general attitudes towards computers, individual characteristics such as “cosmopoliteness”, and their desire to learn more about a computer application. It was expected that an experimental intervention, a videotaped demonstration of interactive video instruction of cardiopulmonary resuscitation (CPR); previous computer experience; and the subject's “cosmopoliteness” wolud influence attitudes towards computers and the desire to learn more about a computer application.
Cloud computing in medical imaging.
Kagadis, George C; Kloukinas, Christos; Moore, Kevin; Philbin, Jim; Papadimitroulas, Panagiotis; Alexakos, Christos; Nagy, Paul G; Visvikis, Dimitris; Hendee, William R
2013-07-01
Over the past century technology has played a decisive role in defining, driving, and reinventing procedures, devices, and pharmaceuticals in healthcare. Cloud computing has been introduced only recently but is already one of the major topics of discussion in research and clinical settings. The provision of extensive, easily accessible, and reconfigurable resources such as virtual systems, platforms, and applications with low service cost has caught the attention of many researchers and clinicians. Healthcare researchers are moving their efforts to the cloud, because they need adequate resources to process, store, exchange, and use large quantities of medical data. This Vision 20/20 paper addresses major questions related to the applicability of advanced cloud computing in medical imaging. The paper also considers security and ethical issues that accompany cloud computing.
Summary of research in progress at ICASE
NASA Technical Reports Server (NTRS)
1993-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1992 through March 31, 1993.
Real science at the petascale.
Saksena, Radhika S; Boghosian, Bruce; Fazendeiro, Luis; Kenway, Owain A; Manos, Steven; Mazzeo, Marco D; Sadiq, S Kashif; Suter, James L; Wright, David; Coveney, Peter V
2009-06-28
We describe computational science research that uses petascale resources to achieve scientific results at unprecedented scales and resolution. The applications span a wide range of domains, from investigation of fundamental problems in turbulence through computational materials science research to biomedical applications at the forefront of HIV/AIDS research and cerebrovascular haemodynamics. This work was mainly performed on the US TeraGrid 'petascale' resource, Ranger, at Texas Advanced Computing Center, in the first half of 2008 when it was the largest computing system in the world available for open scientific research. We have sought to use this petascale supercomputer optimally across application domains and scales, exploiting the excellent parallel scaling performance found on up to at least 32 768 cores for certain of our codes in the so-called 'capability computing' category as well as high-throughput intermediate-scale jobs for ensemble simulations in the 32-512 core range. Furthermore, this activity provides evidence that conventional parallel programming with MPI should be successful at the petascale in the short to medium term. We also report on the parallel performance of some of our codes on up to 65 636 cores on the IBM Blue Gene/P system at the Argonne Leadership Computing Facility, which has recently been named the fastest supercomputer in the world for open science.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Senate Committee on Commerce, Science, and Transportation.
This committee report is intended to accompany S. 1067, a bill designed to provide for a coordinated federal research program in high-performance computing (HPC). The primary objective of the legislation is given as the acceleration of research, development, and application of the most advanced computing technology in research, education, and…
Computer Gaming at Every Age: A Comparative Evaluation of Alice
ERIC Educational Resources Information Center
Seals, Cheryl D.; McMillian, Yolanda; Rouse, Kenneth; Agarwal, Ravikant; Johnson, Andrea Williams; Gilbert, Juan E.; Chapman, Richard
2008-01-01
This research has two thrusts of teaching object oriented programming to very young audiences and of increasing student excitement about computing applications with the long-term goal of increasing involvement in technology classes, in the use of computer applications and interest in technology careers. The goal of this work was to provide…
NASA Technical Reports Server (NTRS)
1992-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, fluid mechanics including fluid dynamics, acoustics, and combustion, aerodynamics, and computer science during the period 1 Apr. 1992 - 30 Sep. 1992 is summarized.
Research on optimal control, stabilization and computational algorithms for aerospace applications
NASA Technical Reports Server (NTRS)
Athans, M.
1985-01-01
The research carried out in the areas of optimal control and estimation theory and its applications under this grant is reviewed. A listing of the 257 publications that document the research results is presented.
Research on optimal control, stabilization and computational algorithms for aerospace applications
NASA Technical Reports Server (NTRS)
Athans, M.
1984-01-01
The research carried out in the areas of optimal control and estimation theory and its applications under this grant is reviewed. A listing of the 257 publications that document the research results is presented.
Computer technology applications in industrial and organizational psychology.
Crespin, Timothy R; Austin, James T
2002-08-01
This article reviews computer applications developed and utilized by industrial-organizational (I-O) psychologists, both in practice and in research. A primary emphasis is on applications developed for Internet usage, because this "network of networks" changes the way I-O psychologists work. The review focuses on traditional and emerging topics in I-O psychology. The first topic involves information technology applications in measurement, defined broadly across levels of analysis (persons, groups, organizations) and domains (abilities, personality, attitudes). Discussion then focuses on individual learning at work, both in formal training and in coping with continual automation of work. A section on job analysis follows, illustrating the role of computers and the Internet in studying jobs. Shifting focus to the group level of analysis, we briefly review how information technology is being used to understand and support cooperative work. Finally, special emphasis is given to the emerging "third discipline" in I-O psychology research-computational modeling of behavioral events in organizations. Throughout this review, themes of innovation and dissemination underlie a continuum between research and practice. The review concludes by setting a framework for I-O psychology in a computerized and networked world.
NASA Technical Reports Server (NTRS)
Gillian, Ronnie E.; Lotts, Christine G.
1988-01-01
The Computational Structural Mechanics (CSM) Activity at Langley Research Center is developing methods for structural analysis on modern computers. To facilitate that research effort, an applications development environment has been constructed to insulate the researcher from the many computer operating systems of a widely distributed computer network. The CSM Testbed development system was ported to the Numerical Aerodynamic Simulator (NAS) Cray-2, at the Ames Research Center, to provide a high end computational capability. This paper describes the implementation experiences, the resulting capability, and the future directions for the Testbed on supercomputers.
1983-09-01
Report Al-TR-346. Artifcial Intelligence Laboratory, Mamachusetts Institute of Tech- niugy. Cambridge, Mmeh mett. June 19 [G.usmn@ A. Gaman-Arenas...Testbed Coordinator, 415/859-4395 Artificial Intelligence Center Computer Science and Technology Division Prepared for: Defense Advanced Research...to support processing of aerial photographs for such military applications as cartography, Intelligence , weapon guidance, and targeting. A key
Project JOVE. [microgravity experiments and applications
NASA Technical Reports Server (NTRS)
Lyell, M. J.
1994-01-01
The goal of this project is to investigate new areas of research pertaining to free surface-interface fluids mechanics and/or microgravity which have potential commercial applications. This paper presents an introduction to ferrohydrodynamics (FHD), and discusses some applications. Also, computational methods for solving free surface flow problems are presented in detail. Both have diverse applications in industry and in microgravity fluids applications. Three different modeling schemes for FHD flows are addressed and the governing equations, including Maxwell's equations, are introduced. In the area of computational modeling of free surface flows, both Eulerian and Lagrangian schemes are discussed. The state of the art in computational methods applied to free surface flows is elucidated. In particular, adaptive grids and re-zoning methods are discussed. Additional research results are addressed and copies of the publications produced under the JOVE Project are included.
Asymmetric Core Computing for U.S. Army High-Performance Computing Applications
2009-04-01
Playstation 4 (should one be announced). 8 4.2 FPGAs Reconfigurable computing refers to performing computations using Field Programmable Gate Arrays...2008 4 . TITLE AND SUBTITLE Asymmetric Core Computing for U.S. Army High-Performance Computing Applications 5a. CONTRACT NUMBER 5b. GRANT NUMBER...Acknowledgments vi 1. Introduction 1 2. Relevant Technologies 2 3. Technical Approach 5 4 . Research and Development Highlights 7 4.1 Cell
Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.
2016-01-01
An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.
CSM Testbed Development and Large-Scale Structural Applications
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.
1989-01-01
A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.
Applied Information Systems Research Program (AISRP). Workshop 2: Meeting Proceedings
NASA Technical Reports Server (NTRS)
1992-01-01
The Earth and space science participants were able to see where the current research can be applied in their disciplines and computer science participants could see potential areas for future application of computer and information systems research. The Earth and Space Science research proposals for the High Performance Computing and Communications (HPCC) program were under evaluation. Therefore, this effort was not discussed at the AISRP Workshop. OSSA's other high priority area in computer science is scientific visualization, with the entire second day of the workshop devoted to it.
The applications of computers in biological research
NASA Technical Reports Server (NTRS)
Wei, Jennifer
1988-01-01
Research in many fields could not be done without computers. There is often a great deal of technical data, even in the biological fields, that need to be analyzed. These data, unfortunately, previously absorbed much of every researcher's time. Now, due to the steady increase in computer technology, biological researchers are able to make incredible advances in their work without the added worries of tedious and difficult tasks such as the many mathematical calculations involved in today's research and health care.
Proposed Directions for Research in Computer-Based Education.
ERIC Educational Resources Information Center
Waugh, Michael L.
Several directions for potential research efforts in the field of computer-based education (CBE) are discussed. (For the purposes of this paper, CBE is defined as any use of computers to promote learning with no intended inference as to the specific nature or organization of the educational application under discussion.) Efforts should be directed…
The research of computer multimedia assistant in college English listening
NASA Astrophysics Data System (ADS)
Zhang, Qian
2012-04-01
With the technology development of network information, there exists more and more seriously questions to our education. Computer multimedia application breaks the traditional foreign language teaching and brings new challenges and opportunities for the education. Through the multiple media application, the teaching process is full of animation, image, voice, and characters. This can improve the learning initiative and objective with great development of learning efficiency. During the traditional foreign language teaching, people use characters learning. However, through this method, the theory performance is good but the practical application is low. During the long time computer multimedia application in the foreign language teaching, many teachers still have prejudice. Therefore, the method is not obtaining the effect. After all the above, the research has significant meaning for improving the teaching quality of foreign language.
[Research on the Application of Fuzzy Logic to Systems Analysis and Control
NASA Technical Reports Server (NTRS)
1998-01-01
Research conducted with the support of NASA Grant NCC2-275 has been focused in the main on the development of fuzzy logic and soft computing methodologies and their applications to systems analysis and control. with emphasis 011 problem areas which are of relevance to NASA's missions. One of the principal results of our research has been the development of a new methodology called Computing with Words (CW). Basically, in CW words drawn from a natural language are employed in place of numbers for computing and reasoning. There are two major imperatives for computing with words. First, computing with words is a necessity when the available information is too imprecise to justify the use of numbers, and second, when there is a tolerance for imprecision which can be exploited to achieve tractability, robustness, low solution cost, and better rapport with reality. Exploitation of the tolerance for imprecision is an issue of central importance in CW.
NASA Astrophysics Data System (ADS)
Veltri, Pierangelo
The use of computer based solutions for data management in biology and clinical science has contributed to improve life-quality and also to gather research results in shorter time. Indeed, new algorithms and high performance computation have been using in proteomics and genomics studies for curing chronic diseases (e.g., drug designing) as well as supporting clinicians both in diagnosis (e.g., images-based diagnosis) and patient curing (e.g., computer based information analysis on information gathered from patient). In this paper we survey on examples of computer based techniques applied in both biology and clinical contexts. The reported applications are also results of experiences in real case applications at University Medical School of Catanzaro and also part of experiences of the National project Staywell SH 2.0 involving many research centers and companies aiming to study and improve citizen wellness.
Focus issue: series on computational and systems biology.
Gough, Nancy R
2011-09-06
The application of computational biology and systems biology is yielding quantitative insight into cellular regulatory phenomena. For the month of September, Science Signaling highlights research featuring computational approaches to understanding cell signaling and investigation of signaling networks, a series of Teaching Resources from a course in systems biology, and various other articles and resources relevant to the application of computational biology and systems biology to the study of signal transduction.
Final Report. Center for Scalable Application Development Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mellor-Crummey, John
2014-10-26
The Center for Scalable Application Development Software (CScADS) was established as a part- nership between Rice University, Argonne National Laboratory, University of California Berkeley, University of Tennessee – Knoxville, and University of Wisconsin – Madison. CScADS pursued an integrated set of activities with the aim of increasing the productivity of DOE computational scientists by catalyzing the development of systems software, libraries, compilers, and tools for leadership computing platforms. Principal Center activities were workshops to engage the research community in the challenges of leadership computing, research and development of open-source software, and work with computational scientists to help them develop codesmore » for leadership computing platforms. This final report summarizes CScADS activities at Rice University in these areas.« less
Human computer confluence applied in healthcare and rehabilitation.
Viaud-Delmon, Isabelle; Gaggioli, Andrea; Ferscha, Alois; Dunne, Stephen
2012-01-01
Human computer confluence (HCC) is an ambitious research program studying how the emerging symbiotic relation between humans and computing devices can enable radically new forms of sensing, perception, interaction, and understanding. It is an interdisciplinary field, bringing together researches from horizons as various as pervasive computing, bio-signals processing, neuroscience, electronics, robotics, virtual & augmented reality, and provides an amazing potential for applications in medicine and rehabilitation.
Computational and mathematical methods in brain atlasing.
Nowinski, Wieslaw L
2017-12-01
Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.
TRAINING AND RESEARCH PROGRAM IN COMPUTER APPLICATIONS.
ERIC Educational Resources Information Center
HUNKA, S.
TO MAKE EDUCATIONAL RESEARCHERS AND TEACHERS MORE AWARE OF THE VALUES OF ELECTRONIC AUTOMATION, THIS ARTICLE PROPOSES A TRAINING-RESEARCH PROGRAM USING THE IBM 360/67 AND THE IBM 1500 COMPUTERS. PARTICIPANTS WOULD BE SELECTED FROM (1) POST-DOCTORAL AND PROFESSIONAL UNIVERSITY STAFF MEMBERS ON SABBATICAL LEAVE WHOSE MAIN INTEREST IS EDUCATIONAL…
omniClassifier: a Desktop Grid Computing System for Big Data Prediction Modeling
Phan, John H.; Kothari, Sonal; Wang, May D.
2016-01-01
Robust prediction models are important for numerous science, engineering, and biomedical applications. However, best-practice procedures for optimizing prediction models can be computationally complex, especially when choosing models from among hundreds or thousands of parameter choices. Computational complexity has further increased with the growth of data in these fields, concurrent with the era of “Big Data”. Grid computing is a potential solution to the computational challenges of Big Data. Desktop grid computing, which uses idle CPU cycles of commodity desktop machines, coupled with commercial cloud computing resources can enable research labs to gain easier and more cost effective access to vast computing resources. We have developed omniClassifier, a multi-purpose prediction modeling application that provides researchers with a tool for conducting machine learning research within the guidelines of recommended best-practices. omniClassifier is implemented as a desktop grid computing system using the Berkeley Open Infrastructure for Network Computing (BOINC) middleware. In addition to describing implementation details, we use various gene expression datasets to demonstrate the potential scalability of omniClassifier for efficient and robust Big Data prediction modeling. A prototype of omniClassifier can be accessed at http://omniclassifier.bme.gatech.edu/. PMID:27532062
Computer Science Research at Langley
NASA Technical Reports Server (NTRS)
Voigt, S. J. (Editor)
1982-01-01
A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Dongwan; Claycomb, William R.; Urias, Vincent E.
Cloud computing is a paradigm rapidly being embraced by government and industry as a solution for cost-savings, scalability, and collaboration. While a multitude of applications and services are available commercially for cloud-based solutions, research in this area has yet to fully embrace the full spectrum of potential challenges facing cloud computing. This tutorial aims to provide researchers with a fundamental understanding of cloud computing, with the goals of identifying a broad range of potential research topics, and inspiring a new surge in research to address current issues. We will also discuss real implementations of research-oriented cloud computing systems for bothmore » academia and government, including configuration options, hardware issues, challenges, and solutions.« less
The application of computer image analysis in life sciences and environmental engineering
NASA Astrophysics Data System (ADS)
Mazur, R.; Lewicki, A.; Przybył, K.; Zaborowicz, M.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.
2014-04-01
The main aim of the article was to present research on the application of computer image analysis in Life Science and Environmental Engineering. The authors used different methods of computer image analysis in developing of an innovative biotest in modern biomonitoring of water quality. Created tools were based on live organisms such as bioindicators Lemna minor L. and Hydra vulgaris Pallas as well as computer image analysis method in the assessment of negatives reactions during the exposition of the organisms to selected water toxicants. All of these methods belong to acute toxicity tests and are particularly essential in ecotoxicological assessment of water pollutants. Developed bioassays can be used not only in scientific research but are also applicable in environmental engineering and agriculture in the study of adverse effects on water quality of various compounds used in agriculture and industry.
Design and implementation of space physics multi-model application integration based on web
NASA Astrophysics Data System (ADS)
Jiang, Wenping; Zou, Ziming
With the development of research on space environment and space science, how to develop network online computing environment of space weather, space environment and space physics models for Chinese scientific community is becoming more and more important in recent years. Currently, There are two software modes on space physics multi-model application integrated system (SPMAIS) such as C/S and B/S. the C/S mode which is traditional and stand-alone, demands a team or workshop from many disciplines and specialties to build their own multi-model application integrated system, that requires the client must be deployed in different physical regions when user visits the integrated system. Thus, this requirement brings two shortcomings: reducing the efficiency of researchers who use the models to compute; inconvenience of accessing the data. Therefore, it is necessary to create a shared network resource access environment which could help users to visit the computing resources of space physics models through the terminal quickly for conducting space science research and forecasting spatial environment. The SPMAIS develops high-performance, first-principles in B/S mode based on computational models of the space environment and uses these models to predict "Space Weather", to understand space mission data and to further our understanding of the solar system. the main goal of space physics multi-model application integration system (SPMAIS) is to provide an easily and convenient user-driven online models operating environment. up to now, the SPMAIS have contained dozens of space environment models , including international AP8/AE8 IGRF T96 models and solar proton prediction model geomagnetic transmission model etc. which are developed by Chinese scientists. another function of SPMAIS is to integrate space observation data sets which offers input data for models online high-speed computing. In this paper, service-oriented architecture (SOA) concept that divides system into independent modules according to different business needs is applied to solve the problem of the independence of the physical space between multiple models. The classic MVC(Model View Controller) software design pattern is concerned to build the architecture of space physics multi-model application integrated system. The JSP+servlet+javabean technology is used to integrate the web application programs of space physics multi-model. It solves the problem of multi-user requesting the same job of model computing and effectively balances each server computing tasks. In addition, we also complete follow tasks: establishing standard graphical user interface based on Java Applet application program; Designing the interface between model computing and model computing results visualization; Realizing three-dimensional network visualization without plug-ins; Using Java3D technology to achieve a three-dimensional network scene interaction; Improved ability to interact with web pages and dynamic execution capabilities, including rendering three-dimensional graphics, fonts and color control. Through the design and implementation of the SPMAIS based on Web, we provide an online computing and application runtime environment of space physics multi-model. The practical application improves that researchers could be benefit from our system in space physics research and engineering applications.
Joint the Center for Applied Scientific Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamblin, Todd; Bremer, Timo; Van Essen, Brian
The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.
Using high-performance networks to enable computational aerosciences applications
NASA Technical Reports Server (NTRS)
Johnson, Marjory J.
1992-01-01
One component of the U.S. Federal High Performance Computing and Communications Program (HPCCP) is the establishment of a gigabit network to provide a communications infrastructure for researchers across the nation. This gigabit network will provide new services and capabilities, in addition to increased bandwidth, to enable future applications. An understanding of these applications is necessary to guide the development of the gigabit network and other high-performance networks of the future. In this paper we focus on computational aerosciences applications run remotely using the Numerical Aerodynamic Simulation (NAS) facility located at NASA Ames Research Center. We characterize these applications in terms of network-related parameters and relate user experiences that reveal limitations imposed by the current wide-area networking infrastructure. Then we investigate how the development of a nationwide gigabit network would enable users of the NAS facility to work in new, more productive ways.
A network-based distributed, media-rich computing and information environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillips, R.L.
1995-12-31
Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to be a prototype National Information Infrastructure development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multi-media technologies, and data-mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and K-12 education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; (3) To define a new way of collaboration between computer science and industrially-relevant research.« less
Generic Divide and Conquer Internet-Based Computing
NASA Technical Reports Server (NTRS)
Radenski, Atanas; Follen, Gregory J. (Technical Monitor)
2001-01-01
The rapid growth of internet-based applications and the proliferation of networking technologies have been transforming traditional commercial application areas as well as computer and computational sciences and engineering. This growth stimulates the exploration of new, internet-oriented software technologies that can open new research and application opportunities not only for the commercial world, but also for the scientific and high -performance computing applications community. The general goal of this research project is to contribute to better understanding of the transition to internet-based high -performance computing and to develop solutions for some of the difficulties of this transition. More specifically, our goal is to design an architecture for generic divide and conquer internet-based computing, to develop a portable implementation of this architecture, to create an example library of high-performance divide-and-conquer computing agents that run on top of this architecture, and to evaluate the performance of these agents. We have been designing an architecture that incorporates a master task-pool server and utilizes satellite computational servers that operate on the Internet in a dynamically changing large configuration of lower-end nodes provided by volunteer contributors. Our designed architecture is intended to be complementary to and accessible from computational grids such as Globus, Legion, and Condor. Grids provide remote access to existing high-end computing resources; in contrast, our goal is to utilize idle processor time of lower-end internet nodes. Our project is focused on a generic divide-and-conquer paradigm and its applications that operate on a loose and ever changing pool of lower-end internet nodes.
eXascale PRogramming Environment and System Software (XPRESS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapman, Barbara; Gabriel, Edgar
Exascale systems, with a thousand times the compute capacity of today’s leading edge petascale computers, are expected to emerge during the next decade. Their software systems will need to facilitate the exploitation of exceptional amounts of concurrency in applications, and ensure that jobs continue to run despite the occurrence of system failures and other kinds of hard and soft errors. Adapting computations at runtime to cope with changes in the execution environment, as well as to improve power and performance characteristics, is likely to become the norm. As a result, considerable innovation is required to develop system support to meetmore » the needs of future computing platforms. The XPRESS project aims to develop and prototype a revolutionary software system for extreme-scale computing for both exascale and strongscaled problems. The XPRESS collaborative research project will advance the state-of-the-art in high performance computing and enable exascale computing for current and future DOE mission-critical applications and supporting systems. The goals of the XPRESS research project are to: A. enable exascale performance capability for DOE applications, both current and future, B. develop and deliver a practical computing system software X-stack, OpenX, for future practical DOE exascale computing systems, and C. provide programming methods and environments for effective means of expressing application and system software for portable exascale system execution.« less
A breakthrough for experiencing and understanding simulated physics
NASA Technical Reports Server (NTRS)
Watson, Val
1988-01-01
The use of computer simulation in physics research is discussed, focusing on improvements to graphic workstations. Simulation capabilities and applications of enhanced visualization tools are outlined. The elements of an ideal computer simulation are presented and the potential for improving various simulation elements is examined. The interface between the human and the computer and simulation models are considered. Recommendations are made for changes in computer simulation practices and applications of simulation technology in education.
Workshops of the Fifth International Brain-Computer Interface Meeting: Defining the Future.
Huggins, Jane E; Guger, Christoph; Allison, Brendan; Anderson, Charles W; Batista, Aaron; Brouwer, Anne-Marie A-M; Brunner, Clemens; Chavarriaga, Ricardo; Fried-Oken, Melanie; Gunduz, Aysegul; Gupta, Disha; Kübler, Andrea; Leeb, Robert; Lotte, Fabien; Miller, Lee E; Müller-Putz, Gernot; Rutkowski, Tomasz; Tangermann, Michael; Thompson, David Edward
2014-01-01
The Fifth International Brain-Computer Interface (BCI) Meeting met June 3-7 th , 2013 at the Asilomar Conference Grounds, Pacific Grove, California. The conference included 19 workshops covering topics in brain-computer interface and brain-machine interface research. Topics included translation of BCIs into clinical use, standardization and certification, types of brain activity to use for BCI, recording methods, the effects of plasticity, special interest topics in BCIs applications, and future BCI directions. BCI research is well established and transitioning to practical use to benefit people with physical impairments. At the same time, new applications are being explored, both for people with physical impairments and beyond. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and high-lighting important issues for future research and development.
Choosing a Computer Language for Institutional Research. The AIR Professional File No. 6.
ERIC Educational Resources Information Center
Strenglein, Denise
1980-01-01
It is suggested that much thought should be given to choosing an appropriate computer language for an institutional research office, considering the sophistication of the staff, types of planned application, size and type of computer, and availability of central programming support in the institution. For offices that prepare straight reports and…
Challenges and opportunities of cloud computing for atmospheric sciences
NASA Astrophysics Data System (ADS)
Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.
2016-04-01
Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.
Collaborative Working Architecture for IoT-Based Applications.
Mora, Higinio; Signes-Pont, María Teresa; Gil, David; Johnsson, Magnus
2018-05-23
The new sensing applications need enhanced computing capabilities to handle the requirements of complex and huge data processing. The Internet of Things (IoT) concept brings processing and communication features to devices. In addition, the Cloud Computing paradigm provides resources and infrastructures for performing the computations and outsourcing the work from the IoT devices. This scenario opens new opportunities for designing advanced IoT-based applications, however, there is still much research to be done to properly gear all the systems for working together. This work proposes a collaborative model and an architecture to take advantage of the available computing resources. The resulting architecture involves a novel network design with different levels which combines sensing and processing capabilities based on the Mobile Cloud Computing (MCC) paradigm. An experiment is included to demonstrate that this approach can be used in diverse real applications. The results show the flexibility of the architecture to perform complex computational tasks of advanced applications.
Experimental Evaluation and Workload Characterization for High-Performance Computer Architectures
NASA Technical Reports Server (NTRS)
El-Ghazawi, Tarek A.
1995-01-01
This research is conducted in the context of the Joint NSF/NASA Initiative on Evaluation (JNNIE). JNNIE is an inter-agency research program that goes beyond typical.bencbking to provide and in-depth evaluations and understanding of the factors that limit the scalability of high-performance computing systems. Many NSF and NASA centers have participated in the effort. Our research effort was an integral part of implementing JNNIE in the NASA ESS grand challenge applications context. Our research work under this program was composed of three distinct, but related activities. They include the evaluation of NASA ESS high- performance computing testbeds using the wavelet decomposition application; evaluation of NASA ESS testbeds using astrophysical simulation applications; and developing an experimental model for workload characterization for understanding workload requirements. In this report, we provide a summary of findings that covers all three parts, a list of the publications that resulted from this effort, and three appendices with the details of each of the studies using a key publication developed under the respective work.
Wei, Jyh-Da; Tsai, Ming-Hung; Lee, Gen-Cher; Huang, Jeng-Hung; Lee, Der-Tsai
2009-01-01
Algorithm visualization is a unique research topic that integrates engineering skills such as computer graphics, system programming, database management, computer networks, etc., to facilitate algorithmic researchers in testing their ideas, demonstrating new findings, and teaching algorithm design in the classroom. Within the broad applications of algorithm visualization, there still remain performance issues that deserve further research, e.g., system portability, collaboration capability, and animation effect in 3D environments. Using modern technologies of Java programming, we develop an algorithm visualization and debugging system, dubbed GeoBuilder, for geometric computing. The GeoBuilder system features Java's promising portability, engagement of collaboration in algorithm development, and automatic camera positioning for tracking 3D geometric objects. In this paper, we describe the design of the GeoBuilder system and demonstrate its applications.
Computational biology for cardiovascular biomarker discovery.
Azuaje, Francisco; Devaux, Yvan; Wagner, Daniel
2009-07-01
Computational biology is essential in the process of translating biological knowledge into clinical practice, as well as in the understanding of biological phenomena based on the resources and technologies originating from the clinical environment. One such key contribution of computational biology is the discovery of biomarkers for predicting clinical outcomes using 'omic' information. This process involves the predictive modelling and integration of different types of data and knowledge for screening, diagnostic or prognostic purposes. Moreover, this requires the design and combination of different methodologies based on statistical analysis and machine learning. This article introduces key computational approaches and applications to biomarker discovery based on different types of 'omic' data. Although we emphasize applications in cardiovascular research, the computational requirements and advances discussed here are also relevant to other domains. We will start by introducing some of the contributions of computational biology to translational research, followed by an overview of methods and technologies used for the identification of biomarkers with predictive or classification value. The main types of 'omic' approaches to biomarker discovery will be presented with specific examples from cardiovascular research. This will include a review of computational methodologies for single-source and integrative data applications. Major computational methods for model evaluation will be described together with recommendations for reporting models and results. We will present recent advances in cardiovascular biomarker discovery based on the combination of gene expression and functional network analyses. The review will conclude with a discussion of key challenges for computational biology, including perspectives from the biosciences and clinical areas.
Network and computing infrastructure for scientific applications in Georgia
NASA Astrophysics Data System (ADS)
Kvatadze, R.; Modebadze, Z.
2016-09-01
Status of network and computing infrastructure and available services for research and education community of Georgia are presented. Research and Educational Networking Association - GRENA provides the following network services: Internet connectivity, network services, cyber security, technical support, etc. Computing resources used by the research teams are located at GRENA and at major state universities. GE-01-GRENA site is included in European Grid infrastructure. Paper also contains information about programs of Learning Center and research and development projects in which GRENA is participating.
G2LC: Resources Autoscaling for Real Time Bioinformatics Applications in IaaS.
Hu, Rongdong; Liu, Guangming; Jiang, Jingfei; Wang, Lixin
2015-01-01
Cloud computing has started to change the way how bioinformatics research is being carried out. Researchers who have taken advantage of this technology can process larger amounts of data and speed up scientific discovery. The variability in data volume results in variable computing requirements. Therefore, bioinformatics researchers are pursuing more reliable and efficient methods for conducting sequencing analyses. This paper proposes an automated resource provisioning method, G2LC, for bioinformatics applications in IaaS. It enables application to output the results in a real time manner. Its main purpose is to guarantee applications performance, while improving resource utilization. Real sequence searching data of BLAST is used to evaluate the effectiveness of G2LC. Experimental results show that G2LC guarantees the application performance, while resource is saved up to 20.14%.
G2LC: Resources Autoscaling for Real Time Bioinformatics Applications in IaaS
Hu, Rongdong; Liu, Guangming; Jiang, Jingfei; Wang, Lixin
2015-01-01
Cloud computing has started to change the way how bioinformatics research is being carried out. Researchers who have taken advantage of this technology can process larger amounts of data and speed up scientific discovery. The variability in data volume results in variable computing requirements. Therefore, bioinformatics researchers are pursuing more reliable and efficient methods for conducting sequencing analyses. This paper proposes an automated resource provisioning method, G2LC, for bioinformatics applications in IaaS. It enables application to output the results in a real time manner. Its main purpose is to guarantee applications performance, while improving resource utilization. Real sequence searching data of BLAST is used to evaluate the effectiveness of G2LC. Experimental results show that G2LC guarantees the application performance, while resource is saved up to 20.14%. PMID:26504488
NASA Technical Reports Server (NTRS)
Bailey, F. R.; Kutler, Paul
1988-01-01
Discussed are the capabilities of NASA's Numerical Aerodynamic Simulation (NAS) Program and its application as an advanced supercomputing system for computational fluid dynamics (CFD) research. First, the paper describes the NAS computational system, called the NAS Processing System Network, and the advanced computational capabilities it offers as a consequence of carrying out the NAS pathfinder objective. Second, it presents examples of pioneering CFD research accomplished during NAS's first operational year. Examples are included which illustrate CFD applications for predicting fluid phenomena, complementing and supplementing experimentation, and aiding in design. Finally, pacing elements and future directions for CFD and NAS are discussed.
Advances in Computational Capabilities for Hypersonic Flows
NASA Technical Reports Server (NTRS)
Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip
1997-01-01
The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.
A Decade of Neural Networks: Practical Applications and Prospects
NASA Technical Reports Server (NTRS)
Kemeny, Sabrina E.
1994-01-01
The Jet Propulsion Laboratory Neural Network Workshop, sponsored by NASA and DOD, brings together sponsoring agencies, active researchers, and the user community to formulate a vision for the next decade of neural network research and application prospects. While the speed and computing power of microprocessors continue to grow at an ever-increasing pace, the demand to intelligently and adaptively deal with the complex, fuzzy, and often ill-defined world around us remains to a large extent unaddressed. Powerful, highly parallel computing paradigms such as neural networks promise to have a major impact in addressing these needs. Papers in the workshop proceedings highlight benefits of neural networks in real-world applications compared to conventional computing techniques. Topics include fault diagnosis, pattern recognition, and multiparameter optimization.
A comprehensive overview of the applications of artificial life.
Kim, Kyung-Joong; Cho, Sung-Bae
2006-01-01
We review the applications of artificial life (ALife), the creation of synthetic life on computers to study, simulate, and understand living systems. The definition and features of ALife are shown by application studies. ALife application fields treated include robot control, robot manufacturing, practical robots, computer graphics, natural phenomenon modeling, entertainment, games, music, economics, Internet, information processing, industrial design, simulation software, electronics, security, data mining, and telecommunications. In order to show the status of ALife application research, this review primarily features a survey of about 180 ALife application articles rather than a selected representation of a few articles. Evolutionary computation is the most popular method for designing such applications, but recently swarm intelligence, artificial immune network, and agent-based modeling have also produced results. Applications were initially restricted to the robotics and computer graphics, but presently, many different applications in engineering areas are of interest.
Semantic computing and language knowledge bases
NASA Astrophysics Data System (ADS)
Wang, Lei; Wang, Houfeng; Yu, Shiwen
2017-09-01
As the proposition of the next-generation Web - semantic Web, semantic computing has been drawing more and more attention within the circle and the industries. A lot of research has been conducted on the theory and methodology of the subject, and potential applications have also been investigated and proposed in many fields. The progress of semantic computing made so far cannot be detached from its supporting pivot - language resources, for instance, language knowledge bases. This paper proposes three perspectives of semantic computing from a macro view and describes the current status of affairs about the construction of language knowledge bases and the related research and applications that have been carried out on the basis of these resources via a case study in the Institute of Computational Linguistics at Peking University.
Computational fluid dynamics research
NASA Technical Reports Server (NTRS)
Chandra, Suresh; Jones, Kenneth; Hassan, Hassan; Mcrae, David Scott
1992-01-01
The focus of research in the computational fluid dynamics (CFD) area is two fold: (1) to develop new approaches for turbulence modeling so that high speed compressible flows can be studied for applications to entry and re-entry flows; and (2) to perform research to improve CFD algorithm accuracy and efficiency for high speed flows. Research activities, faculty and student participation, publications, and financial information are outlined.
Affective medicine. A review of affective computing efforts in medical informatics.
Luneski, A; Konstantinidis, E; Bamidis, P D
2010-01-01
Affective computing (AC) is concerned with emotional interactions performed with and through computers. It is defined as "computing that relates to, arises from, or deliberately influences emotions". AC enables investigation and understanding of the relation between human emotions and health as well as application of assistive and useful technologies in the medical domain. 1) To review the general state of the art in AC and its applications in medicine, and 2) to establish synergies between the research communities of AC and medical informatics. Aspects related to the human affective state as a determinant of the human health are discussed, coupled with an illustration of significant AC research and related literature output. Moreover, affective communication channels are described and their range of application fields is explored through illustrative examples. The presented conferences, European research projects and research publications illustrate the recent increase of interest in the AC area by the medical community. Tele-home healthcare, AmI, ubiquitous monitoring, e-learning and virtual communities with emotionally expressive characters for elderly or impaired people are few areas where the potential of AC has been realized and applications have emerged. A number of gaps can potentially be overcome through the synergy of AC and medical informatics. The application of AC technologies parallels the advancement of the existing state of the art and the introduction of new methods. The amount of work and projects reviewed in this paper witness an ambitious and optimistic synergetic future of the affective medicine field.
Elementary EFL Teachers' Computer Phobia and Computer Self-Efficacy in Taiwan
ERIC Educational Resources Information Center
Chen, Kate Tzuching
2012-01-01
The advent and application of computer and information technology has increased the overall success of EFL teaching; however, such success is hard to assess, and teachers prone to computer avoidance face negative consequences. Two major obstacles are high computer phobia and low computer self-efficacy. However, little research has been carried out…
ERIC Educational Resources Information Center
Tataw, Oben Moses
2013-01-01
Interdisciplinary research in computer science requires the development of computational techniques for practical application in different domains. This usually requires careful integration of different areas of technical expertise. This dissertation presents image and time series analysis algorithms, with practical interdisciplinary applications…
Computer-Mediated Communication and Virtual Groups: Applications to Interethnic Conflict
ERIC Educational Resources Information Center
Walther, Joseph B.
2009-01-01
This essay concerns applications of computer-mediated communication (CMC) research in groups toward the enhancement of relations between members of potentially hostile ethnopolitical groups. The characteristics of CMC offer several possible means of facilitating the reduction of animosity through online contact among intergroup constituents. The…
A characterization of workflow management systems for extreme-scale applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia
We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less
A characterization of workflow management systems for extreme-scale applications
Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...
2017-02-16
We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less
Huggins, Jane E.; Guger, Christoph; Ziat, Mounia; Zander, Thorsten O.; Taylor, Denise; Tangermann, Michael; Soria-Frisch, Aureli; Simeral, John; Scherer, Reinhold; Rupp, Rüdiger; Ruffini, Giulio; Robinson, Douglas K. R.; Ramsey, Nick F.; Nijholt, Anton; Müller-Putz, Gernot; McFarland, Dennis J.; Mattia, Donatella; Lance, Brent J.; Kindermans, Pieter-Jan; Iturrate, Iñaki; Herff, Christian; Gupta, Disha; Do, An H.; Collinger, Jennifer L.; Chavarriaga, Ricardo; Chase, Steven M.; Bleichner, Martin G.; Batista, Aaron; Anderson, Charles W.; Aarnoutse, Erik J.
2017-01-01
The Sixth International Brain–Computer Interface (BCI) Meeting was held 30 May–3 June 2016 at the Asilomar Conference Grounds, Pacific Grove, California, USA. The conference included 28 workshops covering topics in BCI and brain–machine interface research. Topics included BCI for specific populations or applications, advancing BCI research through use of specific signals or technological advances, and translational and commercial issues to bring both implanted and non-invasive BCIs to market. BCI research is growing and expanding in the breadth of its applications, the depth of knowledge it can produce, and the practical benefit it can provide both for those with physical impairments and the general public. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and highlighting important issues and calls for action to support future research and development. PMID:29152523
Huggins, Jane E; Guger, Christoph; Ziat, Mounia; Zander, Thorsten O; Taylor, Denise; Tangermann, Michael; Soria-Frisch, Aureli; Simeral, John; Scherer, Reinhold; Rupp, Rüdiger; Ruffini, Giulio; Robinson, Douglas K R; Ramsey, Nick F; Nijholt, Anton; Müller-Putz, Gernot; McFarland, Dennis J; Mattia, Donatella; Lance, Brent J; Kindermans, Pieter-Jan; Iturrate, Iñaki; Herff, Christian; Gupta, Disha; Do, An H; Collinger, Jennifer L; Chavarriaga, Ricardo; Chase, Steven M; Bleichner, Martin G; Batista, Aaron; Anderson, Charles W; Aarnoutse, Erik J
2017-01-01
The Sixth International Brain-Computer Interface (BCI) Meeting was held 30 May-3 June 2016 at the Asilomar Conference Grounds, Pacific Grove, California, USA. The conference included 28 workshops covering topics in BCI and brain-machine interface research. Topics included BCI for specific populations or applications, advancing BCI research through use of specific signals or technological advances, and translational and commercial issues to bring both implanted and non-invasive BCIs to market. BCI research is growing and expanding in the breadth of its applications, the depth of knowledge it can produce, and the practical benefit it can provide both for those with physical impairments and the general public. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and highlighting important issues and calls for action to support future research and development.
Small Computer Applications for Base Supply.
1984-03-01
research on small computer utili- zation at bse level organizatins , This research effort studies whether small computers and commercial softure can assist...Doe has made !solid contributions to the full range of departmental activity. His demonstrated leadership skills and administrative ability warrent his...outstanding professionalism and leadership abilities were evidenced by his superb performance as unit key worker In the 1980 Combined Federal CauMign
Computational neuroscience across the lifespan: Promises and pitfalls.
van den Bos, Wouter; Bruckner, Rasmus; Nassar, Matthew R; Mata, Rui; Eppinger, Ben
2017-10-13
In recent years, the application of computational modeling in studies on age-related changes in decision making and learning has gained in popularity. One advantage of computational models is that they provide access to latent variables that cannot be directly observed from behavior. In combination with experimental manipulations, these latent variables can help to test hypotheses about age-related changes in behavioral and neurobiological measures at a level of specificity that is not achievable with descriptive analysis approaches alone. This level of specificity can in turn be beneficial to establish the identity of the corresponding behavioral and neurobiological mechanisms. In this paper, we will illustrate applications of computational methods using examples of lifespan research on risk taking, strategy selection and reinforcement learning. We will elaborate on problems that can occur when computational neuroscience methods are applied to data of different age groups. Finally, we will discuss potential targets for future applications and outline general shortcomings of computational neuroscience methods for research on human lifespan development. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Semiannual report, 1 April - 30 September 1991
NASA Technical Reports Server (NTRS)
1991-01-01
The major categories of the current Institute for Computer Applications in Science and Engineering (ICASE) research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification problems, with emphasis on effective numerical methods; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software for parallel computers. Research in these areas is discussed.
Paul, J T; Singh, A K; Dong, Z; Zhuang, H; Revard, B C; Rijal, B; Ashton, M; Linscheid, A; Blonsky, M; Gluhovic, D; Guo, J; Hennig, R G
2017-11-29
The discovery of two-dimensional (2D) materials comes at a time when computational methods are mature and can predict novel 2D materials, characterize their properties, and guide the design of 2D materials for applications. This article reviews the recent progress in computational approaches for 2D materials research. We discuss the computational techniques and provide an overview of the ongoing research in the field. We begin with an overview of known 2D materials, common computational methods, and available cyber infrastructures. We then move onto the discovery of novel 2D materials, discussing the stability criteria for 2D materials, computational methods for structure prediction, and interactions of monolayers with electrochemical and gaseous environments. Next, we describe the computational characterization of the 2D materials' electronic, optical, magnetic, and superconducting properties and the response of the properties under applied mechanical strain and electrical fields. From there, we move on to discuss the structure and properties of defects in 2D materials, and describe methods for 2D materials device simulations. We conclude by providing an outlook on the needs and challenges for future developments in the field of computational research for 2D materials.
Computational methods for 2D materials: discovery, property characterization, and application design
NASA Astrophysics Data System (ADS)
Paul, J. T.; Singh, A. K.; Dong, Z.; Zhuang, H.; Revard, B. C.; Rijal, B.; Ashton, M.; Linscheid, A.; Blonsky, M.; Gluhovic, D.; Guo, J.; Hennig, R. G.
2017-11-01
The discovery of two-dimensional (2D) materials comes at a time when computational methods are mature and can predict novel 2D materials, characterize their properties, and guide the design of 2D materials for applications. This article reviews the recent progress in computational approaches for 2D materials research. We discuss the computational techniques and provide an overview of the ongoing research in the field. We begin with an overview of known 2D materials, common computational methods, and available cyber infrastructures. We then move onto the discovery of novel 2D materials, discussing the stability criteria for 2D materials, computational methods for structure prediction, and interactions of monolayers with electrochemical and gaseous environments. Next, we describe the computational characterization of the 2D materials’ electronic, optical, magnetic, and superconducting properties and the response of the properties under applied mechanical strain and electrical fields. From there, we move on to discuss the structure and properties of defects in 2D materials, and describe methods for 2D materials device simulations. We conclude by providing an outlook on the needs and challenges for future developments in the field of computational research for 2D materials.
Three Decades of Research on Computer Applications in Health Care
Michael Fitzmaurice, J.; Adams, Karen; Eisenberg, John M.
2002-01-01
The Agency for Healthcare Research and Quality and its predecessor organizations—collectively referred to here as AHRQ—have a productive history of funding research and development in the field of medical informatics, with grant investments since 1968 totaling $107 million. Many computerized interventions that are commonplace today, such as drug interaction alerts, had their genesis in early AHRQ initiatives. This review provides a historical perspective on AHRQ investment in medical informatics research. It shows that grants provided by AHRQ resulted in achievements that include advancing automation in the clinical laboratory and radiology, assisting in technology development (computer languages, software, and hardware), evaluating the effectiveness of computer-based medical information systems, facilitating the evolution of computer-aided decision making, promoting computer-initiated quality assurance programs, backing the formation and application of comprehensive data banks, enhancing the management of specific conditions such as HIV infection, and supporting health data coding and standards initiatives. Other federal agencies and private organizations have also supported research in medical informatics, some earlier and to a greater degree than AHRQ. The results and relative roles of these related efforts are beyond the scope of this review. PMID:11861630
Workshops of the Fifth International Brain-Computer Interface Meeting: Defining the Future
Huggins, Jane E.; Guger, Christoph; Allison, Brendan; Anderson, Charles W.; Batista, Aaron; Brouwer, Anne-Marie (A.-M.); Brunner, Clemens; Chavarriaga, Ricardo; Fried-Oken, Melanie; Gunduz, Aysegul; Gupta, Disha; Kübler, Andrea; Leeb, Robert; Lotte, Fabien; Miller, Lee E.; Müller-Putz, Gernot; Rutkowski, Tomasz; Tangermann, Michael; Thompson, David Edward
2014-01-01
The Fifth International Brain-Computer Interface (BCI) Meeting met June 3–7th, 2013 at the Asilomar Conference Grounds, Pacific Grove, California. The conference included 19 workshops covering topics in brain-computer interface and brain-machine interface research. Topics included translation of BCIs into clinical use, standardization and certification, types of brain activity to use for BCI, recording methods, the effects of plasticity, special interest topics in BCIs applications, and future BCI directions. BCI research is well established and transitioning to practical use to benefit people with physical impairments. At the same time, new applications are being explored, both for people with physical impairments and beyond. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and high-lighting important issues for future research and development. PMID:25485284
Formalisms for user interface specification and design
NASA Technical Reports Server (NTRS)
Auernheimer, Brent J.
1989-01-01
The application of formal methods to the specification and design of human-computer interfaces is described. A broad outline of human-computer interface problems, a description of the field of cognitive engineering and two relevant research results, the appropriateness of formal specification techniques, and potential NASA application areas are described.
On October 25 and 26, 1984, the U.S. EPA sponsored a workshop to consider the potential applications of the techniques of computational biological chemistry to problems in environmental health. Eleven extramural scientists from the various related disciplines and a similar number...
NASA Astrophysics Data System (ADS)
Wan, Junwei; Chen, Hongyan; Zhao, Jing
2017-08-01
According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.
A comparative analysis of soft computing techniques for gene prediction.
Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand
2013-07-01
The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.
International Symposium on Grids and Clouds (ISGC) 2016
NASA Astrophysics Data System (ADS)
The International Symposium on Grids and Clouds (ISGC) 2016 will be held at Academia Sinica in Taipei, Taiwan from 13-18 March 2016, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC). The theme of ISGC 2016 focuses on“Ubiquitous e-infrastructures and Applications”. Contemporary research is impossible without a strong IT component - researchers rely on the existence of stable and widely available e-infrastructures and their higher level functions and properties. As a result of these expectations, e-Infrastructures are becoming ubiquitous, providing an environment that supports large scale collaborations that deal with global challenges as well as smaller and temporal research communities focusing on particular scientific problems. To support those diversified communities and their needs, the e-Infrastructures themselves are becoming more layered and multifaceted, supporting larger groups of applications. Following the call for the last year conference, ISGC 2016 continues its aim to bring together users and application developers with those responsible for the development and operation of multi-purpose ubiquitous e-Infrastructures. Topics of discussion include Physics (including HEP) and Engineering Applications, Biomedicine & Life Sciences Applications, Earth & Environmental Sciences & Biodiversity Applications, Humanities, Arts, and Social Sciences (HASS) Applications, Virtual Research Environment (including Middleware, tools, services, workflow, etc.), Data Management, Big Data, Networking & Security, Infrastructure & Operations, Infrastructure Clouds and Virtualisation, Interoperability, Business Models & Sustainability, Highly Distributed Computing Systems, and High Performance & Technical Computing (HPTC), etc.
Applications of Adaptive Quantum Control to Research Questions in Solar Energy Conversion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Damrauer, Niels
2017-02-07
This award supported a broad research effort at the University of Colorado at Boulder comprising synthesis, applications of computational chemistry, development of theory, exploration of material properties, and advancement of spectroscopic tools including femtosecond pulse shaping techniques. It funded six graduate students and two postdoctoral researchers.
Review of computational fluid dynamics (CFD) researches on nano fluid flow through micro channel
NASA Astrophysics Data System (ADS)
Dewangan, Satish Kumar
2018-05-01
Nanofluid is becoming a promising heat transfer fluids due to its improved thermo-physical properties and heat transfer performance. Micro channel heat transfer has potential application in the cooling high power density microchips in CPU system, micro power systems and many such miniature thermal systems which need advanced cooling capacity. Use of nanofluids enhances the effectiveness of t=scu systems. Computational Fluid Dynamics (CFD) is a very powerful tool in computational analysis of the various physical processes. It application to the situations of flow and heat transfer analysis of the nano fluids is catching up very fast. Present research paper gives a brief account of the methodology of the CFD and also summarizes its application on nano fluid and heat transfer for microchannel cases.
Computer graphics testbed to simulate and test vision systems for space applications
NASA Technical Reports Server (NTRS)
Cheatham, John B.
1991-01-01
Research activity has shifted from computer graphics and vision systems to the broader scope of applying concepts of artificial intelligence to robotics. Specifically, the research is directed toward developing Artificial Neural Networks, Expert Systems, and Laser Imaging Techniques for Autonomous Space Robots.
Implementation of cloud computing in higher education
NASA Astrophysics Data System (ADS)
Asniar; Budiawan, R.
2016-04-01
Cloud computing research is a new trend in distributed computing, where people have developed service and SOA (Service Oriented Architecture) based application. This technology is very useful to be implemented, especially for higher education. This research is studied the need and feasibility for the suitability of cloud computing in higher education then propose the model of cloud computing service in higher education in Indonesia that can be implemented in order to support academic activities. Literature study is used as the research methodology to get a proposed model of cloud computing in higher education. Finally, SaaS and IaaS are cloud computing service that proposed to be implemented in higher education in Indonesia and cloud hybrid is the service model that can be recommended.
Lewis Structures Technology, 1988. Volume 2: Structural Mechanics
NASA Technical Reports Server (NTRS)
1988-01-01
Lewis Structures Div. performs and disseminates results of research conducted in support of aerospace engine structures. These results have a wide range of applicability to practitioners of structural engineering mechanics beyond the aerospace arena. The engineering community was familiarized with the depth and range of research performed by the division and its academic and industrial partners. Sessions covered vibration control, fracture mechanics, ceramic component reliability, parallel computing, nondestructive evaluation, constitutive models and experimental capabilities, dynamic systems, fatigue and damage, wind turbines, hot section technology (HOST), aeroelasticity, structural mechanics codes, computational methods for dynamics, structural optimization, and applications of structural dynamics, and structural mechanics computer codes.
Computer graphics and the graphic artist
NASA Technical Reports Server (NTRS)
Taylor, N. L.; Fedors, E. G.; Pinelli, T. E.
1985-01-01
A centralized computer graphics system is being developed at the NASA Langley Research Center. This system was required to satisfy multiuser needs, ranging from presentation quality graphics prepared by a graphic artist to 16-mm movie simulations generated by engineers and scientists. While the major thrust of the central graphics system was directed toward engineering and scientific applications, hardware and software capabilities to support the graphic artists were integrated into the design. This paper briefly discusses the importance of computer graphics in research; the central graphics system in terms of systems, software, and hardware requirements; the application of computer graphics to graphic arts, discussed in terms of the requirements for a graphic arts workstation; and the problems encountered in applying computer graphics to the graphic arts. The paper concludes by presenting the status of the central graphics system.
The Berlin Brain–Computer Interface: Non-Medical Uses of BCI Technology
Blankertz, Benjamin; Tangermann, Michael; Vidaurre, Carmen; Fazli, Siamac; Sannelli, Claudia; Haufe, Stefan; Maeder, Cecilia; Ramsey, Lenny; Sturm, Irene; Curio, Gabriel; Müller, Klaus-Robert
2010-01-01
Brain–computer interfacing (BCI) is a steadily growing area of research. While initially BCI research was focused on applications for paralyzed patients, increasingly more alternative applications in healthy human subjects are proposed and investigated. In particular, monitoring of mental states and decoding of covert user states have seen a strong rise of interest. Here, we present some examples of such novel applications which provide evidence for the promising potential of BCI technology for non-medical uses. Furthermore, we discuss distinct methodological improvements required to bring non-medical applications of BCI technology to a diversity of layperson target groups, e.g., ease of use, minimal training, general usability, short control latencies. PMID:21165175
USSR Report: Cybernetics, Computers and Automation Technology. No. 69.
1983-05-06
computers in multiprocessor and multistation design , control and scientific research automation systems. The results of comparing the efficiency of...Podvizhnaya, Scientific Research Institute of Control Computers, Severodonetsk] [Text] The most significant change in the design of the SM-2M compared to...UPRAVLYAYUSHCHIYE SISTEMY I MASHINY, Nov-Dec 82) 95 APPLICATIONS Kiev Automated Control System, Design Features and Prospects for Development (V. A
Evaluation of the Intel iWarp parallel processor for space flight applications
NASA Technical Reports Server (NTRS)
Hine, Butler P., III; Fong, Terrence W.
1993-01-01
The potential of a DARPA-sponsored advanced processor, the Intel iWarp, for use in future SSF Data Management Systems (DMS) upgrades is evaluated through integration into the Ames DMS testbed and applications testing. The iWarp is a distributed, parallel computing system well suited for high performance computing applications such as matrix operations and image processing. The system architecture is modular, supports systolic and message-based computation, and is capable of providing massive computational power in a low-cost, low-power package. As a consequence, the iWarp offers significant potential for advanced space-based computing. This research seeks to determine the iWarp's suitability as a processing device for space missions. In particular, the project focuses on evaluating the ease of integrating the iWarp into the SSF DMS baseline architecture and the iWarp's ability to support computationally stressing applications representative of SSF tasks.
Speed challenge: a case for hardware implementation in soft-computing
NASA Technical Reports Server (NTRS)
Daud, T.; Stoica, A.; Duong, T.; Keymeulen, D.; Zebulum, R.; Thomas, T.; Thakoor, A.
2000-01-01
For over a decade, JPL has been actively involved in soft computing research on theory, architecture, applications, and electronics hardware. The driving force in all our research activities, in addition to the potential enabling technology promise, has been creation of a niche that imparts orders of magnitude speed advantage by implementation in parallel processing hardware with algorithms made especially suitable for hardware implementation. We review our work on neural networks, fuzzy logic, and evolvable hardware with selected application examples requiring real time response capabilities.
Simulation Applications at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Inouye, M.
1984-01-01
Aeronautical applications of simulation technology at Ames Research Center are described. The largest wind tunnel in the world is used to determine the flow field and aerodynamic characteristics of various aircraft, helicopter, and missile configurations. Large computers are used to obtain similar results through numerical solutions of the governing equations. Capabilities are illustrated by computer simulations of turbulence, aileron buzz, and an exhaust jet. Flight simulators are used to assess the handling qualities of advanced aircraft, particularly during takeoff and landing.
NASA Technical Reports Server (NTRS)
1995-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period 1 Oct. 1994 - 31 Mar. 1995.
NiftyNet: a deep-learning platform for medical imaging.
Gibson, Eli; Li, Wenqi; Sudre, Carole; Fidon, Lucas; Shakir, Dzhoshkun I; Wang, Guotai; Eaton-Rosen, Zach; Gray, Robert; Doel, Tom; Hu, Yipeng; Whyntie, Tom; Nachev, Parashkev; Modat, Marc; Barratt, Dean C; Ourselin, Sébastien; Cardoso, M Jorge; Vercauteren, Tom
2018-05-01
Medical image analysis and computer-assisted intervention problems are increasingly being addressed with deep-learning-based solutions. Established deep-learning platforms are flexible but do not provide specific functionality for medical image analysis and adapting them for this domain of application requires substantial implementation effort. Consequently, there has been substantial duplication of effort and incompatible infrastructure developed across many research groups. This work presents the open-source NiftyNet platform for deep learning in medical imaging. The ambition of NiftyNet is to accelerate and simplify the development of these solutions, and to provide a common mechanism for disseminating research outputs for the community to use, adapt and build upon. The NiftyNet infrastructure provides a modular deep-learning pipeline for a range of medical imaging applications including segmentation, regression, image generation and representation learning applications. Components of the NiftyNet pipeline including data loading, data augmentation, network architectures, loss functions and evaluation metrics are tailored to, and take advantage of, the idiosyncracies of medical image analysis and computer-assisted intervention. NiftyNet is built on the TensorFlow framework and supports features such as TensorBoard visualization of 2D and 3D images and computational graphs by default. We present three illustrative medical image analysis applications built using NiftyNet infrastructure: (1) segmentation of multiple abdominal organs from computed tomography; (2) image regression to predict computed tomography attenuation maps from brain magnetic resonance images; and (3) generation of simulated ultrasound images for specified anatomical poses. The NiftyNet infrastructure enables researchers to rapidly develop and distribute deep learning solutions for segmentation, regression, image generation and representation learning applications, or extend the platform to new applications. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Computational Fluid Dynamics Conference. Volume 1: Sessions 1-6
NASA Technical Reports Server (NTRS)
1989-01-01
Presentations given at the NASA Computational Fluid Dynamics (CFD) Conference held at the NASA Ames Research Center, Moffett Field, California, March 7-9, 1989 are given. Topics covered include research facility overviews of CFD research and applications, validation programs, direct simulation of compressible turbulence, turbulence modeling, advances in Runge-Kutta schemes for solving 3-D Navier-Stokes equations, grid generation and invicid flow computation around aircraft geometries, numerical simulation of rotorcraft, and viscous drag prediction for rotor blades.
NASA Technical Reports Server (NTRS)
Schuster, David M.; Edwards, John W.
2004-01-01
The motivation behind the inclusion of unsteady aerodynamics and aeroelastic effects in the computation of stability and control (S&C) derivatives will be discussed as they pertain to aeroelastic and aeroservoelastic analysis. This topic will be addressed in the context of two applications, the first being the estimation of S&C derivatives for a cable-mounted aeroservoelastic wind tunnel model tested in the NASA Langley Research Center (LaRC) Transonic Dynamics Tunnel (TDT). The second application will be the prediction of the nonlinear aeroservoelastic phenomenon known as Residual Pitch Oscillation (RPO) on the B-2 Bomber. Techniques and strategies used in these applications to compute S&C derivatives and perform flight simulations will be reviewed, and computational results will be presented.
1980-09-30
typography is voluminous and directly applicable. Research dealing directly with the line printer used in computer output is scanty, but consistent with...available to the researcher. While this may stimulate rapid software production, it often creates sets of chain- reaction problems. Accordingly
Handheld Computers in Education. Research Brief
ERIC Educational Resources Information Center
Education Partnerships, Inc., 2003
2003-01-01
For over the last 20 years, educators have been trying to find the best practice in using technology for student learning. Some of the most widely used applications with computers have been student learning of programming, word processing, Web research, spreadsheets, games, and Web design. The difficulty with integrating many of these activities…
NASA CST aids U.S. industry. [computational structures technology
NASA Technical Reports Server (NTRS)
Housner, Jerry M.; Pinson, Larry D.
1993-01-01
The effect of NASA's computational structures Technology (CST) research on aerospace vehicle design and operation is discussed. The application of this research to proposed version of a high-speed civil transport, to composite structures in aerospace, to the study of crack growth, and to resolving field problems is addressed.
SIAM Conference on Geometric Design and Computing. Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2002-03-11
The SIAM Conference on Geometric Design and Computing attracted 164 domestic and international researchers, from academia, industry, and government. It provided a stimulating forum in which to learn about the latest developments, to discuss exciting new research directions, and to forge stronger ties between theory and applications. Final Report
Computer programming for generating visual stimuli.
Bukhari, Farhan; Kurylo, Daniel D
2008-02-01
Critical to vision research is the generation of visual displays with precise control over stimulus metrics. Generating stimuli often requires adapting commercial software or developing specialized software for specific research applications. In order to facilitate this process, we give here an overview that allows nonexpert users to generate and customize stimuli for vision research. We first give a review of relevant hardware and software considerations, to allow the selection of display hardware, operating system, programming language, and graphics packages most appropriate for specific research applications. We then describe the framework of a generic computer program that can be adapted for use with a broad range of experimental applications. Stimuli are generated in the context of trial events, allowing the display of text messages, the monitoring of subject responses and reaction times, and the inclusion of contingency algorithms. This approach allows direct control and management of computer-generated visual stimuli while utilizing the full capabilities of modern hardware and software systems. The flowchart and source code for the stimulus-generating program may be downloaded from www.psychonomic.org/archive.
A survey of CPU-GPU heterogeneous computing techniques
Mittal, Sparsh; Vetter, Jeffrey S.
2015-07-04
As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and applicationmore » level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.« less
Hyperbolic Harmonic Mapping for Surface Registration
Shi, Rui; Zeng, Wei; Su, Zhengyu; Jiang, Jian; Damasio, Hanna; Lu, Zhonglin; Wang, Yalin; Yau, Shing-Tung; Gu, Xianfeng
2016-01-01
Automatic computation of surface correspondence via harmonic map is an active research field in computer vision, computer graphics and computational geometry. It may help document and understand physical and biological phenomena and also has broad applications in biometrics, medical imaging and motion capture inducstries. Although numerous studies have been devoted to harmonic map research, limited progress has been made to compute a diffeomorphic harmonic map on general topology surfaces with landmark constraints. This work conquers this problem by changing the Riemannian metric on the target surface to a hyperbolic metric so that the harmonic mapping is guaranteed to be a diffeomorphism under landmark constraints. The computational algorithms are based on Ricci flow and nonlinear heat diffusion methods. The approach is general and robust. We employ our algorithm to study the constrained surface registration problem which applies to both computer vision and medical imaging applications. Experimental results demonstrate that, by changing the Riemannian metric, the registrations are always diffeomorphic and achieve relatively high performance when evaluated with some popular surface registration evaluation standards. PMID:27187948
A survey of CPU-GPU heterogeneous computing techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh; Vetter, Jeffrey S.
As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and applicationmore » level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.« less
ERIC Educational Resources Information Center
Bayer, Marc Dewey
2008-01-01
Since 2004, Buffalo State College's E. H. Butler Library has used the Information Commons (IC) model to assist its 8,500 students with library research and computer applications. Campus Technology Services (CTS) plays a very active role in its IC, with a centrally located Computer Help Desk and a newly created Application Support Desk right in the…
1986-05-01
AD-ft?l 552 TIGHT BOUNDS FOR NININAX GRID MATCHING WITH i APPLICATIONS TO THE AVERAGE C.. (U) MASSACHUSETTS INST OF TECH CAMBRIDGE LAS FOR COMPUTER...MASSACHUSETTS LABORATORYFORNSTITUTE OF COMPUTER SCIENCE TECHNOLOGY MIT/LCS/TM-298 TIGHT BOUNDS FOR MINIMAX GRID MATCHING, WITH APPLICATIONS TO THE AVERAGE...PERIOD COVERED Tight bounds for minimax grid matching, Interim research with applications to the average case May 1986 analysis of algorithms. 6
Applicability of mathematical modeling to problems of environmental physiology
NASA Technical Reports Server (NTRS)
White, Ronald J.; Lujan, Barbara F.; Leonard, Joel I.; Srinivasan, R. Srini
1988-01-01
The paper traces the evolution of mathematical modeling and systems analysis from terrestrial research to research related to space biomedicine and back again to terrestrial research. Topics covered include: power spectral analysis of physiological signals; pattern recognition models for detection of disease processes; and, computer-aided diagnosis programs used in conjunction with a special on-line biomedical computer library.
PRACE - The European HPC Infrastructure
NASA Astrophysics Data System (ADS)
Stadelmeyer, Peter
2014-05-01
The mission of PRACE (Partnership for Advanced Computing in Europe) is to enable high impact scientific discovery and engineering research and development across all disciplines to enhance European competitiveness for the benefit of society. PRACE seeks to realize this mission by offering world class computing and data management resources and services through a peer review process. This talk gives a general overview about PRACE and the PRACE research infrastructure (RI). PRACE is established as an international not-for-profit association and the PRACE RI is a pan-European supercomputing infrastructure which offers access to computing and data management resources at partner sites distributed throughout Europe. Besides a short summary about the organization, history, and activities of PRACE, it is explained how scientists and researchers from academia and industry from around the world can access PRACE systems and which education and training activities are offered by PRACE. The overview also contains a selection of PRACE contributions to societal challenges and ongoing activities. Examples of the latter are beside others petascaling, application benchmark suite, best practice guides for efficient use of key architectures, application enabling / scaling, new programming models, and industrial applications. The Partnership for Advanced Computing in Europe (PRACE) is an international non-profit association with its seat in Brussels. The PRACE Research Infrastructure provides a persistent world-class high performance computing service for scientists and researchers from academia and industry in Europe. The computer systems and their operations accessible through PRACE are provided by 4 PRACE members (BSC representing Spain, CINECA representing Italy, GCS representing Germany and GENCI representing France). The Implementation Phase of PRACE receives funding from the EU's Seventh Framework Programme (FP7/2007-2013) under grant agreements RI-261557, RI-283493 and RI-312763. For more information, see www.prace-ri.eu
Computational manufacturing as a bridge between design and production.
Tikhonravov, Alexander V; Trubetskov, Michael K
2005-11-10
Computational manufacturing of optical coatings is a research area that can be placed between theoretical designing and practical manufacturing in the same way that computational physics can be placed between theoretical and experimental physics. Investigations in this area have been performed for more than 30 years under the name of computer simulation of manufacturing and monitoring processes. Our goal is to attract attention to the increasing importance of computational manufacturing at the current state of the art in the design and manufacture of optical coatings and to demonstrate possible applications of this research tool.
Computational manufacturing as a bridge between design and production
NASA Astrophysics Data System (ADS)
Tikhonravov, Alexander V.; Trubetskov, Michael K.
2005-11-01
Computational manufacturing of optical coatings is a research area that can be placed between theoretical designing and practical manufacturing in the same way that computational physics can be placed between theoretical and experimental physics. Investigations in this area have been performed for more than 30 years under the name of computer simulation of manufacturing and monitoring processes. Our goal is to attract attention to the increasing importance of computational manufacturing at the current state of the art in the design and manufacture of optical coatings and to demonstrate possible applications of this research tool.
Statistical Methodologies to Integrate Experimental and Computational Research
NASA Technical Reports Server (NTRS)
Parker, P. A.; Johnson, R. T.; Montgomery, D. C.
2008-01-01
Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.
Use of a Computer-Mediated Delphi Process to Validate a Mass Casualty Conceptual Model
CULLEY, JOAN M.
2012-01-01
Since the original work on the Delphi technique, multiple versions have been developed and used in research and industry; however, very little empirical research has been conducted that evaluates the efficacy of using online computer, Internet, and e-mail applications to facilitate a Delphi method that can be used to validate theoretical models. The purpose of this research was to develop computer, Internet, and e-mail applications to facilitate a modified Delphi technique through which experts provide validation for a proposed conceptual model that describes the information needs for a mass-casualty continuum of care. Extant literature and existing theoretical models provided the basis for model development. Two rounds of the Delphi process were needed to satisfy the criteria for consensus and/or stability related to the constructs, relationships, and indicators in the model. The majority of experts rated the online processes favorably (mean of 6.1 on a seven-point scale). Using online Internet and computer applications to facilitate a modified Delphi process offers much promise for future research involving model building or validation. The online Delphi process provided an effective methodology for identifying and describing the complex series of events and contextual factors that influence the way we respond to disasters. PMID:21076283
Use of a computer-mediated Delphi process to validate a mass casualty conceptual model.
Culley, Joan M
2011-05-01
Since the original work on the Delphi technique, multiple versions have been developed and used in research and industry; however, very little empirical research has been conducted that evaluates the efficacy of using online computer, Internet, and e-mail applications to facilitate a Delphi method that can be used to validate theoretical models. The purpose of this research was to develop computer, Internet, and e-mail applications to facilitate a modified Delphi technique through which experts provide validation for a proposed conceptual model that describes the information needs for a mass-casualty continuum of care. Extant literature and existing theoretical models provided the basis for model development. Two rounds of the Delphi process were needed to satisfy the criteria for consensus and/or stability related to the constructs, relationships, and indicators in the model. The majority of experts rated the online processes favorably (mean of 6.1 on a seven-point scale). Using online Internet and computer applications to facilitate a modified Delphi process offers much promise for future research involving model building or validation. The online Delphi process provided an effective methodology for identifying and describing the complex series of events and contextual factors that influence the way we respond to disasters.
Tools for computer graphics applications
NASA Technical Reports Server (NTRS)
Phillips, R. L.
1976-01-01
Extensive research in computer graphics has produced a collection of basic algorithms and procedures whose utility spans many disciplines. These tools are described in terms of their fundamental aspects, implementations, applications, and availability. Programs which are discussed include basic data plotting, curve smoothing, and depiction of three dimensional surfaces. As an aid to potential users of these tools, particular attention is given to discussing their availability and, where applicable, their cost.
Applications of Out-of-Domain Knowledge in Students' Reasoning about Computer Program State
ERIC Educational Resources Information Center
Lewis, Colleen Marie
2012-01-01
To meet a growing demand and a projected deficit in the supply of computer professionals (NCWIT, 2009), it is of vital importance to expand students' access to computer science. However, many researchers in the computer science education community unproductively assume that some students lack an innate ability for computer science and…
Applications of artificial intelligence to scientific research
NASA Technical Reports Server (NTRS)
Prince, Mary Ellen
1986-01-01
Artificial intelligence (AI) is a growing field which is just beginning to make an impact on disciplines other than computer science. While a number of military and commercial applications were undertaken in recent years, few attempts were made to apply AI techniques to basic scientific research. There is no inherent reason for the discrepancy. The characteristics of the problem, rather than its domain, determines whether or not it is suitable for an AI approach. Expert system, intelligent tutoring systems, and learning programs are examples of theoretical topics which can be applied to certain areas of scientific research. Further research and experimentation should eventurally make it possible for computers to act as intelligent assistants to scientists.
The monitoring and managing application of cloud computing based on Internet of Things.
Luo, Shiliang; Ren, Bin
2016-07-01
Cloud computing and the Internet of Things are the two hot points in the Internet application field. The application of the two new technologies is in hot discussion and research, but quite less on the field of medical monitoring and managing application. Thus, in this paper, we study and analyze the application of cloud computing and the Internet of Things on the medical field. And we manage to make a combination of the two techniques in the medical monitoring and managing field. The model architecture for remote monitoring cloud platform of healthcare information (RMCPHI) was established firstly. Then the RMCPHI architecture was analyzed. Finally an efficient PSOSAA algorithm was proposed for the medical monitoring and managing application of cloud computing. Simulation results showed that our proposed scheme can improve the efficiency about 50%. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Atkinson, R. C.
1974-01-01
The results are presented of a project of research and development on strategies for optimizing the instructional process, and dissemination of information about the applications of such research to the instructional medium of computer-assisted instruction. Accomplishments reported include construction of the author language INSTRUCT, construction of a practical CAI course in the area of computer science, and a number of investigations into the individualization of instruction, using the course as a vehicle.
Camera systems in human motion analysis for biomedical applications
NASA Astrophysics Data System (ADS)
Chin, Lim Chee; Basah, Shafriza Nisha; Yaacob, Sazali; Juan, Yeap Ewe; Kadir, Aida Khairunnisaa Ab.
2015-05-01
Human Motion Analysis (HMA) system has been one of the major interests among researchers in the field of computer vision, artificial intelligence and biomedical engineering and sciences. This is due to its wide and promising biomedical applications, namely, bio-instrumentation for human computer interfacing and surveillance system for monitoring human behaviour as well as analysis of biomedical signal and image processing for diagnosis and rehabilitation applications. This paper provides an extensive review of the camera system of HMA, its taxonomy, including camera types, camera calibration and camera configuration. The review focused on evaluating the camera system consideration of the HMA system specifically for biomedical applications. This review is important as it provides guidelines and recommendation for researchers and practitioners in selecting a camera system of the HMA system for biomedical applications.
Bernstam, Elmer V.; Hersh, William R.; Johnson, Stephen B.; Chute, Christopher G.; Nguyen, Hien; Sim, Ida; Nahm, Meredith; Weiner, Mark; Miller, Perry; DiLaura, Robert P.; Overcash, Marc; Lehmann, Harold P.; Eichmann, David; Athey, Brian D.; Scheuermann, Richard H.; Anderson, Nick; Starren, Justin B.; Harris, Paul A.; Smith, Jack W.; Barbour, Ed; Silverstein, Jonathan C.; Krusch, David A.; Nagarajan, Rakesh; Becich, Michael J.
2010-01-01
Clinical and translational research increasingly requires computation. Projects may involve multiple computationally-oriented groups including information technology (IT) professionals, computer scientists and biomedical informaticians. However, many biomedical researchers are not aware of the distinctions among these complementary groups, leading to confusion, delays and sub-optimal results. Although written from the perspective of clinical and translational science award (CTSA) programs within academic medical centers, the paper addresses issues that extend beyond clinical and translational research. The authors describe the complementary but distinct roles of operational IT, research IT, computer science and biomedical informatics using a clinical data warehouse as a running example. In general, IT professionals focus on technology. The authors distinguish between two types of IT groups within academic medical centers: central or administrative IT (supporting the administrative computing needs of large organizations) and research IT (supporting the computing needs of researchers). Computer scientists focus on general issues of computation such as designing faster computers or more efficient algorithms, rather than specific applications. In contrast, informaticians are concerned with data, information and knowledge. Biomedical informaticians draw on a variety of tools, including but not limited to computers, to solve information problems in health care and biomedicine. The paper concludes with recommendations regarding administrative structures that can help to maximize the benefit of computation to biomedical research within academic health centers. PMID:19550198
Internal fluid mechanics research on supercomputers for aerospace propulsion systems
NASA Technical Reports Server (NTRS)
Miller, Brent A.; Anderson, Bernhard H.; Szuch, John R.
1988-01-01
The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid mechanics (ICFM) to a state of practical application for aerospace propulsion systems. The strategies used to achieve this goal are to: (1) pursue an understanding of flow physics, surface heat transfer, and combustion via analysis and fundamental experiments, (2) incorporate improved understanding of these phenomena into verified 3-D CFD codes, and (3) utilize state-of-the-art computational technology to enhance experimental and CFD research. Presented is an overview of the ICFM program in high-speed propulsion, including work in inlets, turbomachinery, and chemical reacting flows. Ongoing efforts to integrate new computer technologies, such as parallel computing and artificial intelligence, into high-speed aeropropulsion research are described.
Oluwagbemi, Olugbenga O; Adewumi, Adewole; Esuruoso, Abimbola
2012-01-01
Computational biology and bioinformatics are gradually gaining grounds in Africa and other developing nations of the world. However, in these countries, some of the challenges of computational biology and bioinformatics education are inadequate infrastructures, and lack of readily-available complementary and motivational tools to support learning as well as research. This has lowered the morale of many promising undergraduates, postgraduates and researchers from aspiring to undertake future study in these fields. In this paper, we developed and described MACBenAbim (Multi-platform Mobile Application for Computational Biology and Bioinformatics), a flexible user-friendly tool to search for, define and describe the meanings of keyterms in computational biology and bioinformatics, thus expanding the frontiers of knowledge of the users. This tool also has the capability of achieving visualization of results on a mobile multi-platform context. MACBenAbim is available from the authors for non-commercial purposes.
Computing and Office Automation: Changing Variables.
ERIC Educational Resources Information Center
Staman, E. Michael
1981-01-01
Trends in computing and office automation and their applications, including planning, institutional research, and general administrative support in higher education, are discussed. Changing aspects of information processing and an increasingly larger user community are considered. The computing literacy cycle may involve programming, analysis, use…
Koyama, Michihisa; Tsuboi, Hideyuki; Endou, Akira; Takaba, Hiromitsu; Kubo, Momoji; Del Carpio, Carlos A; Miyamoto, Akira
2007-02-01
Computational chemistry can provide fundamental knowledge regarding various aspects of materials. While its impact in scientific research is greatly increasing, its contributions to industrially important issues are far from satisfactory. In order to realize industrial innovation by computational chemistry, a new concept "combinatorial computational chemistry" has been proposed by introducing the concept of combinatorial chemistry to computational chemistry. This combinatorial computational chemistry approach enables theoretical high-throughput screening for materials design. In this manuscript, we review the successful applications of combinatorial computational chemistry to deNO(x) catalysts, Fischer-Tropsch catalysts, lanthanoid complex catalysts, and cathodes of the lithium ion secondary battery.
The AIST Managed Cloud Environment
NASA Astrophysics Data System (ADS)
Cook, S.
2016-12-01
ESTO is currently in the process of developing and implementing the AIST Managed Cloud Environment (AMCE) to offer cloud computing services to ESTO-funded PIs to conduct their project research. AIST will provide projects access to a cloud computing framework that incorporates NASA security, technical, and financial standards, on which project can freely store, run, and process data. Currently, many projects led by research groups outside of NASA do not have the awareness of requirements or the resources to implement NASA standards into their research, which limits the likelihood of infusing the work into NASA applications. Offering this environment to PIs will allow them to conduct their project research using the many benefits of cloud computing. In addition to the well-known cost and time savings that it allows, it also provides scalability and flexibility. The AMCE will facilitate infusion and end user access by ensuring standardization and security. This approach will ultimately benefit ESTO, the science community, and the research, allowing the technology developments to have quicker and broader applications.
The AMCE (AIST Managed Cloud Environment)
NASA Astrophysics Data System (ADS)
Cook, S.
2017-12-01
ESTO has developed and implemented the AIST Managed Cloud Environment (AMCE) to offer cloud computing services to SMD-funded PIs to conduct their project research. AIST will provide projects access to a cloud computing framework that incorporates NASA security, technical, and financial standards, on which project can freely store, run, and process data. Currently, many projects led by research groups outside of NASA do not have the awareness of requirements or the resources to implement NASA standards into their research, which limits the likelihood of infusing the work into NASA applications. Offering this environment to PIs allows them to conduct their project research using the many benefits of cloud computing. In addition to the well-known cost and time savings that it allows, it also provides scalability and flexibility. The AMCE facilitates infusion and end user access by ensuring standardization and security. This approach will ultimately benefit ESTO, the science community, and the research, allowing the technology developments to have quicker and broader applications.
Research on phone contacts online status based on mobile cloud computing
NASA Astrophysics Data System (ADS)
Wang, Wen-jinga; Ge, Weib
2013-03-01
Because the limited ability of storage space, CPU processing on mobile phone, it is difficult to realize complex applications on mobile phones, but along with the development of cloud computing, we can place the computing and storage in the clouds, provide users with rich cloud services, helping users complete various function through the browser has become the trend for future mobile communication. This article is taking the mobile phone contacts online status as an example to analysis the development and application of mobile cloud computing.
Optical information processing at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Reid, Max B.; Bualat, Maria G.; Cho, Young C.; Downie, John D.; Gary, Charles K.; Ma, Paul W.; Ozcan, Meric; Pryor, Anna H.; Spirkovska, Lilly
1993-01-01
The combination of analog optical processors with digital electronic systems offers the potential of tera-OPS computational performance, while often requiring less power and weight relative to all-digital systems. NASA is working to develop and demonstrate optical processing techniques for on-board, real time science and mission applications. Current research areas and applications under investigation include optical matrix processing for space structure vibration control and the analysis of Space Shuttle Main Engine plume spectra, optical correlation-based autonomous vision for robotic vehicles, analog computation for robotic path planning, free-space optical interconnections for information transfer within digital electronic computers, and multiplexed arrays of fiber optic interferometric sensors for acoustic and vibration measurements.
Analysis of the frontier technology of agricultural IoT and its predication research
NASA Astrophysics Data System (ADS)
Han, Shuqing; Zhang, Jianhua; Zhu, Mengshuai; Wu, Jianzhai; Shen, Chen; Kong, Fantao
2017-09-01
Agricultural IoT (Internet of Things) develops rapidly. Nanotechnology, biotechnology and optoelectronic technology are successfully integrated into the agricultural sensor technology. Big data, cloud computing and artificial intelligence technology have also been successfully used in IoT. This paper carries out the research on integration of agricultural sensor technology, nanotechnology, biotechnology and optoelectronic technology and the application of big data, cloud computing and artificial intelligence technology in agricultural IoT. The advantages and development of the integration of nanotechnology, biotechnology and optoelectronic technology with agricultural sensor technology were discussed. The application of big data, cloud computing and artificial intelligence technology in IoT and their development trend were analysed.
Gao, Yuan; Peters, Ove A; Wu, Hongkun; Zhou, Xuedong
2009-02-01
The purpose of this study was to customize an application framework by using the MeVisLab image processing and visualization platform for three-dimensional reconstruction and assessment of tooth and root canal morphology. One maxillary first molar was scanned before and after preparation with ProTaper by using micro-computed tomography. With a customized application framework based on MeVisLab, internal and external anatomy was reconstructed. Furthermore, the dimensions of root canal and radicular dentin were quantified, and effects of canal preparation were assessed. Finally, a virtual preparation with risk analysis was performed to simulate the removal of a broken instrument. This application framework provided an economical platform and met current requirements of endodontic research. The broad-based use of high-quality free software and the resulting exchange of experience might help to improve the quality of endodontic research with micro-computed tomography.
Primary School Students' Attitudes towards Computer Based Testing and Assessment in Turkey
ERIC Educational Resources Information Center
Yurdabakan, Irfan; Uzunkavak, Cicek
2012-01-01
This study investigated the attitudes of primary school students towards computer based testing and assessment in terms of different variables. The sample for this research is primary school students attending a computer based testing and assessment application via CITO-OIS. The "Scale on Attitudes towards Computer Based Testing and…
ERIC Educational Resources Information Center
McNinch, George H., Ed.; And Others
Conference presentations of research on reading comprehension, reading instruction, computer applications in reading instruction, and reading theory are compiled in this yearbook. Titles and authors of some of the articles are as follows: "A Rationale for Teaching Children with Limited English Proficiency" (M. Zintz); "Preliminary Development of a…
Computer-Mediated Communication and a Cautionary Tale of Two Cities
ERIC Educational Resources Information Center
Sadler, Randall
2007-01-01
This paper describes an action research project that investigated the pedagogical applicability of computer-mediated communication (CMC) tools for collaborative projects. The research involved two groups of students studying to become ESL/EFL teachers, one group at a university located in the US Midwest and the other in the Catalan region of…
Advanced networks and computing in healthcare
Ackerman, Michael
2011-01-01
As computing and network capabilities continue to rise, it becomes increasingly important to understand the varied applications for using them to provide healthcare. The objective of this review is to identify key characteristics and attributes of healthcare applications involving the use of advanced computing and communication technologies, drawing upon 45 research and development projects in telemedicine and other aspects of healthcare funded by the National Library of Medicine over the past 12 years. Only projects publishing in the professional literature were included in the review. Four projects did not publish beyond their final reports. In addition, the authors drew on their first-hand experience as project officers, reviewers and monitors of the work. Major themes in the corpus of work were identified, characterizing key attributes of advanced computing and network applications in healthcare. Advanced computing and network applications are relevant to a range of healthcare settings and specialties, but they are most appropriate for solving a narrower range of problems in each. Healthcare projects undertaken primarily to explore potential have also demonstrated effectiveness and depend on the quality of network service as much as bandwidth. Many applications are enabling, making it possible to provide service or conduct research that previously was not possible or to achieve outcomes in addition to those for which projects were undertaken. Most notable are advances in imaging and visualization, collaboration and sense of presence, and mobility in communication and information-resource use. PMID:21486877
ERIC Educational Resources Information Center
Yue, Kui
2009-01-01
A shape grammar is a formalism that has been widely applied, in many different fields, to analyzing designs. Computer implementation of a shape grammar interpreter is vital to both research and application. However, implementing a shape grammar interpreter is hard, especially for parametric shapes defined by open terms. This dissertation…
Use of Standardized Test Scores to Predict Success in a Computer Applications Course
ERIC Educational Resources Information Center
Harris, Robert V.
2014-01-01
In this educational study, the research problem was that each semester a variable number of community college students are unable to complete an introductory computer applications course at a community college in the state of Mississippi with a successful course letter grade. Course failure, or non-success, at the collegiate level is a negative…
Parallel Computational Fluid Dynamics: Current Status and Future Requirements
NASA Technical Reports Server (NTRS)
Simon, Horst D.; VanDalsem, William R.; Dagum, Leonardo; Kutler, Paul (Technical Monitor)
1994-01-01
One or the key objectives of the Applied Research Branch in the Numerical Aerodynamic Simulation (NAS) Systems Division at NASA Allies Research Center is the accelerated introduction of highly parallel machines into a full operational environment. In this report we discuss the performance results obtained from the implementation of some computational fluid dynamics (CFD) applications on the Connection Machine CM-2 and the Intel iPSC/860. We summarize some of the experiences made so far with the parallel testbed machines at the NAS Applied Research Branch. Then we discuss the long term computational requirements for accomplishing some of the grand challenge problems in computational aerosciences. We argue that only massively parallel machines will be able to meet these grand challenge requirements, and we outline the computer science and algorithm research challenges ahead.
Provider-Independent Use of the Cloud
NASA Astrophysics Data System (ADS)
Harmer, Terence; Wright, Peter; Cunningham, Christina; Perrott, Ron
Utility computing offers researchers and businesses the potential of significant cost-savings, making it possible for them to match the cost of their computing and storage to their demand for such resources. A utility compute provider enables the purchase of compute infrastructures on-demand; when a user requires computing resources a provider will provision a resource for them and charge them only for their period of use of that resource. There has been a significant growth in the number of cloud computing resource providers and each has a different resource usage model, application process and application programming interface (API)-developing generic multi-resource provider applications is thus difficult and time consuming. We have developed an abstraction layer that provides a single resource usage model, user authentication model and API for compute providers that enables cloud-provider neutral applications to be developed. In this paper we outline the issues in using external resource providers, give examples of using a number of the most popular cloud providers and provide examples of developing provider neutral applications. In addition, we discuss the development of the API to create a generic provisioning model based on a common architecture for cloud computing providers.
Phan, Philippe; Mezghani, Neila; Aubin, Carl-Éric; de Guise, Jacques A; Labelle, Hubert
2011-07-01
Adolescent idiopathic scoliosis (AIS) is a complex spinal deformity whose assessment and treatment present many challenges. Computer applications have been developed to assist clinicians. A literature review on computer applications used in AIS evaluation and treatment has been undertaken. The algorithms used, their accuracy and clinical usability were analyzed. Computer applications have been used to create new classifications for AIS based on 2D and 3D features, assess scoliosis severity or risk of progression and assist bracing and surgical treatment. It was found that classification accuracy could be improved using computer algorithms that AIS patient follow-up and screening could be done using surface topography thereby limiting radiation and that bracing and surgical treatment could be optimized using simulations. Yet few computer applications are routinely used in clinics. With the development of 3D imaging and databases, huge amounts of clinical and geometrical data need to be taken into consideration when researching and managing AIS. Computer applications based on advanced algorithms will be able to handle tasks that could otherwise not be done which can possibly improve AIS patients' management. Clinically oriented applications and evidence that they can improve current care will be required for their integration in the clinical setting.
High-performance scientific computing in the cloud
NASA Astrophysics Data System (ADS)
Jorissen, Kevin; Vila, Fernando; Rehr, John
2011-03-01
Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.
Ames Research Center SR&T program and earth observations
NASA Technical Reports Server (NTRS)
Poppoff, I. G.
1972-01-01
An overview is presented of the research activities in earth observations at Ames Research Center. Most of the tasks involve the use of research aircraft platforms. The program is also directed toward the use of the Illiac 4 computer for statistical analysis. Most tasks are weighted toward Pacific coast and Pacific basin problems with emphasis on water applications, air applications, animal migration studies, and geophysics.
Computational Nanoelectronics and Nanotechnology at NASA ARC
NASA Technical Reports Server (NTRS)
Saini, Subhash; Kutler, Paul (Technical Monitor)
1998-01-01
Both physical and economic considerations indicate that the scaling era of CMOS will run out of steam around the year 2010. However, physical laws also indicate that it is possible to compute at a rate of a billion times present speeds with the expenditure of only one Watt of electrical power. NASA has long-term needs where ultra-small semiconductor devices are needed for critical applications: high performance, low power, compact computers for intelligent autonomous vehicles and Petaflop computing technology are some key examples. To advance the design, development, and production of future generation micro- and nano-devices, IT Modeling and Simulation Group has been started at NASA Ames with a goal to develop an integrated simulation environment that addresses problems related to nanoelectronics and molecular nanotechnology. Overview of nanoelectronics and nanotechnology research activities being carried out at Ames Research Center will be presented. We will also present the vision and the research objectives of the IT Modeling and Simulation Group including the applications of nanoelectronic based devices relevant to NASA missions.
Computational Nanoelectronics and Nanotechnology at NASA ARC
NASA Technical Reports Server (NTRS)
Saini, Subhash
1998-01-01
Both physical and economic considerations indicate that the scaling era of CMOS will run out of steam around the year 2010. However, physical laws also indicate that it is possible to compute at a rate of a billion times present speeds with the expenditure of only one Watt of electrical power. NASA has long-term needs where ultra-small semiconductor devices are needed for critical applications: high performance, low power, compact computers for intelligent autonomous vehicles and Petaflop computing technolpgy are some key examples. To advance the design, development, and production of future generation micro- and nano-devices, IT Modeling and Simulation Group has been started at NASA Ames with a goal to develop an integrated simulation environment that addresses problems related to nanoelectronics and molecular nanotecnology. Overview of nanoelectronics and nanotechnology research activities being carried out at Ames Research Center will be presented. We will also present the vision and the research objectives of the IT Modeling and Simulation Group including the applications of nanoelectronic based devices relevant to NASA missions.
PoPLAR: Portal for Petascale Lifescience Applications and Research
2013-01-01
Background We are focusing specifically on fast data analysis and retrieval in bioinformatics that will have a direct impact on the quality of human health and the environment. The exponential growth of data generated in biology research, from small atoms to big ecosystems, necessitates an increasingly large computational component to perform analyses. Novel DNA sequencing technologies and complementary high-throughput approaches--such as proteomics, genomics, metabolomics, and meta-genomics--drive data-intensive bioinformatics. While individual research centers or universities could once provide for these applications, this is no longer the case. Today, only specialized national centers can deliver the level of computing resources required to meet the challenges posed by rapid data growth and the resulting computational demand. Consequently, we are developing massively parallel applications to analyze the growing flood of biological data and contribute to the rapid discovery of novel knowledge. Methods The efforts of previous National Science Foundation (NSF) projects provided for the generation of parallel modules for widely used bioinformatics applications on the Kraken supercomputer. We have profiled and optimized the code of some of the scientific community's most widely used desktop and small-cluster-based applications, including BLAST from the National Center for Biotechnology Information (NCBI), HMMER, and MUSCLE; scaled them to tens of thousands of cores on high-performance computing (HPC) architectures; made them robust and portable to next-generation architectures; and incorporated these parallel applications in science gateways with a web-based portal. Results This paper will discuss the various developmental stages, challenges, and solutions involved in taking bioinformatics applications from the desktop to petascale with a front-end portal for very-large-scale data analysis in the life sciences. Conclusions This research will help to bridge the gap between the rate of data generation and the speed at which scientists can study this data. The ability to rapidly analyze data at such a large scale is having a significant, direct impact on science achieved by collaborators who are currently using these tools on supercomputers. PMID:23902523
Cumulative reports and publications
NASA Technical Reports Server (NTRS)
1993-01-01
A complete list of Institute for Computer Applications in Science and Engineering (ICASE) reports are listed. Since ICASE reports are intended to be preprints of articles that will appear in journals or conference proceedings, the published reference is included when it is available. The major categories of the current ICASE research program are: applied and numerical mathematics, including numerical analysis and algorithm development; theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and computer science.
Framework Resources Multiply Computing Power
NASA Technical Reports Server (NTRS)
2010-01-01
As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.
[Activities of Institute for Computer Applications in Science and Engineering (ICASE)
NASA Technical Reports Server (NTRS)
Bushnell, Dennis M. (Technical Monitor)
2001-01-01
This report summarizes research conducted at ICASE in applied mathematics, fluid mechanics, computer science, and structures and material sciences during the period April 1, 2000 through September 30, 2000.
Computer Code for Transportation Network Design and Analysis
DOT National Transportation Integrated Search
1977-01-01
This document describes the results of research into the application of the mathematical programming technique of decomposition to practical transportation network problems. A computer code called Catnap (for Control Analysis Transportation Network A...
Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S
2016-05-01
Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.
Strategies for the promotion of computer applications in radiology in healthcare delivery.
Reiner, B; Siegel, E; Allman, R
1998-08-01
The objective of this paper is to identify current trends in the development and implementation of computer applications in today's ever-changing healthcare environment. Marketing strategies are discussed with the goal of promoting computer applications in radiology as a means to advance future healthcare acceptance of technologic developments from the medical imaging field. With the rapid evolution of imaging and and information technologies along with the transition to filmless imaging, radiologists must assume a proactive role in the development and application of these advancements. This expansion can be accomplished in a number of ways including internet based educational programs, research partnerships, and professional membership in societies such as the Society of Computer Applications in Radiology (SCAR). Professional societies such as SCAR, in turn, should reach out to include other professionals from the healthcare community. These would include financial, administrative, and information systems disciplines to promote these technologies in a cost conscious and value added manner.
Changing how and what children learn in school with computer-based technologies.
Roschelle, J M; Pea, R D; Hoadley, C M; Gordin, D N; Means, B M
2000-01-01
Schools today face ever-increasing demands in their attempts to ensure that students are well equipped to enter the workforce and navigate a complex world. Research indicates that computer technology can help support learning, and that it is especially useful in developing the higher-order skills of critical thinking, analysis, and scientific inquiry. But the mere presence of computers in the classroom does not ensure their effective use. Some computer applications have been shown to be more successful than others, and many factors influence how well even the most promising applications are implemented. This article explores the various ways computer technology can be used to improve how and what children learn in the classroom. Several examples of computer-based applications are highlighted to illustrate ways technology can enhance how children learn by supporting four fundamental characteristics of learning: (1) active engagement, (2) participation in groups, (3) frequent interaction and feedback, and (4) connections to real-world contexts. Additional examples illustrate ways technology can expand what children learn by helping them to understand core concepts in subjects like math, science, and literacy. Research indicates, however, that the use of technology as an effective learning tool is more likely to take place when embedded in a broader education reform movement that includes improvements in teacher training, curriculum, student assessment, and a school's capacity for change. To help inform decisions about the future role of computers in the classroom, the authors conclude that further research is needed to identify the uses that most effectively support learning and the conditions required for successful implementation.
Computers in Public Education Study.
ERIC Educational Resources Information Center
HBJ Enterprises, Highland Park, NJ.
This survey conducted for the National Institute of Education reports the use of computers in U.S. public schools in the areas of instructional computing, student accounting, management of educational resources, research, guidance, testing, and library applications. From a stratified random sample of 1800 schools in varying geographic areas and…
Students Computer Literacy: Perception versus Reality
ERIC Educational Resources Information Center
Wilkinson, Kelly
2006-01-01
Students believe that they are computer literate. When asked, students perceive themselves as skilled in a variety of computer applications. This research compares students' perceptions with their reality. Students did not perform well on pretests of Microsoft Office, but improved their posttests scores with instruction. The study also examined…
Perspectives on an education in computational biology and medicine.
Rubinstein, Jill C
2012-09-01
The mainstream application of massively parallel, high-throughput assays in biomedical research has created a demand for scientists educated in Computational Biology and Bioinformatics (CBB). In response, formalized graduate programs have rapidly evolved over the past decade. Concurrently, there is increasing need for clinicians trained to oversee the responsible translation of CBB research into clinical tools. Physician-scientists with dedicated CBB training can facilitate such translation, positioning themselves at the intersection between computational biomedical research and medicine. This perspective explores key elements of the educational path to such a position, specifically addressing: 1) evolving perceptions of the role of the computational biologist and the impact on training and career opportunities; 2) challenges in and strategies for obtaining the core skill set required of a biomedical researcher in a computational world; and 3) how the combination of CBB with medical training provides a logical foundation for a career in academic medicine and/or biomedical research.
Applications of computer-graphics animation for motion-perception research
NASA Technical Reports Server (NTRS)
Proffitt, D. R.; Kaiser, M. K.
1986-01-01
The advantages and limitations of using computer animated stimuli in studying motion perception are presented and discussed. Most current programs of motion perception research could not be pursued without the use of computer graphics animation. Computer generated displays afford latitudes of freedom and control that are almost impossible to attain through conventional methods. There are, however, limitations to this presentational medium. At present, computer generated displays present simplified approximations of the dynamics in natural events. Very little is known about how the differences between natural events and computer simulations influence perceptual processing. In practice, the differences are assumed to be irrelevant to the questions under study, and that findings with computer generated stimuli will generalize to natural events.
Accelerating scientific discovery : 2007 annual report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckman, P.; Dave, P.; Drugan, C.
2008-11-14
As a gateway for scientific discovery, the Argonne Leadership Computing Facility (ALCF) works hand in hand with the world's best computational scientists to advance research in a diverse span of scientific domains, ranging from chemistry, applied mathematics, and materials science to engineering physics and life sciences. Sponsored by the U.S. Department of Energy's (DOE) Office of Science, researchers are using the IBM Blue Gene/L supercomputer at the ALCF to study and explore key scientific problems that underlie important challenges facing our society. For instance, a research team at the University of California-San Diego/ SDSC is studying the molecular basis ofmore » Parkinson's disease. The researchers plan to use the knowledge they gain to discover new drugs to treat the disease and to identify risk factors for other diseases that are equally prevalent. Likewise, scientists from Pratt & Whitney are using the Blue Gene to understand the complex processes within aircraft engines. Expanding our understanding of jet engine combustors is the secret to improved fuel efficiency and reduced emissions. Lessons learned from the scientific simulations of jet engine combustors have already led Pratt & Whitney to newer designs with unprecedented reductions in emissions, noise, and cost of ownership. ALCF staff members provide in-depth expertise and assistance to those using the Blue Gene/L and optimizing user applications. Both the Catalyst and Applications Performance Engineering and Data Analytics (APEDA) teams support the users projects. In addition to working with scientists running experiments on the Blue Gene/L, we have become a nexus for the broader global community. In partnership with the Mathematics and Computer Science Division at Argonne National Laboratory, we have created an environment where the world's most challenging computational science problems can be addressed. Our expertise in high-end scientific computing enables us to provide guidance for applications that are transitioning to petascale as well as to produce software that facilitates their development, such as the MPICH library, which provides a portable and efficient implementation of the MPI standard--the prevalent programming model for large-scale scientific applications--and the PETSc toolkit that provides a programming paradigm that eases the development of many scientific applications on high-end computers.« less
NASA Astrophysics Data System (ADS)
Moon, Hongsik
What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the changing computer hardware platforms in order to provide fast, accurate and efficient solutions to large, complex electromagnetic problems. The research in this dissertation proves that the performance of parallel code is intimately related to the configuration of the computer hardware and can be maximized for different hardware platforms. To benchmark and optimize the performance of parallel CEM software, a variety of large, complex projects are created and executed on a variety of computer platforms. The computer platforms used in this research are detailed in this dissertation. The projects run as benchmarks are also described in detail and results are presented. The parameters that affect parallel CEM software on High Performance Computing Clusters (HPCC) are investigated. This research demonstrates methods to maximize the performance of parallel CEM software code.
A survey of GPU-based acceleration techniques in MRI reconstructions
Wang, Haifeng; Peng, Hanchuan; Chang, Yuchou
2018-01-01
Image reconstruction in magnetic resonance imaging (MRI) clinical applications has become increasingly more complicated. However, diagnostic and treatment require very fast computational procedure. Modern competitive platforms of graphics processing unit (GPU) have been used to make high-performance parallel computations available, and attractive to common consumers for computing massively parallel reconstruction problems at commodity price. GPUs have also become more and more important for reconstruction computations, especially when deep learning starts to be applied into MRI reconstruction. The motivation of this survey is to review the image reconstruction schemes of GPU computing for MRI applications and provide a summary reference for researchers in MRI community. PMID:29675361
A survey of GPU-based acceleration techniques in MRI reconstructions.
Wang, Haifeng; Peng, Hanchuan; Chang, Yuchou; Liang, Dong
2018-03-01
Image reconstruction in magnetic resonance imaging (MRI) clinical applications has become increasingly more complicated. However, diagnostic and treatment require very fast computational procedure. Modern competitive platforms of graphics processing unit (GPU) have been used to make high-performance parallel computations available, and attractive to common consumers for computing massively parallel reconstruction problems at commodity price. GPUs have also become more and more important for reconstruction computations, especially when deep learning starts to be applied into MRI reconstruction. The motivation of this survey is to review the image reconstruction schemes of GPU computing for MRI applications and provide a summary reference for researchers in MRI community.
How the Theory of Computing Can Help in Space Exploration
NASA Technical Reports Server (NTRS)
Kreinovich, Vladik; Longpre, Luc
1997-01-01
The opening of the NASA Pan American Center for Environmental and Earth Sciences (PACES) at the University of Texas at El Paso made it possible to organize the student Center for Theoretical Research and its Applications in Computer Science (TRACS). In this abstract, we briefly describe the main NASA-related research directions of the TRACS center, and give an overview of the preliminary results of student research.
NASA Technical Reports Server (NTRS)
1979-01-01
A comprehensive review of all NASA airfoil research, conducted both in-house and under grant and contract, as well as a broad spectrum of airfoil research outside of NASA is presented. Emphasis is placed on the development of computational aerodynamic codes for airfoil analysis and design, the development of experimental facilities and test techniques, and all types of airfoil applications.
ERIC Educational Resources Information Center
Vigilante, Richard P.
This monograph introduces educational administrators at a variety of levels to the basic concepts and procedures in the successful implementation of educational computer systems. In the first section, the units and functions of the computer are defined, and the administrative, research, and instructional applications of educational computing are…
WebStars: Holistic, Arts-Based College Curriculum in a Computer Applications Course
ERIC Educational Resources Information Center
Karsten, Selia
2004-01-01
The purpose of my qualitative, action study was to gain a better understanding of the effects of an experimental college course in computer applications. This inquiry was made concerning both the teacher's and learners' points of view. A holistic, arts-based approach was used by the researcher/teacher in order to design, develop and facilitate a…
CT Image Sequence Processing For Wood Defect Recognition
Dongping Zhu; R.W. Conners; Philip A. Araman
1991-01-01
The research reported in this paper explores a non-destructive testing application of x-ray computed tomography (CT) in the forest products industry. This application involves a computer vision system that uses CT to locate and identify internal defects in hardwood logs. The knowledge of log defects is critical in deciding whether to veneer or to saw up a log, and how...
Language, Learning, and Identity in Social Networking Sites for Language Learning: The Case of Busuu
ERIC Educational Resources Information Center
Alvarez Valencia, Jose Aldemar
2014-01-01
Recent progress in the discipline of computer applications such as the advent of web-based communication, afforded by the Web 2.0, has paved the way for novel applications in language learning, namely, social networking. Social networking has challenged the area of Computer Mediated Communication (CMC) to expand its research palette in order to…
ERIC Educational Resources Information Center
Fuchs, Lynn S.; Fuchs, Douglas; Courey, Susan J.
2005-01-01
In this article, the authors explain how curriculum-based measurement (CBM) differs from other forms of classroom-based assessment. The development of CBM is traced from computation to concepts and applications to real-life problem solving, with examples of the assessments and illustrations of research to document technical features and utility…
The Job Training Partnership Act and Computer-Assisted Instruction. Research Report 88-13.
ERIC Educational Resources Information Center
Education Turnkey Systems, Inc., Falls Church, VA.
A study sought to (1) determine the current and potential instructional application of computers in Job Training Partnership Act (JTPA) Titles II, III, and IV programs; and (2) present policy options that would increase the effective use of this technology in employment and training programs. Research methodology involved conducting an assessment…
ERIC Educational Resources Information Center
Anohah, Ebenezer; Oyelere, Solomon Sunday; Suhonen, Jarkko
2017-01-01
The majority of the existing research regarding mobile learning in computing education has primarily focused on studying the effectiveness of, and in some cases reporting about, implemented mobile learning solutions. However, it is equally important to explore development and application perspectives on the integration of mobile learning into…
Dzemidzic, Vildana; Sokic, Emir; Tiro, Alisa; Nakas, Enita
2015-12-01
This study was aimed to investigate the reliability of a computer application for assessment of the stages of cervical vertebra maturation in order to determine the stage of skeletal maturity. For this study, digital lateral cephalograms of 99 subjects (52 females and 47 males) were examined. The following selection criteria were used during the sample composition: age between 9 and 16 years, absence of anomalies of the vertebrae, good general health, no history of trauma at the cervical region. Subjects with lateral cephalograms of low quality were excluded from the study. For the purpose of this study a computer application Cephalometar HF V1 was developed. This application was used to mark the contours of the second, third and fourth cervical vertebrae on the digital lateral cephalograms, which enabled a computer to determine the stage of cervical vertebral maturation. The assessment of the stages of cervical vertebral maturation was carried out by an experienced orthodontist. The assessment was conducted according to the principles of the method proposed by authors Hassel and Farman. The degree of the agreement between the computer application and the researcher was analyzed using by statistical Cohen Kappa test. The results of this study showed the agreement between the computer assessment and the researcher assessment of the cervical vertebral maturation stages, where the value of the Cohen Kappa coefficient was 0.985. The computer application Cephalometar HF V1 proved to be a reliable method for assessing the stages of cervical vertebral maturation. This program could help the orthodontists to identify the stage of cervical vertebral maturation when planning the orthodontic treatment for the patients with skeletal disharmonies.
NASA Technical Reports Server (NTRS)
Weeks, Cindy Lou
1986-01-01
Experiments were conducted at NASA Ames Research Center to define multi-tasking software requirements for multiple-instruction, multiple-data stream (MIMD) computer architectures. The focus was on specifying solutions for algorithms in the field of computational fluid dynamics (CFD). The program objectives were to allow researchers to produce usable parallel application software as soon as possible after acquiring MIMD computer equipment, to provide researchers with an easy-to-learn and easy-to-use parallel software language which could be implemented on several different MIMD machines, and to enable researchers to list preferred design specifications for future MIMD computer architectures. Analysis of CFD algorithms indicated that extensions of an existing programming language, adaptable to new computer architectures, provided the best solution to meeting program objectives. The CoFORTRAN Language was written in response to these objectives and to provide researchers a means to experiment with parallel software solutions to CFD algorithms on machines with parallel architectures.
The Application of Learning Styles to Computer Assisted Instruction in Nursing Education
1991-01-01
nursing profession is to integrate computer technology into the learning process at all levels of nursing education . In order to successfully accomplish... learning styles. * Computer technology needs to be integrated into nursing education , research and practice. * * An evaluation tool needs to be...Computer-assisted video instruction Learning Styles and CAI 71 References Aiken, E. (1990). Continuing nursing education in computer technology : A regional
ERIC Educational Resources Information Center
Emery, James C., Ed.
A comprehensive review of the current status, prospects, and problems of computer networking in higher education is presented from the perspectives of both computer users and network suppliers. Several areas of computer use are considered including applications for instruction, research, and administration in colleges and universities. In the…
2017-03-23
performance computing resources made available by the US Department of Defense High Performance Computing Modernization Program at the Air Force...1Department of Defense Biotechnology High Performance Computing Software Applications Institute, Telemedicine and Advanced Technology Research Center, United...States Army Medical Research and Materiel Command, Fort Detrick, Maryland, USA Full list of author information is available at the end of the article
Physics through the 1990s: Scientific interfaces and technological applications
NASA Technical Reports Server (NTRS)
1986-01-01
The volume examines the scientific interfaces and technological applications of physics. Twelve areas are dealt with: biological physics-biophysics, the brain, and theoretical biology; the physics-chemistry interface-instrumentation, surfaces, neutron and synchrotron radiation, polymers, organic electronic materials; materials science; geophysics-tectonics, the atmosphere and oceans, planets, drilling and seismic exploration, and remote sensing; computational physics-complex systems and applications in basic research; mathematics-field theory and chaos; microelectronics-integrated circuits, miniaturization, future trends; optical information technologies-fiber optics and photonics; instrumentation; physics applications to energy needs and the environment; national security-devices, weapons, and arms control; medical physics-radiology, ultrasonics, MNR, and photonics. An executive summary and many chapters contain recommendations regarding funding, education, industry participation, small-group university research and large facility programs, government agency programs, and computer database needs.
NASA Technical Reports Server (NTRS)
1981-01-01
The Space Transportation System (STS) is discussed, including the launch processing system, the thermal protection subsystem, meteorological research, sound supression water system, rotating service structure, improved hypergol or removal systems, fiber optics research, precision positioning, remote controlled solid rocket booster nozzle plugs, ground operations for Centaur orbital transfer vehicle, parachute drying, STS hazardous waste disposal and recycle, toxic waste technology and control concepts, fast analytical densitometry study, shuttle inventory management system, operational intercommunications system improvement, and protective garment ensemble. Terrestrial applications are also covered, including LANDSAT applications to water resources, satellite freeze forecast system, application of ground penetrating radar to soil survey, turtle tracking, evaluating computer drawn ground cover maps, sparkless load pulsar, and coupling a microcomputer and computing integrator with a gas chromatograph.
NASA Technical Reports Server (NTRS)
1988-01-01
The charter of the Structures Division is to perform and disseminate results of research conducted in support of aerospace engine structures. These results have a wide range of applicability to practioners of structural engineering mechanics beyond the aerospace arena. The specific purpose of the symposium was to familiarize the engineering structures community with the depth and range of research performed by the division and its academic and industrial partners. Sessions covered vibration control, fracture mechanics, ceramic component reliability, parallel computing, nondestructive evaluation, constitutive models and experimental capabilities, dynamic systems, fatigue and damage, wind turbines, hot section technology (HOST), aeroelasticity, structural mechanics codes, computational methods for dynamics, structural optimization, and applications of structural dynamics, and structural mechanics computer codes.
Desktop supercomputer: what can it do?
NASA Astrophysics Data System (ADS)
Bogdanov, A.; Degtyarev, A.; Korkhov, V.
2017-12-01
The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.
NASA Astrophysics Data System (ADS)
Zhao, Ben; Garbacki, Paweł; Gkantsidis, Christos; Iamnitchi, Adriana; Voulgaris, Spyros
After a decade of intensive investigation, peer-to-peer computing has established itself as an accepted research eld in the general area of distributed systems. Peer-to- peer computing can be seen as the democratization of computing over throwing traditional hierarchical designs favored in client-server systems largely brought about by last-mile network improvements which have made individual PCs rst-class citizens in the network community. Much of the early focus in peer-to-peer systems was on best-effort le sharing applications. In recent years, however, research has focused on peer-to-peer systems that provide operational properties and functionality similar to those shown by more traditional distributed systems. These properties include stronger consistency, reliability, and security guarantees suitable to supporting traditional applications such as databases.
Computer-aided drug discovery.
Bajorath, Jürgen
2015-01-01
Computational approaches are an integral part of interdisciplinary drug discovery research. Understanding the science behind computational tools, their opportunities, and limitations is essential to make a true impact on drug discovery at different levels. If applied in a scientifically meaningful way, computational methods improve the ability to identify and evaluate potential drug molecules, but there remain weaknesses in the methods that preclude naïve applications. Herein, current trends in computer-aided drug discovery are reviewed, and selected computational areas are discussed. Approaches are highlighted that aid in the identification and optimization of new drug candidates. Emphasis is put on the presentation and discussion of computational concepts and methods, rather than case studies or application examples. As such, this contribution aims to provide an overview of the current methodological spectrum of computational drug discovery for a broad audience.
Computers as Teaching Tools: Some Examples and Guidelines.
ERIC Educational Resources Information Center
Beins, Bernard C.
The use of computers in the classroom has been touted as an important innovation in education. This article features some recently developed software for use in teaching psychology and different approaches to classroom computer use. Uses of software packages for psychological research designs are included as are applications and limitations of…
Computer-Assisted Instruction. Special Double Issue.
ERIC Educational Resources Information Center
Holmes, Glyn, Ed.
1984-01-01
This booklet presents evidence to support the idea that distinctions between the instructional and research applications of the computer are becoming blurred. The issue includes contributions from authors who are at the forefront of computer-assisted instruction (CAI) development in their respective fields. An attempt is made to represent most…
Bajorath, Jurgen
2012-01-01
We have generated a number of compound data sets and programs for different types of applications in pharmaceutical research. These data sets and programs were originally designed for our research projects and are made publicly available. Without consulting original literature sources, it is difficult to understand specific features of data sets and software tools, basic ideas underlying their design, and applicability domains. Currently, 30 different entries are available for download from our website. In this data article, we provide an overview of the data and tools we make available and designate the areas of research for which they should be useful. For selected data sets and methods/programs, detailed descriptions are given. This article should help interested readers to select data and tools for specific computational investigations. PMID:24358818
NASA Technical Reports Server (NTRS)
Beckenbach, E. S. (Editor)
1984-01-01
It is more important than ever that engineers have an understanding of the future needs of clinical and research medicine, and that physicians know somthing about probable future developments in instrumentation capabilities. Only by maintaining such a dialog can the most effective application of technological advances to medicine be achieved. This workshop attempted to provide this kind of information transfer in the limited field of diagnostic imaging. Biomedical research at the Jet Propulsion Laboratory is discussed, taking into account imaging results from space exploration missions, as well as biomedical research tasks based in these technologies. Attention is also given to current and future indications for magnetic resonance in medicine, high speed quantitative digital microscopy, computer processing of radiographic images, computed tomography and its modern applications, position emission tomography, and developments related to medical ultrasound.
Bioinformatics for Exploration
NASA Technical Reports Server (NTRS)
Johnson, Kathy A.
2006-01-01
For the purpose of this paper, bioinformatics is defined as the application of computer technology to the management of biological information. It can be thought of as the science of developing computer databases and algorithms to facilitate and expedite biological research. This is a crosscutting capability that supports nearly all human health areas ranging from computational modeling, to pharmacodynamics research projects, to decision support systems within autonomous medical care. Bioinformatics serves to increase the efficiency and effectiveness of the life sciences research program. It provides data, information, and knowledge capture which further supports management of the bioastronautics research roadmap - identifying gaps that still remain and enabling the determination of which risks have been addressed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chhabildas, Lalit Chandra; Orphal, Dennis L.
HVIS 2005 was a clear success. The Symposium brought together nearly two hundred active researchers and students from thirteen countries around the world. The 84 papers presented at HVIS 2005 constitute an ''update'' on current research and the state-of-the-art of hypervelocity science. Combined with the over 7000 pages of technical papers from the eight previous Symposia, beginning in 1986, all published in the International Journal of Impact Engineering, the papers from HVIS 2005 add to the growing body of knowledge and the progressing state-of-the-art of hypervelocity science. It is encouraging to report that even with the limited funding resources comparedmore » to two decades ago, creativity and ingenuity in hypervelocity science are alive and well. There is considerable overlap in different disciplines that allows researchers to leverage. Experimentally, higher velocities are now available in the laboratory and are ideally suited for space applications that can be tied to both civilian (NASA) and DoD military applications. Computationally, there is considerable advancement both in computer and modeling technologies. Higher computing speeds and techniques such as parallel processing allow system level type applications to be addressed directly today, much in contrast to the situation only a few years ago. Needless to say, both experimentally and computationally, the ultimate utility will depend on the curiosity and the probing questions that will be incumbent upon the individual researcher. It is quite satisfying that over two dozen students attended the symposium. Hopefully this is indicative of a good pool of future researchers that will be needed both in the government and civilian industries. It is also gratifying to note that novel thrust areas exploring different and new material phenomenology relevant to hypervelocity impact, but a number of other applications as well, are being pursued. In conclusion, considerable progress is still being made that is beneficial for continuous development of hypervelocity impact technology and applications even with the relatively limited resources that are being directed in this field.« less
Formal Representations of Eligibility Criteria: A Literature Review
Weng, Chunhua; Tu, Samson W.; Sim, Ida; Richesson, Rachel
2010-01-01
Standards-based, computable knowledge representations for eligibility criteria are increasingly needed to provide computer-based decision support for automated research participant screening, clinical evidence application, and clinical research knowledge management. We surveyed the literature and identified five aspects of eligibility criteria knowledge representations that contribute to the various research and clinical applications: the intended use of computable eligibility criteria, the classification of eligibility criteria, the expression language for representing eligibility rules, the encoding of eligibility concepts, and the modeling of patient data. We consider three of them (expression language, codification of eligibility concepts, and patient data modeling), to be essential constructs of a formal knowledge representation for eligibility criteria. The requirements for each of the three knowledge constructs vary for different use cases, which therefore should inform the development and choice of the constructs toward cost-effective knowledge representation efforts. We discuss the implications of our findings for standardization efforts toward sharable knowledge representation of eligibility criteria. PMID:20034594
NASA Technical Reports Server (NTRS)
Edwards, John W.; Malone, John B.
1992-01-01
The current status of computational methods for unsteady aerodynamics and aeroelasticity is reviewed. The key features of challenging aeroelastic applications are discussed in terms of the flowfield state: low-angle high speed flows and high-angle vortex-dominated flows. The critical role played by viscous effects in determining aeroelastic stability for conditions of incipient flow separation is stressed. The need for a variety of flow modeling tools, from linear formulations to implementations of the Navier-Stokes equations, is emphasized. Estimates of computer run times for flutter calculations using several computational methods are given. Applications of these methods for unsteady aerodynamic and transonic flutter calculations for airfoils, wings, and configurations are summarized. Finally, recommendations are made concerning future research directions.
Research on Key Technologies of Cloud Computing
NASA Astrophysics Data System (ADS)
Zhang, Shufen; Yan, Hongcan; Chen, Xuebin
With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.
Automated Generation of Message-Passing Programs: An Evaluation Using CAPTools
NASA Technical Reports Server (NTRS)
Hribar, Michelle R.; Jin, Haoqiang; Yan, Jerry C.; Saini, Subhash (Technical Monitor)
1998-01-01
Scientists at NASA Ames Research Center have been developing computational aeroscience applications on highly parallel architectures over the past ten years. During that same time period, a steady transition of hardware and system software also occurred, forcing us to expend great efforts into migrating and re-coding our applications. As applications and machine architectures become increasingly complex, the cost and time required for this process will become prohibitive. In this paper, we present the first set of results in our evaluation of interactive parallelization tools. In particular, we evaluate CAPTool's ability to parallelize computational aeroscience applications. CAPTools was tested on serial versions of the NAS Parallel Benchmarks and ARC3D, a computational fluid dynamics application, on two platforms: the SGI Origin 2000 and the Cray T3E. This evaluation includes performance, amount of user interaction required, limitations and portability. Based on these results, a discussion on the feasibility of computer aided parallelization of aerospace applications is presented along with suggestions for future work.
Munabi, Ian G; Buwembo, William; Bajunirwe, Francis; Kitara, David Lagoro; Joseph, Ruberwa; Peter, Kawungezi; Obua, Celestino; Quinn, John; Mwaka, Erisa S
2015-02-25
Effective utilization of computers and their applications in medical education and research is of paramount importance to students. The objective of this study was to determine the association between owning a computer and use of computers for research data analysis and the other factors influencing health professions students' computer use for data analysis. We conducted a cross sectional study among undergraduate health professions students at three public universities in Uganda using a self-administered questionnaire. The questionnaire was composed of questions on participant demographics, students' participation in research, computer ownership, and use of computers for data analysis. Descriptive and inferential statistics (uni-variable and multi- level logistic regression analysis) were used to analyse data. The level of significance was set at 0.05. Six hundred (600) of 668 questionnaires were completed and returned (response rate 89.8%). A majority of respondents were male (68.8%) and 75.3% reported owning computers. Overall, 63.7% of respondents reported that they had ever done computer based data analysis. The following factors were significant predictors of having ever done computer based data analysis: ownership of a computer (adj. OR 1.80, p = 0.02), recently completed course in statistics (Adj. OR 1.48, p =0.04), and participation in research (Adj. OR 2.64, p <0.01). Owning a computer, participation in research and undertaking courses in research methods influence undergraduate students' use of computers for research data analysis. Students are increasingly participating in research, and thus need to have competencies for the successful conduct of research. Medical training institutions should encourage both curricular and extra-curricular efforts to enhance research capacity in line with the modern theories of adult learning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Song
CFD (Computational Fluid Dynamics) is a widely used technique in engineering design field. It uses mathematical methods to simulate and predict flow characteristics in a certain physical space. Since the numerical result of CFD computation is very hard to understand, VR (virtual reality) and data visualization techniques are introduced into CFD post-processing to improve the understandability and functionality of CFD computation. In many cases CFD datasets are very large (multi-gigabytes), and more and more interactions between user and the datasets are required. For the traditional VR application, the limitation of computing power is a major factor to prevent visualizing largemore » dataset effectively. This thesis presents a new system designing to speed up the traditional VR application by using parallel computing and distributed computing, and the idea of using hand held device to enhance the interaction between a user and VR CFD application as well. Techniques in different research areas including scientific visualization, parallel computing, distributed computing and graphical user interface designing are used in the development of the final system. As the result, the new system can flexibly be built on heterogeneous computing environment, dramatically shorten the computation time.« less
Cloud computing for energy management in smart grid - an application survey
NASA Astrophysics Data System (ADS)
Naveen, P.; Kiing Ing, Wong; Kobina Danquah, Michael; Sidhu, Amandeep S.; Abu-Siada, Ahmed
2016-03-01
The smart grid is the emerging energy system wherein the application of information technology, tools and techniques that make the grid run more efficiently. It possesses demand response capacity to help balance electrical consumption with supply. The challenges and opportunities of emerging and future smart grids can be addressed by cloud computing. To focus on these requirements, we provide an in-depth survey on different cloud computing applications for energy management in the smart grid architecture. In this survey, we present an outline of the current state of research on smart grid development. We also propose a model of cloud based economic power dispatch for smart grid.
NASA Astrophysics Data System (ADS)
Lele, Sanjiva K.
2002-08-01
Funds were received in April 2001 under the Department of Defense DURIP program for construction of a 48 processor high performance computing cluster. This report details the hardware which was purchased and how it has been used to enable and enhance research activities directly supported by, and of interest to, the Air Force Office of Scientific Research and the Department of Defense. The report is divided into two major sections. The first section after this summary describes the computer cluster, its setup, and some cluster performance benchmark results. The second section explains ongoing research efforts which have benefited from the cluster hardware, and presents highlights of those efforts since installation of the cluster.
FEBio: finite elements for biomechanics.
Maas, Steve A; Ellis, Benjamin J; Ateshian, Gerard A; Weiss, Jeffrey A
2012-01-01
In the field of computational biomechanics, investigators have primarily used commercial software that is neither geared toward biological applications nor sufficiently flexible to follow the latest developments in the field. This lack of a tailored software environment has hampered research progress, as well as dissemination of models and results. To address these issues, we developed the FEBio software suite (http://mrl.sci.utah.edu/software/febio), a nonlinear implicit finite element (FE) framework, designed specifically for analysis in computational solid biomechanics. This paper provides an overview of the theoretical basis of FEBio and its main features. FEBio offers modeling scenarios, constitutive models, and boundary conditions, which are relevant to numerous applications in biomechanics. The open-source FEBio software is written in C++, with particular attention to scalar and parallel performance on modern computer architectures. Software verification is a large part of the development and maintenance of FEBio, and to demonstrate the general approach, the description and results of several problems from the FEBio Verification Suite are presented and compared to analytical solutions or results from other established and verified FE codes. An additional simulation is described that illustrates the application of FEBio to a research problem in biomechanics. Together with the pre- and postprocessing software PREVIEW and POSTVIEW, FEBio provides a tailored solution for research and development in computational biomechanics.
Semiannual final report, 1 October 1991 - 31 March 1992
NASA Technical Reports Server (NTRS)
1992-01-01
A summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period 1 Oct. 1991 through 31 Mar. 1992 is presented.
Translational bioinformatics in the cloud: an affordable alternative
2010-01-01
With the continued exponential expansion of publicly available genomic data and access to low-cost, high-throughput molecular technologies for profiling patient populations, computational technologies and informatics are becoming vital considerations in genomic medicine. Although cloud computing technology is being heralded as a key enabling technology for the future of genomic research, available case studies are limited to applications in the domain of high-throughput sequence data analysis. The goal of this study was to evaluate the computational and economic characteristics of cloud computing in performing a large-scale data integration and analysis representative of research problems in genomic medicine. We find that the cloud-based analysis compares favorably in both performance and cost in comparison to a local computational cluster, suggesting that cloud computing technologies might be a viable resource for facilitating large-scale translational research in genomic medicine. PMID:20691073
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aimone, James Bradley; Betty, Rita
Using High Performance Computing to Examine the Processes of Neurogenesis Underlying Pattern Separation/Completion of Episodic Information - Sandia researchers developed novel methods and metrics for studying the computational function of neurogenesis, thus generating substantial impact to the neuroscience and neural computing communities. This work could benefit applications in machine learning and other analysis activities.
NASA's Information Power Grid: Large Scale Distributed Computing and Data Management
NASA Technical Reports Server (NTRS)
Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)
2001-01-01
Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.
Xiao, Bo; Imel, Zac E.; Georgiou, Panayiotis; Atkins, David C.; Narayanan, Shrikanth S.
2017-01-01
Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation, and offer a series of open problems for future research. PMID:27017830
A Grid Infrastructure for Supporting Space-based Science Operations
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Redman, Sandra H.; McNair, Ann R. (Technical Monitor)
2002-01-01
Emerging technologies for computational grid infrastructures have the potential for revolutionizing the way computers are used in all aspects of our lives. Computational grids are currently being implemented to provide a large-scale, dynamic, and secure research and engineering environments based on standards and next-generation reusable software, enabling greater science and engineering productivity through shared resources and distributed computing for less cost than traditional architectures. Combined with the emerging technologies of high-performance networks, grids provide researchers, scientists and engineers the first real opportunity for an effective distributed collaborative environment with access to resources such as computational and storage systems, instruments, and software tools and services for the most computationally challenging applications.
Heads in the Cloud: A Primer on Neuroimaging Applications of High Performance Computing.
Shatil, Anwar S; Younas, Sohail; Pourreza, Hossein; Figley, Chase R
2015-01-01
With larger data sets and more sophisticated analyses, it is becoming increasingly common for neuroimaging researchers to push (or exceed) the limitations of standalone computer workstations. Nonetheless, although high-performance computing platforms such as clusters, grids and clouds are already in routine use by a small handful of neuroimaging researchers to increase their storage and/or computational power, the adoption of such resources by the broader neuroimaging community remains relatively uncommon. Therefore, the goal of the current manuscript is to: 1) inform prospective users about the similarities and differences between computing clusters, grids and clouds; 2) highlight their main advantages; 3) discuss when it may (and may not) be advisable to use them; 4) review some of their potential problems and barriers to access; and finally 5) give a few practical suggestions for how interested new users can start analyzing their neuroimaging data using cloud resources. Although the aim of cloud computing is to hide most of the complexity of the infrastructure management from end-users, we recognize that this can still be an intimidating area for cognitive neuroscientists, psychologists, neurologists, radiologists, and other neuroimaging researchers lacking a strong computational background. Therefore, with this in mind, we have aimed to provide a basic introduction to cloud computing in general (including some of the basic terminology, computer architectures, infrastructure and service models, etc.), a practical overview of the benefits and drawbacks, and a specific focus on how cloud resources can be used for various neuroimaging applications.
Heads in the Cloud: A Primer on Neuroimaging Applications of High Performance Computing
Shatil, Anwar S.; Younas, Sohail; Pourreza, Hossein; Figley, Chase R.
2015-01-01
With larger data sets and more sophisticated analyses, it is becoming increasingly common for neuroimaging researchers to push (or exceed) the limitations of standalone computer workstations. Nonetheless, although high-performance computing platforms such as clusters, grids and clouds are already in routine use by a small handful of neuroimaging researchers to increase their storage and/or computational power, the adoption of such resources by the broader neuroimaging community remains relatively uncommon. Therefore, the goal of the current manuscript is to: 1) inform prospective users about the similarities and differences between computing clusters, grids and clouds; 2) highlight their main advantages; 3) discuss when it may (and may not) be advisable to use them; 4) review some of their potential problems and barriers to access; and finally 5) give a few practical suggestions for how interested new users can start analyzing their neuroimaging data using cloud resources. Although the aim of cloud computing is to hide most of the complexity of the infrastructure management from end-users, we recognize that this can still be an intimidating area for cognitive neuroscientists, psychologists, neurologists, radiologists, and other neuroimaging researchers lacking a strong computational background. Therefore, with this in mind, we have aimed to provide a basic introduction to cloud computing in general (including some of the basic terminology, computer architectures, infrastructure and service models, etc.), a practical overview of the benefits and drawbacks, and a specific focus on how cloud resources can be used for various neuroimaging applications. PMID:27279746
DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Imam, Neena; Poole, Stephen W
2013-01-01
In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET,more » and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.« less
Computers for real time flight simulation: A market survey
NASA Technical Reports Server (NTRS)
Bekey, G. A.; Karplus, W. J.
1977-01-01
An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.
Wireless Communications in Reverberant Environments
2015-01-01
Secure Wireless Agent Testbed (SWAT), the Protocol Engineering Advanced Networking (PROTEAN) Research Group, the Data Fusion Laboratory (DFL), and the...constraints of their application. 81 Bibliography [1] V. Gungor and G. Hancke, “Industrial wireless sensor networks : Challenges, design principles, and...Bhattacharya, “Path loss estimation for a wireless sensor network for application in ship,” Int. J. of Comput. Sci. and Mobile Computing, vol. 2, no. 6, pp
1990-06-01
reader is cautioned that computer programs developed in this research may not have been exercised for all cases of interest. While every effort has been...Source of Funding Numbers _. Program Element No Project No I Task No I Work Unit Accession No 11 Title (Include security classflcation) APPLICATION OF...formats. Previous applications of these encoding formats were on industry standard computers (PC) over a 16-20 klIz channel. This report discusses the
NASA Technical Reports Server (NTRS)
1988-01-01
The research activities of the Lewis Research Center for 1988 are summarized. The projects included are within basic and applied technical disciplines essential to aeropropulsion, space propulsion, space power, and space science/applications. These disciplines are materials science and technology, structural mechanics, life prediction, internal computational fluid mechanics, heat transfer, instruments and controls, and space electronics.
The AAHA Computer Program. American Animal Hospital Association.
Albers, J W
1986-07-01
The American Animal Hospital Association Computer Program should benefit all small animal practitioners. Through the availability of well-researched and well-developed certified software, veterinarians will have increased confidence in their purchase decisions. With the expansion of computer applications to improve practice management efficiency, veterinary computer systems will further justify their initial expense. The development of the Association's veterinary computer network will provide a variety of important services to the profession.
Wilaiprasitporn, Theerawit; Yagi, Tohru
2015-01-01
This research demonstrates the orientation-modulated attention effect on visual evoked potential. We combined this finding with our previous findings about the motion-modulated attention effect and used the result to develop novel visual stimuli for a personal identification number (PIN) application based on a brain-computer interface (BCI) framework. An electroencephalography amplifier with a single electrode channel was sufficient for our application. A computationally inexpensive algorithm and small datasets were used in processing. Seven healthy volunteers participated in experiments to measure offline performance. Mean accuracy was 83.3% at 13.9 bits/min. Encouraged by these results, we plan to continue developing the BCI-based personal identification application toward real-time systems.
Heterogeneous Distributed Computing for Computational Aerosciences
NASA Technical Reports Server (NTRS)
Sunderam, Vaidy S.
1998-01-01
The research supported under this award focuses on heterogeneous distributed computing for high-performance applications, with particular emphasis on computational aerosciences. The overall goal of this project was to and investigate issues in, and develop solutions to, efficient execution of computational aeroscience codes in heterogeneous concurrent computing environments. In particular, we worked in the context of the PVM[1] system and, subsequent to detailed conversion efforts and performance benchmarking, devising novel techniques to increase the efficacy of heterogeneous networked environments for computational aerosciences. Our work has been based upon the NAS Parallel Benchmark suite, but has also recently expanded in scope to include the NAS I/O benchmarks as specified in the NHT-1 document. In this report we summarize our research accomplishments under the auspices of the grant.
Mobile Computing for Aerospace Applications
NASA Technical Reports Server (NTRS)
Alena, Richard; Swietek, Gregory E. (Technical Monitor)
1994-01-01
The use of commercial computer technology in specific aerospace mission applications can reduce the cost and project cycle time required for the development of special-purpose computer systems. Additionally, the pace of technological innovation in the commercial market has made new computer capabilities available for demonstrations and flight tests. Three areas of research and development being explored by the Portable Computer Technology Project at NASA Ames Research Center are the application of commercial client/server network computing solutions to crew support and payload operations, the analysis of requirements for portable computing devices, and testing of wireless data communication links as extensions to the wired network. This paper will present computer architectural solutions to portable workstation design including the use of standard interfaces, advanced flat-panel displays and network configurations incorporating both wired and wireless transmission media. It will describe the design tradeoffs used in selecting high-performance processors and memories, interfaces for communication and peripheral control, and high resolution displays. The packaging issues for safe and reliable operation aboard spacecraft and aircraft are presented. The current status of wireless data links for portable computers is discussed from a system design perspective. An end-to-end data flow model for payload science operations from the experiment flight rack to the principal investigator is analyzed using capabilities provided by the new generation of computer products. A future flight experiment on-board the Russian MIR space station will be described in detail including system configuration and function, the characteristics of the spacecraft operating environment, the flight qualification measures needed for safety review, and the specifications of the computing devices to be used in the experiment. The software architecture chosen shall be presented. An analysis of the performance characteristics of wireless data links in the spacecraft environment will be discussed. Network performance and operation will be modeled and preliminary test results presented. A crew support application will be demonstrated in conjunction with the network metrics experiment.
MOLAR: Modular Linux and Adaptive Runtime Support for HEC OS/R Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frank Mueller
2009-02-05
MOLAR is a multi-institution research effort that concentrates on adaptive, reliable,and efficient operating and runtime system solutions for ultra-scale high-end scientific computing on the next generation of supercomputers. This research addresses the challenges outlined by the FAST-OS - forum to address scalable technology for runtime and operating systems --- and HECRTF --- high-end computing revitalization task force --- activities by providing a modular Linux and adaptable runtime support for high-end computing operating and runtime systems. The MOLAR research has the following goals to address these issues. (1) Create a modular and configurable Linux system that allows customized changes based onmore » the requirements of the applications, runtime systems, and cluster management software. (2) Build runtime systems that leverage the OS modularity and configurability to improve efficiency, reliability, scalability, ease-of-use, and provide support to legacy and promising programming models. (3) Advance computer reliability, availability and serviceability (RAS) management systems to work cooperatively with the OS/R to identify and preemptively resolve system issues. (4) Explore the use of advanced monitoring and adaptation to improve application performance and predictability of system interruptions. The overall goal of the research conducted at NCSU is to develop scalable algorithms for high-availability without single points of failure and without single points of control.« less
NASA Astrophysics Data System (ADS)
Huang, Qian
2014-09-01
Scientific computing often requires the availability of a massive number of computers for performing large-scale simulations, and computing in mineral physics is no exception. In order to investigate physical properties of minerals at extreme conditions in computational mineral physics, parallel computing technology is used to speed up the performance by utilizing multiple computer resources to process a computational task simultaneously thereby greatly reducing computation time. Traditionally, parallel computing has been addressed by using High Performance Computing (HPC) solutions and installed facilities such as clusters and super computers. Today, it has been seen that there is a tremendous growth in cloud computing. Infrastructure as a Service (IaaS), the on-demand and pay-as-you-go model, creates a flexible and cost-effective mean to access computing resources. In this paper, a feasibility report of HPC on a cloud infrastructure is presented. It is found that current cloud services in IaaS layer still need to improve performance to be useful to research projects. On the other hand, Software as a Service (SaaS), another type of cloud computing, is introduced into an HPC system for computing in mineral physics, and an application of which is developed. In this paper, an overall description of this SaaS application is presented. This contribution can promote cloud application development in computational mineral physics, and cross-disciplinary studies.
Marshal Wrubel and the Electronic Computer as an Astronomical Instrument
NASA Astrophysics Data System (ADS)
Mutschlecner, J. P.; Olsen, K. H.
1998-05-01
In 1960, Marshal H. Wrubel, professor of astrophysics at Indiana University, published an influential review paper under the title, "The Electronic Computer as an Astronomical Instrument." This essay pointed out the enormous potential of the electronic computer as an instrument of observational and theoretical research in astronomy, illustrated programming concepts, and made specific recommendations for the increased use of computers in astronomy. He noted that, with a few scattered exceptions, computer use by the astronomical community had heretofore been "timid and sporadic." This situation was to improve dramatically in the next few years. By the late 1950s, general-purpose, high-speed, "mainframe" computers were just emerging from the experimental, developmental stage, but few were affordable by or available to academic and research institutions not closely associated with large industrial or national defense programs. Yet by 1960 Wrubel had spent a decade actively pioneering and promoting the imaginative application of electronic computation within the astronomical community. Astronomy upper-level undergraduate and graduate students at Indiana were introduced to computing, and Ph.D. candidates who he supervised applied computer techniques to problems in theoretical astrophysics. He wrote an early textbook on programming, taught programming classes, and helped establish and direct the Research Computing Center at Indiana, later named the Wrubel Computing Center in his honor. He and his students created a variety of algorithms and subroutines and exchanged these throughout the astronomical community by distributing the Astronomical Computation News Letter. Nationally as well as internationally, Wrubel actively cooperated with other groups interested in computing applications for theoretical astrophysics, often through his position as secretary of the IAU commission on Stellar Constitution.
ICASE semiannual report, April 1 - September 30, 1989
NASA Technical Reports Server (NTRS)
1990-01-01
The Institute conducts unclassified basic research in applied mathematics, numerical analysis, and computer science in order to extend and improve problem-solving capabilities in science and engineering, particularly in aeronautics and space. The major categories of the current Institute for Computer Applications in Science and Engineering (ICASE) research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification problems, with emphasis on effective numerical methods; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers. ICASE reports are considered to be primarily preprints of manuscripts that have been submitted to appropriate research journals or that are to appear in conference proceedings.
Architectural Aspects of Grid Computing and its Global Prospects for E-Science Community
NASA Astrophysics Data System (ADS)
Ahmad, Mushtaq
2008-05-01
The paper reviews the imminent Architectural Aspects of Grid Computing for e-Science community for scientific research and business/commercial collaboration beyond physical boundaries. Grid Computing provides all the needed facilities; hardware, software, communication interfaces, high speed internet, safe authentication and secure environment for collaboration of research projects around the globe. It provides highly fast compute engine for those scientific and engineering research projects and business/commercial applications which are heavily compute intensive and/or require humongous amounts of data. It also makes possible the use of very advanced methodologies, simulation models, expert systems and treasure of knowledge available around the globe under the umbrella of knowledge sharing. Thus it makes possible one of the dreams of global village for the benefit of e-Science community across the globe.
The Sunrise project: An R&D project for a national information infrastructure prototype
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Juhnyoung
1995-02-01
Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to a prototype National Information Infrastructure (NII) development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multimedia technologies, and data mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; and (3) To define a new way of collaboration between computer science and industrially relevant research.« less
A Computer-Assisted Instruction in Teaching Abstract Statistics to Public Affairs Undergraduates
ERIC Educational Resources Information Center
Ozturk, Ali Osman
2012-01-01
This article attempts to demonstrate the applicability of a computer-assisted instruction supported with simulated data in teaching abstract statistical concepts to political science and public affairs students in an introductory research methods course. The software is called the Elaboration Model Computer Exercise (EMCE) in that it takes a great…
Computer Aided Reading Diagnosis.
ERIC Educational Resources Information Center
McEneaney, John E.
Computer technologies are having an ever-increasing influence on educational research and practice in Russia and the United States. In Russia, a number of recent papers have focused on the application of the computer as a teaching tool and on its influence in instructional organization and planning. In the United States, there is a great deal of…
The Centre of High-Performance Scientific Computing, Geoverbund, ABC/J - Geosciences enabled by HPSC
NASA Astrophysics Data System (ADS)
Kollet, Stefan; Görgen, Klaus; Vereecken, Harry; Gasper, Fabian; Hendricks-Franssen, Harrie-Jan; Keune, Jessica; Kulkarni, Ketan; Kurtz, Wolfgang; Sharples, Wendy; Shrestha, Prabhakar; Simmer, Clemens; Sulis, Mauro; Vanderborght, Jan
2016-04-01
The Centre of High-Performance Scientific Computing (HPSC TerrSys) was founded 2011 to establish a centre of competence in high-performance scientific computing in terrestrial systems and the geosciences enabling fundamental and applied geoscientific research in the Geoverbund ABC/J (geoscientfic research alliance of the Universities of Aachen, Cologne, Bonn and the Research Centre Jülich, Germany). The specific goals of HPSC TerrSys are to achieve relevance at the national and international level in (i) the development and application of HPSC technologies in the geoscientific community; (ii) student education; (iii) HPSC services and support also to the wider geoscientific community; and in (iv) the industry and public sectors via e.g., useful applications and data products. A key feature of HPSC TerrSys is the Simulation Laboratory Terrestrial Systems, which is located at the Jülich Supercomputing Centre (JSC) and provides extensive capabilities with respect to porting, profiling, tuning and performance monitoring of geoscientific software in JSC's supercomputing environment. We will present a summary of success stories of HPSC applications including integrated terrestrial model development, parallel profiling and its application from watersheds to the continent; massively parallel data assimilation using physics-based models and ensemble methods; quasi-operational terrestrial water and energy monitoring; and convection permitting climate simulations over Europe. The success stories stress the need for a formalized education of students in the application of HPSC technologies in future.
Design and implementation of a Windows NT network to support CNC activities
NASA Technical Reports Server (NTRS)
Shearrow, C. A.
1996-01-01
The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.
Richesson, Rachel L; Smerek, Michelle M; Blake Cameron, C
2016-01-01
The ability to reproducibly identify clinically equivalent patient populations is critical to the vision of learning health care systems that implement and evaluate evidence-based treatments. The use of common or semantically equivalent phenotype definitions across research and health care use cases will support this aim. Currently, there is no single consolidated repository for computable phenotype definitions, making it difficult to find all definitions that already exist, and also hindering the sharing of definitions between user groups. Drawing from our experience in an academic medical center that supports a number of multisite research projects and quality improvement studies, we articulate a framework that will support the sharing of phenotype definitions across research and health care use cases, and highlight gaps and areas that need attention and collaborative solutions. An infrastructure for re-using computable phenotype definitions and sharing experience across health care delivery and clinical research applications includes: access to a collection of existing phenotype definitions, information to evaluate their appropriateness for particular applications, a knowledge base of implementation guidance, supporting tools that are user-friendly and intuitive, and a willingness to use them. We encourage prospective researchers and health administrators to re-use existing EHR-based condition definitions where appropriate and share their results with others to support a national culture of learning health care. There are a number of federally funded resources to support these activities, and research sponsors should encourage their use.
Richesson, Rachel L.; Smerek, Michelle M.; Blake Cameron, C.
2016-01-01
Introduction: The ability to reproducibly identify clinically equivalent patient populations is critical to the vision of learning health care systems that implement and evaluate evidence-based treatments. The use of common or semantically equivalent phenotype definitions across research and health care use cases will support this aim. Currently, there is no single consolidated repository for computable phenotype definitions, making it difficult to find all definitions that already exist, and also hindering the sharing of definitions between user groups. Method: Drawing from our experience in an academic medical center that supports a number of multisite research projects and quality improvement studies, we articulate a framework that will support the sharing of phenotype definitions across research and health care use cases, and highlight gaps and areas that need attention and collaborative solutions. Framework: An infrastructure for re-using computable phenotype definitions and sharing experience across health care delivery and clinical research applications includes: access to a collection of existing phenotype definitions, information to evaluate their appropriateness for particular applications, a knowledge base of implementation guidance, supporting tools that are user-friendly and intuitive, and a willingness to use them. Next Steps: We encourage prospective researchers and health administrators to re-use existing EHR-based condition definitions where appropriate and share their results with others to support a national culture of learning health care. There are a number of federally funded resources to support these activities, and research sponsors should encourage their use. PMID:27563686
Research and technology at the Kennedy Space Center
NASA Technical Reports Server (NTRS)
1983-01-01
Cryogenic engineering, hypergolic engineering, hazardous warning, structures and mechanics, computer sciences, communications, meteorology, technology applications, safety engineering, materials analysis, biomedicine, and engineering management and training aids research are reviewed.
Research in mathematical theory of computation. [computer programming applications
NASA Technical Reports Server (NTRS)
Mccarthy, J.
1973-01-01
Research progress in the following areas is reviewed: (1) new version of computer program LCF (logic for computable functions) including a facility to search for proofs automatically; (2) the description of the language PASCAL in terms of both LCF and in first order logic; (3) discussion of LISP semantics in LCF and attempt to prove the correctness of the London compilers in a formal way; (4) design of both special purpose and domain independent proving procedures specifically program correctness in mind; (5) design of languages for describing such proof procedures; and (6) the embedding of ideas in the first order checker.
Time and Space Partition Platform for Safe and Secure Flight Software
NASA Astrophysics Data System (ADS)
Esquinas, Angel; Zamorano, Juan; de la Puente, Juan A.; Masmano, Miguel; Crespo, Alfons
2012-08-01
There are a number of research and development activities that are exploring Time and Space Partition (TSP) to implement safe and secure flight software. This approach allows to execute different real-time applications with different levels of criticality in the same computer board. In order to do that, flight applications must be isolated from each other in the temporal and spatial domains. This paper presents the first results of a partitioning platform based on the Open Ravenscar Kernel (ORK+) and the XtratuM hypervisor. ORK+ is a small, reliable realtime kernel supporting the Ada Ravenscar Computational model that is central to the ASSERT development process. XtratuM supports multiple virtual machines, i.e. partitions, on a single computer and is being used in the Integrated Modular Avionics for Space study. ORK+ executes in an XtratuM partition enabling Ada applications to share the computer board with other applications.
Development and application of air quality models at the US ...
Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Research Laboratory (NERL). This presentation will provide a simple overview of air quality model development and application geared toward a non-technical student audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
[Activities of Research Institute for Advanced Computer Science
NASA Technical Reports Server (NTRS)
Gross, Anthony R. (Technical Monitor); Leiner, Barry M.
2001-01-01
The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.
NASA Astrophysics Data System (ADS)
Lugmayr, Artur
2006-02-01
The research field of ambient media starts to spread rapidly and first applications for consumer homes are on the way. Ambient media is the logical continuation of research around media. Media has been evolving from old media (e.g. print media), to integrated presentation in one form (multimedia - or new media), to generating a synthetic world (virtual reality), to the natural environment is the user-interface (ambient media), and will be evolving towards real/synthetic undistinguishable media (bio-media or bio-multimedia). After the IT bubble was bursting, multimedia was lacking a vision of potential future scenarios and applications. Within this research paper the potentials, applications, and market available solutions of mobile ambient multimedia are studied. The different features of ambient mobile multimedia are manifold and include wearable computers, adaptive software, context awareness, ubiquitous computers, middleware, and wireless networks. The paper especially focuses on algorithms and methods that can be utilized to realize modern mobile ambient systems.
NASA Technical Reports Server (NTRS)
Jedlovec, Gary J.; Molthan, Andrew; Zavodsky, Bradley T.; Case, Jonathan L.; LaFontaine, Frank J.; Srikishen, Jayanthi
2010-01-01
The NASA Short-term Prediction Research and Transition Center (SPoRT)'s new "Weather in a Box" resources will provide weather research and forecast modeling capabilities for real-time application. Model output will provide additional forecast guidance and research into the impacts of new NASA satellite data sets and software capabilities. By combining several research tools and satellite products, SPoRT can generate model guidance that is strongly influenced by unique NASA contributions.
Reconfigurable vision system for real-time applications
NASA Astrophysics Data System (ADS)
Torres-Huitzil, Cesar; Arias-Estrada, Miguel
2002-03-01
Recently, a growing community of researchers has used reconfigurable systems to solve computationally intensive problems. Reconfigurability provides optimized processors for systems on chip designs, and makes easy to import technology to a new system through reusable modules. The main objective of this work is the investigation of a reconfigurable computer system targeted for computer vision and real-time applications. The system is intended to circumvent the inherent computational load of most window-based computer vision algorithms. It aims to build a system for such tasks by providing an FPGA-based hardware architecture for task specific vision applications with enough processing power, using the minimum amount of hardware resources as possible, and a mechanism for building systems using this architecture. Regarding the software part of the system, a library of pre-designed and general-purpose modules that implement common window-based computer vision operations is being investigated. A common generic interface is established for these modules in order to define hardware/software components. These components can be interconnected to develop more complex applications, providing an efficient mechanism for transferring image and result data among modules. Some preliminary results are presented and discussed.
Daxini, S D; Prajapati, J M
2014-01-01
Meshfree methods are viewed as next generation computational techniques. With evident limitations of conventional grid based methods, like FEM, in dealing with problems of fracture mechanics, large deformation, and simulation of manufacturing processes, meshfree methods have gained much attention by researchers. A number of meshfree methods have been proposed till now for analyzing complex problems in various fields of engineering. Present work attempts to review recent developments and some earlier applications of well-known meshfree methods like EFG and MLPG to various types of structure mechanics and fracture mechanics applications like bending, buckling, free vibration analysis, sensitivity analysis and topology optimization, single and mixed mode crack problems, fatigue crack growth, and dynamic crack analysis and some typical applications like vibration of cracked structures, thermoelastic crack problems, and failure transition in impact problems. Due to complex nature of meshfree shape functions and evaluation of integrals in domain, meshless methods are computationally expensive as compared to conventional mesh based methods. Some improved versions of original meshfree methods and other techniques suggested by researchers to improve computational efficiency of meshfree methods are also reviewed here.
An assessment of the real-time application capabilities of the SIFT computer system
NASA Technical Reports Server (NTRS)
Butler, R. W.
1982-01-01
The real-time capabilities of the SIFT computer system, a highly reliable multicomputer architecture developed to support the flight controls of a relaxed static stability aircraft, are discussed. The SIFT computer system was designed to meet extremely high reliability requirements and to facilitate a formal proof of its correctness. Although SIFT represents a significant achievement in fault-tolerant system research it presents an unusual and restrictive interface to its users. The characteristics of the user interface and its impact on application system design are assessed.
A brief overview of NASA Langley's research program in formal methods
NASA Technical Reports Server (NTRS)
1992-01-01
An overview of NASA Langley's research program in formal methods is presented. The major goal of this work is to bring formal methods technology to a sufficiently mature level for use by the United States aerospace industry. Towards this goal, work is underway to design and formally verify a fault-tolerant computing platform suitable for advanced flight control applications. Also, several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of six NASA civil servants and contractors from Boeing Military Aircraft Company, Computational Logic Inc., Odyssey Research Associates, SRI International, University of California at Davis, and Vigyan Inc.
Exploiting GPUs in Virtual Machine for BioCloud
Jo, Heeseung; Jeong, Jinkyu; Lee, Myoungho; Choi, Dong Hoon
2013-01-01
Recently, biological applications start to be reimplemented into the applications which exploit many cores of GPUs for better computation performance. Therefore, by providing virtualized GPUs to VMs in cloud computing environment, many biological applications will willingly move into cloud environment to enhance their computation performance and utilize infinite cloud computing resource while reducing expenses for computations. In this paper, we propose a BioCloud system architecture that enables VMs to use GPUs in cloud environment. Because much of the previous research has focused on the sharing mechanism of GPUs among VMs, they cannot achieve enough performance for biological applications of which computation throughput is more crucial rather than sharing. The proposed system exploits the pass-through mode of PCI express (PCI-E) channel. By making each VM be able to access underlying GPUs directly, applications can show almost the same performance as when those are in native environment. In addition, our scheme multiplexes GPUs by using hot plug-in/out device features of PCI-E channel. By adding or removing GPUs in each VM in on-demand manner, VMs in the same physical host can time-share their GPUs. We implemented the proposed system using the Xen VMM and NVIDIA GPUs and showed that our prototype is highly effective for biological GPU applications in cloud environment. PMID:23710465
Exploiting GPUs in virtual machine for BioCloud.
Jo, Heeseung; Jeong, Jinkyu; Lee, Myoungho; Choi, Dong Hoon
2013-01-01
Recently, biological applications start to be reimplemented into the applications which exploit many cores of GPUs for better computation performance. Therefore, by providing virtualized GPUs to VMs in cloud computing environment, many biological applications will willingly move into cloud environment to enhance their computation performance and utilize infinite cloud computing resource while reducing expenses for computations. In this paper, we propose a BioCloud system architecture that enables VMs to use GPUs in cloud environment. Because much of the previous research has focused on the sharing mechanism of GPUs among VMs, they cannot achieve enough performance for biological applications of which computation throughput is more crucial rather than sharing. The proposed system exploits the pass-through mode of PCI express (PCI-E) channel. By making each VM be able to access underlying GPUs directly, applications can show almost the same performance as when those are in native environment. In addition, our scheme multiplexes GPUs by using hot plug-in/out device features of PCI-E channel. By adding or removing GPUs in each VM in on-demand manner, VMs in the same physical host can time-share their GPUs. We implemented the proposed system using the Xen VMM and NVIDIA GPUs and showed that our prototype is highly effective for biological GPU applications in cloud environment.
Speeding Up Geophysical Research Using Docker Containers Within Multi-Cloud Environment.
NASA Astrophysics Data System (ADS)
Synytsky, R.; Henadiy, S.; Lobzakov, V.; Kolesnikov, L.; Starovoit, Y. O.
2016-12-01
How useful are the geophysical observations in a scope of minimizing losses from natural disasters today? Does it help to decrease number of human victims during tsunami and earthquake? Unfortunately it's still at early stage these days. It's a big goal and achievement to make such observations more useful by improving early warning and prediction systems with the help of cloud computing. Cloud computing technologies have proved the ability to speed up application development in many areas for 10 years already. Cloud unlocks new opportunities for geoscientists by providing access to modern data processing tools and algorithms including real-time high-performance computing, big data processing, artificial intelligence and others. Emerging lightweight cloud technologies, such as Docker containers, are gaining wide traction in IT due to the fact of faster and more efficient deployment of different applications in a cloud environment. It allows to deploy and manage geophysical applications and systems in minutes across multiple clouds and data centers that becomes of utmost importance for the next generation applications. In this session we'll demonstrate how Docker containers technology within multi-cloud can accelerate the development of applications specifically designed for geophysical researches.
Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing
NASA Technical Reports Server (NTRS)
Wells, B. Earl
2003-01-01
The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.
Computational predictions of zinc oxide hollow structures
NASA Astrophysics Data System (ADS)
Tuoc, Vu Ngoc; Huan, Tran Doan; Thao, Nguyen Thi
2018-03-01
Nanoporous materials are emerging as potential candidates for a wide range of technological applications in environment, electronic, and optoelectronics, to name just a few. Within this active research area, experimental works are predominant while theoretical/computational prediction and study of these materials face some intrinsic challenges, one of them is how to predict porous structures. We propose a computationally and technically feasible approach for predicting zinc oxide structures with hollows at the nano scale. The designed zinc oxide hollow structures are studied with computations using the density functional tight binding and conventional density functional theory methods, revealing a variety of promising mechanical and electronic properties, which can potentially find future realistic applications.
Cognitive Model Exploration and Optimization: A New Challenge for Computational Science
2010-01-01
Introduction Research in cognitive science often involves the generation and analysis of computational cognitive models to explain various...HPC) clusters and volunteer computing for large-scale computational resources. The majority of applications on the Department of Defense HPC... clusters focus on solving partial differential equations (Post, 2009). These tend to be lean, fast models with little noise. While we lack specific
Computational Toxicology as Implemented by the US EPA ...
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the T
NASA Astrophysics Data System (ADS)
Landgrebe, Anton J.
1987-03-01
An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.
NASA Technical Reports Server (NTRS)
Landgrebe, Anton J.
1987-01-01
An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.
High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations
NASA Technical Reports Server (NTRS)
Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.
2003-01-01
Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.
LHC@Home: a BOINC-based volunteer computing infrastructure for physics studies at CERN
NASA Astrophysics Data System (ADS)
Barranco, Javier; Cai, Yunhai; Cameron, David; Crouch, Matthew; Maria, Riccardo De; Field, Laurence; Giovannozzi, Massimo; Hermes, Pascal; Høimyr, Nils; Kaltchev, Dobrin; Karastathis, Nikos; Luzzi, Cinzia; Maclean, Ewen; McIntosh, Eric; Mereghetti, Alessio; Molson, James; Nosochkov, Yuri; Pieloni, Tatiana; Reid, Ivan D.; Rivkin, Lenny; Segal, Ben; Sjobak, Kyrre; Skands, Peter; Tambasco, Claudia; Veken, Frederik Van der; Zacharov, Igor
2017-12-01
The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and has since 2011 been expanded with a wider range of applications. The traditional CERN accelerator physics simulation code SixTrack enjoys continuing volunteers support, and thanks to virtualisation a number of applications from the LHC experiment collaborations and particle theory groups have joined the consolidated LHC@Home BOINC project. This paper addresses the challenges related to traditional and virtualized applications in the BOINC environment, and how volunteer computing has been integrated into the overall computing strategy of the laboratory through the consolidated LHC@Home service. Thanks to the computing power provided by volunteers joining LHC@Home, numerous accelerator beam physics studies have been carried out, yielding an improved understanding of charged particle dynamics in the CERN Large Hadron Collider (LHC) and its future upgrades. The main results are highlighted in this paper.
NASA Astrophysics Data System (ADS)
Chiner, Esther; Garcia-Vera, Victoria E.
2017-11-01
The purpose of this study was to examine students' computer attitudes and experience, as well as students' perceptions about the use of two specific software applications (Google Drive Spreadsheets and Arquimedes) in the Building Engineering context. The relationships among these variables were also examined. Ninety-two students took part in this study. Results suggest that students hold favourable computer attitudes. Moreover, it was found a significant positive relationship among students' attitudes and their computer experience. Findings also show that students find Arquimedes software more useful and with higher output quality than Google Drive Spreadsheets, while the latter is perceived to be easier to use. Regarding the relationship among students' attitudes towards the use of computers and their perceptions about the use of both software applications, only a significant positive relationship in the case of Arquimedes was found. Findings are discussed in terms of its implications for practice and further research.
The Peer Assisted Teaching Model for Undergraduate Research at a HBCU
ERIC Educational Resources Information Center
Wu, Liyun; Lewis, Marilyn W.
2018-01-01
Despite wide application of research skills in higher education, undergraduate students reported research and computer anxiety, and low association between research and their professional goals. This study aims to assess whether peer-assisted mentoring programs would promote positive changes in undergraduates' attitudes toward research. Using a…
NASA Astrophysics Data System (ADS)
Delipetrev, Blagoj
2016-04-01
Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.
A Guide to IRUS-II Application Development
1989-09-01
Stallard (editors). Research and Develo; nent in Natural Language b’nderstan,;ng as Part of t/i Strategic Computing Program . chapter 3, pages 27-34...Development in Natural Language Processing in the Strategic Computing Program . Compi-nrional Linguistics 12(2):132-136. April-June, 1986. [24] Sidner. C.L...assist developers interested in adapting IRUS-11 to new application domains Chapter 2 provides a general introduction and overviev ,. Chapter 3 describes
1990-09-01
I. Introduction .......................................... 1 General Issue .................................. 1 Specific Research Problem...viii APPLICATION OF A MICRO COMPUTER-BASED MANAGEMENT INFORMATION SYSTEM TO IMPROVE THE USAF SERVICE REPORTING PROCESS I. Introduction General Issue...continued Transfer MIP Responsibility ,KNT WETSS0GEFORM UNCLASSIFIED 904 JAUG 19: iRR iRRl UUUUI HOWE271652_ D- FF:MCH INFO: NONE E. iUCH DATA DEF: NONE F
ERIC Educational Resources Information Center
Acat, M. Bahaddin; Kilic, Abdurrahman; Girmen, Pinar; Anagun, Senegul S.
2007-01-01
The main purpose of this study is to identify the levels of the necessity and applicability of the courses offered in the Departments of Computer Education and Instructional Technologies based on the views of the fourth grade and graduated students. In the study descriptive research model was used. The population of the study were final-year and…
High-Performance Java Codes for Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Riley, Christopher; Chatterjee, Siddhartha; Biswas, Rupak; Biegel, Bryan (Technical Monitor)
2001-01-01
The computational science community is reluctant to write large-scale computationally -intensive applications in Java due to concerns over Java's poor performance, despite the claimed software engineering advantages of its object-oriented features. Naive Java implementations of numerical algorithms can perform poorly compared to corresponding Fortran or C implementations. To achieve high performance, Java applications must be designed with good performance as a primary goal. This paper presents the object-oriented design and implementation of two real-world applications from the field of Computational Fluid Dynamics (CFD): a finite-volume fluid flow solver (LAURA, from NASA Langley Research Center), and an unstructured mesh adaptation algorithm (2D_TAG, from NASA Ames Research Center). This work builds on our previous experience with the design of high-performance numerical libraries in Java. We examine the performance of the applications using the currently available Java infrastructure and show that the Java version of the flow solver LAURA performs almost within a factor of 2 of the original procedural version. Our Java version of the mesh adaptation algorithm 2D_TAG performs within a factor of 1.5 of its original procedural version on certain platforms. Our results demonstrate that object-oriented software design principles are not necessarily inimical to high performance.
Application research of Ganglia in Hadoop monitoring and management
NASA Astrophysics Data System (ADS)
Li, Gang; Ding, Jing; Zhou, Lixia; Yang, Yi; Liu, Lei; Wang, Xiaolei
2017-03-01
There are many applications of Hadoop System in the field of large data, cloud computing. The test bench of storage and application in seismic network at Earthquake Administration of Tianjin use with Hadoop system, which is used the open source software of Ganglia to operate and monitor. This paper reviews the function, installation and configuration process, application effect of operating and monitoring in Hadoop system of the Ganglia system. It briefly introduces the idea and effect of Nagios software monitoring Hadoop system. It is valuable for the industry in the monitoring system of cloud computing platform.
Microstructure Applications for Battery Design | Transportation Research |
NREL Microstructure Applications for Battery Design Microstructure Applications for Battery Design NREL's Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) work includes simulating physics at the electrode microstructure level and created a virtual design tool for battery
ERIC Educational Resources Information Center
Ramsay, Guy
2005-01-01
While the internationalisation of higher education has made learner diversity a key consideration in tertiary pedagogical practice, research into the application of computer-mediated technologies in this domain has rarely taken into account culture. This article responds to this gap in the research by comparing "Confucian-heritage" and…
NASA Technical Reports Server (NTRS)
STACK S. H.
1981-01-01
A computer-aided design system has recently been developed specifically for the small research group environment. The system is implemented on a Prime 400 minicomputer linked with a CDC 6600 computer. The goal was to assign the minicomputer specific tasks, such as data input and graphics, thereby reserving the large mainframe computer for time-consuming analysis codes. The basic structure of the design system consists of GEMPAK, a computer code that generates detailed configuration geometry from a minimum of input; interface programs that reformat GEMPAK geometry for input to the analysis codes; and utility programs that simplify computer access and data interpretation. The working system has had a large positive impact on the quantity and quality of research performed by the originating group. This paper describes the system, the major factors that contributed to its particular form, and presents examples of its application.
Survey data collection using Audio Computer Assisted Self-Interview.
Jones, Rachel
2003-04-01
The Audio Computer Assisted Self-Interview (ACASI) is a computer application that allows a research participant to hear survey interview items over a computer headset and read the corresponding items on a computer monitor. The ACASI automates progression from one item to the next, skipping irrelevant items. The research participant responds by pressing a number keypad, sending the data directly into a database. The ACASI was used to enhance participants' sense of privacy. A convenience sample of 257 young urban women, ages 18 to 29 years, were interviewed in neighborhood settings concerning human immune deficiency virus (HIV) sexual risk behaviors. Notebook computers were used to facilitate mobility. The overwhelming majority rated their experience with ACASI as easy to use. This article will focus on the use of ACASI in HIV behavioral research, its benefits, and approaches to resolve some identified problems with this method of data collection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, James C. Jr.; Mason, Thomas; Guerrieri, Bruno
1997-10-01
Programs have been established at Florida A & M University to attract minority students to research careers in mathematics and computational science. The primary goal of the program was to increase the number of such students studying computational science via an interactive multimedia learning environment One mechanism used for meeting this goal was the development of educational modules. This academic year program established within the mathematics department at Florida A&M University, introduced students to computational science projects using high-performance computers. Additional activities were conducted during the summer, these included workshops, meetings, and lectures. Through the exposure provided by this programmore » to scientific ideas and research in computational science, it is likely that their successful applications of tools from this interdisciplinary field will be high.« less
Color engineering in the age of digital convergence
NASA Astrophysics Data System (ADS)
MacDonald, Lindsay W.
1998-09-01
Digital color imaging has developed over the past twenty years from specialized scientific applications into the mainstream of computing. In addition to the phenomenal growth of computer processing power and storage capacity, great advances have been made in the capabilities and cost-effectiveness of color imaging peripherals. The majority of imaging applications, including the graphic arts, video and film have made the transition from analogue to digital production methods. Digital convergence of computing, communications and television now heralds new possibilities for multimedia publishing and mobile lifestyles. Color engineering, the application of color science to the design of imaging products, is an emerging discipline that poses exciting challenges to the international color imaging community for training, research and standards.
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Pugmire, David; Rogers, David
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Pugmire, David; Rogers, David
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
Photonics for aerospace sensors
NASA Astrophysics Data System (ADS)
Pellegrino, John; Adler, Eric D.; Filipov, Andree N.; Harrison, Lorna J.; van der Gracht, Joseph; Smith, Dale J.; Tayag, Tristan J.; Viveiros, Edward A.
1992-11-01
The maturation in the state-of-the-art of optical components is enabling increased applications for the technology. Most notable is the ever-expanding market for fiber optic data and communications links, familiar in both commercial and military markets. The inherent properties of optics and photonics, however, have suggested that components and processors may be designed that offer advantages over more commonly considered digital approaches for a variety of airborne sensor and signal processing applications. Various academic, industrial, and governmental research groups have been actively investigating and exploiting these properties of high bandwidth, large degree of parallelism in computation (e.g., processing in parallel over a two-dimensional field), and interconnectivity, and have succeeded in advancing the technology to the stage of systems demonstration. Such advantages as computational throughput and low operating power consumption are highly attractive for many computationally intensive problems. This review covers the key devices necessary for optical signal and image processors, some of the system application demonstration programs currently in progress, and active research directions for the implementation of next-generation architectures.
Advanced technology airfoil research, volume 2. [conferences
NASA Technical Reports Server (NTRS)
1979-01-01
A comprehensive review of airfoil research is presented. The major thrust of the research is in three areas: development of computational aerodynamic codes for airfoil analysis and design, development of experimental facilities and test techniques, and all types of airfoil applications.
Animal-Related Computer Simulation Programs for Use in Education and Research. AWIC Series Number 1.
ERIC Educational Resources Information Center
Engler, Kevin P.
Computer models have definite limitations regarding the representation of biological systems, but they do have useful applications in reducing the number of animals used to study physiological systems, especially for educational purposes. This guide lists computer models that simulate living systems and can be used to demonstrate physiological,…
NASA Technical Reports Server (NTRS)
1998-01-01
Under a NASA SBIR (Small Business Innovative Research) contract, (NAS5-30905), EAI Simulation Associates, Inc., developed a new digital simulation computer, Starlight(tm). With an architecture based on the analog model of computation, Starlight(tm) outperforms all other computers on a wide range of continuous system simulation. This system is used in a variety of applications, including aerospace, automotive, electric power and chemical reactors.
Computer Science (CS) in the Compulsory Education Curriculum: Implications for Future Research
ERIC Educational Resources Information Center
Passey, Don
2017-01-01
The subject of computer science (CS) and computer science education (CSE) has relatively recently arisen as a subject for inclusion within the compulsory school curriculum. Up to this present time, a major focus of technologies in the school curriculum has in many countries been on applications of existing technologies into subject practice (both…
National Educational Computing Conference Proceedings (9th, Dallas, Texas, June 15-17, 1988).
ERIC Educational Resources Information Center
Ryan, William C., Ed.
The more than 200 papers and panel, project, and special session reports represented in this collection focus on innovations, trends, and research on the use of computers in a variety of educational settings. Of these, the full text is provided for 37 presentations and abstracts for 182. The topics discussed include: computer applications in…
RICIS Symposium 1992: Mission and Safety Critical Systems Research and Applications
NASA Technical Reports Server (NTRS)
1992-01-01
This conference deals with computer systems which control systems whose failure to operate correctly could produce the loss of life and or property, mission and safety critical systems. Topics covered are: the work of standards groups, computer systems design and architecture, software reliability, process control systems, knowledge based expert systems, and computer and telecommunication protocols.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahn, R.E.
1983-11-01
Fifth generation of computers is described. The three disciplines involved in bringing such a new generation to reality are: microelectronics; artificial intelligence and, computer systems and architecture. Applications in industry, offices, aerospace, education, health care and retailing are outlined. An analysis is given of research efforts in the US, Japan, U.K., and Europe. Fifth generation programming languages are detailed.
An overview of computer vision
NASA Technical Reports Server (NTRS)
Gevarter, W. B.
1982-01-01
An overview of computer vision is provided. Image understanding and scene analysis are emphasized, and pertinent aspects of pattern recognition are treated. The basic approach to computer vision systems, the techniques utilized, applications, the current existing systems and state-of-the-art issues and research requirements, who is doing it and who is funding it, and future trends and expectations are reviewed.
Investigating an Innovative Computer Application to Improve L2 Word Recognition from Speech
ERIC Educational Resources Information Center
Matthews, Joshua; O'Toole, John Mitchell
2015-01-01
The ability to recognise words from the aural modality is a critical aspect of successful second language (L2) listening comprehension. However, little research has been reported on computer-mediated development of L2 word recognition from speech in L2 learning contexts. This report describes the development of an innovative computer application…
NASA Technical Reports Server (NTRS)
1993-01-01
Developed under a Small Business Innovation Research (SBIR) contract, RAMPANT is a CFD software package for computing flow around complex shapes. The package is flexible, fast and easy to use. It has found a great number of applications, including computation of air flow around a Nordic ski jumper, prediction of flow over an airfoil and computation of the external aerodynamics of motor vehicles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MCCLEAN, JARROD; HANER, THOMAS; STEIGER, DAMIAN
FermiLib is an open source software package designed to facilitate the development and testing of algorithms for simulations of fermionic systems on quantum computers. Fermionic simulations represent an important application of early quantum devices with a lot of potential high value targets, such as quantum chemistry for the development of new catalysts. This software strives to provide a link between the required domain expertise in specific fermionic applications and quantum computing to enable more users to directly interface with, and develop for, these applications. It is an extensible Python library designed to interface with the high performance quantum simulator, ProjectQ,more » as well as application specific software such as PSI4 from the domain of quantum chemistry. Such software is key to enabling effective user facilities in quantum computation research.« less
Framework for architecture-independent run-time reconfigurable applications
NASA Astrophysics Data System (ADS)
Lehn, David I.; Hudson, Rhett D.; Athanas, Peter M.
2000-10-01
Configurable Computing Machines (CCMs) have emerged as a technology with the computational benefits of custom ASICs as well as the flexibility and reconfigurability of general-purpose microprocessors. Significant effort from the research community has focused on techniques to move this reconfigurability from a rapid application development tool to a run-time tool. This requires the ability to change the hardware design while the application is executing and is known as Run-Time Reconfiguration (RTR). Widespread acceptance of run-time reconfigurable custom computing depends upon the existence of high-level automated design tools. Such tools must reduce the designers effort to port applications between different platforms as the architecture, hardware, and software evolves. A Java implementation of a high-level application framework, called Janus, is presented here. In this environment, developers create Java classes that describe the structural behavior of an application. The framework allows hardware and software modules to be freely mixed and interchanged. A compilation phase of the development process analyzes the structure of the application and adapts it to the target platform. Janus is capable of structuring the run-time behavior of an application to take advantage of the memory and computational resources available.
Outcome of a Workshop on Applications of Protein Models in Biomedical Research
Schwede, Torsten; Sali, Andrej; Honig, Barry; Levitt, Michael; Berman, Helen M.; Jones, David; Brenner, Steven E.; Burley, Stephen K.; Das, Rhiju; Dokholyan, Nikolay V.; Dunbrack, Roland L.; Fidelis, Krzysztof; Fiser, Andras; Godzik, Adam; Huang, Yuanpeng Janet; Humblet, Christine; Jacobson, Matthew P.; Joachimiak, Andrzej; Krystek, Stanley R.; Kortemme, Tanja; Kryshtafovych, Andriy; Montelione, Gaetano T.; Moult, John; Murray, Diana; Sanchez, Roberto; Sosnick, Tobin R.; Standley, Daron M.; Stouch, Terry; Vajda, Sandor; Vasquez, Max; Westbrook, John D.; Wilson, Ian A.
2009-01-01
Summary We describe the proceedings and conclusions from a “Workshop on Applications of Protein Models in Biomedical Research” that was held at University of California at San Francisco on 11 and 12 July, 2008. At the workshop, international scientists involved with structure modeling explored (i) how models are currently used in biomedical research, (ii) what the requirements and challenges for different applications are, and (iii) how the interaction between the computational and experimental research communities could be strengthened to advance the field. PMID:19217386
eHealth research from the user's perspective.
Hesse, Bradford W; Shneiderman, Ben
2007-05-01
The application of information technology (IT) to issues of healthcare delivery has had a long and tortuous history in the United States. Within the field of eHealth, vanguard applications of advanced computing techniques, such as applications in artificial intelligence or expert systems, have languished in spite of a track record of scholarly publication and decisional accuracy. The problem is one of purpose, of asking the right questions for the science to solve. Historically, many computer science pioneers have been tempted to ask "what can the computer do?" New advances in eHealth are prompting developers to ask "what can people do?" How can eHealth take part in national goals for healthcare reform to empower relationships between healthcare professionals and patients, healthcare teams and families, and hospitals and communities to improve health equitably throughout the population? To do this, eHealth researchers must combine best evidence from the user sciences (human factors engineering, human-computer interaction, psychology, and usability) with best evidence in medicine to create transformational improvements in the quality of care that medicine offers. These improvements should follow recommendations from the Institute of Medicine to create a healthcare system that is (1) safe, (2) effective (evidence based), (3) patient centered, and (4) timely. Relying on the eHealth researcher's intuitive grasp of systems issues, improvements should be made with considerations of users and beneficiaries at the individual (patient-physician), group (family-staff), community, and broad environmental levels.
Computational Modeling in Concert with Laboratory Studies: Application to B Cell Differentiation
Remediation is expensive, so accurate prediction of dose-response is important to help control costs. Dose response is a function of biological mechanisms. Computational models of these mechanisms improve the efficiency of research and provide the capability for prediction.
NASA Astrophysics Data System (ADS)
Cofino, A. S.; Fernández Quiruelas, V.; Blanco Real, J. C.; García Díez, M.; Fernández, J.
2013-12-01
Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the WRF4G project objective is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is used by many groups, in the climate research community, to carry on downscaling simulations. Therefore this community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the simulations and the data. Thus,another objective of theWRF4G project consists on the development of a generic adaptation of WRF to DCIs. It should simplify the access to the DCIs for the researchers, and also to free them from the technical and computational aspects of the use of theses DCI. Finally, in order to demonstrate the ability of WRF4G solving actual scientific challenges with interest and relevance on the climate science (implying a high computational cost) we will shown results from different kind of downscaling experiments, like ERA-Interim re-analysis, CMIP5 models, or seasonal. WRF4G is been used to run WRF simulations which are contributing to the CORDEX initiative and others projects like SPECS and EUPORIAS. This work is been partially funded by the European Regional Development Fund (ERDF) and the Spanish National R&D Plan 2008-2011 (CGL2011-28864)
Computing through Scientific Abstractions in SysBioPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, George; Stephan, Eric G.; Gracio, Deborah K.
2004-10-13
Today, biologists and bioinformaticists have a tremendous amount of computational power at their disposal. With the availability of supercomputers, burgeoning scientific databases and digital libraries such as GenBank and PubMed, and pervasive computational environments such as the Grid, biologists have access to a wealth of computational capabilities and scientific data at hand. Yet, the rapid development of computational technologies has far exceeded the typical biologist’s ability to effectively apply the technology in their research. Computational sciences research and development efforts such as the Biology Workbench, BioSPICE (Biological Simulation Program for Intra-Cellular Evaluation), and BioCoRE (Biological Collaborative Research Environment) are importantmore » in connecting biologists and their scientific problems to computational infrastructures. On the Computational Cell Environment and Heuristic Entity-Relationship Building Environment projects at the Pacific Northwest National Laboratory, we are jointly developing a new breed of scientific problem solving environment called SysBioPSE that will allow biologists to access and apply computational resources in the scientific research context. In contrast to other computational science environments, SysBioPSE operates as an abstraction layer above a computational infrastructure. The goal of SysBioPSE is to allow biologists to apply computational resources in the context of the scientific problems they are addressing and the scientific perspectives from which they conduct their research. More specifically, SysBioPSE allows biologists to capture and represent scientific concepts and theories and experimental processes, and to link these views to scientific applications, data repositories, and computer systems.« less
Operation of the Institute for Computer Applications in Science and Engineering
NASA Technical Reports Server (NTRS)
1975-01-01
The ICASE research program is described in detail; it consists of four major categories: (1) efficient use of vector and parallel computers, with particular emphasis on the CDC STAR-100; (2) numerical analysis, with particular emphasis on the development and analysis of basic numerical algorithms; (3) analysis and planning of large-scale software systems; and (4) computational research in engineering and the natural sciences, with particular emphasis on fluid dynamics. The work in each of these areas is described in detail; other activities are discussed, a prognosis of future activities are included.
The development of an engineering computer graphics laboratory
NASA Technical Reports Server (NTRS)
Anderson, D. C.; Garrett, R. E.
1975-01-01
Hardware and software systems developed to further research and education in interactive computer graphics were described, as well as several of the ongoing application-oriented projects, educational graphics programs, and graduate research projects. The software system consists of a FORTRAN 4 subroutine package, in conjunction with a PDP 11/40 minicomputer as the primary computation processor and the Imlac PDS-1 as an intelligent display processor. The package comprises a comprehensive set of graphics routines for dynamic, structured two-dimensional display manipulation, and numerous routines to handle a variety of input devices at the Imlac.
Application of technology developed for flight simulation at NASA. Langley Research Center
NASA Technical Reports Server (NTRS)
Cleveland, Jeff I., II
1991-01-01
In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations including mathematical model computation and data input/output to the simulators must be deterministic and be completed in as short a time as possible. Personnel at NASA's Langley Research Center are currently developing the use of supercomputers for simulation mathematical model computation for real-time simulation. This, coupled with the use of an open systems software architecture, will advance the state-of-the-art in real-time flight simulation.
Squid - a simple bioinformatics grid.
Carvalho, Paulo C; Glória, Rafael V; de Miranda, Antonio B; Degrave, Wim M
2005-08-03
BLAST is a widely used genetic research tool for analysis of similarity between nucleotide and protein sequences. This paper presents a software application entitled "Squid" that makes use of grid technology. The current version, as an example, is configured for BLAST applications, but adaptation for other computing intensive repetitive tasks can be easily accomplished in the open source version. This enables the allocation of remote resources to perform distributed computing, making large BLAST queries viable without the need of high-end computers. Most distributed computing / grid solutions have complex installation procedures requiring a computer specialist, or have limitations regarding operating systems. Squid is a multi-platform, open-source program designed to "keep things simple" while offering high-end computing power for large scale applications. Squid also has an efficient fault tolerance and crash recovery system against data loss, being able to re-route jobs upon node failure and recover even if the master machine fails. Our results show that a Squid application, working with N nodes and proper network resources, can process BLAST queries almost N times faster than if working with only one computer. Squid offers high-end computing, even for the non-specialist, and is freely available at the project web site. Its open-source and binary Windows distributions contain detailed instructions and a "plug-n-play" instalation containing a pre-configured example.
Resin-composite blocks for dental CAD/CAM applications.
Ruse, N D; Sadoun, M J
2014-12-01
Advances in digital impression technology and manufacturing processes have led to a dramatic paradigm shift in dentistry and to the widespread use of computer-aided design/computer-aided manufacturing (CAD/CAM) in the fabrication of indirect dental restorations. Research and development in materials suitable for CAD/CAM applications are currently the most active field in dental materials. Two classes of materials are used in the production of CAD/CAM restorations: glass-ceramics/ceramics and resin composites. While glass-ceramics/ceramics have overall superior mechanical and esthetic properties, resin-composite materials may offer significant advantages related to their machinability and intra-oral reparability. This review summarizes recent developments in resin-composite materials for CAD/CAM applications, focusing on both commercial and experimental materials. © International & American Associations for Dental Research.
GSTARS computer models and their applications, Part II: Applications
Simoes, F.J.M.; Yang, C.T.
2008-01-01
In part 1 of this two-paper series, a brief summary of the basic concepts and theories used in developing the Generalized Stream Tube model for Alluvial River Simulation (GSTARS) computer models was presented. Part 2 provides examples that illustrate some of the capabilities of the GSTARS models and how they can be applied to solve a wide range of river and reservoir sedimentation problems. Laboratory and field case studies are used and the examples show representative applications of the earlier and of the more recent versions of GSTARS. Some of the more recent capabilities implemented in GSTARS3, one of the latest versions of the series, are also discussed here with more detail. ?? 2008 International Research and Training Centre on Erosion and Sedimentation and the World Association for Sedimentation and Erosion Research.
Use of microcomputers in health and social service applications in developing nations.
Bertrand, W E
1987-01-01
The microcomputer is creating something of a revolution in many developing nations where historically there has been a lack of access to computer power at all levels of the health sector. For the first time, practitioners and researchers, often trained in computer techniques for developing countries, have access through microcomputers to data and information manipulation in their local workplace. While the history of microcomputers in such settings is short, this article presents early evidence from several countries which indicates the usefulness of various applications. The majority of the applications reported in the literature from clinical and research laboratories is made up of national data base systems and special studies of morbidity and mortality. Secondary applications, including assistance in biographical searches and word and graphics processing, are also reviewed in this article. A summary of the most utilized microcomputer hardware configurations completes the review.
Overview of ICE Project: Integration of Computational Fluid Dynamics and Experiments
NASA Technical Reports Server (NTRS)
Stegeman, James D.; Blech, Richard A.; Babrauckas, Theresa L.; Jones, William H.
2001-01-01
Researchers at the NASA Glenn Research Center have developed a prototype integrated environment for interactively exploring, analyzing, and validating information from computational fluid dynamics (CFD) computations and experiments. The Integrated CFD and Experiments (ICE) project is a first attempt at providing a researcher with a common user interface for control, manipulation, analysis, and data storage for both experiments and simulation. ICE can be used as a live, on-tine system that displays and archives data as they are gathered; as a postprocessing system for dataset manipulation and analysis; and as a control interface or "steering mechanism" for simulation codes while visualizing the results. Although the full capabilities of ICE have not been completely demonstrated, this report documents the current system. Various applications of ICE are discussed: a low-speed compressor, a supersonic inlet, real-time data visualization, and a parallel-processing simulation code interface. A detailed data model for the compressor application is included in the appendix.
Making a Significant Difference with Institutional Research.
ERIC Educational Resources Information Center
Clagett, Craig A.; Huntington, Robin B.
Focusing on the changing roles of institutional researchers (IRs) due to the widespread distribution of computer technology, this monograph explores the effective application of IR skills to maximize the impact of research on campus policy making. The discussion is centered around three major principles guiding institutional research: know the…
Research in Distance Education: A System Modeling Approach.
ERIC Educational Resources Information Center
Saba, Farhad; Twitchell, David
This demonstration of the use of a computer simulation research method based on the System Dynamics modeling technique for studying distance education reviews research methods in distance education, including the broad categories of conceptual and case studies, and presents a rationale for the application of systems research in this area. The…
Cryogenic Memories based on Spin-Singlet and Spin-Triplet Ferromagnetic Josephson Junctions
NASA Astrophysics Data System (ADS)
Gingrich, Eric
The last several decades have seen an explosion in the use and size of computers for scientific applications. The US Department of Energy has set an ExaScale computing goal for high performance computing that is projected to be unattainable by current CMOS computing designs. This has led to a renewed interest in superconducting computing as a means of beating these projections. One of the primary requirements of this thrust is the development of an efficient cryogenic memory. Estimates of power consumption of early Rapid Single Flux Quantum (RSFQ) memory designs are on the order of MW, far too steep for any real application. Therefore, other memory concepts are required. S/F/S Josephson Junctions, a class of device in which two superconductors (S) are separated by one or more ferromagnetic layers (F) has shown promise as a memory element. Several different systems have been proposed utilizing either the spin-singlet or spin-triplet superconducting states. This talk will discuss the concepts underpinning these devices, and the recent work done to demonstrate their feasibility. This research is supported in part by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), via U.S. Army Research Office Contract W911NF-14-C-0115.
Educational NASA Computational and Scientific Studies (enCOMPASS)
NASA Technical Reports Server (NTRS)
Memarsadeghi, Nargess
2013-01-01
Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and engineering applications to computer science and applied mathematics university classes, and makes NASA objectives part of the university curricula. There is great potential for growth and return on investment of this program to the point where every major university in the U.S. would use at least one of these case studies in one of their computational courses, and where every NASA scientist and engineer facing a computational challenge (without having resources or expertise to solve it) would use enCOMPASS to formulate the problem as a case study, provide it to a university, and get back their solutions and ideas.
Bibliographic Research and the Love of Learning.
ERIC Educational Resources Information Center
Weiskel, Timothy C.
1985-01-01
Discusses the design and several applications of the computer program, BIBLIO-File. Designed primarily for interactive bibliographic instruction and research, this program allows users to enter, sort, index, search, and print annotated bibliographic information. (MBR)
NASA Technical Reports Server (NTRS)
1994-01-01
CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. The 1993-94 CESDIS year included a broad range of computer science research applied to NASA problems. This report provides an overview of these research projects and programs as well as a summary of the various other activities of CESDIS in support of NASA and the university research community, We have had an exciting and challenging year.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potok, Thomas; Schuman, Catherine; Patton, Robert
The White House and Department of Energy have been instrumental in driving the development of a neuromorphic computing program to help the United States continue its lead in basic research into (1) Beyond Exascale—high performance computing beyond Moore’s Law and von Neumann architectures, (2) Scientific Discovery—new paradigms for understanding increasingly large and complex scientific data, and (3) Emerging Architectures—assessing the potential of neuromorphic and quantum architectures. Neuromorphic computing spans a broad range of scientific disciplines from materials science to devices, to computer science, to neuroscience, all of which are required to solve the neuromorphic computing grand challenge. In our workshopmore » we focus on the computer science aspects, specifically from a neuromorphic device through an application. Neuromorphic devices present a very different paradigm to the computer science community from traditional von Neumann architectures, which raises six major questions about building a neuromorphic application from the device level. We used these fundamental questions to organize the workshop program and to direct the workshop panels and discussions. From the white papers, presentations, panels, and discussions, there emerged several recommendations on how to proceed.« less
International Symposium on Grids and Clouds (ISGC) 2014
NASA Astrophysics Data System (ADS)
The International Symposium on Grids and Clouds (ISGC) 2014 will be held at Academia Sinica in Taipei, Taiwan from 23-28 March 2014, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC).“Bringing the data scientist to global e-Infrastructures” is the theme of ISGC 2014. The last decade has seen the phenomenal growth in the production of data in all forms by all research communities to produce a deluge of data from which information and knowledge need to be extracted. Key to this success will be the data scientist - educated to use advanced algorithms, applications and infrastructures - collaborating internationally to tackle society’s challenges. ISGC 2014 will bring together researchers working in all aspects of data science from different disciplines around the world to collaborate and educate themselves in the latest achievements and techniques being used to tackle the data deluge. In addition to the regular workshops, technical presentations and plenary keynotes, ISGC this year will focus on how to grow the data science community by considering the educational foundation needed for tomorrow’s data scientist. Topics of discussion include Physics (including HEP) and Engineering Applications, Biomedicine & Life Sciences Applications, Earth & Environmental Sciences & Biodiversity Applications, Humanities & Social Sciences Application, Virtual Research Environment (including Middleware, tools, services, workflow, ... etc.), Data Management, Big Data, Infrastructure & Operations Management, Infrastructure Clouds and Virtualisation, Interoperability, Business Models & Sustainability, Highly Distributed Computing Systems, and High Performance & Technical Computing (HPTC).
Learning and Career Specialty Preferences of Medical School Applicants
ERIC Educational Resources Information Center
Stratton, Terry D.; Witzke, Donald B.; Elam, Carol L.; Cheever, Todd R.
2005-01-01
The present research examined relationships among medical school applicants' preferred approaches to learning, methods of instruction, and specialty areas (n=912). Based on confidential responses to a progressive series of paired comparisons, applicants' preferences for lecture (L), self-study (SS), group discussion (GD), and computers (C) were…
Computational logic: its origins and applications.
Paulson, Lawrence C
2018-02-01
Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the 'logic for computable functions (LCF) approach' pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users' code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself.
Grid computing in large pharmaceutical molecular modeling.
Claus, Brian L; Johnson, Stephen R
2008-07-01
Most major pharmaceutical companies have employed grid computing to expand their compute resources with the intention of minimizing additional financial expenditure. Historically, one of the issues restricting widespread utilization of the grid resources in molecular modeling is the limited set of suitable applications amenable to coarse-grained parallelization. Recent advances in grid infrastructure technology coupled with advances in application research and redesign will enable fine-grained parallel problems, such as quantum mechanics and molecular dynamics, which were previously inaccessible to the grid environment. This will enable new science as well as increase resource flexibility to load balance and schedule existing workloads.
[Advancements of computer chemistry in separation of Chinese medicine].
Li, Lingjuan; Hong, Hong; Xu, Xuesong; Guo, Liwei
2011-12-01
Separating technique of Chinese medicine is not only a key technique in the field of Chinese medicine' s research and development, but also a significant step in the modernization of Chinese medicinal preparation. Computer chemistry can build model and look for the regulations from Chinese medicine system which is full of complicated data. This paper analyzed the applicability, key technology, basic mode and common algorithm of computer chemistry applied in the separation of Chinese medicine, introduced the mathematic mode and the setting methods of Extraction kinetics, investigated several problems which based on traditional Chinese medicine membrane procession, and forecasted the application prospect.
Molecular electronics: The technology of sixth generation computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarvis, M.T.; Miller, R.K.
1987-01-01
In February 1986, Japan began the 6th Generation project. At the 1987 Economic Summit in Venice, Prime Minister Yashuhiro Makasone opened the project to world collaboration. A project director suggests that the 6th Generation ''may just be a turning point for human society.'' The major rationale for building molecular electronic devices is to achieve advances in computational densities and speeds. Proposed chromophore chains for molecular-scale chips, for example, could be spaced closer than today's silicone elements by a factor of almost 100. This book describes the research and proposed designs for molecular electronic devices and computers. It examines specific potentialmore » applications and the relationship to molecular electronics to silicon technology and presents the first published survey of experts on research issues, applications, and forecast of future developments and also includes market forecast. An interesting suggestion of the survey is that the chemical industry may become a significant factor in the computer industry as the sixth generation unfolds.« less
JAX Colony Management System (JCMS): an extensible colony and phenotype data management system.
Donnelly, Chuck J; McFarland, Mike; Ames, Abigail; Sundberg, Beth; Springer, Dave; Blauth, Peter; Bult, Carol J
2010-04-01
The Jackson Laboratory Colony Management System (JCMS) is a software application for managing data and information related to research mouse colonies, associated biospecimens, and experimental protocols. JCMS runs directly on computers that run one of the PC Windows operating systems, but can be accessed via web browser interfaces from any computer running a Windows, Macintosh, or Linux operating system. JCMS can be configured for a single user or multiple users in small- to medium-size work groups. The target audience for JCMS includes laboratory technicians, animal colony managers, and principal investigators. The application provides operational support for colony management and experimental workflows, sample and data tracking through transaction-based data entry forms, and date-driven work reports. Flexible query forms allow researchers to retrieve database records based on user-defined criteria. Recent advances in handheld computers with integrated barcode readers, middleware technologies, web browsers, and wireless networks add to the utility of JCMS by allowing real-time access to the database from any networked computer.
Computational Aeroelastic Modeling of Airframes and TurboMachinery: Progress and Challenges
NASA Technical Reports Server (NTRS)
Bartels, R. E.; Sayma, A. I.
2006-01-01
Computational analyses such as computational fluid dynamics and computational structural dynamics have made major advances toward maturity as engineering tools. Computational aeroelasticity is the integration of these disciplines. As computational aeroelasticity matures it too finds an increasing role in the design and analysis of aerospace vehicles. This paper presents a survey of the current state of computational aeroelasticity with a discussion of recent research, success and continuing challenges in its progressive integration into multidisciplinary aerospace design. This paper approaches computational aeroelasticity from the perspective of the two main areas of application: airframe and turbomachinery design. An overview will be presented of the different prediction methods used for each field of application. Differing levels of nonlinear modeling will be discussed with insight into accuracy versus complexity and computational requirements. Subjects will include current advanced methods (linear and nonlinear), nonlinear flow models, use of order reduction techniques and future trends in incorporating structural nonlinearity. Examples in which computational aeroelasticity is currently being integrated into the design of airframes and turbomachinery will be presented.
Adaptive Animation of Human Motion for E-Learning Applications
ERIC Educational Resources Information Center
Li, Frederick W. B.; Lau, Rynson W. H.; Komura, Taku; Wang, Meng; Siu, Becky
2007-01-01
Human motion animation has been one of the major research topics in the field of computer graphics for decades. Techniques developed in this area help present human motions in various applications. This is crucial for enhancing the realism as well as promoting the user interest in the applications. To carry this merit to e-learning applications,…
ERIC Educational Resources Information Center
Tsai, Chia-Wen; Shen, Pei-Di; Tsai, Meng-Chuan; Chen, Wen-Yu
2017-01-01
Much application software education in Taiwan can hardly be regarded as practical. The researchers in this study provided a flexible means of ubiquitous learning (u-learning) with a mobile app for students to access the learning material. In addition, the authors also adopted computational thinking (CT) to help students develop practical computing…
NASA Technical Reports Server (NTRS)
VanZandt, John
1994-01-01
The usage model of supercomputers for scientific applications, such as computational fluid dynamics (CFD), has changed over the years. Scientific visualization has moved scientists away from looking at numbers to looking at three-dimensional images, which capture the meaning of the data. This change has impacted the system models for computing. This report details the model which is used by scientists at NASA's research centers.
ERIC Educational Resources Information Center
Barrett, Andrew J.; And Others
The Center for Interactive Technology, Applications, and Research at the College of Engineering of the University of South Florida (Tampa) has developed objective and descriptive evaluation models to assist in determining the educational potential of computer and video courseware. The computer-based courseware evaluation model and the video-based…
ERIC Educational Resources Information Center
Association for the Development of Computer-based Instructional Systems.
This volume is divided into full formal paper manuscripts and 200-word abstracts of presentations that were made without submitting formal papers. The 73 papers presented in full represent recent research and applications in the field of computer-based instruction. They are organized into 13 sections by special interest group: (1) Computer-Based…
Mobile Applications for Participatory Science
ERIC Educational Resources Information Center
Drill, Sabrina L.
2013-01-01
Citizen science, participatory research, and volunteer monitoring all describe research where data are collected by non-professional collaborators. These approaches can allow for research to be conducted at spatial and temporal scales unfeasible for professionals, especially in current budget climates. Mobile computing apps for data collection,…
NASA Technical Reports Server (NTRS)
Krishen, Kumar (Editor); Burnham, Calvin (Editor)
1995-01-01
The papers presented at the 4th International Conference Exhibition: World Congress on Superconductivity held at the Marriott Orlando World Center, Orlando, Florida, are contained in this document and encompass the research, technology, applications, funding, political, and social aspects of superconductivity. Specifically, the areas covered included: high-temperature materials; thin films; C-60 based superconductors; persistent magnetic fields and shielding; fabrication methodology; space applications; physical applications; performance characterization; device applications; weak link effects and flux motion; accelerator technology; superconductivity energy; storage; future research and development directions; medical applications; granular superconductors; wire fabrication technology; computer applications; technical and commercial challenges, and power and energy applications.
NASA Technical Reports Server (NTRS)
Krishen, Kumar (Editor); Burnham, Calvin (Editor)
1995-01-01
This document contains papers presented at the 4th International Conference Exhibition: World Congress on Superconductivity held June 27-July 1, 1994 in Orlando, Florida. These documents encompass research, technology, applications, funding, political, and social aspects of superconductivity. The areas covered included: high-temperature materials; thin films; C-60 based superconductors; persistent magnetic fields and shielding; fabrication methodology; space applications; physical applications; performance characterization; device applications; weak link effects and flux motion; accelerator technology; superconductivity energy; storage; future research and development directions; medical applications; granular superconductors; wire fabrication technology; computer applications; technical and commercial challenges; and power and energy applications.
Performance Evaluation in Network-Based Parallel Computing
NASA Technical Reports Server (NTRS)
Dezhgosha, Kamyar
1996-01-01
Network-based parallel computing is emerging as a cost-effective alternative for solving many problems which require use of supercomputers or massively parallel computers. The primary objective of this project has been to conduct experimental research on performance evaluation for clustered parallel computing. First, a testbed was established by augmenting our existing SUNSPARCs' network with PVM (Parallel Virtual Machine) which is a software system for linking clusters of machines. Second, a set of three basic applications were selected. The applications consist of a parallel search, a parallel sort, a parallel matrix multiplication. These application programs were implemented in C programming language under PVM. Third, we conducted performance evaluation under various configurations and problem sizes. Alternative parallel computing models and workload allocations for application programs were explored. The performance metric was limited to elapsed time or response time which in the context of parallel computing can be expressed in terms of speedup. The results reveal that the overhead of communication latency between processes in many cases is the restricting factor to performance. That is, coarse-grain parallelism which requires less frequent communication between processes will result in higher performance in network-based computing. Finally, we are in the final stages of installing an Asynchronous Transfer Mode (ATM) switch and four ATM interfaces (each 155 Mbps) which will allow us to extend our study to newer applications, performance metrics, and configurations.
Bioinspired Intelligent Algorithm and Its Applications for Mobile Robot Control: A Survey.
Ni, Jianjun; Wu, Liuying; Fan, Xinnan; Yang, Simon X
2016-01-01
Bioinspired intelligent algorithm (BIA) is a kind of intelligent computing method, which is with a more lifelike biological working mechanism than other types. BIAs have made significant progress in both understanding of the neuroscience and biological systems and applying to various fields. Mobile robot control is one of the main application fields of BIAs which has attracted more and more attention, because mobile robots can be used widely and general artificial intelligent algorithms meet a development bottleneck in this field, such as complex computing and the dependence on high-precision sensors. This paper presents a survey of recent research in BIAs, which focuses on the research in the realization of various BIAs based on different working mechanisms and the applications for mobile robot control, to help in understanding BIAs comprehensively and clearly. The survey has four primary parts: a classification of BIAs from the biomimetic mechanism, a summary of several typical BIAs from different levels, an overview of current applications of BIAs in mobile robot control, and a description of some possible future directions for research.
Bioinspired Intelligent Algorithm and Its Applications for Mobile Robot Control: A Survey
Ni, Jianjun; Wu, Liuying; Fan, Xinnan; Yang, Simon X.
2016-01-01
Bioinspired intelligent algorithm (BIA) is a kind of intelligent computing method, which is with a more lifelike biological working mechanism than other types. BIAs have made significant progress in both understanding of the neuroscience and biological systems and applying to various fields. Mobile robot control is one of the main application fields of BIAs which has attracted more and more attention, because mobile robots can be used widely and general artificial intelligent algorithms meet a development bottleneck in this field, such as complex computing and the dependence on high-precision sensors. This paper presents a survey of recent research in BIAs, which focuses on the research in the realization of various BIAs based on different working mechanisms and the applications for mobile robot control, to help in understanding BIAs comprehensively and clearly. The survey has four primary parts: a classification of BIAs from the biomimetic mechanism, a summary of several typical BIAs from different levels, an overview of current applications of BIAs in mobile robot control, and a description of some possible future directions for research. PMID:26819582
NASA Technical Reports Server (NTRS)
Rutishauser, David
2006-01-01
The motivation for this work comes from an observation that amidst the push for Massively Parallel (MP) solutions to high-end computing problems such as numerical physical simulations, large amounts of legacy code exist that are highly optimized for vector supercomputers. Because re-hosting legacy code often requires a complete re-write of the original code, which can be a very long and expensive effort, this work examines the potential to exploit reconfigurable computing machines in place of a vector supercomputer to implement an essentially unmodified legacy source code. Custom and reconfigurable computing resources could be used to emulate an original application's target platform to the extent required to achieve high performance. To arrive at an architecture that delivers the desired performance subject to limited resources involves solving a multi-variable optimization problem with constraints. Prior research in the area of reconfigurable computing has demonstrated that designing an optimum hardware implementation of a given application under hardware resource constraints is an NP-complete problem. The premise of the approach is that the general issue of applying reconfigurable computing resources to the implementation of an application, maximizing the performance of the computation subject to physical resource constraints, can be made a tractable problem by assuming a computational paradigm, such as vector processing. This research contributes a formulation of the problem and a methodology to design a reconfigurable vector processing implementation of a given application that satisfies a performance metric. A generic, parametric, architectural framework for vector processing implemented in reconfigurable logic is developed as a target for a scheduling/mapping algorithm that maps an input computation to a given instance of the architecture. This algorithm is integrated with an optimization framework to arrive at a specification of the architecture parameters that attempts to minimize execution time, while staying within resource constraints. The flexibility of using a custom reconfigurable implementation is exploited in a unique manner to leverage the lessons learned in vector supercomputer development. The vector processing framework is tailored to the application, with variable parameters that are fixed in traditional vector processing. Benchmark data that demonstrates the functionality and utility of the approach is presented. The benchmark data includes an identified bottleneck in a real case study example vector code, the NASA Langley Terminal Area Simulation System (TASS) application.
Spatial data analytics on heterogeneous multi- and many-core parallel architectures using python
Laura, Jason R.; Rey, Sergio J.
2017-01-01
Parallel vector spatial analysis concerns the application of parallel computational methods to facilitate vector-based spatial analysis. The history of parallel computation in spatial analysis is reviewed, and this work is placed into the broader context of high-performance computing (HPC) and parallelization research. The rise of cyber infrastructure and its manifestation in spatial analysis as CyberGIScience is seen as a main driver of renewed interest in parallel computation in the spatial sciences. Key problems in spatial analysis that have been the focus of parallel computing are covered. Chief among these are spatial optimization problems, computational geometric problems including polygonization and spatial contiguity detection, the use of Monte Carlo Markov chain simulation in spatial statistics, and parallel implementations of spatial econometric methods. Future directions for research on parallelization in computational spatial analysis are outlined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Maxine D.; Leigh, Jason
2014-02-17
The Blaze high-performance visual computing system serves the high-performance computing research and education needs of University of Illinois at Chicago (UIC). Blaze consists of a state-of-the-art, networked, computer cluster and ultra-high-resolution visualization system called CAVE2(TM) that is currently not available anywhere in Illinois. This system is connected via a high-speed 100-Gigabit network to the State of Illinois' I-WIRE optical network, as well as to national and international high speed networks, such as the Internet2, and the Global Lambda Integrated Facility. This enables Blaze to serve as an on-ramp to national cyberinfrastructure, such as the National Science Foundation’s Blue Waters petascalemore » computer at the National Center for Supercomputing Applications at the University of Illinois at Chicago and the Department of Energy’s Argonne Leadership Computing Facility (ALCF) at Argonne National Laboratory. DOE award # DE-SC005067, leveraged with NSF award #CNS-0959053 for “Development of the Next-Generation CAVE Virtual Environment (NG-CAVE),” enabled us to create a first-of-its-kind high-performance visual computing system. The UIC Electronic Visualization Laboratory (EVL) worked with two U.S. companies to advance their commercial products and maintain U.S. leadership in the global information technology economy. New applications are being enabled with the CAVE2/Blaze visual computing system that is advancing scientific research and education in the U.S. and globally, and help train the next-generation workforce.« less
NASA Astrophysics Data System (ADS)
Clay, Alexis; Delord, Elric; Couture, Nadine; Domenger, Gaël
We describe the joint research that we conduct in gesture-based emotion recognition and virtual augmentation of a stage, bridging together the fields of computer science and dance. After establishing a common ground for dialogue, we could conduct a research process that equally benefits both fields. As computer scientists, dance is a perfect application case. Dancer's artistic creativity orient our research choices. As dancers, computer science provides new tools for creativity, and more importantly a new point of view that forces us to reconsider dance from its fundamentals. In this paper we hence describe our scientific work and its implications on dance. We provide an overview of our system to augment a ballet stage, taking a dancer's emotion into account. To illustrate our work in both fields, we describe three events that mixed dance, emotion recognition and augmented reality.
Cloud Computing Boosts Business Intelligence of Telecommunication Industry
NASA Astrophysics Data System (ADS)
Xu, Meng; Gao, Dan; Deng, Chao; Luo, Zhiguo; Sun, Shaoling
Business Intelligence becomes an attracting topic in today's data intensive applications, especially in telecommunication industry. Meanwhile, Cloud Computing providing IT supporting Infrastructure with excellent scalability, large scale storage, and high performance becomes an effective way to implement parallel data processing and data mining algorithms. BC-PDM (Big Cloud based Parallel Data Miner) is a new MapReduce based parallel data mining platform developed by CMRI (China Mobile Research Institute) to fit the urgent requirements of business intelligence in telecommunication industry. In this paper, the architecture, functionality and performance of BC-PDM are presented, together with the experimental evaluation and case studies of its applications. The evaluation result demonstrates both the usability and the cost-effectiveness of Cloud Computing based Business Intelligence system in applications of telecommunication industry.
Brunner, J; Krummenauer, F; Lehr, H A
2000-04-01
Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.
Quinn, T. Alexander; Kohl, Peter
2013-01-01
Since the development of the first mathematical cardiac cell model 50 years ago, computational modelling has become an increasingly powerful tool for the analysis of data and for the integration of information related to complex cardiac behaviour. Current models build on decades of iteration between experiment and theory, representing a collective understanding of cardiac function. All models, whether computational, experimental, or conceptual, are simplified representations of reality and, like tools in a toolbox, suitable for specific applications. Their range of applicability can be explored (and expanded) by iterative combination of ‘wet’ and ‘dry’ investigation, where experimental or clinical data are used to first build and then validate computational models (allowing integration of previous findings, quantitative assessment of conceptual models, and projection across relevant spatial and temporal scales), while computational simulations are utilized for plausibility assessment, hypotheses-generation, and prediction (thereby defining further experimental research targets). When implemented effectively, this combined wet/dry research approach can support the development of a more complete and cohesive understanding of integrated biological function. This review illustrates the utility of such an approach, based on recent examples of multi-scale studies of cardiac structure and mechano-electric function. PMID:23334215
Study on the application of mobile internet cloud computing platform
NASA Astrophysics Data System (ADS)
Gong, Songchun; Fu, Songyin; Chen, Zheng
2012-04-01
The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.
Integrating Grid Services into the Cray XT4 Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
NERSC; Cholia, Shreyas; Lin, Hwa-Chun Wendy
2009-05-01
The 38640 core Cray XT4"Franklin" system at the National Energy Research Scientific Computing Center (NERSC) is a massively parallel resource available to Department of Energy researchers that also provides on-demand grid computing to the Open Science Grid. The integration of grid services on Franklin presented various challenges, including fundamental differences between the interactive and compute nodes, a stripped down compute-node operating system without dynamic library support, a shared-root environment and idiosyncratic application launching. Inour work, we describe how we resolved these challenges on a running, general-purpose production system to provide on-demand compute, storage, accounting and monitoring services through generic gridmore » interfaces that mask the underlying system-specific details for the end user.« less
ENVIRONMENTAL BIOINFORMATICS AND COMPUTATIONAL TOXICOLOGY CENTER
The Center activities focused on integrating developmental efforts from the various research projects of the Center, and collaborative applications involving scientists from other institutions and EPA, to enhance research in critical areas. A representative sample of specif...
Research and technology, 1984 report
NASA Technical Reports Server (NTRS)
1984-01-01
Research and technology projects in the following areas are described: cryogenic engineering, hypergolic engineering, hazardous warning instrumentation, structures and mechanics, sensors and controls, computer sciences, communications, material analysis, biomedicine, meteorology, engineering management, logistics, training and maintenance aids, and technology applications.
Indexed Retrieval System for Navy Experimental Diving Unit Research and Evaluation Reports.
KWIC computer programs developed by the International Business Machine Corporation (IBM) were so successful in this application that they are now being applied to all of NEDU’s microfilmed research files. (Author)
Enabling a Scientific Cloud Marketplace: VGL (Invited)
NASA Astrophysics Data System (ADS)
Fraser, R.; Woodcock, R.; Wyborn, L. A.; Vote, J.; Rankine, T.; Cox, S. J.
2013-12-01
The Virtual Geophysics Laboratory (VGL) provides a flexible, web based environment where researchers can browse data and use a variety of scientific software packaged into tool kits that run in the Cloud. Both data and tool kits are published by multiple researchers and registered with the VGL infrastructure forming a data and application marketplace. The VGL provides the basic work flow of Discovery and Access to the disparate data sources and a Library for tool kits and scripting to drive the scientific codes. Computation is then performed on the Research or Commercial Clouds. Provenance information is collected throughout the work flow and can be published alongside the results allowing for experiment comparison and sharing with other researchers. VGL's "mix and match" approach to data, computational resources and scientific codes, enables a dynamic approach to scientific collaboration. VGL allows scientists to publish their specific contribution, be it data, code, compute or work flow, knowing the VGL framework will provide other components needed for a complete application. Other scientists can choose the pieces that suit them best to assemble an experiment. The coarse grain workflow of the VGL framework combined with the flexibility of the scripting library and computational toolkits allows for significant customisation and sharing amongst the community. The VGL utilises the cloud computational and storage resources from the Australian academic research cloud provided by the NeCTAR initiative and a large variety of data accessible from national and state agencies via the Spatial Information Services Stack (SISS - http://siss.auscope.org). VGL v1.2 screenshot - http://vgl.auscope.org
Expanded serial communication capability for the transport systems research vehicle laptop computers
NASA Technical Reports Server (NTRS)
Easley, Wesley C.
1991-01-01
A recent upgrade of the Transport Systems Research Vehicle (TSRV) operated by the Advanced Transport Operating Systems Program Office at the NASA Langley Research Center included installation of a number of Grid 1500 series laptop computers. Each unit is a 80386-based IBM PC clone. RS-232 data busses are needed for TSRV flight research programs, and it has been advantageous to extend the application of the Grids in this area. Use was made of the expansion features of the Grid internal bus to add a user programmable serial communication channel. Software to allow use of the Grid bus expansion has been written and placed in a Turbo C library for incorporation into applications programs in a transparent manner via function calls. Port setup; interrupt-driven, two-way data transfer; and software flow control are built into the library functions.
NASA Astrophysics Data System (ADS)
Perraud, Jean-Michel; Bennett, James C.; Bridgart, Robert; Robertson, David E.
2016-04-01
Research undertaken through the Water Information Research and Development Alliance (WIRADA) has laid the foundations for continuous deterministic and ensemble short-term forecasting services. One output of this research is the software Short-term Water Information Forecasting Tools version 2 (SWIFT2). SWIFT2 is developed for use in research on short term streamflow forecasting techniques as well as operational forecasting services at the Australian Bureau of Meteorology. The variety of uses in research and operations requires a modular software system whose components can be arranged in applications that are fit for each particular purpose, without unnecessary software duplication. SWIFT2 modelling structures consist of sub-areas of hydrologic models, nodes and links with in-stream routing and reservoirs. While this modelling structure is customary, SWIFT2 is built from the ground up for computational and data intensive applications such as ensemble forecasts necessary for the estimation of the uncertainty in forecasts. Support for parallel computation on multiple processors or on a compute cluster is a primary use case. A convention is defined to store large multi-dimensional forecasting data and its metadata using the netCDF library. SWIFT2 is written in modern C++ with state of the art software engineering techniques and practices. A salient technical feature is a well-defined application programming interface (API) to facilitate access from different applications and technologies. SWIFT2 is already seamlessly accessible on Windows and Linux via packages in R, Python, Matlab and .NET languages such as C# and F#. Command line or graphical front-end applications are also feasible. This poster gives an overview of the technology stack, and illustrates the resulting features of SWIFT2 for users. Research and operational uses share the same common core C++ modelling shell for consistency, but augmented by different software modules suitable for each context. The accessibility via interactive modelling languages is particularly amenable to using SWIFT2 in exploratory research, with a dynamic and versatile experimental modelling workflow. This does not come at the expense of the stability and reliability required for use in operations, where only mature and stable components are used.
Transforming Polar Research with Google Glass Augmented Reality (Invited)
NASA Astrophysics Data System (ADS)
Ruthkoski, T.
2013-12-01
Augmented reality is a new technology with the potential to accelerate the advancement of science, particularly in geophysical research. Augmented reality is defined as a live, direct or indirect, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. When paired with advanced computing techniques on cloud resources, augmented reality has the potential to improve data collection techniques, visualizations, as well as in-situ analysis for many areas of research. Google is currently a pioneer of augmented reality technology and has released beta versions of their wearable computing device, Google Glass, to a select number of developers and beta testers. This community of 'Glass Explorers' is the vehicle from which Google shapes the future of their augmented reality device. Example applications of Google Glass in geophysical research range from use as a data gathering interface in harsh climates to an on-site visualization and analysis tool. Early participation in the shaping of the Google Glass device is an opportunity for researchers to tailor this new technology to their specific needs. The purpose of this presentation is to provide geophysical researchers with a hands-on first look at Google Glass and its potential as a scientific tool. Attendees will be given an overview of the technical specifications as well as a live demonstration of the device. Potential applications to geophysical research in polar regions will be the primary focus. The presentation will conclude with an open call to participate, during which attendees may indicate interest in developing projects that integrate Google Glass into their research. Application Mockup: Penguin Counter Google Glass Augmented Reality Device
Transforming Polar Research with Google Glass Augmented Reality (Invited)
NASA Astrophysics Data System (ADS)
Ramachandran, R.; McEniry, M.; Maskey, M.
2011-12-01
Augmented reality is a new technology with the potential to accelerate the advancement of science, particularly in geophysical research. Augmented reality is defined as a live, direct or indirect, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. When paired with advanced computing techniques on cloud resources, augmented reality has the potential to improve data collection techniques, visualizations, as well as in-situ analysis for many areas of research. Google is currently a pioneer of augmented reality technology and has released beta versions of their wearable computing device, Google Glass, to a select number of developers and beta testers. This community of 'Glass Explorers' is the vehicle from which Google shapes the future of their augmented reality device. Example applications of Google Glass in geophysical research range from use as a data gathering interface in harsh climates to an on-site visualization and analysis tool. Early participation in the shaping of the Google Glass device is an opportunity for researchers to tailor this new technology to their specific needs. The purpose of this presentation is to provide geophysical researchers with a hands-on first look at Google Glass and its potential as a scientific tool. Attendees will be given an overview of the technical specifications as well as a live demonstration of the device. Potential applications to geophysical research in polar regions will be the primary focus. The presentation will conclude with an open call to participate, during which attendees may indicate interest in developing projects that integrate Google Glass into their research. Application Mockup: Penguin Counter Google Glass Augmented Reality Device
Security Issues in Cross-Organizational Peer-to-Peer Applications and Some Solutions
NASA Astrophysics Data System (ADS)
Gupta, Ankur; Awasthi, Lalit K.
Peer-to-Peer networks have been widely used for sharing millions of terabytes of content, for large-scale distributed computing and for a variety of other novel applications, due to their scalability and fault-tolerance. However, the scope of P2P networks has somehow been limited to individual computers connected to the internet. P2P networks are also notorious for blatant copyright violations and facilitating several kinds of security attacks. Businesses and large organizations have thus stayed away from deploying P2P applications citing security loopholes in P2P systems as the biggest reason for non-adoption. In theory P2P applications can help fulfill many organizational requirements such as collaboration and joint projects with other organizations, access to specialized computing infrastructure and finally accessing the specialized information/content and expert human knowledge available at other organizations. These potentially beneficial interactions necessitate that the research community attempt to alleviate the security shortcomings in P2P systems and ensure their acceptance and wide deployment. This research paper therefore examines the security issues prevalent in enabling cross-organizational P2P interactions and provides some technical insights into how some of these issues can be resolved.
Tutorial: Advanced fault tree applications using HARP
NASA Technical Reports Server (NTRS)
Dugan, Joanne Bechta; Bavuso, Salvatore J.; Boyd, Mark A.
1993-01-01
Reliability analysis of fault tolerant computer systems for critical applications is complicated by several factors. These modeling difficulties are discussed and dynamic fault tree modeling techniques for handling them are described and demonstrated. Several advanced fault tolerant computer systems are described, and fault tree models for their analysis are presented. HARP (Hybrid Automated Reliability Predictor) is a software package developed at Duke University and NASA Langley Research Center that is capable of solving the fault tree models presented.
NASA Astrophysics Data System (ADS)
Burnett, W.
2016-12-01
The Department of Defense's (DoD) High Performance Computing Modernization Program (HPCMP) provides high performance computing to address the most significant challenges in computational resources, software application support and nationwide research and engineering networks. Today, the HPCMP has a critical role in ensuring the National Earth System Prediction Capability (N-ESPC) achieves initial operational status in 2019. A 2015 study commissioned by the HPCMP found that N-ESPC computational requirements will exceed interconnect bandwidth capacity due to the additional load from data assimilation and passing connecting data between ensemble codes. Memory bandwidth and I/O bandwidth will continue to be significant bottlenecks for the Navy's Hybrid Coordinate Ocean Model (HYCOM) scalability - by far the major driver of computing resource requirements in the N-ESPC. The study also found that few of the N-ESPC model developers have detailed plans to ensure their respective codes scale through 2024. Three HPCMP initiatives are designed to directly address and support these issues: Productivity Enhancement, Technology, Transfer and Training (PETTT), the HPCMP Applications Software Initiative (HASI), and Frontier Projects. PETTT supports code conversion by providing assistance, expertise and training in scalable and high-end computing architectures. HASI addresses the continuing need for modern application software that executes effectively and efficiently on next-generation high-performance computers. Frontier Projects enable research and development that could not be achieved using typical HPCMP resources by providing multi-disciplinary teams access to exceptional amounts of high performance computing resources. Finally, the Navy's DoD Supercomputing Resource Center (DSRC) currently operates a 6 Petabyte system, of which Naval Oceanography receives 15% of operational computational system use, or approximately 1 Petabyte of the processing capability. The DSRC will provide the DoD with future computing assets to initially operate the N-ESPC in 2019. This talk will further describe how DoD's HPCMP will ensure N-ESPC becomes operational, efficiently and effectively, using next-generation high performance computing.
Endodontic applications of 3D printing.
Anderson, J; Wealleans, J; Ray, J
2018-02-27
Computer-aided design (CAD) and computer-aided manufacturing (CAM) technologies can leverage cone beam computed tomography data for production of objects used in surgical and nonsurgical endodontics and in educational settings. The aim of this article was to review all current applications of 3D printing in endodontics and to speculate upon future directions for research and clinical use within the specialty. A literature search of PubMed, Ovid and Scopus was conducted using the following terms: stereolithography, 3D printing, computer aided rapid prototyping, surgical guide, guided endodontic surgery, guided endodontic access, additive manufacturing, rapid prototyping, autotransplantation rapid prototyping, CAD, CAM. Inclusion criteria were articles in the English language documenting endodontic applications of 3D printing. Fifty-one articles met inclusion criteria and were utilized. The endodontic literature on 3D printing is generally limited to case reports and pre-clinical studies. Documented solutions to endodontic challenges include: guided access with pulp canal obliteration, applications in autotransplantation, pre-surgical planning and educational modelling and accurate location of osteotomy perforation sites. Acquisition of technical expertise and equipment within endodontic practices present formidable obstacles to widespread deployment within the endodontic specialty. As knowledge advances, endodontic postgraduate programmes should consider implementing 3D printing into their curriculums. Future research directions should include clinical outcomes assessments of treatments employing 3D printed objects. Published 2018. This article is a U.S. Government work and is in the public domain in the USA.
Computer Mediated Communication: Online Instruction and Interactivity.
ERIC Educational Resources Information Center
Lavooy, Maria J.; Newlin, Michael H.
2003-01-01
Explores the different forms and potential applications of computer mediated communication (CMC) for Web-based and Web-enhanced courses. Based on their experiences with three different Web courses (Research Methods in Psychology, Statistical Methods in Psychology, and Basic Learning Processes) taught repeatedly over the last five years, the…
Turbomachinery CFD on parallel computers
NASA Technical Reports Server (NTRS)
Blech, Richard A.; Milner, Edward J.; Quealy, Angela; Townsend, Scott E.
1992-01-01
The role of multistage turbomachinery simulation in the development of propulsion system models is discussed. Particularly, the need for simulations with higher fidelity and faster turnaround time is highlighted. It is shown how such fast simulations can be used in engineering-oriented environments. The use of parallel processing to achieve the required turnaround times is discussed. Current work by several researchers in this area is summarized. Parallel turbomachinery CFD research at the NASA Lewis Research Center is then highlighted. These efforts are focused on implementing the average-passage turbomachinery model on MIMD, distributed memory parallel computers. Performance results are given for inviscid, single blade row and viscous, multistage applications on several parallel computers, including networked workstations.
Eye Tracking and Head Movement Detection: A State-of-Art Survey
2013-01-01
Eye-gaze detection and tracking have been an active research field in the past years as it adds convenience to a variety of applications. It is considered a significant untraditional method of human computer interaction. Head movement detection has also received researchers' attention and interest as it has been found to be a simple and effective interaction method. Both technologies are considered the easiest alternative interface methods. They serve a wide range of severely disabled people who are left with minimal motor abilities. For both eye tracking and head movement detection, several different approaches have been proposed and used to implement different algorithms for these technologies. Despite the amount of research done on both technologies, researchers are still trying to find robust methods to use effectively in various applications. This paper presents a state-of-art survey for eye tracking and head movement detection methods proposed in the literature. Examples of different fields of applications for both technologies, such as human-computer interaction, driving assistance systems, and assistive technologies are also investigated. PMID:27170851
A Review of Computational Methods in Materials Science: Examples from Shock-Wave and Polymer Physics
Steinhauser, Martin O.; Hiermaier, Stefan
2009-01-01
This review discusses several computational methods used on different length and time scales for the simulation of material behavior. First, the importance of physical modeling and its relation to computer simulation on multiscales is discussed. Then, computational methods used on different scales are shortly reviewed, before we focus on the molecular dynamics (MD) method. Here we survey in a tutorial-like fashion some key issues including several MD optimization techniques. Thereafter, computational examples for the capabilities of numerical simulations in materials research are discussed. We focus on recent results of shock wave simulations of a solid which are based on two different modeling approaches and we discuss their respective assets and drawbacks with a view to their application on multiscales. Then, the prospects of computer simulations on the molecular length scale using coarse-grained MD methods are covered by means of examples pertaining to complex topological polymer structures including star-polymers, biomacromolecules such as polyelectrolytes and polymers with intrinsic stiffness. This review ends by highlighting new emerging interdisciplinary applications of computational methods in the field of medical engineering where the application of concepts of polymer physics and of shock waves to biological systems holds a lot of promise for improving medical applications such as extracorporeal shock wave lithotripsy or tumor treatment. PMID:20054467
Facial expression system on video using widrow hoff
NASA Astrophysics Data System (ADS)
Jannah, M.; Zarlis, M.; Mawengkang, H.
2018-03-01
Facial expressions recognition is one of interesting research. This research contains human feeling to computer application Such as the interaction between human and computer, data compression, facial animation and facial detection from the video. The purpose of this research is to create facial expression system that captures image from the video camera. The system in this research uses Widrow-Hoff learning method in training and testing image with Adaptive Linear Neuron (ADALINE) approach. The system performance is evaluated by two parameters, detection rate and false positive rate. The system accuracy depends on good technique and face position that trained and tested.
Lewis Structures Technology, 1988. Volume 1: Structural Dynamics
NASA Technical Reports Server (NTRS)
1988-01-01
The specific purpose of the symposium was to familiarize the engineering structures community with the depth and range of research performed by the Structures Division of the Lewis Research Center and its academic and industrial partners. Sessions covered vibration control, fracture mechanics, ceramic component reliability, parallel computing, nondestructive testing, dynamical systems, fatigue and damage, wind turbines, hot section technology, structural mechanics codes, computational methods for dynamics, structural optimization, and applications of structural dynamics.
2013-03-01
reduced order model is created. Finally, previous research in this area of study will be examined, and its application to this research will be...TRAINING MANEUVER EVALUATION FOR REDUCED ORDER MODELING OF STABILITY & CONTROL PROPERTIES USING COMPUTATIONAL FLUID DYNAMICS THESIS Craig Curtis...Government and is not subject to copyright protection in the United States. AFIT-ENY-13-M-28 TRAINING MANEUVER EVALUATION FOR REDUCED ORDER MODELING OF
Guidelines for developing vectorizable computer programs
NASA Technical Reports Server (NTRS)
Miner, E. W.
1982-01-01
Some fundamental principles for developing computer programs which are compatible with array-oriented computers are presented. The emphasis is on basic techniques for structuring computer codes which are applicable in FORTRAN and do not require a special programming language or exact a significant penalty on a scalar computer. Researchers who are using numerical techniques to solve problems in engineering can apply these basic principles and thus develop transportable computer programs (in FORTRAN) which contain much vectorizable code. The vector architecture of the ASC is discussed so that the requirements of array processing can be better appreciated. The "vectorization" of a finite-difference viscous shock-layer code is used as an example to illustrate the benefits and some of the difficulties involved. Increases in computing speed with vectorization are illustrated with results from the viscous shock-layer code and from a finite-element shock tube code. The applicability of these principles was substantiated through running programs on other computers with array-associated computing characteristics, such as the Hewlett-Packard (H-P) 1000-F.
Under a Cooperative Research and Development Agreement (CRADA), Fluent, Inc. and the US EPA National Exposure Research Laboratory (NERL) propose to improve the ability of environmental scientists to use computer modeling for environmental exposure to air pollutants in human exp...
Principles of Protein Stability and Their Application in Computational Design.
Goldenzweig, Adi; Fleishman, Sarel
2018-01-26
Proteins are increasingly used in basic and applied biomedical research.Many proteins, however, are only marginally stable and can be expressed in limited amounts, thus hampering research and applications. Research has revealed the thermodynamic, cellular, and evolutionary principles and mechanisms that underlie marginal stability. With this growing understanding, computational stability design methods have advanced over the past two decades starting from methods that selectively addressed only some aspects of marginal stability. Current methods are more general and, by combining phylogenetic analysis with atomistic design, have shown drastic improvements in solubility, thermal stability, and aggregation resistance while maintaining the protein's primary molecular activity. Stability design is opening the way to rational engineering of improved enzymes, therapeutics, and vaccines and to the application of protein design methodology to large proteins and molecular activities that have proven challenging in the past. Expected final online publication date for the Annual Review of Biochemistry Volume 87 is June 20, 2018. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Current perspectives of CASA applications in diverse mammalian spermatozoa.
van der Horst, Gerhard; Maree, Liana; du Plessis, Stefan S
2018-03-26
Since the advent of computer-aided sperm analysis (CASA) some four decades ago, advances in computer technology and software algorithms have helped establish it as a research and diagnostic instrument for the analysis of spermatozoa. Despite mammalian spermatozoa being the most diverse cell type known, CASA is a great tool that has the capacity to provide rapid, reliable and objective quantitative assessment of sperm quality. This paper provides contemporary research findings illustrating the scientific and commercial applications of CASA and its ability to evaluate diverse mammalian spermatozoa (human, primates, rodents, domestic mammals, wildlife species) at both structural and functional levels. The potential of CASA to quantitatively measure essential aspects related to sperm subpopulations, hyperactivation, morphology and morphometry is also demonstrated. Furthermore, applications of CASA are provided for improved mammalian sperm quality assessment, evaluation of sperm functionality and the effect of different chemical substances or pathologies on sperm fertilising ability. It is clear that CASA has evolved significantly and is currently superior to many manual techniques in the research and clinical setting.
NASA Astrophysics Data System (ADS)
Santini, Maurizio
2015-11-01
X-ray computed tomography (CT) is a well-known technique nowadays, since its first practical application by Sir. G. Hounsfield (Nobel price for medicine 1979) has continually benefited from optimising improvements, especially in medical applications. Indeed, also application of CT in various engineering research fields provides fundamental informations on a wide range of applications, considering that the technique is not destructive, allowing 3D visualization without perturbation of the analysed material. Nowadays, it is technologically possible to design and realize an equipment that achieve a micrometric resolution and even improve the sensibility in revealing differences in materials having very radiotransparency, allowing i.e. to distinguish between different fluids (with different density) or states of matter (like with two-phase flows). At the University of Bergamo, a prototype of an X-ray microCT system was developed since 2008, so being fully operative from 2012, with specific customizations for investigations in thermal-fluid dynamics and multiphase flow researches. A technical session held at the UIT International Conference in L'Aquila (Italy), at which this paper is referring, has presented some microCT fundamentals, to allow the audience to gain basics to follow the “fil-rouge” that links all the instrumentation developments, till the recent applications. Hereinafter are reported some applications currently developed at Bergamo University at the X-ray computed micro-tomography laboratory.
ERIC Educational Resources Information Center
Bintas, Jale; Barut, Asim
2008-01-01
The aim of research is to compare difference between tenth class students and determine their level of success about classic and web based educational applications of Turbo Pascal lesson. This research was applied to 10 A and 10 TLB students of Izmir Karsikaya Anatolian Technical and industrial high school computer department in second term of…
Topical perspective on massive threading and parallelism.
Farber, Robert M
2011-09-01
Unquestionably computer architectures have undergone a recent and noteworthy paradigm shift that now delivers multi- and many-core systems with tens to many thousands of concurrent hardware processing elements per workstation or supercomputer node. GPGPU (General Purpose Graphics Processor Unit) technology in particular has attracted significant attention as new software development capabilities, namely CUDA (Compute Unified Device Architecture) and OpenCL™, have made it possible for students as well as small and large research organizations to achieve excellent speedup for many applications over more conventional computing architectures. The current scientific literature reflects this shift with numerous examples of GPGPU applications that have achieved one, two, and in some special cases, three-orders of magnitude increased computational performance through the use of massive threading to exploit parallelism. Multi-core architectures are also evolving quickly to exploit both massive-threading and massive-parallelism such as the 1.3 million threads Blue Waters supercomputer. The challenge confronting scientists in planning future experimental and theoretical research efforts--be they individual efforts with one computer or collaborative efforts proposing to use the largest supercomputers in the world is how to capitalize on these new massively threaded computational architectures--especially as not all computational problems will scale to massive parallelism. In particular, the costs associated with restructuring software (and potentially redesigning algorithms) to exploit the parallelism of these multi- and many-threaded machines must be considered along with application scalability and lifespan. This perspective is an overview of the current state of threading and parallelize with some insight into the future. Published by Elsevier Inc.
Workflow based framework for life science informatics.
Tiwari, Abhishek; Sekhar, Arvind K T
2007-10-01
Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.
Military clouds: utilization of cloud computing systems at the battlefield
NASA Astrophysics Data System (ADS)
Süleyman, Sarıkürk; Volkan, Karaca; İbrahim, Kocaman; Ahmet, Şirzai
2012-05-01
Cloud computing is known as a novel information technology (IT) concept, which involves facilitated and rapid access to networks, servers, data saving media, applications and services via Internet with minimum hardware requirements. Use of information systems and technologies at the battlefield is not new. Information superiority is a force multiplier and is crucial to mission success. Recent advances in information systems and technologies provide new means to decision makers and users in order to gain information superiority. These developments in information technologies lead to a new term, which is known as network centric capability. Similar to network centric capable systems, cloud computing systems are operational today. In the near future extensive use of military clouds at the battlefield is predicted. Integrating cloud computing logic to network centric applications will increase the flexibility, cost-effectiveness, efficiency and accessibility of network-centric capabilities. In this paper, cloud computing and network centric capability concepts are defined. Some commercial cloud computing products and applications are mentioned. Network centric capable applications are covered. Cloud computing supported battlefield applications are analyzed. The effects of cloud computing systems on network centric capability and on the information domain in future warfare are discussed. Battlefield opportunities and novelties which might be introduced to network centric capability by cloud computing systems are researched. The role of military clouds in future warfare is proposed in this paper. It was concluded that military clouds will be indispensible components of the future battlefield. Military clouds have the potential of improving network centric capabilities, increasing situational awareness at the battlefield and facilitating the settlement of information superiority.
Goscinski, Wojtek J.; McIntosh, Paul; Felzmann, Ulrich; Maksimenko, Anton; Hall, Christopher J.; Gureyev, Timur; Thompson, Darren; Janke, Andrew; Galloway, Graham; Killeen, Neil E. B.; Raniga, Parnesh; Kaluza, Owen; Ng, Amanda; Poudel, Govinda; Barnes, David G.; Nguyen, Toan; Bonnington, Paul; Egan, Gary F.
2014-01-01
The Multi-modal Australian ScienceS Imaging and Visualization Environment (MASSIVE) is a national imaging and visualization facility established by Monash University, the Australian Synchrotron, the Commonwealth Scientific Industrial Research Organization (CSIRO), and the Victorian Partnership for Advanced Computing (VPAC), with funding from the National Computational Infrastructure and the Victorian Government. The MASSIVE facility provides hardware, software, and expertise to drive research in the biomedical sciences, particularly advanced brain imaging research using synchrotron x-ray and infrared imaging, functional and structural magnetic resonance imaging (MRI), x-ray computer tomography (CT), electron microscopy and optical microscopy. The development of MASSIVE has been based on best practice in system integration methodologies, frameworks, and architectures. The facility has: (i) integrated multiple different neuroimaging analysis software components, (ii) enabled cross-platform and cross-modality integration of neuroinformatics tools, and (iii) brought together neuroimaging databases and analysis workflows. MASSIVE is now operational as a nationally distributed and integrated facility for neuroinfomatics and brain imaging research. PMID:24734019
NASA Astrophysics Data System (ADS)
Gomez, R.; Gentle, J.
2015-12-01
Modern data pipelines and computational processes require that meticulous methodologies be applied in order to insure that the source data, algorithms, and results are properly curated, managed and retained while remaining discoverable, accessible, and reproducible. Given the complexity of understanding the scientific problem domain being researched, combined with the overhead of learning to use advanced computing technologies, it becomes paramount that the next generation of scientists and researchers learn to embrace best-practices. The Integrative Computational Education and Research Traineeship (ICERT) is a National Science Foundation (NSF) Research Experience for Undergraduates (REU) Site at the Texas Advanced Computing Center (TACC). During Summer 2015, two ICERT interns joined the 3DDY project. 3DDY converts geospatial datasets into file types that can take advantage of new formats, such as natural user interfaces, interactive visualization, and 3D printing. Mentored by TACC researchers for ten weeks, students with no previous background in computational science learned to use scripts to build the first prototype of the 3DDY application, and leveraged Wrangler, the newest high performance computing (HPC) resource at TACC. Test datasets for quadrangles in central Texas were used to assemble the 3DDY workflow and code. Test files were successfully converted into a stereo lithographic (STL) format, which is amenable for use with a 3D printers. Test files and the scripts were documented and shared using the Figshare site while metadata was documented for the 3DDY application using OntoSoft. These efforts validated a straightforward set of workflows to transform geospatial data and established the first prototype version of 3DDY. Adding the data and software management procedures helped students realize a broader set of tangible results (e.g. Figshare entries), better document their progress and the final state of their work for the research group and community, helped students and researchers follow a clear set of formats and fill in the necessary details that may be lost otherwise, and exposed the students to the next generation workflows and practices for digital scholarship and scientific inquiry for converting geospatial data into formats that are easy to reuse.
Biosensors with Built-In Biomolecular Logic Gates for Practical Applications
Lai, Yu-Hsuan; Sun, Sin-Cih; Chuang, Min-Chieh
2014-01-01
Molecular logic gates, designs constructed with biological and chemical molecules, have emerged as an alternative computing approach to silicon-based logic operations. These molecular computers are capable of receiving and integrating multiple stimuli of biochemical significance to generate a definitive output, opening a new research avenue to advanced diagnostics and therapeutics which demand handling of complex factors and precise control. In molecularly gated devices, Boolean logic computations can be activated by specific inputs and accurately processed via bio-recognition, bio-catalysis, and selective chemical reactions. In this review, we survey recent advances of the molecular logic approaches to practical applications of biosensors, including designs constructed with proteins, enzymes, nucleic acids, nanomaterials, and organic compounds, as well as the research avenues for future development of digitally operating “sense and act” schemes that logically process biochemical signals through networked circuits to implement intelligent control systems. PMID:25587423
An Intelligent Model for Pairs Trading Using Genetic Algorithms.
Huang, Chien-Feng; Hsu, Chi-Jen; Chen, Chi-Chung; Chang, Bao Rong; Li, Chen-An
2015-01-01
Pairs trading is an important and challenging research area in computational finance, in which pairs of stocks are bought and sold in pair combinations for arbitrage opportunities. Traditional methods that solve this set of problems mostly rely on statistical methods such as regression. In contrast to the statistical approaches, recent advances in computational intelligence (CI) are leading to promising opportunities for solving problems in the financial applications more effectively. In this paper, we present a novel methodology for pairs trading using genetic algorithms (GA). Our results showed that the GA-based models are able to significantly outperform the benchmark and our proposed method is capable of generating robust models to tackle the dynamic characteristics in the financial application studied. Based upon the promising results obtained, we expect this GA-based method to advance the research in computational intelligence for finance and provide an effective solution to pairs trading for investment in practice.
An Intelligent Model for Pairs Trading Using Genetic Algorithms
Hsu, Chi-Jen; Chen, Chi-Chung; Li, Chen-An
2015-01-01
Pairs trading is an important and challenging research area in computational finance, in which pairs of stocks are bought and sold in pair combinations for arbitrage opportunities. Traditional methods that solve this set of problems mostly rely on statistical methods such as regression. In contrast to the statistical approaches, recent advances in computational intelligence (CI) are leading to promising opportunities for solving problems in the financial applications more effectively. In this paper, we present a novel methodology for pairs trading using genetic algorithms (GA). Our results showed that the GA-based models are able to significantly outperform the benchmark and our proposed method is capable of generating robust models to tackle the dynamic characteristics in the financial application studied. Based upon the promising results obtained, we expect this GA-based method to advance the research in computational intelligence for finance and provide an effective solution to pairs trading for investment in practice. PMID:26339236
On the use of distributed sensing in control of large flexible spacecraft
NASA Technical Reports Server (NTRS)
Montgomery, Raymond C.; Ghosh, Dave
1990-01-01
Distributed processing technology is being developed to process signals from distributed sensors using distributed computations. Thiw work presents a scheme for calculating the operators required to emulate a conventional Kalman filter and regulator using such a computer. The scheme makes use of conventional Kalman theory as applied to the control of large flexible structures. The required computation of the distributed operators given the conventional Kalman filter and regulator is explained. A straightforward application of this scheme may lead to nonsmooth operators whose convergence is not apparent. This is illustrated by application to the Mini-Mast, a large flexible truss at the Langley Research Center used for research in structural dynamics and control. Techniques for developing smooth operators are presented. These involve spatial filtering as well as adjusting the design constants in the Kalman theory. Results are presented that illustrate the degree of smoothness achieved.
Discovery of the Kalman filter as a practical tool for aerospace and industry
NASA Technical Reports Server (NTRS)
Mcgee, L. A.; Schmidt, S. F.
1985-01-01
The sequence of events which led the researchers at Ames Research Center to the early discovery of the Kalman filter shortly after its introduction into the literature is recounted. The scientific breakthroughs and reformulations that were necessary to transform Kalman's work into a useful tool for a specific aerospace application are described. The resulting extended Kalman filter, as it is now known, is often still referred to simply as the Kalman filter. As the filter's use gained in popularity in the scientific community, the problems of implementation on small spaceborne and airborne computers led to a square-root formulation of the filter to overcome numerical difficulties associated with computer word length. The work that led to this new formulation is also discussed, including the first airborne computer implementation and flight test. Since then the applications of the extended and square-root formulations of the Kalman filter have grown rapidly throughout the aerospace industry.
Advances in systems biology: computational algorithms and applications.
Huang, Yufei; Zhao, Zhongming; Xu, Hua; Shyr, Yu; Zhang, Bing
2012-01-01
The 2012 International Conference on Intelligent Biology and Medicine (ICIBM 2012) was held on April 22-24, 2012 in Nashville, Tennessee, USA. The conference featured six technical sessions, one tutorial session, one workshop, and 3 keynote presentations that covered state-of-the-art research activities in genomics, systems biology, and intelligent computing. In addition to a major emphasis on the next generation sequencing (NGS)-driven informatics, ICIBM 2012 aligned significant interests in systems biology and its applications in medicine. We highlight in this editorial the selected papers from the meeting that address the developments of novel algorithms and applications in systems biology.
Application of computer virtual simulation technology in 3D animation production
NASA Astrophysics Data System (ADS)
Mo, Can
2017-11-01
In the continuous development of computer technology, the application system of virtual simulation technology has been further optimized and improved. It also has been widely used in various fields of social development, such as city construction, interior design, industrial simulation and tourism teaching etc. This paper mainly introduces the virtual simulation technology used in 3D animation. Based on analyzing the characteristics of virtual simulation technology, the application ways and means of this technology in 3D animation are researched. The purpose is to provide certain reference for the 3D effect promotion days after.
Generic Divide and Conquer Internet-Based Computing
NASA Technical Reports Server (NTRS)
Follen, Gregory J. (Technical Monitor); Radenski, Atanas
2003-01-01
The growth of Internet-based applications and the proliferation of networking technologies have been transforming traditional commercial application areas as well as computer and computational sciences and engineering. This growth stimulates the exploration of Peer to Peer (P2P) software technologies that can open new research and application opportunities not only for the commercial world, but also for the scientific and high-performance computing applications community. The general goal of this project is to achieve better understanding of the transition to Internet-based high-performance computing and to develop solutions for some of the technical challenges of this transition. In particular, we are interested in creating long-term motivation for end users to provide their idle processor time to support computationally intensive tasks. We believe that a practical P2P architecture should provide useful service to both clients with high-performance computing needs and contributors of lower-end computing resources. To achieve this, we are designing dual -service architecture for P2P high-performance divide-and conquer computing; we are also experimenting with a prototype implementation. Our proposed architecture incorporates a master server, utilizes dual satellite servers, and operates on the Internet in a dynamically changing large configuration of lower-end nodes provided by volunteer contributors. A dual satellite server comprises a high-performance computing engine and a lower-end contributor service engine. The computing engine provides generic support for divide and conquer computations. The service engine is intended to provide free useful HTTP-based services to contributors of lower-end computing resources. Our proposed architecture is complementary to and accessible from computational grids, such as Globus, Legion, and Condor. Grids provide remote access to existing higher-end computing resources; in contrast, our goal is to utilize idle processor time of lower-end Internet nodes. Our project is focused on a generic divide and conquer paradigm and on mobile applications of this paradigm that can operate on a loose and ever changing pool of lower-end Internet nodes.
Programmer's Reference Manual for Dynamic Display Software System
DOT National Transportation Integrated Search
1971-01-01
In 1968, the display sysems group of the Systems Laboratory of the NASA/Electronics Research Center undertook a research task in the area of computer controlled flight information systems for aerospace application. The display laboratory of the Trans...
Use of Computer and Mobile Technologies in the Treatment of Depression.
Callan, Judith A; Wright, Jesse; Siegle, Greg J; Howland, Robert H; Kepler, Britney B
2017-06-01
Major depression (MDD) is a common and disabling disorder. Research has shown that most people with MDD receive either no treatment or inadequate treatment. Computer and mobile technologies may offer solutions for the delivery of therapies to untreated or inadequately treated individuals with MDD. The authors review currently available technologies and research aimed at relieving symptoms of MDD. These technologies include computer-assisted cognitive-behavior therapy (CCBT), web-based self-help, Internet self-help support groups, mobile psychotherapeutic interventions (i.e., mobile applications or apps), technology enhanced exercise, and biosensing technology. Copyright © 2017 Elsevier Inc. All rights reserved.
Computational Psychiatry and the Challenge of Schizophrenia.
Krystal, John H; Murray, John D; Chekroud, Adam M; Corlett, Philip R; Yang, Genevieve; Wang, Xiao-Jing; Anticevic, Alan
2017-05-01
Schizophrenia research is plagued by enormous challenges in integrating and analyzing large datasets and difficulties developing formal theories related to the etiology, pathophysiology, and treatment of this disorder. Computational psychiatry provides a path to enhance analyses of these large and complex datasets and to promote the development and refinement of formal models for features of this disorder. This presentation introduces the reader to the notion of computational psychiatry and describes discovery-oriented and theory-driven applications to schizophrenia involving machine learning, reinforcement learning theory, and biophysically-informed neural circuit models. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center 2017.
ERIC Educational Resources Information Center
García, Isaías; Benavides, Carmen; Alaiz, Héctor; Alonso, Angel
2013-01-01
This paper describes research on the use of knowledge models (ontologies) for building computer-aided educational software in the field of control engineering. Ontologies are able to represent in the computer a very rich conceptual model of a given domain. This model can be used later for a number of purposes in different software applications. In…
Multiple-User, Multitasking, Virtual-Memory Computer System
NASA Technical Reports Server (NTRS)
Generazio, Edward R.; Roth, Don J.; Stang, David B.
1993-01-01
Computer system designed and programmed to serve multiple users in research laboratory. Provides for computer control and monitoring of laboratory instruments, acquisition and anlaysis of data from those instruments, and interaction with users via remote terminals. System provides fast access to shared central processing units and associated large (from megabytes to gigabytes) memories. Underlying concept of system also applicable to monitoring and control of industrial processes.
Computational methods and software systems for dynamics and control of large space structures
NASA Technical Reports Server (NTRS)
Park, K. C.; Felippa, C. A.; Farhat, C.; Pramono, E.
1990-01-01
Two key areas of crucial importance to the computer-based simulation of large space structures are discussed. The first area involves multibody dynamics (MBD) of flexible space structures, with applications directed to deployment, construction, and maneuvering. The second area deals with advanced software systems, with emphasis on parallel processing. The latest research thrust in the second area involves massively parallel computers.
Viscous Incompressible Flow Computations for 3-D Steady and Unsteady Flows
NASA Technical Reports Server (NTRS)
Kwak, Dochan
2001-01-01
This viewgraph presentation gives an overview of viscous incompressible flow computations for three-dimensional steady and unsteady flows. Details are given on the use of computational fluid dynamics (CFD) as an engineering tool, solution methods for incompressible Navier-Stokes equations, numerical and physical characteristics of the primitive variable approach, and the role of CFD in the past and in current engineering and research applications.
A multiarchitecture parallel-processing development environment
NASA Technical Reports Server (NTRS)
Townsend, Scott; Blech, Richard; Cole, Gary
1993-01-01
A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.
Constructing Scientific Applications from Heterogeneous Resources
NASA Technical Reports Server (NTRS)
Schichting, Richard D.
1995-01-01
A new model for high-performance scientific applications in which such applications are implemented as heterogeneous distributed programs or, equivalently, meta-computations, is investigated. The specific focus of this grant was a collaborative effort with researchers at NASA and the University of Toledo to test and improve Schooner, a software interconnection system, and to explore the benefits of increased user interaction with existing scientific applications.
Commentary: New Technologies on the Horizon for Teaching
ERIC Educational Resources Information Center
Parslow, Graham R.
2013-01-01
A well-researched report has listed the technologies that should increasingly feature in teaching. It is projected that in the coming year there will be increased use of cloud computing, mobile applications, social exchanges, and tablet computing. The New Media Consortium (NMC) that produced the report is an international association of…
Dialogue-Based CALL: An Overview of Existing Research
ERIC Educational Resources Information Center
Bibauw, Serge; François, Thomas; Desmet, Piet
2015-01-01
Dialogue-based Computer-Assisted Language Learning (CALL) covers applications and systems allowing a learner to practice the target language in a meaning-focused conversational activity with an automated agent. We first present a common definition for dialogue-based CALL, based on three features: dialogue as the activity unit, computer as the…
Preparation of Teachers for Computer and Multimedia-Based Instruction in Literacy.
ERIC Educational Resources Information Center
Balajthy, Ernest
Recent developments in computer and multimedia technologies bring about the need to reconsider the education of today's teachers and future teachers and to update the technology-related content of literacy education coursework. "Application" software receives the most attention from researchers and theorists in literacy education. Use of…
48 CFR 27.404-4 - Contractor's release, publication, and use of data.
Code of Federal Regulations, 2011 CFR
2011-10-01
.... statutes. However, agencies may restrict the release or disclosure of computer software that is or is intended to be developed to the point of practical application (including for agency distribution under... applied research. Agencies may also preclude a contractor from asserting copyright in any computer...
48 CFR 27.404-4 - Contractor's release, publication, and use of data.
Code of Federal Regulations, 2012 CFR
2012-10-01
.... statutes. However, agencies may restrict the release or disclosure of computer software that is or is intended to be developed to the point of practical application (including for agency distribution under... applied research. Agencies may also preclude a contractor from asserting copyright in any computer...
48 CFR 27.404-4 - Contractor's release, publication, and use of data.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... statutes. However, agencies may restrict the release or disclosure of computer software that is or is intended to be developed to the point of practical application (including for agency distribution under... applied research. Agencies may also preclude a contractor from asserting copyright in any computer...
Collaboration and Computer-Assisted Acquisition of a Second Language.
ERIC Educational Resources Information Center
Renie, Delphine; Chanier, Thierry
1995-01-01
Discusses how collaborative learning (CL) can be used in a computer-assisted learning (CAL) environment for language learning, reviewing research in the fields of applied linguistics, educational psychology, and artificial intelligence. An application of CL and CAL in the learning of French as a Second Language, focusing on interrogative…
ERIC Educational Resources Information Center
Tennyson, Robert
1984-01-01
Reviews educational applications of artificial intelligence and presents empirically-based design variables for developing a computer-based instruction management system. Taken from a programmatic research effort based on the Minnesota Adaptive Instructional System, variables include amount and sequence of instruction, display time, advisement,…
Ontology-Driven Discovery of Scientific Computational Entities
ERIC Educational Resources Information Center
Brazier, Pearl W.
2010-01-01
Many geoscientists use modern computational resources, such as software applications, Web services, scientific workflows and datasets that are readily available on the Internet, to support their research and many common tasks. These resources are often shared via human contact and sometimes stored in data portals; however, they are not necessarily…
Soft computing prediction of economic growth based in science and technology factors
NASA Astrophysics Data System (ADS)
Marković, Dušan; Petković, Dalibor; Nikolić, Vlastimir; Milovančević, Miloš; Petković, Biljana
2017-01-01
The purpose of this research is to develop and apply the Extreme Learning Machine (ELM) to forecast the gross domestic product (GDP) growth rate. In this study the GDP growth was analyzed based on ten science and technology factors. These factors were: research and development (R&D) expenditure in GDP, scientific and technical journal articles, patent applications for nonresidents, patent applications for residents, trademark applications for nonresidents, trademark applications for residents, total trademark applications, researchers in R&D, technicians in R&D and high-technology exports. The ELM results were compared with genetic programming (GP), artificial neural network (ANN) and fuzzy logic results. Based upon simulation results, it is demonstrated that ELM has better forecasting capability for the GDP growth rate.
Post-Positivist Research: Two Examples of Methodological Pluralism.
ERIC Educational Resources Information Center
Wildemuth, Barbara M.
1993-01-01
Discussion of positivist and interpretive approaches to research and postpositivism focuses on two studies that apply interpretive research in different ways: an exploratory study of user-developed computing applications conducted prior to a positivist study and a study of end-user searching behaviors conducted concurrently with a positivist…
Multicore: Fallout from a Computing Evolution
Yelick, Kathy [Director, NERSC
2017-12-09
July 22, 2008 Berkeley Lab lecture: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.
Zhao, Yu; Liu, Yide; Lai, Ivan K W; Zhang, Hongfeng; Zhang, Yi
2016-03-18
As one of the latest revolutions in networking technology, social networks allow users to keep connected and exchange information. Driven by the rapid wireless technology development and diffusion of mobile devices, social networks experienced a tremendous change based on mobile sensor computing. More and more mobile sensor network applications have appeared with the emergence of a huge amount of users. Therefore, an in-depth discussion on the human-computer interaction (HCI) issues of mobile sensor computing is required. The target of this study is to extend the discussions on HCI by examining the relationships of users' compound attitudes (i.e., affective attitudes, cognitive attitude), engagement and electronic word of mouth (eWOM) behaviors in the context of mobile sensor computing. A conceptual model is developed, based on which, 313 valid questionnaires are collected. The research discusses the level of impact on the eWOM of mobile sensor computing by considering user-technology issues, including the compound attitude and engagement, which can bring valuable discussions on the HCI of mobile sensor computing in further study. Besides, we find that user engagement plays a mediating role between the user's compound attitudes and eWOM. The research result can also help the mobile sensor computing industry to develop effective strategies and build strong consumer user-product (brand) relationships.
Zhao, Yu; Liu, Yide; Lai, Ivan K. W.; Zhang, Hongfeng; Zhang, Yi
2016-01-01
As one of the latest revolutions in networking technology, social networks allow users to keep connected and exchange information. Driven by the rapid wireless technology development and diffusion of mobile devices, social networks experienced a tremendous change based on mobile sensor computing. More and more mobile sensor network applications have appeared with the emergence of a huge amount of users. Therefore, an in-depth discussion on the human–computer interaction (HCI) issues of mobile sensor computing is required. The target of this study is to extend the discussions on HCI by examining the relationships of users’ compound attitudes (i.e., affective attitudes, cognitive attitude), engagement and electronic word of mouth (eWOM) behaviors in the context of mobile sensor computing. A conceptual model is developed, based on which, 313 valid questionnaires are collected. The research discusses the level of impact on the eWOM of mobile sensor computing by considering user-technology issues, including the compound attitude and engagement, which can bring valuable discussions on the HCI of mobile sensor computing in further study. Besides, we find that user engagement plays a mediating role between the user’s compound attitudes and eWOM. The research result can also help the mobile sensor computing industry to develop effective strategies and build strong consumer user—product (brand) relationships. PMID:26999155
NASA Technical Reports Server (NTRS)
Davis, Bruce E.; Elliot, Gregory
1989-01-01
Jackson State University recently established the Center for Spatial Data Research and Applications, a Geographical Information System (GIS) and remote sensing laboratory. Taking advantage of new technologies and new directions in the spatial (geographic) sciences, JSU is building a Center of Excellence in Spatial Data Management. New opportunities for research, applications, and employment are emerging. GIS requires fundamental shifts and new demands in traditional computer science and geographic training. The Center is not merely another computer lab but is one setting the pace in a new applied frontier. GIS and its associated technologies are discussed. The Center's facilities are described. An ARC/INFO GIS runs on a Vax mainframe, with numerous workstations. Image processing packages include ELAS, LIPS, VICAR, and ERDAS. A host of hardware and software peripheral are used in support. Numerous projects are underway, such as the construction of a Gulf of Mexico environmental data base, development of AI in image processing, a land use dynamics study of metropolitan Jackson, and others. A new academic interdisciplinary program in Spatial Data Management is under development, combining courses in Geography and Computer Science. The broad range of JSU's GIS and remote sensing activities is addressed. The impacts on changing paradigms in the university and in the professional world conclude the discussion.
Computational logic: its origins and applications
2018-01-01
Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the ‘logic for computable functions (LCF) approach’ pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users’ code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself. PMID:29507522
Molecular dynamics simulations and applications in computational toxicology and nanotoxicology.
Selvaraj, Chandrabose; Sakkiah, Sugunadevi; Tong, Weida; Hong, Huixiao
2018-02-01
Nanotoxicology studies toxicity of nanomaterials and has been widely applied in biomedical researches to explore toxicity of various biological systems. Investigating biological systems through in vivo and in vitro methods is expensive and time taking. Therefore, computational toxicology, a multi-discipline field that utilizes computational power and algorithms to examine toxicology of biological systems, has gained attractions to scientists. Molecular dynamics (MD) simulations of biomolecules such as proteins and DNA are popular for understanding of interactions between biological systems and chemicals in computational toxicology. In this paper, we review MD simulation methods, protocol for running MD simulations and their applications in studies of toxicity and nanotechnology. We also briefly summarize some popular software tools for execution of MD simulations. Published by Elsevier Ltd.
Innovative architectures for dense multi-microprocessor computers
NASA Technical Reports Server (NTRS)
Larson, Robert E.
1989-01-01
The purpose is to summarize a Phase 1 SBIR project performed for the NASA/Langley Computational Structural Mechanics Group. The project was performed from February to August 1987. The main objectives of the project were to: (1) expand upon previous research into the application of chordal ring architectures to the general problem of designing multi-microcomputer architectures, (2) attempt to identify a family of chordal rings such that each chordal ring can be simply expanded to produce the next member of the family, (3) perform a preliminary, high-level design of an expandable multi-microprocessor computer based upon chordal rings, (4) analyze the potential use of chordal ring based multi-microprocessors for sparse matrix problems and other applications arising in computational structural mechanics.
Simulation tools for robotics research and assessment
NASA Astrophysics Data System (ADS)
Fields, MaryAnne; Brewer, Ralph; Edge, Harris L.; Pusey, Jason L.; Weller, Ed; Patel, Dilip G.; DiBerardino, Charles A.
2016-05-01
The Robotics Collaborative Technology Alliance (RCTA) program focuses on four overlapping technology areas: Perception, Intelligence, Human-Robot Interaction (HRI), and Dexterous Manipulation and Unique Mobility (DMUM). In addition, the RCTA program has a requirement to assess progress of this research in standalone as well as integrated form. Since the research is evolving and the robotic platforms with unique mobility and dexterous manipulation are in the early development stage and very expensive, an alternate approach is needed for efficient assessment. Simulation of robotic systems, platforms, sensors, and algorithms, is an attractive alternative to expensive field-based testing. Simulation can provide insight during development and debugging unavailable by many other means. This paper explores the maturity of robotic simulation systems for applications to real-world problems in robotic systems research. Open source (such as Gazebo and Moby), commercial (Simulink, Actin, LMS), government (ANVEL/VANE), and the RCTA-developed RIVET simulation environments are examined with respect to their application in the robotic research domains of Perception, Intelligence, HRI, and DMUM. Tradeoffs for applications to representative problems from each domain are presented, along with known deficiencies and disadvantages. In particular, no single robotic simulation environment adequately covers the needs of the robotic researcher in all of the domains. Simulation for DMUM poses unique constraints on the development of physics-based computational models of the robot, the environment and objects within the environment, and the interactions between them. Most current robot simulations focus on quasi-static systems, but dynamic robotic motion places an increased emphasis on the accuracy of the computational models. In order to understand the interaction of dynamic multi-body systems, such as limbed robots, with the environment, it may be necessary to build component-level computational models to provide the necessary simulation fidelity for accuracy. However, the Perception domain remains the most problematic for adequate simulation performance due to the often cartoon nature of computer rendering and the inability to model realistic electromagnetic radiation effects, such as multiple reflections, in real-time.
Use of cloud computing in biomedicine.
Sobeslav, Vladimir; Maresova, Petra; Krejcar, Ondrej; Franca, Tanos C C; Kuca, Kamil
2016-12-01
Nowadays, biomedicine is characterised by a growing need for processing of large amounts of data in real time. This leads to new requirements for information and communication technologies (ICT). Cloud computing offers a solution to these requirements and provides many advantages, such as cost savings, elasticity and scalability of using ICT. The aim of this paper is to explore the concept of cloud computing and the related use of this concept in the area of biomedicine. Authors offer a comprehensive analysis of the implementation of the cloud computing approach in biomedical research, decomposed into infrastructure, platform and service layer, and a recommendation for processing large amounts of data in biomedicine. Firstly, the paper describes the appropriate forms and technological solutions of cloud computing. Secondly, the high-end computing paradigm of cloud computing aspects is analysed. Finally, the potential and current use of applications in scientific research of this technology in biomedicine is discussed.
Application of supercomputers to computational aerodynamics
NASA Technical Reports Server (NTRS)
Peterson, V. L.
1984-01-01
Computers are playing an increasingly important role in the field of aerodynamics such that they now serve as a major complement to wind tunnels in aerospace research and development. Factors pacing advances in computational aerodynamics are identified, including the amount of computational power required to take the next major step in the discipline. Example results obtained from the successively refined forms of the governing equations are discussed, both in the context of levels of computer power required and the degree to which they either further the frontiers of research or apply to problems of practical importance. Finally, the Numerical Aerodynamic Simulation (NAS) Program - with its 1988 target of achieving a sustained computational rate of 1 billion floating point operations per second and operating with a memory of 240 million words - is discussed in terms of its goals and its projected effect on the future of computational aerodynamics.
STREAM2016: Streaming Requirements, Experience, Applications and Middleware Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, Geoffrey; Jha, Shantenu; Ramakrishnan, Lavanya
The Department of Energy (DOE) Office of Science (SC) facilities including accelerators, light sources and neutron sources and sensors that study, the environment, and the atmosphere, are producing streaming data that needs to be analyzed for next-generation scientific discoveries. There has been an explosion of new research and technologies for stream analytics arising from the academic and private sectors. However, there has been no corresponding effort in either documenting the critical research opportunities or building a community that can create and foster productive collaborations. The two-part workshop series, STREAM: Streaming Requirements, Experience, Applications and Middleware Workshop (STREAM2015 and STREAM2016), weremore » conducted to bring the community together and identify gaps and future efforts needed by both NSF and DOE. This report describes the discussions, outcomes and conclusions from STREAM2016: Streaming Requirements, Experience, Applications and Middleware Workshop, the second of these workshops held on March 22-23, 2016 in Tysons, VA. STREAM2016 focused on the Department of Energy (DOE) applications, computational and experimental facilities, as well software systems. Thus, the role of “streaming and steering” as a critical mode of connecting the experimental and computing facilities was pervasive through the workshop. Given the overlap in interests and challenges with industry, the workshop had significant presence from several innovative companies and major contributors. The requirements that drive the proposed research directions, identified in this report, show an important opportunity for building competitive research and development program around streaming data. These findings and recommendations are consistent with vision outlined in NRC Frontiers of Data and National Strategic Computing Initiative (NCSI) [1, 2]. The discussions from the workshop are captured as topic areas covered in this report's sections. The report discusses four research directions driven by current and future application requirements reflecting the areas identified as important by STREAM2016. These include (i) Algorithms, (ii) Programming Models, Languages and Runtime Systems (iii) Human-in-the-loop and Steering in Scientific Workflow and (iv) Facilities.« less
Innovation Research in E-Learning
NASA Astrophysics Data System (ADS)
Wu, Bing; Xu, WenXia; Ge, Jun
This study is a productivity review on the literature gleaned from SSCI, SCIE databases concerning innovation research in E-Learning. The result indicates that the number of literature productions on innovation research in ELearning is still growing from 2005. The main research development country is England, and from the analysis of the publication year, the number of papers is increasing peaking in 25% of the total in 2010. Meanwhile the main source title is British Journal of Educational Technology. In addition the subject area concentrated on Education & Educational Research, Computer Science, Interdisciplinary Applications and Computer Science, Software Engineering. Moreover the research focuses on are mainly conceptual research and empirical research, which were used to explore E-Learning in respective of innovation diffusion theory, also the limitations and future research of these research were discussed for further research.
Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.
Bian, Yuemin; Xie, Xiang-Qun Sean
2018-04-09
Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.
PREDICTORS OF COMPUTER USE IN COMMUNITY-DWELLING ETHNICALLY DIVERSE OLDER ADULTS
Werner, Julie M.; Carlson, Mike; Jordan-Marsh, Maryalice; Clark, Florence
2011-01-01
Objective In this study we analyzed self-reported computer use, demographic variables, psychosocial variables, and health and well-being variables collected from 460 ethnically diverse, community-dwelling elders in order to investigate the relationship computer use has with demographics, well-being and other key psychosocial variables in older adults. Background Although younger elders with more education, those who employ active coping strategies, or those who are low in anxiety levels are thought to use computers at higher rates than others, previous research has produced mixed or inconclusive results regarding ethnic, gender, and psychological factors, or has concentrated on computer-specific psychological factors only (e.g., computer anxiety). Few such studies have employed large sample sizes or have focused on ethnically diverse populations of community-dwelling elders. Method With a large number of overlapping predictors, zero-order analysis alone is poorly equipped to identify variables that are independently associated with computer use. Accordingly, both zero-order and stepwise logistic regression analyses were conducted to determine the correlates of two types of computer use: email and general computer use. Results Results indicate that younger age, greater level of education, non-Hispanic ethnicity, behaviorally active coping style, general physical health, and role-related emotional health each independently predicted computer usage. Conclusion Study findings highlight differences in computer usage, especially in regard to Hispanic ethnicity and specific health and well-being factors. Application Potential applications of this research include future intervention studies, individualized computer-based activity programming, or customizable software and user interface design for older adults responsive to a variety of personal characteristics and capabilities. PMID:22046718
NASA Technical Reports Server (NTRS)
Schmid, Beat; Bergstrom, Robert W.; Redemann, Jens
2002-01-01
This report is the final report for "Analysis of Atmospheric Aerosol Data Sets and Application of Radiative Transfer Models to Compute Aerosol Effects". It is a bibliographic compilation of 29 peer-reviewed publications (published, in press or submitted) produced under this Cooperative Agreement and 30 first-authored conference presentations. The tasks outlined in the various proposals are listed below with a brief comment as to the research performed. Copies of title/abstract pages of peer-reviewed publications are attached.
Computational fluid dynamics applications at McDonnel Douglas
NASA Technical Reports Server (NTRS)
Hakkinen, R. J.
1987-01-01
Representative examples are presented of applications and development of advanced Computational Fluid Dynamics (CFD) codes for aerodynamic design at the McDonnell Douglas Corporation (MDC). Transonic potential and Euler codes, interactively coupled with boundary layer computation, and solutions of slender-layer Navier-Stokes approximation are applied to aircraft wing/body calculations. An optimization procedure using evolution theory is described in the context of transonic wing design. Euler methods are presented for analysis of hypersonic configurations, and helicopter rotors in hover and forward flight. Several of these projects were accepted for access to the Numerical Aerodynamic Simulation (NAS) facility at the NASA-Ames Research Center.
Telehealth innovations in health education and training.
Conde, José G; De, Suvranu; Hall, Richard W; Johansen, Edward; Meglan, Dwight; Peng, Grace C Y
2010-01-01
Telehealth applications are increasingly important in many areas of health education and training. In addition, they will play a vital role in biomedical research and research training by facilitating remote collaborations and providing access to expensive/remote instrumentation. In order to fulfill their true potential to leverage education, training, and research activities, innovations in telehealth applications should be fostered across a range of technology fronts, including online, on-demand computational models for simulation; simplified interfaces for software and hardware; software frameworks for simulations; portable telepresence systems; artificial intelligence applications to be applied when simulated human patients are not options; and the development of more simulator applications. This article presents the results of discussion on potential areas of future development, barries to overcome, and suggestions to translate the promise of telehealth applications into a transformed environment of training, education, and research in the health sciences.
Ubiquitous Green Computing Techniques for High Demand Applications in Smart Environments
Zapater, Marina; Sanchez, Cesar; Ayala, Jose L.; Moya, Jose M.; Risco-Martín, José L.
2012-01-01
Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time. PMID:23112621
The Pixhawk Open-Source Computer Vision Framework for Mavs
NASA Astrophysics Data System (ADS)
Meier, L.; Tanskanen, P.; Fraundorfer, F.; Pollefeys, M.
2011-09-01
Unmanned aerial vehicles (UAV) and micro air vehicles (MAV) are already intensively used in geodetic applications. State of the art autonomous systems are however geared towards the application area in safe and obstacle-free altitudes greater than 30 meters. Applications at lower altitudes still require a human pilot. A new application field will be the reconstruction of structures and buildings, including the facades and roofs, with semi-autonomous MAVs. Ongoing research in the MAV robotics field is focusing on enabling this system class to operate at lower altitudes in proximity to nearby obstacles and humans. PIXHAWK is an open source and open hardware toolkit for this purpose. The quadrotor design is optimized for onboard computer vision and can connect up to four cameras to its onboard computer. The validity of the system design is shown with a fully autonomous capture flight along a building.
Mapping urban green open space in Bontang city using QGIS and cloud computing
NASA Astrophysics Data System (ADS)
Agus, F.; Ramadiani; Silalahi, W.; Armanda, A.; Kusnandar
2018-04-01
Digital mapping techniques are available freely and openly so that map-based application development is easier, faster and cheaper. A rapid development of Cloud Computing Geographic Information System makes this system can help the needs of the community for the provision of geospatial information online. The presence of urban Green Open Space (GOS) provide great benefits as an oxygen supplier, carbon-binding agent and can contribute to providing comfort and beauty of city life. This study aims to propose a platform application of GIS Cloud Computing (CC) of Bontang City GOS mapping. The GIS-CC platform uses the basic map available that’s free and open source. The research used survey method to collect GOS data obtained from Bontang City Government, while application developing works Quantum GIS-CC. The result section describes the existence of GOS Bontang City and the design of GOS mapping application.
Ubiquitous green computing techniques for high demand applications in Smart environments.
Zapater, Marina; Sanchez, Cesar; Ayala, Jose L; Moya, Jose M; Risco-Martín, José L
2012-01-01
Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.
Fundamental Concepts of Digital Image Processing
DOE R&D Accomplishments Database
Twogood, R. E.
1983-03-01
The field of a digital-image processing has experienced dramatic growth and increasingly widespread applicability in recent years. Fortunately, advances in computer technology have kept pace with the rapid growth in volume of image data in these and other applications. Digital image processing has become economical in many fields of research and in industrial and military applications. While each application has requirements unique from the others, all are concerned with faster, cheaper, more accurate, and more extensive computation. The trend is toward real-time and interactive operations, where the user of the system obtains preliminary results within a short enough time that the next decision can be made by the human processor without loss of concentration on the task at hand. An example of this is the obtaining of two-dimensional (2-D) computer-aided tomography (CAT) images. A medical decision might be made while the patient is still under observation rather than days later.
2012-11-01
performance . The simulations confirm that the PID algorithm can be applied to this cohort without the risk of hypoglycemia . Funding: The study was... Performance Computing Software Applications Institute, Telemedicine and Advanced Technology Research Center, U.S. Army Medical Research and Materiel Command...safe operating region, type 1 diabetes mellitus simulator Corresponding Author: Jaques Reifman, Ph.D., DoD Biotechnology High- Performance Computing
Transonic CFD applications at Boeing
NASA Technical Reports Server (NTRS)
Tinoco, E. N.
1989-01-01
The use of computational methods for three dimensional transonic flow design and analysis at the Boeing Company is presented. A range of computational tools consisting of production tools for every day use by project engineers, expert user tools for special applications by computational researchers, and an emerging tool which may see considerable use in the near future are described. These methods include full potential and Euler solvers, some coupled to three dimensional boundary layer analysis methods, for transonic flow analysis about nacelle, wing-body, wing-body-strut-nacelle, and complete aircraft configurations. As the examples presented show, such a toolbox of codes is necessary for the variety of applications typical of an industrial environment. Such a toolbox of codes makes possible aerodynamic advances not previously achievable in a timely manner, if at all.
Evolving telemedicine/ehealth technology.
Ferrante, Frank E
2005-06-01
This paper describes emerging technologies to support a rapidly changing and expanding scope of telemedicine/telehealth applications. Of primary interest here are wireless systems, emerging broadband, nanotechnology, intelligent agent applications, and grid computing. More specifically, the paper describes the changes underway in wireless designs aimed at enhancing security; some of the current work involving the development of nanotechnology applications and research into the use of intelligent agents/artificial intelligence technology to establish what are termed "Knowbots"; and a sampling of the use of Web services, such as grid computing capabilities, to support medical applications. In addition, the expansion of these technologies and the need for cost containment to sustain future health care for an increasingly mobile and aging population is discussed.
Moran, Stephan G; Key, Jason S; McGwin, Gerald; Keeley, Jason W; Davidson, James S; Rue, Loring W
2004-07-01
Head injury is a significant cause of both morbidity and mortality. Motor vehicle collisions (MVCs) are the most common source of head injury in the United States. No studies have conclusively determined the applicability of computer models for accurate prediction of head injuries sustained in actual MVCs. This study sought to determine the applicability of such models for predicting head injuries sustained by MVC occupants. The Crash Injury Research and Engineering Network (CIREN) database was queried for restrained drivers who sustained a head injury. These collisions were modeled using occupant dynamic modeling (MADYMO) software, and head injury scores were generated. The computer-generated head injury scores then were evaluated with respect to the actual head injuries sustained by the occupants to determine the applicability of MADYMO computer modeling for predicting head injury. Five occupants meeting the selection criteria for the study were selected from the CIREN database. The head injury scores generated by MADYMO were lower than expected given the actual injuries sustained. In only one case did the computer analysis predict a head injury of a severity similar to that actually sustained by the occupant. Although computer modeling accurately simulates experimental crash tests, it may not be applicable for predicting head injury in actual MVCs. Many complicating factors surrounding actual MVCs make accurate computer modeling difficult. Future modeling efforts should consider variables such as age of the occupant and should account for a wider variety of crash scenarios.
GeoBrain Computational Cyber-laboratory for Earth Science Studies
NASA Astrophysics Data System (ADS)
Deng, M.; di, L.
2009-12-01
Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.
NASA Computational Fluid Dynamics Conference. Volume 2: Sessions 7-12
NASA Technical Reports Server (NTRS)
1989-01-01
The objectives of the conference were to disseminate CFD research results to industry and university CFD researchers, to promote synergy among NASA CFD researchers, and to permit feedback from researchers outside of NASA on issues pacing the discipline of CFD. The focus of the conference was on the application of CFD technology but also included fundamental activities.
NASA Technical Reports Server (NTRS)
1994-01-01
CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. Research areas of primary interest at CESDIS include: 1) High performance computing, especially software design and performance evaluation for massively parallel machines; 2) Parallel input/output and data storage systems for high performance parallel computers; 3) Data base and intelligent data management systems for parallel computers; 4) Image processing; 5) Digital libraries; and 6) Data compression. CESDIS funds multiyear projects at U. S. universities and colleges. Proposals are accepted in response to calls for proposals and are selected on the basis of peer reviews. Funds are provided to support faculty and graduate students working at their home institutions. Project personnel visit Goddard during academic recess periods to attend workshops, present seminars, and collaborate with NASA scientists on research projects. Additionally, CESDIS takes on specific research tasks of shorter duration for computer science research requested by NASA Goddard scientists.
Cloudbus Toolkit for Market-Oriented Cloud Computing
NASA Astrophysics Data System (ADS)
Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian
This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.
Hastings, Janna; Chepelev, Leonid; Willighagen, Egon; Adams, Nico; Steinbeck, Christoph; Dumontier, Michel
2011-01-01
Cheminformatics is the application of informatics techniques to solve chemical problems in silico. There are many areas in biology where cheminformatics plays an important role in computational research, including metabolism, proteomics, and systems biology. One critical aspect in the application of cheminformatics in these fields is the accurate exchange of data, which is increasingly accomplished through the use of ontologies. Ontologies are formal representations of objects and their properties using a logic-based ontology language. Many such ontologies are currently being developed to represent objects across all the domains of science. Ontologies enable the definition, classification, and support for querying objects in a particular domain, enabling intelligent computer applications to be built which support the work of scientists both within the domain of interest and across interrelated neighbouring domains. Modern chemical research relies on computational techniques to filter and organise data to maximise research productivity. The objects which are manipulated in these algorithms and procedures, as well as the algorithms and procedures themselves, enjoy a kind of virtual life within computers. We will call these information entities. Here, we describe our work in developing an ontology of chemical information entities, with a primary focus on data-driven research and the integration of calculated properties (descriptors) of chemical entities within a semantic web context. Our ontology distinguishes algorithmic, or procedural information from declarative, or factual information, and renders of particular importance the annotation of provenance to calculated data. The Chemical Information Ontology is being developed as an open collaborative project. More details, together with a downloadable OWL file, are available at http://code.google.com/p/semanticchemistry/ (license: CC-BY-SA).
Hastings, Janna; Chepelev, Leonid; Willighagen, Egon; Adams, Nico; Steinbeck, Christoph; Dumontier, Michel
2011-01-01
Cheminformatics is the application of informatics techniques to solve chemical problems in silico. There are many areas in biology where cheminformatics plays an important role in computational research, including metabolism, proteomics, and systems biology. One critical aspect in the application of cheminformatics in these fields is the accurate exchange of data, which is increasingly accomplished through the use of ontologies. Ontologies are formal representations of objects and their properties using a logic-based ontology language. Many such ontologies are currently being developed to represent objects across all the domains of science. Ontologies enable the definition, classification, and support for querying objects in a particular domain, enabling intelligent computer applications to be built which support the work of scientists both within the domain of interest and across interrelated neighbouring domains. Modern chemical research relies on computational techniques to filter and organise data to maximise research productivity. The objects which are manipulated in these algorithms and procedures, as well as the algorithms and procedures themselves, enjoy a kind of virtual life within computers. We will call these information entities. Here, we describe our work in developing an ontology of chemical information entities, with a primary focus on data-driven research and the integration of calculated properties (descriptors) of chemical entities within a semantic web context. Our ontology distinguishes algorithmic, or procedural information from declarative, or factual information, and renders of particular importance the annotation of provenance to calculated data. The Chemical Information Ontology is being developed as an open collaborative project. More details, together with a downloadable OWL file, are available at http://code.google.com/p/semanticchemistry/ (license: CC-BY-SA). PMID:21991315
Current trends in hardware and software for brain-computer interfaces (BCIs)
NASA Astrophysics Data System (ADS)
Brunner, P.; Bianchi, L.; Guger, C.; Cincotti, F.; Schalk, G.
2011-04-01
A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.
NASA Technical Reports Server (NTRS)
1993-01-01
Under an Army Small Business Innovation Research (SBIR) grant, Symbiotics, Inc. developed a software system that permits users to upgrade products from standalone applications so they can communicate in a distributed computing environment. Under a subsequent NASA SBIR grant, Symbiotics added additional tools to the SOCIAL product to enable NASA to coordinate conventional systems for planning Shuttle launch support operations. Using SOCIAL, data may be shared among applications in a computer network even when the applications are written in different programming languages. The product was introduced to the commercial market in 1993 and is used to monitor and control equipment for operation support and to integrate financial networks. The SBIR program was established to increase small business participation in federal R&D activities and to transfer government research to industry. InQuisiX is a reuse library providing high performance classification, cataloging, searching, browsing, retrieval and synthesis capabilities. These form the foundation for software reuse, producing higher quality software at lower cost and in less time. Software Productivity Solutions, Inc. developed the technology under Small Business Innovation Research (SBIR) projects funded by NASA and the Army and is marketing InQuisiX in conjunction with Science Applications International Corporation (SAIC). The SBIR program was established to increase small business participation in federal R&D activities and to transfer government research to industry.
A Programming Framework for Scientific Applications on CPU-GPU Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, John
2013-03-24
At a high level, my research interests center around designing, programming, and evaluating computer systems that use new approaches to solve interesting problems. The rapid change of technology allows a variety of different architectural approaches to computationally difficult problems, and a constantly shifting set of constraints and trends makes the solutions to these problems both challenging and interesting. One of the most important recent trends in computing has been a move to commodity parallel architectures. This sea change is motivated by the industry’s inability to continue to profitably increase performance on a single processor and instead to move to multiplemore » parallel processors. In the period of review, my most significant work has been leading a research group looking at the use of the graphics processing unit (GPU) as a general-purpose processor. GPUs can potentially deliver superior performance on a broad range of problems than their CPU counterparts, but effectively mapping complex applications to a parallel programming model with an emerging programming environment is a significant and important research problem.« less
Designing integrated computational biology pipelines visually.
Jamil, Hasan M
2013-01-01
The long-term cost of developing and maintaining a computational pipeline that depends upon data integration and sophisticated workflow logic is too high to even contemplate "what if" or ad hoc type queries. In this paper, we introduce a novel application building interface for computational biology research, called VizBuilder, by leveraging a recent query language called BioFlow for life sciences databases. Using VizBuilder, it is now possible to develop ad hoc complex computational biology applications at throw away costs. The underlying query language supports data integration and workflow construction almost transparently and fully automatically, using a best effort approach. Users express their application by drawing it with VizBuilder icons and connecting them in a meaningful way. Completed applications are compiled and translated as BioFlow queries for execution by the data management system LifeDB, for which VizBuilder serves as a front end. We discuss VizBuilder features and functionalities in the context of a real life application after we briefly introduce BioFlow. The architecture and design principles of VizBuilder are also discussed. Finally, we outline future extensions of VizBuilder. To our knowledge, VizBuilder is a unique system that allows visually designing computational biology pipelines involving distributed and heterogeneous resources in an ad hoc manner.
Session on High Speed Civil Transport Design Capability Using MDO and High Performance Computing
NASA Technical Reports Server (NTRS)
Rehder, Joe
2000-01-01
Since the inception of CAS in 1992, NASA Langley has been conducting research into applying multidisciplinary optimization (MDO) and high performance computing toward reducing aircraft design cycle time. The focus of this research has been the development of a series of computational frameworks and associated applications that increased in capability, complexity, and performance over time. The culmination of this effort is an automated high-fidelity analysis capability for a high speed civil transport (HSCT) vehicle installed on a network of heterogeneous computers with a computational framework built using Common Object Request Broker Architecture (CORBA) and Java. The main focus of the research in the early years was the development of the Framework for Interdisciplinary Design Optimization (FIDO) and associated HSCT applications. While the FIDO effort was eventually halted, work continued on HSCT applications of ever increasing complexity. The current application, HSCT4.0, employs high fidelity CFD and FEM analysis codes. For each analysis cycle, the vehicle geometry and computational grids are updated using new values for design variables. Processes for aeroelastic trim, loads convergence, displacement transfer, stress and buckling, and performance have been developed. In all, a total of 70 processes are integrated in the analysis framework. Many of the key processes include automatic differentiation capabilities to provide sensitivity information that can be used in optimization. A software engineering process was developed to manage this large project. Defining the interactions among 70 processes turned out to be an enormous, but essential, task. A formal requirements document was prepared that defined data flow among processes and subprocesses. A design document was then developed that translated the requirements into actual software design. A validation program was defined and implemented to ensure that codes integrated into the framework produced the same results as their standalone counterparts. Finally, a Commercial Off the Shelf (COTS) configuration management system was used to organize the software development. A computational environment, CJOPT, based on the Common Object Request Broker Architecture, CORBA, and the Java programming language has been developed as a framework for multidisciplinary analysis and Optimization. The environment exploits the parallelisms inherent in the application and distributes the constituent disciplines on machines best suited to their needs. In CJOpt, a discipline code is "wrapped" as an object. An interface to the object identifies the functionality (services) provided by the discipline, defined in Interface Definition Language (IDL) and implemented using Java. The results of using the HSCT4.0 capability are described. A summary of lessons learned is also presented. The use of some of the processes, codes, and techniques by industry are highlighted. The application of the methodology developed in this research to other aircraft are described. Finally, we show how the experience gained is being applied to entirely new vehicles, such as the Reusable Space Transportation System. Additional information is contained in the original.
SEOS frame camera applications study
NASA Technical Reports Server (NTRS)
1974-01-01
A research and development satellite is discussed which will provide opportunities for observation of transient phenomena that fall within the fixed viewing circle of the spacecraft. The evaluation of possible applications for frame cameras, for SEOS, are studied. The computed lens characteristics for each camera are listed.
Applications of Goal Programming to Education.
ERIC Educational Resources Information Center
Van Dusseldorp, Ralph A.; And Others
This paper discusses goal programming, a computer-based operations research technique that is basically a modification and extension of linear programming. The authors first discuss the similarities and differences between goal programming and linear programming, then describe the limitations of goal programming and its possible applications for…