(Some) Computer Futures: Mainframes.
ERIC Educational Resources Information Center
Joseph, Earl C.
Possible futures for the world of mainframe computers can be forecast through studies identifying forces of change and their impact on current trends. Some new prospects for the future have been generated by advances in information technology; for example, recent United States successes in applied artificial intelligence (AI) have created new…
Educational Computer Utilization and Computer Communications.
ERIC Educational Resources Information Center
Singh, Jai P.; Morgan, Robert P.
As part of an analysis of educational needs and telecommunications requirements for future educational satellite systems, three studies were carried out. 1) The role of the computer in education was examined and both current status and future requirements were analyzed. Trade-offs between remote time sharing and remote batch process were explored…
A Qualitative Look at Preservice Teacher's Perceptions of the Future of Computers in Education.
ERIC Educational Resources Information Center
Schnackenberg, Heidi L.; Savenye, Wilhelmina C.
A qualitative study was conducted to determine the perceptions of preservice teachers on how computers will be used in schools in the future. Undergraduate students (n=40) were given a 60-minute multimedia presentation on how computer and multimedia technologies are used in schools, followed by group discussions on the ways in which computers will…
Positron Computed Tomography: Current State, Clinical Results and Future Trends
DOE R&D Accomplishments Database
Schelbert, H. R.; Phelps, M. E.; Kuhl, D. E.
1980-09-01
An overview is presented of positron computed tomography: its advantages over single photon emission tomography, its use in metabolic studies of the heart and chemical investigation of the brain, and future trends. (ACR)
Middle School Girls' Envisioned Future in Computing
ERIC Educational Resources Information Center
Friend, Michelle
2015-01-01
Experience is necessary but not sufficient to cause girls to envision a future career in computing. This study investigated the experiences and attitudes of girls who had taken three years of mandatory computer science classes in an all-girls setting in middle school, measured at the end of eighth grade. The one third of participants who were open…
Perspectives on the Future of CFD
NASA Technical Reports Server (NTRS)
Kwak, Dochan
2000-01-01
This viewgraph presentation gives an overview of the future of computational fluid dynamics (CFD), which in the past has pioneered the field of flow simulation. Over time CFD has progressed as computing power. Numerical methods have been advanced as CPU and memory capacity increases. Complex configurations are routinely computed now and direct numerical simulations (DNS) and large eddy simulations (LES) are used to study turbulence. As the computing resources changed to parallel and distributed platforms, computer science aspects such as scalability (algorithmic and implementation) and portability and transparent codings have advanced. Examples of potential future (or current) challenges include risk assessment, limitations of the heuristic model, and the development of CFD and information technology (IT) tools.
Future trends in computer waste generation in India.
Dwivedy, Maheshwar; Mittal, R K
2010-11-01
The objective of this paper is to estimate the future projection of computer waste in India and to subsequently analyze their flow at the end of their useful phase. For this purpose, the study utilizes the logistic model-based approach proposed by Yang and Williams to forecast future trends in computer waste. The model estimates future projection of computer penetration rate utilizing their first lifespan distribution and historical sales data. A bounding analysis on the future carrying capacity was simulated using the three parameter logistic curve. The observed obsolete generation quantities from the extrapolated penetration rates are then used to model the disposal phase. The results of the bounding analysis indicate that in the year 2020, around 41-152 million units of computers will become obsolete. The obsolete computer generation quantities are then used to estimate the End-of-Life outflows by utilizing a time-series multiple lifespan model. Even a conservative estimate of the future recycling capacity of PCs will reach upwards of 30 million units during 2025. Apparently, more than 150 million units could be potentially recycled in the upper bound case. However, considering significant future investment in the e-waste recycling sector from all stakeholders in India, we propose a logistic growth in the recycling rate and estimate the requirement of recycling capacity between 60 and 400 million units for the lower and upper bound case during 2025. Finally, we compare the future obsolete PC generation amount of the US and India. Copyright © 2010 Elsevier Ltd. All rights reserved.
Aerodynamic optimization studies on advanced architecture computers
NASA Technical Reports Server (NTRS)
Chawla, Kalpana
1995-01-01
The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.
Distributed computing environments for future space control systems
NASA Technical Reports Server (NTRS)
Viallefont, Pierre
1993-01-01
The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.
CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences
NASA Technical Reports Server (NTRS)
Slotnick, Jeffrey; Khodadoust, Abdollah; Alonso, Juan; Darmofal, David; Gropp, William; Lurie, Elizabeth; Mavriplis, Dimitri
2014-01-01
This report documents the results of a study to address the long range, strategic planning required by NASA's Revolutionary Computational Aerosciences (RCA) program in the area of computational fluid dynamics (CFD), including future software and hardware requirements for High Performance Computing (HPC). Specifically, the "Vision 2030" CFD study is to provide a knowledge-based forecast of the future computational capabilities required for turbulent, transitional, and reacting flow simulations across a broad Mach number regime, and to lay the foundation for the development of a future framework and/or environment where physics-based, accurate predictions of complex turbulent flows, including flow separation, can be accomplished routinely and efficiently in cooperation with other physics-based simulations to enable multi-physics analysis and design. Specific technical requirements from the aerospace industrial and scientific communities were obtained to determine critical capability gaps, anticipated technical challenges, and impediments to achieving the target CFD capability in 2030. A preliminary development plan and roadmap were created to help focus investments in technology development to help achieve the CFD vision in 2030.
The influence of a game-making project on male and female learners' attitudes to computing
NASA Astrophysics Data System (ADS)
Robertson, Judy
2013-03-01
There is a pressing need for gender inclusive approaches to engage young people in computer science. A recent popular approach has been to harness learners' enthusiasm for computer games to motivate them to learn computer science concepts through game authoring. This article describes a study in which 992 learners across 13 schools took part in a game-making project. It provides evidence from 225 pre-test and post-test questionnaires on how learners' attitudes to computing changed during the project, as well as qualitative reflections from the class teachers on how the project affected their learners. Results indicate that girls did not enjoy the experience as much as boys, and that in fact, the project may make pupils less inclined to study computing in the future. This has important implications for future efforts to engage young people in computing.
NASA Technical Reports Server (NTRS)
Trinh, H. P.; Gross, K. W.
1989-01-01
Computational studies have been conducted to examine the capability of a CFD code by simulating the steady state thrust chamber internal flow. The SSME served as the sample case, and significant parameter profiles are presented and discussed. Performance predictions from TDK, the recommended JANNAF reference computer program, are compared with those from PHOENICS to establish the credibility of its results. The investigation of an overexpanded nozzle flow is particularly addressed since it plays an important role in the area ratio selection of future rocket engines. Experience gained during this uncompleted flow separation study and future steps are outlined.
Computer Simulation in the Social Sciences/Social Studies.
ERIC Educational Resources Information Center
Klassen, Daniel L.
Computers are beginning to be used more frequently as instructional tools in secondary school social studies. This is especially true of "new social studies" programs; i.e., programs which subordinate mere mastery of factual content to the recognition of and ability to deal with the social imperatives of the future. Computer-assisted…
ERIC Educational Resources Information Center
McCredie, John W., Ed.
Ten case studies that describe the planning process and strategies employed by colleges who use computing and communication systems are presented, based on a 1981-1982 study conducted by EDUCOM. An introduction by John W. McCredie summarizes several current and future effects of the rapid spread and integration of computing and communication…
Duct flow nonuniformities study for space shuttle main engine
NASA Technical Reports Server (NTRS)
Thoenes, J.
1985-01-01
To improve the Space Shuttle Main Engine (SSME) design and for future use in the development of generation rocket engines, a combined experimental/analytical study was undertaken with the goals of first, establishing an experimental data base for the flow conditions in the SSME high pressure fuel turbopump (HPFTP) hot gas manifold (HGM) and, second, setting up a computer model of the SSME HGM flow field. Using the test data to verify the computer model it should be possible in the future to computationally scan contemplated advanced design configurations and limit costly testing to the most promising design. The effort of establishing and using the computer model is detailed. The comparison of computational results and experimental data observed clearly demonstrate that computational fluid mechanics (CFD) techniques can be used successfully to predict the gross features of three dimensional fluid flow through configurations as intricate as the SSME turbopump hot gas manifold.
Is There Such a Thing as Gender and Ethnicity of Computing?
ERIC Educational Resources Information Center
Turner, Eva
2000-01-01
Discussion of the absence of women and minority groups in computer science and information technology focuses on a study conducted at Middlesex University (England) that investigated how gender and ethnicity connected to computing are perceived by computing science students and how this may influence their decision as future computer scientists…
Parent's Guide to Computers in Education.
ERIC Educational Resources Information Center
Moursund, David
Addressed to the parents of children taking computer courses in school, this booklet outlines the rationales for computer use in schools and explains for a lay audience the features and functions of computers. A look at the school of the future shows computers aiding the study of reading, writing, arithmetic, geography, and history. The features…
Large scale systems : a study of computer organizations for air traffic control applications.
DOT National Transportation Integrated Search
1971-06-01
Based on current sizing estimates and tracking algorithms, some computer organizations applicable to future air traffic control computing systems are described and assessed. Hardware and software problem areas are defined and solutions are outlined.
Simulated Sustainable Societies: Students' Reflections on Creating Future Cities in Computer Games
ERIC Educational Resources Information Center
Nilsson, Elisabet M.; Jakobsson, Anders
2011-01-01
The empirical study, in this article, involved 42 students (ages 14-15), who used the urban simulation computer game SimCity 4 to create models of sustainable future cities. The aim was to explore in what ways the simulated "real" worlds provided by this game could be a potential facilitator for science learning contexts. The topic investigated is…
Education: AIChE Probes Impact of Computer on Future Engineering Education.
ERIC Educational Resources Information Center
Krieger, James
1983-01-01
Evaluates influence of computer assisted instruction on engineering education, considering use of computers to remove burden of doing calculations and to provide interactive self-study programs of a tutorial/remedial nature. Cites universities requiring personal computer purchase, pointing out possibility for individualized design assignments.…
What Do Computer Science Students Think about Software Piracy?
ERIC Educational Resources Information Center
Konstantakis, Nikos I.; Palaigeorgiou, George E.; Siozos, Panos D.; Tsoukalas, Ioannis A.
2010-01-01
Today, software piracy is an issue of global importance. Computer science students are the future information and communication technologies professionals and it is important to study the way they approach this issue. In this article, we attempt to study attitudes, behaviours and the corresponding reasoning of computer science students in Greece…
CT myocardial perfusion imaging: current status and future perspectives.
Yang, Dong Hyun; Kim, Young-Hak
2017-07-01
Computed tomography myocardial perfusion (CTP) combined with coronary computed tomography angiography (CCTA) may constitute a "1-stop shop" for the noninvasive diagnosis of hemodynamically significant coronary stenosis during a single CT examination. CTP shows high diagnostic performance and provides incremental value over CCTA for the detection of hemodynamically significant coronary stenosis in patients with a high Agatston calcium score or coronary artery stents. Future studies should determine the optimal protocol and clinical value of CTP for guiding revascularization strategy and prognostication. In this article, we review the current status and future perspectives of CTP, focusing on technical considerations, clinical applications, and future research topics.
Perspectives on Emerging/Novel Computing Paradigms and Future Aerospace Workforce Environments
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
2003-01-01
The accelerating pace of the computing technology development shows no signs of abating. Computing power reaching 100 Tflop/s is likely to be reached by 2004 and Pflop/s (10(exp 15) Flop/s) by 2007. The fundamental physical limits of computation, including information storage limits, communication limits and computation rate limits will likely be reached by the middle of the present millennium. To overcome these limits, novel technologies and new computing paradigms will be developed. An attempt is made in this overview to put the diverse activities related to new computing-paradigms in perspective and to set the stage for the succeeding presentations. The presentation is divided into five parts. In the first part, a brief historical account is given of development of computer and networking technologies. The second part provides brief overviews of the three emerging computing paradigms grid, ubiquitous and autonomic computing. The third part lists future computing alternatives and the characteristics of future computing environment. The fourth part describes future aerospace workforce research, learning and design environments. The fifth part lists the objectives of the workshop and some of the sources of information on future computing paradigms.
Case Study: Audio-Guided Learning, with Computer Graphics.
ERIC Educational Resources Information Center
Koumi, Jack; Daniels, Judith
1994-01-01
Describes teaching packages which involve the use of audiotape recordings with personal computers in Open University (United Kingdom) mathematics courses. Topics addressed include software development; computer graphics; pedagogic principles for distance education; feedback, including course evaluations and student surveys; and future plans.…
Improving Computer Literacy of Business Management Majors: A Case Study
ERIC Educational Resources Information Center
Johnson, David W.; Bartholomew, Kimberly W.; Miller, Duane
2006-01-01
Stakeholders, such as future employers, parents, and educators, have raised their expectations of college graduates in the area of computer literacy. Computer skills and understanding are especially critical for business management graduates, who are expected to use computer technology as a tool in every aspect of their career. Business students…
Technology Acceptance Predictors among Student Teachers and Experienced Classroom Teachers
ERIC Educational Resources Information Center
Smarkola, Claudia
2007-01-01
This study investigated 160 student teachers' and 158 experienced teachers' self-reported computer usage and their future intentions to use computer applications for school assignments. The Technology Acceptance Model (TAM) was used as the framework to determine computer usage and intentions. Statistically significant results showed that after…
ERIC Educational Resources Information Center
Sargent, John
The Office of Technology Policy analyzed Bureau of Labor Statistics' growth projections for the core occupational classifications of IT (information technology) workers to assess future demand in the United States. Classifications studied were computer engineers, systems analysts, computer programmers, database administrators, computer support…
Microcomputers and the Future.
ERIC Educational Resources Information Center
Uhlig, George E.
Dangers are inherent in predicting the future. In discussing the future of computers, specifically, it is useful to consider the brief history of computers from the development of ENIAC to microcomputers. Advances in computer technology can be seen by looking at changes in individual components, including internal and external memory, the…
Computing Literacy in the University of the Future.
ERIC Educational Resources Information Center
Gantt, Vernon W.
In exploring the impact of microcomputers and the future of the university in 1985 and beyond, a distinction should be made between computing literacy--the ability to use a computer--and computer literacy, which goes beyond successful computer use to include knowing how to program in various computer languages and understanding what goes on…
Computational aerodynamics development and outlook /Dryden Lecture in Research for 1979/
NASA Technical Reports Server (NTRS)
Chapman, D. R.
1979-01-01
Some past developments and current examples of computational aerodynamics are briefly reviewed. An assessment is made of the requirements on future computer memory and speed imposed by advanced numerical simulations, giving emphasis to the Reynolds averaged Navier-Stokes equations and to turbulent eddy simulations. Experimental scales of turbulence structure are used to determine the mesh spacings required to adequately resolve turbulent energy and shear. Assessment also is made of the changing market environment for developing future large computers, and of the projections of micro-electronics memory and logic technology that affect future computer capability. From the two assessments, estimates are formed of the future time scale in which various advanced types of aerodynamic flow simulations could become feasible. Areas of research judged especially relevant to future developments are noted.
[Results of the marketing research study "Acceptance of physician's office computer systems"].
Steinhausen, D; Brinkmann, F; Engelhard, A
1998-01-01
We report on a market research study on the acceptance of computer systems in surgeries. 11,000 returned questionnaires of surgeons--user and nonuser--were analysed. We found out that most of the surgeons used their computers in a limited way, i.e. as a device for accounting. Concerning the level of utilisation there are differentials of Men-Women, West-East and Young-Old. In this study we also analysed the computer using behaviour of gynaecologic surgeons. As a result two third of all nonusers are not intending to utilise a computer in the future.
Computers and the orthopaedic office.
Berumen, Edmundo; Barllow, Fidel Dobarganes; Fong, Fransisco Javier; Lopez, Jorge Arturo
2002-01-01
The advance of today's medicine could be linked very closely to the history of computers through the last twenty years. In the beginning the first attempt to build a computer was trying to help us with mathematical calculations. This has changed recently and computers are now linked to x-ray machines, CT scanners, and MRIs. Being able to share information is one of the goals of the future. Today's computer technology has helped a great deal to allow orthopaedic surgeons from around the world to consult on a difficult case or to become a part of a large database. Obtaining the results from a method of treatment using a multicentric information study can be done on a regular basis. In the future, computers will help us to retrieve information from patients' clinical history directly from a hospital database or by portable memory cards that will carry every radiograph or video from previous surgeries.
Future petroleum geologist: discussion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, G.D.
1987-07-01
Robert R. Berg's (1986) article, ''The Future Petroleum Geologist,'' summarizes the findings of the 13-member AAPG Select Committee on The Future Petroleum Geologist appointed by President William L. Fisher in July 1985. While this undertaking is laudable, particularly considering present circumstance in the petroleum industry, the committee has apparently overlooked a vital aspect concerning the future knowledge requirements of the petroleum geologist. Specifically, the Select Committee makes no mention of the need for computer literacy in its list of educational training categories. Obviously, AAPG is well aware of both the interest in computers by its membership and the increasing needmore » for training and familiarity in this discipline. The Select Committee on The Future Petroleum Geologist, while undertaking a difficult and potentially controversial task, has omitted an important aspect of the background requirements for generations of future petroleum geologists; the committee should consider an amendment to their recommendations to reflect this increasingly important field study.« less
Optimum spaceborne computer system design by simulation
NASA Technical Reports Server (NTRS)
Williams, T.; Kerner, H.; Weatherbee, J. E.; Taylor, D. S.; Hodges, B.
1973-01-01
A deterministic simulator is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Its use as a tool to study and determine the minimum computer system configuration necessary to satisfy the on-board computational requirements of a typical mission is presented. The paper describes how the computer system configuration is determined in order to satisfy the data processing demand of the various shuttle booster subsytems. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources.
Girls in computer science: A female only introduction class in high school
NASA Astrophysics Data System (ADS)
Drobnis, Ann W.
This study examined the impact of an all girls' classroom environment in a high school introductory computer science class on the student's attitudes towards computer science and their thoughts on future involvement with computer science. It was determined that an all girls' introductory class could impact the declining female enrollment and female students' efficacy towards computer science. This research was conducted in a summer school program through a regional magnet school for science and technology which these students attend during the school year. Three different groupings of students were examined for the research: female students in an all girls' class, female students in mixed-gender classes and male students in mixed-gender classes. A survey, Attitudes about Computers and Computer Science (ACCS), was designed to obtain an understanding of the students' thoughts, preconceptions, attitude, knowledge of computer science, and future intentions around computer science, both in education and career. Students in all three groups were administered the ACCS prior to taking the class and upon completion of the class. In addition, students in the all girls' class wrote in a journal throughout the course, and some of those students were also interviewed upon completion of the course. The data was analyzed using quantitative and qualitative techniques. While there were no major differences found in the quantitative data, it was determined that girls in the all girls' class were truly excited by what they had learned and were more open to the idea of computer science being a part of their future.
A National Survey of the Public's Attitudes Toward Computers.
ERIC Educational Resources Information Center
American Federation of Information Processing Societies, Montvale, NJ.
The general public's attitudes towards continually expanding computer usage is frequently speculated about but is far from understood. This study is aimed at providing objective data on the public's attitudes towards computers, their uses, their perceived impact on the American economy as well as on the individual, and their future uses. The…
Payload/orbiter contamination control requirement study: Computer interface
NASA Technical Reports Server (NTRS)
Bareiss, L. E.; Hooper, V. W.; Ress, E. B.
1976-01-01
The MSFC computer facilities, and future plans for them are described relative to characteristics of the various computers as to availability and suitability for processing the contamination program. A listing of the CDC 6000 series and UNIVAC 1108 characteristics is presented so that programming requirements can be compared directly and differences noted.
Computer-Based National Information Systems. Technology and Public Policy Issues.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Office of Technology Assessment.
A general introduction to computer based national information systems, and the context and basis for future studies are provided in this report. Chapter One, the introduction, summarizes computers and information systems and their relation to society, the structure of information policy issues, and public policy issues. Chapter Two describes the…
ALOHA System Technical Reports 16, 19, 24, 28, and 30, 1974.
ERIC Educational Resources Information Center
Hawaii Univ., Honolulu. ALOHA System.
A series of technical reports based on the Aloha System for educational computer programs provide a background on how various countries in the Pacific region developed computer capabilities and describe their current operations, as well as prospects for future expansion. Included are studies on the Japan-Hawaii TELEX and Satellite; computers at…
Teaching Hackers: School Computing Culture and the Future of Cyber-Rights.
ERIC Educational Resources Information Center
Van Buren, Cassandra
2001-01-01
Discussion of the need for ethical computing strategies and policies at the K-12 level to acculturate computer hackers away from malicious network hacking focuses on a three-year participant observation ethnographic study conducted at the New Technology High School (California) that examined the school's attempts to socialize its hackers to act…
Computational Fluid Dynamics: Past, Present, And Future
NASA Technical Reports Server (NTRS)
Kutler, Paul
1988-01-01
Paper reviews development of computational fluid dynamics and explores future prospects of technology. Report covers such topics as computer technology, turbulence, development of solution methodology, developemnt of algorithms, definition of flow geometries, generation of computational grids, and pre- and post-data processing.
Code of Federal Regulations, 2010 CFR
2010-04-01
... financial reporting and monthly computation by futures commission merchants and introducing brokers. 1.18... UNDER THE COMMODITY EXCHANGE ACT Minimum Financial and Related Reporting Requirements § 1.18 Records for and relating to financial reporting and monthly computation by futures commission merchants and...
Code of Federal Regulations, 2011 CFR
2011-04-01
... financial reporting and monthly computation by futures commission merchants and introducing brokers. 1.18... UNDER THE COMMODITY EXCHANGE ACT Minimum Financial and Related Reporting Requirements § 1.18 Records for and relating to financial reporting and monthly computation by futures commission merchants and...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brun, B.
1997-07-01
Computer technology has improved tremendously during the last years with larger media capacity, memory and more computational power. Visual computing with high-performance graphic interface and desktop computational power have changed the way engineers accomplish everyday tasks, development and safety studies analysis. The emergence of parallel computing will permit simulation over a larger domain. In addition, new development methods, languages and tools have appeared in the last several years.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler)
2003-01-01
The document contains the proceedings of the training workshop on Emerging and Future Computing Paradigms and their impact on the Research, Training and Design Environments of the Aerospace Workforce. The workshop was held at NASA Langley Research Center, Hampton, Virginia, March 18 and 19, 2003. The workshop was jointly sponsored by Old Dominion University and NASA. Workshop attendees came from NASA, other government agencies, industry and universities. The objectives of the workshop were to a) provide broad overviews of the diverse activities related to new computing paradigms, including grid computing, pervasive computing, high-productivity computing, and the IBM-led autonomic computing; and b) identify future directions for research that have high potential for future aerospace workforce environments. The format of the workshop included twenty-one, half-hour overview-type presentations and three exhibits by vendors.
Method and computer program product for maintenance and modernization backlogging
Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M
2013-02-19
According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.
Institutional computing (IC) information session
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koch, Kenneth R; Lally, Bryan R
2011-01-19
The LANL Institutional Computing Program (IC) will host an information session about the current state of unclassified Institutional Computing at Los Alamos, exciting plans for the future, and the current call for proposals for science and engineering projects requiring computing. Program representatives will give short presentations and field questions about the call for proposals and future planned machines, and discuss technical support available to existing and future projects. Los Alamos has started making a serious institutional investment in open computing available to our science projects, and that investment is expected to increase even more.
Center for Advanced Computational Technology
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
2000-01-01
The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.
Modeling of Photoionized Plasmas
NASA Technical Reports Server (NTRS)
Kallman, Timothy R.
2010-01-01
In this paper I review the motivation and current status of modeling of plasmas exposed to strong radiation fields, as it applies to the study of cosmic X-ray sources. This includes some of the astrophysical issues which can be addressed, the ingredients for the models, the current computational tools, the limitations imposed by currently available atomic data, and the validity of some of the standard assumptions. I will also discuss ideas for the future: challenges associated with future missions, opportunities presented by improved computers, and goals for atomic data collection.
Diller, David J; Swanson, Jon; Bayden, Alexander S; Jarosinski, Mark; Audie, Joseph
2015-01-01
Peptides provide promising templates for developing drugs to occupy a middle space between small molecules and antibodies and for targeting 'undruggable' intracellular protein-protein interactions. Importantly, rational or in cerebro design, especially when coupled with validated in silico tools, can be used to efficiently explore chemical space and identify islands of 'drug-like' peptides to satisfy diverse drug discovery program objectives. Here, we consider the underlying principles of and recent advances in rational, computer-enabled peptide drug design. In particular, we consider the impact of basic physicochemical properties, potency and ADME/Tox opportunities and challenges, and recently developed computational tools for enabling rational peptide drug design. Key principles and practices are spotlighted by recent case studies. We close with a hypothetical future case study.
Palamar, Borys I; Vaskivska, Halyna O; Palamar, Svitlana P
In the article the author touches upon the subject of significance of computer equipment for organization of cooperation of professor and future specialists. Such subject-subject interaction may be directed to forming of professional skills of future specialists. By using information and communication technologies in education system range of didactic tasks can be solved. Improving of process of teaching of subjects in high school, self-learning future specialists, motivating to learning and self-learning, the development of reflection in the learning process. The authors considers computer equipment as instrument for development of intellectual skills, potential and willingness of future specialists to solve communicative and communication tasks and problems on the creative basis. Based on results of researches the author comes to certain conclusions about the effectiveness of usage of computer technologies in process of teaching future specialists and their self-learning. Improper supplying of high schools with computer equipment, lack of appropriate educational programs, professors' teachers' poor knowledge and usage of computers have negative impact on organization of process of teaching disciplines in high schools. Computer equipment and ICT in general are the instruments of development of intellectual skills, potential and willingness of future specialists to solve communicative and communication tasks and problems. So, the formation of psychosocial environment of development of future specialist is multifaceted, complex and didactically important issue.
The Past, Present, and Future of Computational Models of Cognitive Development
ERIC Educational Resources Information Center
Schlesinger, Matthew; McMurray, Bob
2012-01-01
Does modeling matter? We address this question by providing a broad survey of the computational models of cognitive development that have been proposed and studied over the last three decades. We begin by noting the advantages and limitations of computational models. We then describe four key dimensions across which models of development can be…
Mobile Cloud Learning for Higher Education: A Case Study of Moodle in the Cloud
ERIC Educational Resources Information Center
Wang, Minjuan; Chen, Yong; Khan, Muhammad Jahanzaib
2014-01-01
Mobile cloud learning, a combination of mobile learning and cloud computing, is a relatively new concept that holds considerable promise for future development and delivery in the education sectors. Cloud computing helps mobile learning overcome obstacles related to mobile computing. The main focus of this paper is to explore how cloud computing…
Bimolecular dynamics by computer analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eilbeck, J.C.; Lomdahl, P.S.; Scott, A.C.
1984-01-01
As numerical tools (computers and display equipment) become more powerful and the atomic structures of important biological molecules become known, the importance of detailed computation of nonequilibrium biomolecular dynamics increases. In this manuscript we report results from a well developed study of the hydrogen bonded polypeptide crystal acetanilide, a model protein. Directions for future research are suggested. 9 references, 6 figures.
Future of Department of Defense Cloud Computing Amid Cultural Confusion
2013-03-01
enterprise cloud - computing environment and transition to a public cloud service provider. Services have started the development of individual cloud - computing environments...endorsing cloud computing . It addresses related issues in matters of service culture changes and how strategic leaders will dictate the future of cloud ...through data center consolidation and individual Service provided cloud computing .
NASA Technical Reports Server (NTRS)
1972-01-01
The design is reported of an advanced modular computer system designated the Automatically Reconfigurable Modular Multiprocessor System, which anticipates requirements for higher computing capacity and reliability for future spaceborne computers. Subjects discussed include: an overview of the architecture, mission analysis, synchronous and nonsynchronous scheduling control, reliability, and data transmission.
Hardware Considerations for Computer Based Education in the 1980's.
ERIC Educational Resources Information Center
Hirschbuhl, John J.
1980-01-01
In the future, computers will be needed to sift through the vast proliferation of available information. Among new developments in computer technology are the videodisc microcomputers and holography. Predictions for future developments include laser libraries for the visually handicapped and Computer Assisted Dialogue. (JN)
A Study of Cooperative, Networking, and Computer Activities in Southwestern Libraries.
ERIC Educational Resources Information Center
Corbin, John
The Southwestern Library Association (SWLA) conducted an inventory and study of the SWLA libraries in cooperative, network, and computer activities to collect data for use in planning future activities and in minimizing duplication of efforts. Questionnaires were mailed to 2,060 academic, public, and special libraries in the six SWLA states.…
Student Teachers' Perceptions on Educational Technologies' Past, Present and Future
ERIC Educational Resources Information Center
Orhan Goksun, Derya; Filiz, Ozan; Kurt, Adile Askim
2018-01-01
The aim of this study is to reveal Computer Education and Instructional Technologies student teachers', who are in a distance teacher education program, perceptions on past, present and educational technologies of future via infographics. In this study, 54 infographics, which were created by student teachers who were enrolled in Special Teaching…
A study of Mariner 10 flight experiences and some flight piece part failure rate computations
NASA Technical Reports Server (NTRS)
Paul, F. A.
1976-01-01
The problems and failures encountered in Mariner flight are discussed and the data available through a quantitative accounting of all electronic piece parts on the spacecraft are summarized. It also shows computed failure rates for electronic piece parts. It is intended that these computed data be used in the continued updating of the failure rate base used for trade-off studies and predictions for future JPL space missions.
Mobility, Emotion, and Universality in Future Collaboration
NASA Astrophysics Data System (ADS)
Chignell, Mark; Hosono, Naotsune; Fels, Deborah; Lottridge, Danielle; Waterworth, John
The Graphical user interface has traditionally supported personal productivity, efficiency, and usability. With computer supported cooperative work, the focus has been on typical people, doing typical work in a highly rational model of interaction. Recent trends towards mobility, and emotional and universal design are extending the user interface paradigm beyond the routine. As computing moves into the hand and away from the desktop, there is a greater need for dealing with emotions and distractions. Busy and distracted people represent a new kind of disability, but one that will be increasingly prevalent. In this panel we examine the current state of the art, and prospects for future collaboration in non-normative computing requirements. This panel draws together researchers who are studying the problems of mobility, emotion and universality. The goal of the panel is to discuss how progress in these areas will change the nature of future collaboration.
Microcomputers and the future of epidemiology.
Dean, A G
1994-01-01
The Workshop on Microcomputers and the Future of Epidemiology was held March 8-9, 1993, at the Turner Conference Center, Atlanta, GA, with 130 public health professionals participating. The purpose of the workshop was to define microcomputer needs in epidemiology and to propose future initiatives. Thirteen groups representing public health disciplines defined their needs for better and more useful data, development of computer technology appropriate to epidemiology, user support and human infrastructure development, and global communication and planning. Initiatives proposed were demonstration of health surveillance systems, new software and hardware, computer-based training, projects to establish or improve data bases and community access to data bases, improved international communication, conferences on microcomputer use in particular disciplines, a suggestion to encourage competition in the production of public-domain software, and longrange global planning for epidemiologic computing and data management. Other interested groups are urged to study, modify, and implement those ideas. PMID:7910692
Schinke, Steven P.; Fang, Lin; Cole, Kristin C. A.
2010-01-01
This 2008 study involved 546 Black- and Hispanic-American adolescent girls and their mothers from New York, New Jersey, and Connecticut. Participants provided self-report data. Analysis of covariance indicated that the experimental intervention reduced risk factors, improved protective factors, and lowered girls' alcohol use and their future intentions to use substances. The study supports the value of computer-based and gender-specific interventions that involve girls and mothers. Future work needs to replicate and strengthen study results. Research support came from the National Institute on Drug Abuse within the National Institutes of Health of the United States Public Health Service. PMID:21190404
Dynamic interactions in neural networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arbib, M.A.; Amari, S.
The study of neural networks is enjoying a great renaissance, both in computational neuroscience, the development of information processing models of living brains, and in neural computing, the use of neurally inspired concepts in the construction of intelligent machines. This volume presents models and data on the dynamic interactions occurring in the brain, and exhibits the dynamic interactions between research in computational neuroscience and in neural computing. The authors present current research, future trends and open problems.
NASA Astrophysics Data System (ADS)
Clementi, Enrico
2012-06-01
This is the introductory chapter to the AIP Proceedings volume "Theory and Applications of Computational Chemistry: The First Decade of the Second Millennium" where we discuss the evolution of "computational chemistry". Very early variational computational chemistry developments are reported in Sections 1 to 7, and 11, 12 by recalling some of the computational chemistry contributions by the author and his collaborators (from late 1950 to mid 1990); perturbation techniques are not considered in this already extended work. Present day's computational chemistry is partly considered in Sections 8 to 10 where more recent studies by the author and his collaborators are discussed, including the Hartree-Fock-Heitler-London method; a more general discussion on present day computational chemistry is presented in Section 14. The following chapters of this AIP volume provide a view of modern computational chemistry. Future computational chemistry developments can be extrapolated from the chapters of this AIP volume; further, in Sections 13 and 15 present an overall analysis on computational chemistry, obtained from the Global Simulation approach, by considering the evolution of scientific knowledge confronted with the opportunities offered by modern computers.
Society for College Science Teachers: High Technology.
ERIC Educational Resources Information Center
Menefee, Robert
1983-01-01
Presents findings of a study group on high technology charged with determining a definition, assessing current educational response, and examining implications for the future. Topics addressed include: super-techs; computer-aided design/computer-aided manufacture (CAD/CAM); structural unemployment; a two-plus-two curriculum; and educational…
ERIC Educational Resources Information Center
Yu, Wei-Chieh Wayne
2013-01-01
This study was designed to examine the pre- and in-service nursing professionals' perceptions of using computers to facilitate foreign language learning as consideration for future English for nursing purposes instruction. One hundred and ninety seven Taiwanese nursing students participated in the study. Findings revealed that (1) the participants…
Studies of Human Memory and Language Processing.
ERIC Educational Resources Information Center
Collins, Allan M.
The purposes of this study were to determine the nature of human semantic memory and to obtain knowledge usable in the future development of computer systems that can converse with people. The work was based on a computer model which is designed to comprehend English text, relating the text to information stored in a semantic data base that is…
An overview of the information management component of RICIS
NASA Technical Reports Server (NTRS)
Bishop, Peter C.
1987-01-01
Information management is the RICIS (Research Institute for Computing and Information Systems) research area which covers four types of tasks initiated during the first year of research: (1) surveys - a description of the existing state of some area in computing and information systems; (2) forecasts - a description of the alternative future states of some area; (3) plans - an approach to accomplishing some objective in the future; and (4) demonstrations - working prototypes and field trials to study the feasibility and the benefits of a particular information system. The activity in these research areas is described.
Training the Future - Swamp Work Activities
2017-07-19
In the Swamp Works laboratory at NASA's Kennedy Space Center in Florida, student interns, from the left, Jeremiah House, Thomas Muller and Austin Langdon are joining agency scientists, contributing in the area of Exploration Research and Technology. House is studying computer/electrical engineering at John Brown University in Siloam Springs, Arkansas. Muller is pursuing a degree in computer engineering and control systems and Florida Tech. Langdon is an electrical engineering major at the University of Kentucky. The agency attracts its future workforce through the NASA Internship, Fellowships and Scholarships, or NIFS, Program.
Computers for real time flight simulation: A market survey
NASA Technical Reports Server (NTRS)
Bekey, G. A.; Karplus, W. J.
1977-01-01
An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.
Buse, Kathleen; Hill, Catherine; Benson, Kathleen
2017-01-01
While there is an extensive body of research on gender equity in engineering and computing, there have been few efforts to glean insight from a dialog among experts. To encourage collaboration and to develop a shared vision of the future research agenda, a 2 day workshop of 50 scholars who work on the topic of gender in engineering and computing was held at a rural conference center. The structure of the conference and the location allowed for time to reflect, dialog, and to craft an innovative research agenda aimed at increasing the representation of women in engineering and computing. This paper has been written by the conference organizers and details the ideas and recommendations from the scholars. The result is an innovative, collaborative approach to future research that focuses on identifying effective interventions. The new approach includes the creation of partnerships with stakeholders including businesses, government agencies, non-profits and academic institutions to allow a broader voice in setting research priorities. Researchers recommend incorporating multiple disciplines and methodologies, while expanding the use of data analytics, merging and mining existing databases and creating new datasets. The future research agenda is detailed and includes studies focused on socio-cultural interventions particularly on career choice, within undergraduate and graduate programs, and for women in professional careers. The outcome is a vision for future research that can be shared with researchers, practitioners and other stakeholders that will lead to gender equity in the engineering and computing professions. PMID:28469591
Buse, Kathleen; Hill, Catherine; Benson, Kathleen
2017-01-01
While there is an extensive body of research on gender equity in engineering and computing, there have been few efforts to glean insight from a dialog among experts. To encourage collaboration and to develop a shared vision of the future research agenda, a 2 day workshop of 50 scholars who work on the topic of gender in engineering and computing was held at a rural conference center. The structure of the conference and the location allowed for time to reflect, dialog, and to craft an innovative research agenda aimed at increasing the representation of women in engineering and computing. This paper has been written by the conference organizers and details the ideas and recommendations from the scholars. The result is an innovative, collaborative approach to future research that focuses on identifying effective interventions. The new approach includes the creation of partnerships with stakeholders including businesses, government agencies, non-profits and academic institutions to allow a broader voice in setting research priorities. Researchers recommend incorporating multiple disciplines and methodologies, while expanding the use of data analytics, merging and mining existing databases and creating new datasets. The future research agenda is detailed and includes studies focused on socio-cultural interventions particularly on career choice, within undergraduate and graduate programs, and for women in professional careers. The outcome is a vision for future research that can be shared with researchers, practitioners and other stakeholders that will lead to gender equity in the engineering and computing professions.
Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.
ERIC Educational Resources Information Center
Moore, Gwendolyn B.; And Others
1986-01-01
Describes possible applications of new technologies to special education. Discusses results of a study designed to explore the use of robotics, artificial intelligence, and computer simulations to aid people with handicapping conditions. Presents several scenarios in which specific technological advances may contribute to special education…
Laptop Lessons: Exploring the Promise of One-to-One Computing.
ERIC Educational Resources Information Center
Carter, Kim
2001-01-01
Describes benefits of programs where schools provide laptop computers for each student. Topics include results of studies that show positive learning outcomes; funding options; implementation; protecting the equipment; resources for learning about laptop programs; staff training and support; and future possibilities, including the implications of…
SSMA Science Reviewers' Forecasts for the Future of Science Education.
ERIC Educational Resources Information Center
Jinks, Jerry; Hoffer, Terry
1989-01-01
Described is a study which was conducted as an exploratory assessment of science reviewers' perceptions for the future of science education. Arrives at interpretations for identified categories of computers and high technology, science curriculum, teacher education, training, certification, standards, teaching methods, and materials. (RT)
ERIC Educational Resources Information Center
Teo, Timothy; Luan, Wong Su; Sing, Chai Ching
2008-01-01
As computers becomes more ubiquitous in our everyday lives, educational settings are being transformed where educators and students are expected to teach and learn, using computers (Lee, 2003). This study, therefore, explored pre-service teachers' self reported future intentions to use computers in Singapore and Malaysia. A survey methodology was…
ERIC Educational Resources Information Center
Yelland, Nicola
2005-01-01
This article considers the research literature of the past decade pertaining to the use of computers in early childhood education. It notes that there have been considerable changes in all aspects of our lives over this time period and considers studies in which information and communication technologies (ICT)and in particular, computers, have…
ERIC Educational Resources Information Center
Beyer, Sylvia
2014-01-01
This study addresses why women are underrepresented in Computer Science (CS). Data from 1319 American first-year college students (872 female and 447 male) indicate that gender differences in computer self-efficacy, stereotypes, interests, values, interpersonal orientation, and personality exist. If students had had a positive experience in their…
COMPUTER SUPPORT SYSTEMS FOR ESTIMATING CHEMICAL TOXICITY: PRESENT CAPABILITIES AND FUTURE TRENDS
Computer Support Systems for Estimating Chemical Toxicity: Present Capabilities and Future Trends
A wide variety of computer-based artificial intelligence (AI) and decision support systems exist currently to aid in the assessment of toxicity for environmental chemicals. T...
Technological Innovation and Change: A Case Study in the Formation of Organizational Conscience.
ERIC Educational Resources Information Center
McMillan, Jill J.; Hyde, Michael J.
2000-01-01
Discusses how Wake Forest university's adoption of campus-wide computer technology exhibited critical elements of conscience formation. Details how the computer revolution challenged the customary morality of the university; describes how the community engaged in moral deliberation about its technological future; and discusses how the…
The Possibilities of Transformation: Critical Research and Peter McLaren
ERIC Educational Resources Information Center
Porfilio, Brad J.
2006-01-01
The purpose of this paper is to unveil how Peter McLaren's revolutionary brand of pedagogy, multiculturalism, and research colored my two-year qualitative research study, which unearthed twenty White female future teachers' experiences and perceptions in relationship to computing technology and male-centered computing culture. His ideas positioned…
ERIC Educational Resources Information Center
Martin-McCormick, Lynda; And Others
An advocacy packet on educational equity in computer education consists of five separate materials. A booklet entitled "Today's Guide to the Schools of the Future" contains four sections. The first section, a computer equity assessment guide, includes interview questions about school policies and allocation of resources, student and teacher…
In most transportation studies, computer models that forecast travel behavior statistics for a future year use static projections of the spatial distribution of future population and employment growth as inputs. As a result, they are unable to account for the temporally dynamic a...
In most transportation studies, computer models that forecast travel behavior statistics for a future year use static projections of the spatial distribution of future population and employment growth as inputs. As a result, they are unable to account for the temporally dynamic a...
DOT National Transportation Integrated Search
2015-04-01
This document describes the objectives, methods, analyses, and results of a study used to quantify the effects of future space operations : on the National Airspace System (NAS), and to demonstrate the possible benefits of one proposed strategy to mi...
Neural Basis of Reinforcement Learning and Decision Making
Lee, Daeyeol; Seo, Hyojung; Jung, Min Whan
2012-01-01
Reinforcement learning is an adaptive process in which an animal utilizes its previous experience to improve the outcomes of future choices. Computational theories of reinforcement learning play a central role in the newly emerging areas of neuroeconomics and decision neuroscience. In this framework, actions are chosen according to their value functions, which describe how much future reward is expected from each action. Value functions can be adjusted not only through reward and penalty, but also by the animal’s knowledge of its current environment. Studies have revealed that a large proportion of the brain is involved in representing and updating value functions and using them to choose an action. However, how the nature of a behavioral task affects the neural mechanisms of reinforcement learning remains incompletely understood. Future studies should uncover the principles by which different computational elements of reinforcement learning are dynamically coordinated across the entire brain. PMID:22462543
Prospective Optimization with Limited Resources.
Snider, Joseph; Lee, Dongpyo; Poizner, Howard; Gepshtein, Sergei
2015-09-01
The future is uncertain because some forthcoming events are unpredictable and also because our ability to foresee the myriad consequences of our own actions is limited. Here we studied how humans select actions under such extrinsic and intrinsic uncertainty, in view of an exponentially expanding number of prospects on a branching multivalued visual stimulus. A triangular grid of disks of different sizes scrolled down a touchscreen at a variable speed. The larger disks represented larger rewards. The task was to maximize the cumulative reward by touching one disk at a time in a rapid sequence, forming an upward path across the grid, while every step along the path constrained the part of the grid accessible in the future. This task captured some of the complexity of natural behavior in the risky and dynamic world, where ongoing decisions alter the landscape of future rewards. By comparing human behavior with behavior of ideal actors, we identified the strategies used by humans in terms of how far into the future they looked (their "depth of computation") and how often they attempted to incorporate new information about the future rewards (their "recalculation period"). We found that, for a given task difficulty, humans traded off their depth of computation for the recalculation period. The form of this tradeoff was consistent with a complete, brute-force exploration of all possible paths up to a resource-limited finite depth. A step-by-step analysis of the human behavior revealed that participants took into account very fine distinctions between the future rewards and that they abstained from some simple heuristics in assessment of the alternative paths, such as seeking only the largest disks or avoiding the smaller disks. The participants preferred to reduce their depth of computation or increase the recalculation period rather than sacrifice the precision of computation.
Brain-Computer Interfaces Using Sensorimotor Rhythms: Current State and Future Perspectives
Yuan, Han; He, Bin
2014-01-01
Many studies over the past two decades have shown that people can use brain signals to convey their intent to a computer using brain-computer interfaces (BCIs). BCI systems extract specific features of brain activity and translate them into control signals that drive an output. Recently, a category of BCIs that are built on the rhythmic activity recorded over the sensorimotor cortex, i.e. the sensorimotor rhythm (SMR), has attracted considerable attention among the BCIs that use noninvasive neural recordings, e.g. electroencephalography (EEG), and have demonstrated the capability of multi-dimensional prosthesis control. This article reviews the current state and future perspectives of SMR-based BCI and its clinical applications, in particular focusing on the EEG SMR. The characteristic features of SMR from the human brain are described and their underlying neural sources are discussed. The functional components of SMR-based BCI, together with its current clinical applications are reviewed. Lastly, limitations of SMR-BCIs and future outlooks are also discussed. PMID:24759276
The Future of Computer-Based Toxicity Prediction:
Mechanism-Based Models vs. Information Mining Approaches
When we speak of computer-based toxicity prediction, we are generally referring to a broad array of approaches which rely primarily upon chemical structure ...
NASA Technical Reports Server (NTRS)
1973-01-01
The logistics of orbital vehicle servicing computer specifications was developed and a number of alternatives to improve utilization of the space shuttle and the tug were investigated. Preliminary results indicate that space servicing offers a potential for reducing future operational and program costs over ground refurbishment of satellites. A computer code which could be developed to simulate space servicing is presented.
ERIC Educational Resources Information Center
Paul, Sandra K.; Kranberg, Susan
The third report from a comprehensive Unesco study, this document traces the history of the application of computer-based technology to the book distribution process in the United States and indicates functional areas currently showing the effects of using this technology. Ways in which computer use is altering book distribution management…
Computation of canonical correlation and best predictable aspect of future for time series
NASA Technical Reports Server (NTRS)
Pourahmadi, Mohsen; Miamee, A. G.
1989-01-01
The canonical correlation between the (infinite) past and future of a stationary time series is shown to be the limit of the canonical correlation between the (infinite) past and (finite) future, and computation of the latter is reduced to a (generalized) eigenvalue problem involving (finite) matrices. This provides a convenient and essentially, finite-dimensional algorithm for computing canonical correlations and components of a time series. An upper bound is conjectured for the largest canonical correlation.
Ferguson, Melanie; Henshaw, Helen
2015-09-01
The aim of this research forum article was to examine accessibility, use, and adherence to computerized and online interventions for people with hearing loss. Four intervention studies of people with hearing loss were examined: 2 auditory training studies, 1 working memory training study, and 1 study of multimedia educational support. A small proportion (approximately 15%) of participants had never used a computer, which may be a barrier to the accessibility of computer and Internet-based interventions. Computer competence was not a factor in intervention use or adherence. Computer skills and Internet access influenced participant preference for the delivery method of the multimedia educational support program. It is important to be aware of current barriers to computer and Internet-delivered interventions for people with hearing loss. However, there is a clear need to develop and future-proof hearing-related applications for online delivery.
Modeling Trait Anxiety: From Computational Processes to Personality
Raymond, James G.; Steele, J. Douglas; Seriès, Peggy
2017-01-01
Computational methods are increasingly being applied to the study of psychiatric disorders. Often, this involves fitting models to the behavior of individuals with subclinical character traits that are known vulnerability factors for the development of psychiatric conditions. Anxiety disorders can be examined with reference to the behavior of individuals high in “trait” anxiety, which is a known vulnerability factor for the development of anxiety and mood disorders. However, it is not clear how this self-report measure relates to neural and behavioral processes captured by computational models. This paper reviews emerging computational approaches to the study of trait anxiety, specifying how interacting processes susceptible to analysis using computational models could drive a tendency to experience frequent anxious states and promote vulnerability to the development of clinical disorders. Existing computational studies are described in the light of this perspective and appropriate targets for future studies are discussed. PMID:28167920
Modeling Trait Anxiety: From Computational Processes to Personality.
Raymond, James G; Steele, J Douglas; Seriès, Peggy
2017-01-01
Computational methods are increasingly being applied to the study of psychiatric disorders. Often, this involves fitting models to the behavior of individuals with subclinical character traits that are known vulnerability factors for the development of psychiatric conditions. Anxiety disorders can be examined with reference to the behavior of individuals high in "trait" anxiety, which is a known vulnerability factor for the development of anxiety and mood disorders. However, it is not clear how this self-report measure relates to neural and behavioral processes captured by computational models. This paper reviews emerging computational approaches to the study of trait anxiety, specifying how interacting processes susceptible to analysis using computational models could drive a tendency to experience frequent anxious states and promote vulnerability to the development of clinical disorders. Existing computational studies are described in the light of this perspective and appropriate targets for future studies are discussed.
The Ideology of Computer Literacy in Schools.
ERIC Educational Resources Information Center
Mangan, J. Marshall
This research project brings a critical perspective to the examination of computer literacy as an ideological form through a study of the reactions of high school teachers and students. On-site interviews with teachers and students found both acceptance of and resistance to the message of adjustment to an inevitable future of vocational and…
Automatic Student Plagiarism Detection: Future Perspectives
ERIC Educational Resources Information Center
Mozgovoy, Maxim; Kakkonen, Tuomo; Cosma, Georgina
2010-01-01
The availability and use of computers in teaching has seen an increase in the rate of plagiarism among students because of the wide availability of electronic texts online. While computer tools that have appeared in recent years are capable of detecting simple forms of plagiarism, such as copy-paste, a number of recent research studies devoted to…
Computer Technologies: Attitudes and Self-Efficacy across Undergraduate Disciplines.
ERIC Educational Resources Information Center
Kinzie, Mable B.; And Others
1994-01-01
A study of 359 undergraduate students in business (n=125), education (n=111), and nursing (n=123) in 3 state university systems investigated the use of 2 affective measures concerning aspects of computer technology. Data on construct validity, relationship between results of the two measures, and implications for future research are reported.…
ERIC Educational Resources Information Center
Ilieva, Vessela; Erguner-Tekinalp, Bengu
2012-01-01
This study examined the applications of computer-mediated student collaboration in a graduate multicultural counseling course. The course work included a reflective cultural competency building assignment that utilized online communication and collaboration using a wiki to extend and improve students' multicultural counseling and social justice…
High-Throughput Computing on High-Performance Platforms: A Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oleynik, D; Panitkin, S; Matteo, Turilli
The computing systems used by LHC experiments has historically consisted of the federation of hundreds to thousands of distributed resources, ranging from small to mid-size resource. In spite of the impressive scale of the existing distributed computing solutions, the federation of small to mid-size resources will be insufficient to meet projected future demands. This paper is a case study of how the ATLAS experiment has embraced Titan -- a DOE leadership facility in conjunction with traditional distributed high- throughput computing to reach sustained production scales of approximately 52M core-hours a years. The three main contributions of this paper are: (i)more » a critical evaluation of design and operational considerations to support the sustained, scalable and production usage of Titan; (ii) a preliminary characterization of a next generation executor for PanDA to support new workloads and advanced execution modes; and (iii) early lessons for how current and future experimental and observational systems can be integrated with production supercomputers and other platforms in a general and extensible manner.« less
A Review Study on Cloud Computing Issues
NASA Astrophysics Data System (ADS)
Kanaan Kadhim, Qusay; Yusof, Robiah; Sadeq Mahdi, Hamid; Al-shami, Sayed Samer Ali; Rahayu Selamat, Siti
2018-05-01
Cloud computing is the most promising current implementation of utility computing in the business world, because it provides some key features over classic utility computing, such as elasticity to allow clients dynamically scale-up and scale-down the resources in execution time. Nevertheless, cloud computing is still in its premature stage and experiences lack of standardization. The security issues are the main challenges to cloud computing adoption. Thus, critical industries such as government organizations (ministries) are reluctant to trust cloud computing due to the fear of losing their sensitive data, as it resides on the cloud with no knowledge of data location and lack of transparency of Cloud Service Providers (CSPs) mechanisms used to secure their data and applications which have created a barrier against adopting this agile computing paradigm. This study aims to review and classify the issues that surround the implementation of cloud computing which a hot area that needs to be addressed by future research.
Preparing Students for Future Learning with Teachable Agents
ERIC Educational Resources Information Center
Chin, Doris B.; Dohmen, Ilsa M.; Cheng, Britte H.; Oppezzo, Marily A.; Chase, Catherine C.; Schwartz, Daniel L.
2010-01-01
One valuable goal of instructional technologies in K-12 education is to prepare students for future learning. Two classroom studies examined whether Teachable Agents (TA) achieves this goal. TA is an instructional technology that draws on the social metaphor of teaching a computer agent to help students learn. Students teach their agent by…
Sherlock Holmes Meets the 21st Century.
ERIC Educational Resources Information Center
Flack, Jerry
1991-01-01
Mystery literature is proposed as a component of futures studies curriculum for gifted students. The article describes similarities between the behaviors of a detective and a critical thinker, the tools of futurists such as the futures wheel, and the use of such topics as computer crime and extraterrestrial life to challenge students' thinking…
Ravazzani, Giovanni; Ghilardi, Matteo; Mendlik, Thomas; Gobiet, Andreas; Corbari, Chiara; Mancini, Marco
2014-01-01
Assessing the future effects of climate change on water availability requires an understanding of how precipitation and evapotranspiration rates will respond to changes in atmospheric forcing. Use of simplified hydrological models is required beacause of lack of meteorological forcings with the high space and time resolutions required to model hydrological processes in mountains river basins, and the necessity of reducing the computational costs. The main objective of this study was to quantify the differences between a simplified hydrological model, which uses only precipitation and temperature to compute the hydrological balance when simulating the impact of climate change, and an enhanced version of the model, which solves the energy balance to compute the actual evapotranspiration. For the meteorological forcing of future scenario, at-site bias-corrected time series based on two regional climate models were used. A quantile-based error-correction approach was used to downscale the regional climate model simulations to a point scale and to reduce its error characteristics. The study shows that a simple temperature-based approach for computing the evapotranspiration is sufficiently accurate for performing hydrological impact investigations of climate change for the Alpine river basin which was studied. PMID:25285917
Ravazzani, Giovanni; Ghilardi, Matteo; Mendlik, Thomas; Gobiet, Andreas; Corbari, Chiara; Mancini, Marco
2014-01-01
Assessing the future effects of climate change on water availability requires an understanding of how precipitation and evapotranspiration rates will respond to changes in atmospheric forcing. Use of simplified hydrological models is required because of lack of meteorological forcings with the high space and time resolutions required to model hydrological processes in mountains river basins, and the necessity of reducing the computational costs. The main objective of this study was to quantify the differences between a simplified hydrological model, which uses only precipitation and temperature to compute the hydrological balance when simulating the impact of climate change, and an enhanced version of the model, which solves the energy balance to compute the actual evapotranspiration. For the meteorological forcing of future scenario, at-site bias-corrected time series based on two regional climate models were used. A quantile-based error-correction approach was used to downscale the regional climate model simulations to a point scale and to reduce its error characteristics. The study shows that a simple temperature-based approach for computing the evapotranspiration is sufficiently accurate for performing hydrological impact investigations of climate change for the Alpine river basin which was studied.
Current and Future Development of a Non-hydrostatic Unified Atmospheric Model (NUMA)
2010-09-09
following capabilities: 1. Highly scalable on current and future computer architectures ( exascale computing and beyond and GPUs) 2. Flexibility... Exascale Computing • 10 of Top 500 are already in the Petascale range • Should also keep our eyes on GPUs (e.g., Mare Nostrum) 2. Numerical
Reverse logistics system planning for recycling computers hardware: A case study
NASA Astrophysics Data System (ADS)
Januri, Siti Sarah; Zulkipli, Faridah; Zahari, Siti Meriam; Shamsuri, Siti Hajar
2014-09-01
This paper describes modeling and simulation of reverse logistics networks for collection of used computers in one of the company in Selangor. The study focuses on design of reverse logistics network for used computers recycling operation. Simulation modeling, presented in this work allows the user to analyze the future performance of the network and to understand the complex relationship between the parties involved. The findings from the simulation suggest that the model calculates processing time and resource utilization in a predictable manner. In this study, the simulation model was developed by using Arena simulation package.
Computational predictions of zinc oxide hollow structures
NASA Astrophysics Data System (ADS)
Tuoc, Vu Ngoc; Huan, Tran Doan; Thao, Nguyen Thi
2018-03-01
Nanoporous materials are emerging as potential candidates for a wide range of technological applications in environment, electronic, and optoelectronics, to name just a few. Within this active research area, experimental works are predominant while theoretical/computational prediction and study of these materials face some intrinsic challenges, one of them is how to predict porous structures. We propose a computationally and technically feasible approach for predicting zinc oxide structures with hollows at the nano scale. The designed zinc oxide hollow structures are studied with computations using the density functional tight binding and conventional density functional theory methods, revealing a variety of promising mechanical and electronic properties, which can potentially find future realistic applications.
QUANTUM COMPUTING: Quantum Entangled Bits Step Closer to IT.
Zeilinger, A
2000-07-21
In contrast to today's computers, quantum computers and information technologies may in future be able to store and transmit information not only in the state "0" or "1," but also in superpositions of the two; information will then be stored and transmitted in entangled quantum states. Zeilinger discusses recent advances toward using this principle for quantum cryptography and highlights studies into the entanglement (or controlled superposition) of several photons, atoms, or ions.
eXascale PRogramming Environment and System Software (XPRESS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapman, Barbara; Gabriel, Edgar
Exascale systems, with a thousand times the compute capacity of today’s leading edge petascale computers, are expected to emerge during the next decade. Their software systems will need to facilitate the exploitation of exceptional amounts of concurrency in applications, and ensure that jobs continue to run despite the occurrence of system failures and other kinds of hard and soft errors. Adapting computations at runtime to cope with changes in the execution environment, as well as to improve power and performance characteristics, is likely to become the norm. As a result, considerable innovation is required to develop system support to meetmore » the needs of future computing platforms. The XPRESS project aims to develop and prototype a revolutionary software system for extreme-scale computing for both exascale and strongscaled problems. The XPRESS collaborative research project will advance the state-of-the-art in high performance computing and enable exascale computing for current and future DOE mission-critical applications and supporting systems. The goals of the XPRESS research project are to: A. enable exascale performance capability for DOE applications, both current and future, B. develop and deliver a practical computing system software X-stack, OpenX, for future practical DOE exascale computing systems, and C. provide programming methods and environments for effective means of expressing application and system software for portable exascale system execution.« less
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Housner, Jerrold M.
1993-01-01
Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.
Preliminary Computational Study for Future Tests in the NASA Ames 9 foot' x 7 foot Wind Tunnel
NASA Technical Reports Server (NTRS)
Pearl, Jason M.; Carter, Melissa B.; Elmiligui, Alaa A.; WInski, Courtney S.; Nayani, Sudheer N.
2016-01-01
The NASA Advanced Air Vehicles Program, Commercial Supersonics Technology Project seeks to advance tools and techniques to make over-land supersonic flight feasible. In this study, preliminary computational results are presented for future tests in the NASA Ames 9 foot x 7 foot supersonic wind tunnel to be conducted in early 2016. Shock-plume interactions and their effect on pressure signature are examined for six model geometries. Near- field pressure signatures are assessed using the CFD code USM3D to model the proposed test geometries in free-air. Additionally, results obtained using the commercial grid generation software Pointwise Reigistered Trademark are compared to results using VGRID, the NASA Langley Research Center in-house mesh generation program.
Computers in the Classroom: The School of the Future, The Future of the School.
ERIC Educational Resources Information Center
Tapia, Ivan, Ed.
1995-01-01
Computer uses in the classroom is the theme topic of this journal issue. Contents include: "Emo Welzl: 1995 Leibniz Laureate" (Hartmut Wewetzer); "Learning to Read with the Aid of a Computer: Research Project with Children Starting School" (Horst Meermann); "The Multimedia School: The Comenius Pilot Project" (Tom Sperlich); "A Very Useful Piece of…
Advances and trends in computational structural mechanics
NASA Technical Reports Server (NTRS)
Noor, A. K.
1986-01-01
Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.
Application of computational physics within Northrop
NASA Technical Reports Server (NTRS)
George, M. W.; Ling, R. T.; Mangus, J. F.; Thompkins, W. T.
1987-01-01
An overview of Northrop programs in computational physics is presented. These programs depend on access to today's supercomputers, such as the Numerical Aerodynamical Simulator (NAS), and future growth on the continuing evolution of computational engines. Descriptions here are concentrated on the following areas: computational fluid dynamics (CFD), computational electromagnetics (CEM), computer architectures, and expert systems. Current efforts and future directions in these areas are presented. The impact of advances in the CFD area is described, and parallels are drawn to analagous developments in CEM. The relationship between advances in these areas and the development of advances (parallel) architectures and expert systems is also presented.
NASA Astrophysics Data System (ADS)
Tansey, M. K.; Flores-Lopez, F.; Young, C. A.; Huntington, J. L.
2012-12-01
Long term planning for the management of California's water resources requires assessment of the effects of future climate changes on both water supply and demand. Considerable progress has been made on the evaluation of the effects of future climate changes on water supplies but less information is available with regard to water demands. Uncertainty in future climate projections increases the difficulty of assessing climate impacts and evaluating long range adaptation strategies. Compounding the uncertainty in the future climate projections is the fact that most readily available downscaled climate projections lack sufficient meteorological information to compute evapotranspiration (ET) by the widely accepted ASCE Penman-Monteith (PM) method. This study addresses potential changes in future Central Valley water demands and crop yields by examining the effects of climate change on soil evaporation, plant transpiration, growth and yield for major types of crops grown in the Central Valley of California. Five representative climate scenarios based on 112 bias corrected spatially downscaled CMIP 3 GCM climate simulations were developed using the hybrid delta ensemble method to span a wide range future climate uncertainty. Analysis of historical California Irrigation Management Information System meteorological data was combined with several meteorological estimation methods to compute future solar radiation, wind speed and dew point temperatures corresponding to the GCM projected temperatures and precipitation. Future atmospheric CO2 concentrations corresponding to the 5 representative climate projections were developed based on weighting IPCC SRES emissions scenarios. The Land, Atmosphere, and Water Simulator (LAWS) model was used to compute ET and yield changes in the early, middle and late 21st century for 24 representative agricultural crops grown in the Sacramento, San Joaquin and Tulare Lake basins. Study results indicate that changes in ET and yield vary between crops due to plant specific sensitivities to temperature, solar radiation and the vapor pressure deficits. Shifts in the growth period to earlier in the year, shortened growth period for annual crops as well as extended fall growth can also exert important influences. Projected increases in CO2 concentrations in the late 21st century exert very significant influences on ET and yield for many crops. To characterize potential impacts and the range of uncertainty, changes in total agricultural water demands and yields were computed assuming that current crop types and acreages in 21 Central Valley regional planning areas remained constant throughout the 21st century for each of the 5 representative future climate scenarios.
Algorithms in nature: the convergence of systems biology and computational thinking
Navlakha, Saket; Bar-Joseph, Ziv
2011-01-01
Computer science and biology have enjoyed a long and fruitful relationship for decades. Biologists rely on computational methods to analyze and integrate large data sets, while several computational methods were inspired by the high-level design principles of biological systems. Recently, these two directions have been converging. In this review, we argue that thinking computationally about biological processes may lead to more accurate models, which in turn can be used to improve the design of algorithms. We discuss the similar mechanisms and requirements shared by computational and biological processes and then present several recent studies that apply this joint analysis strategy to problems related to coordination, network analysis, and tracking and vision. We also discuss additional biological processes that can be studied in a similar manner and link them to potential computational problems. With the rapid accumulation of data detailing the inner workings of biological systems, we expect this direction of coupling biological and computational studies to greatly expand in the future. PMID:22068329
Future prospect 2012-2025 - How will our business change for the next 10 years -
NASA Astrophysics Data System (ADS)
Tanaka, Sakae
2013-04-01
The purpose of this lecture is to discuss about the "Future". How our business will change in the next 10 years? I believe the key is 3 mega-trends "Sustainability", "Cloud Computing" and "Life Innovation". With the development of social environment, the required business will change, too. The future would be invisible if you shut yourself up in your single industry. It is important to see various business fields horizontally, and recognize various key changes stereoscopically such as demographics, economy, technology, sense of value and lifestyle, when you develop mid-and-long term strategy. "Cloud" is silent, but the revolution of personal computing. It will bring the drastic changes in every industry. It will make "voice" and "moving image" possible to use as the interface to access your computer. Cloud computing will also make the client device more diversified and spread the application range widely. 15 years ago, the term "IT" was equivalent to "personal computer". Recently, it rather means to use smartphone and tablet device. In the next several years, TV and car-navigation system will be connected to broadband and it will become a part of personal computing. The meaning of personal computing is changing essentially year by year. In near future, the universe of computing will expand to the energy, medical and health-care, and agriculture etc. It passed only 20 years since we use "Computer" in a full scale operation. Recently, computer has start understanding our few words and talking in babble like a baby. The history of computing has just started.
Computer simulation modeling of recreation use: Current status, case studies, and future directions
David N. Cole
2005-01-01
This report compiles information about recent progress in the application of computer simulation modeling to planning and management of recreation use, particularly in parks and wilderness. Early modeling efforts are described in a chapter that provides an historical perspective. Another chapter provides an overview of modeling options, common data input requirements,...
Computer simulation studies in fluid and calcium regulation and orthostatic intolerance
NASA Technical Reports Server (NTRS)
1985-01-01
The systems analysis approach to physiological research uses mathematical models and computer simulation. Major areas of concern during prolonged space flight discussed include fluid and blood volume regulation; cardiovascular response during shuttle reentry; countermeasures for orthostatic intolerance; and calcium regulation and bone atrophy. Potential contributions of physiologic math models to future flight experiments are examined.
Visions of CSCL: Eight Provocations for the Future of the Field
ERIC Educational Resources Information Center
Wise, Alyssa Friend; Schwarz, Baruch B.
2017-01-01
The field of Computer Supported Computer Learning (CSCL) is at a critical moment in its development. Internally we face issues of fragmentation and questions about what progress is being made. Externally the rise of social media and a variety of research communities that study the interactions within it raise questions about our unique identity…
Learning from the Learners: Preparing Future Teachers to Leverage the Benefits of Laptop Computers
ERIC Educational Resources Information Center
Grundmeyer, Trent; Peters, Randal
2016-01-01
Technology is changing the teaching and learning landscape. Teacher preparation programs must produce teachers who have new skills and strategies to leverage the benefits of laptop computers in their classrooms. This study used a phenomenological strategy to explain first-year college students' perceptions of the effects of a 1:1 laptop experience…
The Next Wave: Humans, Computers, and Redefining Reality
NASA Technical Reports Server (NTRS)
Little, William
2018-01-01
The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.
ERIC Educational Resources Information Center
Kurland, D. Midian, Ed.
The five papers in this symposium contribute to a dialog on the aims and methods of computer education, and indicate directions future research must take if necessary information is to be available to make informed decisions about the use of computers in schools. The first two papers address the question of what is required for a student to become…
Rutkowski, Tomasz M
2015-08-01
This paper presents an applied concept of a brain-computer interface (BCI) student research laboratory (BCI-LAB) at the Life Science Center of TARA, University of Tsukuba, Japan. Several successful case studies of the student projects are reviewed together with the BCI Research Award 2014 winner case. The BCI-LAB design and project-based teaching philosophy is also explained. Future teaching and research directions summarize the review.
A statistical view of FMRFamide neuropeptide diversity.
Espinoza, E; Carrigan, M; Thomas, S G; Shaw, G; Edison, A S
2000-01-01
FMRFamide-like peptide (FLP) amino acid sequences have been collected and statistically analyzed. FLP amino acid composition as a function of position in the peptide is graphically presented for several major phyla. Results of total amino acid composition and frequencies of pairs of FLP amino acids have been computed and compared with corresponding values from the entire GenBank protein sequence database. The data for pairwise distributions of amino acids should help in future structure-function studies of FLPs. To aid in future peptide discovery, a computer program and search protocol was developed to identify FLPs from the GenBank protein database without the use of keywords.
Aerothermodynamic testing requirements for future space transportation systems
NASA Technical Reports Server (NTRS)
Paulson, John W., Jr.; Miller, Charles G., III
1995-01-01
Aerothermodynamics, encompassing aerodynamics, aeroheating, and fluid dynamic and physical processes, is the genesis for the design and development of advanced space transportation vehicles. It provides crucial information to other disciplines involved in the development process such as structures, materials, propulsion, and avionics. Sources of aerothermodynamic information include ground-based facilities, computational fluid dynamic (CFD) and engineering computer codes, and flight experiments. Utilization of this triad is required to provide the optimum requirements while reducing undue design conservatism, risk, and cost. This paper discusses the role of ground-based facilities in the design of future space transportation system concepts. Testing methodology is addressed, including the iterative approach often required for the assessment and optimization of configurations from an aerothermodynamic perspective. The influence of vehicle shape and the transition from parametric studies for optimization to benchmark studies for final design and establishment of the flight data book is discussed. Future aerothermodynamic testing requirements including the need for new facilities are also presented.
Lorah, Elizabeth R; Parnell, Ashley; Whitby, Peggy Schaefer; Hantula, Donald
2015-12-01
Powerful, portable, off-the-shelf handheld devices, such as tablet based computers (i.e., iPad(®); Galaxy(®)) or portable multimedia players (i.e., iPod(®)), can be adapted to function as speech generating devices for individuals with autism spectrum disorders or related developmental disabilities. This paper reviews the research in this new and rapidly growing area and delineates an agenda for future investigations. In general, participants using these devices acquired verbal repertoires quickly. Studies comparing these devices to picture exchange or manual sign language found that acquisition was often quicker when using a tablet computer and that the vast majority of participants preferred using the device to picture exchange or manual sign language. Future research in interface design, user experience, and extended verbal repertoires is recommended.
NASA Technical Reports Server (NTRS)
Boyle, W. G.; Barton, G. W.
1979-01-01
The feasibility of computerized automation of the Analytical Laboratories Section at NASA's Lewis Research Center was considered. Since that laboratory's duties are not routine, the automation goals were set with that in mind. Four instruments were selected as the most likely automation candidates: an atomic absorption spectrophotometer, an emission spectrometer, an X-ray fluorescence spectrometer, and an X-ray diffraction unit. Two options for computer automation were described: a time-shared central computer and a system with microcomputers for each instrument connected to a central computer. A third option, presented for future planning, expands the microcomputer version. Costs and benefits for each option were considered. It was concluded that the microcomputer version best fits the goals and duties of the laboratory and that such an automted system is needed to meet the laboratory's future requirements.
The future challenge for aeropropulsion
NASA Technical Reports Server (NTRS)
Rosen, Robert; Bowditch, David N.
1992-01-01
NASA's research in aeropropulsion is focused on improving the efficiency, capability, and environmental compatibility for all classes of future aircraft. The development of innovative concepts, and theoretical, experimental, and computational tools provide the knowledge base for continued propulsion system advances. Key enabling technologies include advances in internal fluid mechanics, structures, light-weight high-strength composite materials, and advanced sensors and controls. Recent emphasis has been on the development of advanced computational tools in internal fluid mechanics, structural mechanics, reacting flows, and computational chemistry. For subsonic transport applications, very high bypass ratio turbofans with increased engine pressure ratio are being investigated to increase fuel efficiency and reduce airport noise levels. In a joint supersonic cruise propulsion program with industry, the critical environmental concerns of emissions and community noise are being addressed. NASA is also providing key technologies for the National Aerospaceplane, and is studying propulsion systems that provide the capability for aircraft to accelerate to and cruise in the Mach 4-6 speed range. The combination of fundamental, component, and focused technology development underway at NASA will make possible dramatic advances in aeropropulsion efficiency and environmental compatibility for future aeronautical vehicles.
Application of theoretical methods to increase succinate production in engineered strains.
Valderrama-Gomez, M A; Kreitmayer, D; Wolf, S; Marin-Sanguino, A; Kremling, A
2017-04-01
Computational methods have enabled the discovery of non-intuitive strategies to enhance the production of a variety of target molecules. In the case of succinate production, reviews covering the topic have not yet analyzed the impact and future potential that such methods may have. In this work, we review the application of computational methods to the production of succinic acid. We found that while a total of 26 theoretical studies were published between 2002 and 2016, only 10 studies reported the successful experimental implementation of any kind of theoretical knowledge. None of the experimental studies reported an exact application of the computational predictions. However, the combination of computational analysis with complementary strategies, such as directed evolution and comparative genome analysis, serves as a proof of concept and demonstrates that successful metabolic engineering can be guided by rational computational methods.
Bellwether Social Studies Programs.
ERIC Educational Resources Information Center
Daetz, Denney
1985-01-01
Describes and reviews commercially-available computer software for social studies (SS). They are: "Jury Trial II" (utilizes artificial intelligence); "Africa" (utilizes creative graphics to teaching SS facts; "Revolutions: Past, Present and Future"; "The Other Side" (examines world peace using values…
Quantum Computation: Entangling with the Future
NASA Technical Reports Server (NTRS)
Jiang, Zhang
2017-01-01
Commercial applications of quantum computation have become viable due to the rapid progress of the field in the recent years. Efficient quantum algorithms are discovered to cope with the most challenging real-world problems that are too hard for classical computers. Manufactured quantum hardware has reached unprecedented precision and controllability, enabling fault-tolerant quantum computation. Here, I give a brief introduction on what principles in quantum mechanics promise its unparalleled computational power. I will discuss several important quantum algorithms that achieve exponential or polynomial speedup over any classical algorithm. Building a quantum computer is a daunting task, and I will talk about the criteria and various implementations of quantum computers. I conclude the talk with near-future commercial applications of a quantum computer.
Programmers, professors, and parasites: credit and co-authorship in computer science.
Solomon, Justin
2009-12-01
This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.
Translational bioinformatics in the cloud: an affordable alternative
2010-01-01
With the continued exponential expansion of publicly available genomic data and access to low-cost, high-throughput molecular technologies for profiling patient populations, computational technologies and informatics are becoming vital considerations in genomic medicine. Although cloud computing technology is being heralded as a key enabling technology for the future of genomic research, available case studies are limited to applications in the domain of high-throughput sequence data analysis. The goal of this study was to evaluate the computational and economic characteristics of cloud computing in performing a large-scale data integration and analysis representative of research problems in genomic medicine. We find that the cloud-based analysis compares favorably in both performance and cost in comparison to a local computational cluster, suggesting that cloud computing technologies might be a viable resource for facilitating large-scale translational research in genomic medicine. PMID:20691073
Stanczyk, Nicola Esther; Crutzen, Rik; Bolman, Catherine; Muris, Jean; de Vries, Hein
2013-02-06
Smoking tobacco is one of the most preventable causes of illness and death. Web-based tailored smoking cessation interventions have shown to be effective. Although these interventions have the potential to reach a large number of smokers, they often face high attrition rates, especially among lower educated smokers. A possible reason for the high attrition rates in the latter group is that computer-tailored smoking cessation interventions may not be attractive enough as they are mainly text-based. Video-based messages might be more effective in attracting attention and stimulating comprehension in people with a lower educational level and could therefore reduce attrition rates. The objective of the present study was to investigate whether differences exist in message-processing mechanisms (attention, comprehension, self-reference, appreciation, processing) and future adherence (intention to visit/use the website again, recommend the website to others), according to delivery strategy (video or text based messages) and educational level, to a Dutch computer-tailored smoking cessation program. Smokers who were motivated to quit within the following 6 months and who were aged over 16 were included in the program. Participants were randomly assigned to one of two conditions (video/text CT). The sample was stratified into 2 categories: lower and higher educated participants. In total, 139 participants completed the first session of the web-based tailored intervention and were subsequently asked to fill out a questionnaire assessing message-processing mechanisms and future adherence. ANOVAs and regression analyses were conducted to investigate the differences in message-processing mechanisms and future adherence with regard to delivery strategy and education. No interaction effects were found between delivery strategy (video vs text) and educational level on message-processing mechanisms and future adherence. Delivery strategy had no effect on future adherence and processing mechanisms. However, in both groups results indicated that lower educated participants showed higher attention (F(1,138)=3.97; P=.05) and processing levels (F(1,138)=4.58; P=.04). Results revealed also that lower educated participants were more inclined to visit the computer-tailored intervention website again (F(1,138)=4.43; P=.04). Computer-tailored programs have the potential to positively influence lower educated groups as they might be more involved in the computer-tailored intervention than higher educated smokers. Longitudinal studies with a larger sample are needed to gain more insight into the role of delivery strategy in tailored information and to investigate whether the intention to visit the intervention website again results in the ultimate goal of behavior change. Netherlands Trial Register (NTR3102).
Crutzen, Rik; Bolman, Catherine; Muris, Jean; de Vries, Hein
2013-01-01
Background Smoking tobacco is one of the most preventable causes of illness and death. Web-based tailored smoking cessation interventions have shown to be effective. Although these interventions have the potential to reach a large number of smokers, they often face high attrition rates, especially among lower educated smokers. A possible reason for the high attrition rates in the latter group is that computer-tailored smoking cessation interventions may not be attractive enough as they are mainly text-based. Video-based messages might be more effective in attracting attention and stimulating comprehension in people with a lower educational level and could therefore reduce attrition rates. Objective The objective of the present study was to investigate whether differences exist in message-processing mechanisms (attention, comprehension, self-reference, appreciation, processing) and future adherence (intention to visit/use the website again, recommend the website to others), according to delivery strategy (video or text based messages) and educational level, to a Dutch computer-tailored smoking cessation program. Methods Smokers who were motivated to quit within the following 6 months and who were aged over 16 were included in the program. Participants were randomly assigned to one of two conditions (video/text CT). The sample was stratified into 2 categories: lower and higher educated participants. In total, 139 participants completed the first session of the web-based tailored intervention and were subsequently asked to fill out a questionnaire assessing message-processing mechanisms and future adherence. ANOVAs and regression analyses were conducted to investigate the differences in message-processing mechanisms and future adherence with regard to delivery strategy and education. Results No interaction effects were found between delivery strategy (video vs text) and educational level on message-processing mechanisms and future adherence. Delivery strategy had no effect on future adherence and processing mechanisms. However, in both groups results indicated that lower educated participants showed higher attention (F 1,138=3.97; P=.05) and processing levels (F 1,138=4.58; P=.04). Results revealed also that lower educated participants were more inclined to visit the computer-tailored intervention website again (F 1,138=4.43; P=.04). Conclusions Computer-tailored programs have the potential to positively influence lower educated groups as they might be more involved in the computer-tailored intervention than higher educated smokers. Longitudinal studies with a larger sample are needed to gain more insight into the role of delivery strategy in tailored information and to investigate whether the intention to visit the intervention website again results in the ultimate goal of behavior change. Trial Registration Netherlands Trial Register (NTR3102). PMID:23388554
Taylor, Michael J; Taylor, Dave; Vlaev, Ivo; Elkin, Sarah
2017-01-01
Recent advances in communication technologies enable potential provision of remote education for patients using computer-generated environments known as virtual worlds. Previous research has revealed highly variable levels of patient receptiveness to using information technologies for healthcare-related purposes. This preliminary study involved implementing a questionnaire investigating attitudes and access to computer technologies of respiratory outpatients, in order to assess potential for use of virtual worlds to facilitate health-related education for this sample. Ninety-four patients with a chronic respiratory condition completed surveys, which were distributed at a Chest Clinic. In accordance with our prediction, younger participants were more likely to be able to use, and have access to a computer and some patients were keen to explore use virtual worlds for healthcare-related purposes: Of those with access to computer facilities, 14.50% expressed a willingness to attend a virtual world focus group. Results indicate future virtual world health education facilities should be designed to cater for younger patients, because this group are most likely to accept and use such facilities. Within the study sample, this is likely to comprise of people diagnosed with asthma. Future work could investigate the potential of creating a virtual world asthma education facility.
Taylor, Michael J.; Taylor, Dave; Vlaev, Ivo; Elkin, Sarah
2015-01-01
Recent advances in communication technologies enable potential provision of remote education for patients using computer-generated environments known as virtual worlds. Previous research has revealed highly variable levels of patient receptiveness to using information technologies for healthcare-related purposes. This preliminary study involved implementing a questionnaire investigating attitudes and access to computer technologies of respiratory outpatients, in order to assess potential for use of virtual worlds to facilitate health-related education for this sample. Ninety-four patients with a chronic respiratory condition completed surveys, which were distributed at a Chest Clinic. In accordance with our prediction, younger participants were more likely to be able to use, and have access to a computer and some patients were keen to explore use virtual worlds for healthcare-related purposes: Of those with access to computer facilities, 14.50% expressed a willingness to attend a virtual world focus group. Results indicate future virtual world health education facilities should be designed to cater for younger patients, because this group are most likely to accept and use such facilities. Within the study sample, this is likely to comprise of people diagnosed with asthma. Future work could investigate the potential of creating a virtual world asthma education facility. PMID:28239187
ERIC Educational Resources Information Center
Lee, Youngsun; Wehmeyer, Michael L.; Palmer, Susan B.; Williams-Diehm, Kendra; Davies, Daniel K.; Stock, Steven E.
2011-01-01
The purpose of this study was to investigate the impact of student-directed transition planning instruction ("Whose Future Is It Anyway?" curriculum) with a computer-based reading support program ("Rocket Reader") on the self-determination, self-efficacy and outcome expectancy, and transition planning knowledge of students with disabilities. This…
DOE Office of Scientific and Technical Information (OSTI.GOV)
John Wooley; Herbert S. Lin
This study is the first comprehensive NRC study that suggests a high-level intellectual structure for Federal agencies for supporting work at the biology/computing interface. The report seeks to establish the intellectual legitimacy of a fundamentally cross-disciplinary collaboration between biologists and computer scientists. That is, while some universities are increasingly favorable to research at the intersection, life science researchers at other universities are strongly impeded in their efforts to collaborate. This report addresses these impediments and describes proven strategies for overcoming them. An important feature of the report is the use of well-documented examples that describe clearly to individuals not trainedmore » in computer science the value and usage of computing across the biological sciences, from genes and proteins to networks and pathways, from organelles to cells, and from individual organisms to populations and ecosystems. It is hoped that these examples will be useful to students in the life sciences to motivate (continued) study in computer science that will enable them to be more facile users of computing in their future biological studies.« less
Computer-based visual communication in aphasia.
Steele, R D; Weinrich, M; Wertz, R T; Kleczewska, M K; Carlson, G S
1989-01-01
The authors describe their recently developed Computer-aided VIsual Communication (C-VIC) system, and report results of single-subject experimental designs probing its use with five chronic, severely impaired aphasic individuals. Studies replicate earlier results obtained with a non-computerized system, demonstrate patient competence with the computer implementation, extend the system's utility, and identify promising areas of application. Results of the single-subject experimental designs clarify patients' learning, generalization, and retention patterns, and highlight areas of performance difficulties. Future directions for the project are indicated.
General Reevaluation Report, Upper Skunk River Basin, Iowa (Ames Lake).
1987-07-01
the beneficial effects of land enhance- ment are analyzed as location benefits. Computations of the effects of future growth as related to...relationships needed to compute benefits. c. Computations of the effects of future growth as related to resi- dential and commercial properties were based... effects of increased amounts of soil conservation land treatment practices upon: - soil erosion by water - sediment yields to potential reservoir
Medical image computing for computer-supported diagnostics and therapy. Advances and perspectives.
Handels, H; Ehrhardt, J
2009-01-01
Medical image computing has become one of the most challenging fields in medical informatics. In image-based diagnostics of the future software assistance will become more and more important, and image analysis systems integrating advanced image computing methods are needed to extract quantitative image parameters to characterize the state and changes of image structures of interest (e.g. tumors, organs, vessels, bones etc.) in a reproducible and objective way. Furthermore, in the field of software-assisted and navigated surgery medical image computing methods play a key role and have opened up new perspectives for patient treatment. However, further developments are needed to increase the grade of automation, accuracy, reproducibility and robustness. Moreover, the systems developed have to be integrated into the clinical workflow. For the development of advanced image computing systems methods of different scientific fields have to be adapted and used in combination. The principal methodologies in medical image computing are the following: image segmentation, image registration, image analysis for quantification and computer assisted image interpretation, modeling and simulation as well as visualization and virtual reality. Especially, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients and will gain importance in diagnostic and therapy of the future. From a methodical point of view the authors identify the following future trends and perspectives in medical image computing: development of optimized application-specific systems and integration into the clinical workflow, enhanced computational models for image analysis and virtual reality training systems, integration of different image computing methods, further integration of multimodal image data and biosignals and advanced methods for 4D medical image computing. The development of image analysis systems for diagnostic support or operation planning is a complex interdisciplinary process. Image computing methods enable new insights into the patient's image data and have the future potential to improve medical diagnostics and patient treatment.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Editor)
1986-01-01
The papers contained in this volume provide an overview of the advances made in a number of aspects of computational mechanics, identify some of the anticipated industry needs in this area, discuss the opportunities provided by new hardware and parallel algorithms, and outline some of the current government programs in computational mechanics. Papers are included on advances and trends in parallel algorithms, supercomputers for engineering analysis, material modeling in nonlinear finite-element analysis, the Navier-Stokes computer, and future finite-element software systems.
Current capabilities and future directions in computational fluid dynamics
NASA Technical Reports Server (NTRS)
1986-01-01
A summary of significant findings is given, followed by specific recommendations for future directions of emphasis for computational fluid dynamics development. The discussion is organized into three application areas: external aerodynamics, hypersonics, and propulsion - and followed by a turbulence modeling synopsis.
BITNET: Past, Present, and Future.
ERIC Educational Resources Information Center
Oberst, Daniel J.; Smith, Sheldon B.
1986-01-01
Discusses history and development of the academic computer network BITNET, including BITNET Network Support Center's growth and services, and international expansion. Network users, reasons for growth, and future developments are reviewed. A BITNET applications sampler and listings of compatible computers and operating systems, sites, and…
Military clouds: utilization of cloud computing systems at the battlefield
NASA Astrophysics Data System (ADS)
Süleyman, Sarıkürk; Volkan, Karaca; İbrahim, Kocaman; Ahmet, Şirzai
2012-05-01
Cloud computing is known as a novel information technology (IT) concept, which involves facilitated and rapid access to networks, servers, data saving media, applications and services via Internet with minimum hardware requirements. Use of information systems and technologies at the battlefield is not new. Information superiority is a force multiplier and is crucial to mission success. Recent advances in information systems and technologies provide new means to decision makers and users in order to gain information superiority. These developments in information technologies lead to a new term, which is known as network centric capability. Similar to network centric capable systems, cloud computing systems are operational today. In the near future extensive use of military clouds at the battlefield is predicted. Integrating cloud computing logic to network centric applications will increase the flexibility, cost-effectiveness, efficiency and accessibility of network-centric capabilities. In this paper, cloud computing and network centric capability concepts are defined. Some commercial cloud computing products and applications are mentioned. Network centric capable applications are covered. Cloud computing supported battlefield applications are analyzed. The effects of cloud computing systems on network centric capability and on the information domain in future warfare are discussed. Battlefield opportunities and novelties which might be introduced to network centric capability by cloud computing systems are researched. The role of military clouds in future warfare is proposed in this paper. It was concluded that military clouds will be indispensible components of the future battlefield. Military clouds have the potential of improving network centric capabilities, increasing situational awareness at the battlefield and facilitating the settlement of information superiority.
BigData and computing challenges in high energy and nuclear physics
NASA Astrophysics Data System (ADS)
Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.
2017-06-01
In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R&D computing projects started recently in National Research Center ``Kurchatov Institute''
1988-10-20
The LOCK project , from its very beginnings as an implementation study for the Provably Secure Operating System in 1979...to the security field, can study to gain insight into the evaluation process. The project has developed an innovative format for the DTLS and FTLS...management tern becomes available, the Al Secure DBMS will be system (DBMS) that is currently being developed un- ported to it . der the Advanced
NASA Technical Reports Server (NTRS)
Hawke, Veronica; Gage, Peter; Manning, Ted
2007-01-01
ComGeom2, a tool developed to generate Common Geometry representation for multidisciplinary analysis, has been used to create a large set of geometries for use in a design study requiring analysis by two computational codes. This paper describes the process used to generate the large number of configurations and suggests ways to further automate the process and make it more efficient for future studies. The design geometry for this study is the launch abort system of the NASA Crew Launch Vehicle.
Interactive systems design and synthesis of future spacecraft concepts
NASA Technical Reports Server (NTRS)
Wright, R. L.; Deryder, D. D.; Ferebee, M. J., Jr.
1984-01-01
An interactive systems design and synthesis is performed on future spacecraft concepts using the Interactive Design and Evaluation of Advanced spacecraft (IDEAS) computer-aided design and analysis system. The capabilities and advantages of the systems-oriented interactive computer-aided design and analysis system are described. The synthesis of both large antenna and space station concepts, and space station evolutionary growth is demonstrated. The IDEAS program provides the user with both an interactive graphics and an interactive computing capability which consists of over 40 multidisciplinary synthesis and analysis modules. Thus, the user can create, analyze and conduct parametric studies and modify Earth-orbiting spacecraft designs (space stations, large antennas or platforms, and technologically advanced spacecraft) at an interactive terminal with relative ease. The IDEAS approach is useful during the conceptual design phase of advanced space missions when a multiplicity of parameters and concepts must be analyzed and evaluated in a cost-effective and timely manner.
Nanotoxicity prediction using computational modelling - review and future directions
NASA Astrophysics Data System (ADS)
Saini, Bhavna; Srivastava, Sumit
2018-04-01
Nanomaterials has stimulated various outlooks for future in a number of industries and scientific ventures. A number of applications such as cosmetics, medicines, and electronics are employing nanomaterials due to their various compelling properties. The unending growth of nanomaterials usage in our daily life has escalated the health and environmental risks. Early nanotoxicity recognition is a big challenge. Various researches are going on in the field of nanotoxicity, which comprised of several problems such as inadequacy of proper datasets, lack of appropriate rules and characterization of nanomaterials. Computational modelling would be beneficial asset for nanomaterials researchers because it can foresee the toxicity, rest on previous experimental data. In this study, we have reviewed sufficient work demonstrating a proper pathway to proceed with QSAR analysis of Nanomaterials for toxicity modelling. The paper aims at providing comprehensive insight of Nano QSAR, various theories, tools and approaches used, along with an outline for future research directions to work on.
Human-Computer Interaction and Virtual Environments
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler)
1995-01-01
The proceedings of the Workshop on Human-Computer Interaction and Virtual Environments are presented along with a list of attendees. The objectives of the workshop were to assess the state-of-technology and level of maturity of several areas in human-computer interaction and to provide guidelines for focused future research leading to effective use of these facilities in the design/fabrication and operation of future high-performance engineering systems.
Algorithms for adaptive stochastic control for a class of linear systems
NASA Technical Reports Server (NTRS)
Toda, M.; Patel, R. V.
1977-01-01
Control of linear, discrete time, stochastic systems with unknown control gain parameters is discussed. Two suboptimal adaptive control schemes are derived: one is based on underestimating future control and the other is based on overestimating future control. Both schemes require little on-line computation and incorporate in their control laws some information on estimation errors. The performance of these laws is studied by Monte Carlo simulations on a computer. Two single input, third order systems are considered, one stable and the other unstable, and the performance of the two adaptive control schemes is compared with that of the scheme based on enforced certainty equivalence and the scheme where the control gain parameters are known.
College students and computers: assessment of usage patterns and musculoskeletal discomfort.
Noack-Cooper, Karen L; Sommerich, Carolyn M; Mirka, Gary A
2009-01-01
A limited number of studies have focused on computer-use-related MSDs in college students, though risk factor exposure may be similar to that of workers who use computers. This study examined computer use patterns of college students, and made comparisons to a group of previously studied computer-using professionals. 234 students completed a web-based questionnaire concerning computer use habits and physical discomfort respondents specifically associated with computer use. As a group, students reported their computer use to be at least 'Somewhat likely' 18 out of 24 h/day, compared to 12 h for the professionals. Students reported more uninterrupted work behaviours than the professionals. Younger graduate students reported 33.7 average weekly computing hours, similar to hours reported by younger professionals. Students generally reported more frequent upper extremity discomfort than the professionals. Frequent assumption of awkward postures was associated with frequent discomfort. The findings signal a need for intervention, including, training and education, prior to entry into the workforce. Students are future workers, and so it is important to determine whether their increasing exposure to computers, prior to entering the workforce, may make it so they enter already injured or do not enter their chosen profession due to upper extremity MSDs.
Computational Intelligence and Its Impact on Future High-Performance Engineering Systems
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler)
1996-01-01
This document contains presentations from the joint UVA/NASA Workshop on Computational Intelligence held at the Virginia Consortium of Engineering and Science Universities, Hampton, Virginia, June 27-28, 1995. The presentations addressed activities in the areas of fuzzy logic, neural networks, and evolutionary computations. Workshop attendees represented NASA, the National Science Foundation, the Department of Energy, National Institute of Standards and Technology (NIST), the Jet Propulsion Laboratory, industry, and academia. The workshop objectives were to assess the state of technology in the Computational intelligence area and to provide guidelines for future research.
Coastal Storm Hazards from Virginia to Maine
2015-11-01
study, storm surge, tide, waves, wind, atmospheric pressure, and currents were the dominant storm responses computed. The effect of sea level change on...coastal storm hazards and vulnerability nationally (USACE 2015). NACCS goals also included evaluating the effect of future sea level change (SLC) on...the computed high-fidelity responses included storm surge, astronomical tide, waves, wave effects on water levels, storm duration, wind, currents
Report of the Army Science Board Summer Study on Installations 2025
2009-12-01
stresses , beha- vioral health problems, and injuries associated with war. Transform: IMCOM is modernizing installation management processes, policies...well. For example, "Prediction is very difficult, especially about the future" (Niels Bohr). Others stress that the future will be a lot like the...34homogenization" Endangered species Continuous and ubiquitous of society Islanding computing Telecommuting Wireless proliferation across appliances
COMPUGIRLS: Stepping Stone to Future Computer-Based Technology Pathways
ERIC Educational Resources Information Center
Lee, Jieun; Husman, Jenefer; Scott, Kimberly A.; Eggum-Wilkens, Natalie D.
2015-01-01
The COMPUGIRLS: Culturally relevant technology program for adolescent girls was developed to promote underrepresented girls' future possible selves and career pathways in computer-related technology fields. We hypothesized that the COMPUGIRLS would promote academic possible selves and self-regulation to achieve these possible selves. We compared…
Manufacturing Magic and Computational Creativity
Williams, Howard; McOwan, Peter W.
2016-01-01
This paper describes techniques in computational creativity, blending mathematical modeling and psychological insight, to generate new magic tricks. The details of an explicit computational framework capable of creating new magic tricks are summarized, and evaluated against a range of contemporary theories about what constitutes a creative system. To allow further development of the proposed system we situate this approach to the generation of magic in the wider context of other areas of application in computational creativity in performance arts. We show how approaches in these domains could be incorporated to enhance future magic generation systems, and critically review possible future applications of such magic generating computers. PMID:27375533
NASA Technical Reports Server (NTRS)
Gorospe, George E., Jr.; Daigle, Matthew J.; Sankararaman, Shankar; Kulkarni, Chetan S.; Ng, Eley
2017-01-01
Prognostic methods enable operators and maintainers to predict the future performance for critical systems. However, these methods can be computationally expensive and may need to be performed each time new information about the system becomes available. In light of these computational requirements, we have investigated the application of graphics processing units (GPUs) as a computational platform for real-time prognostics. Recent advances in GPU technology have reduced cost and increased the computational capability of these highly parallel processing units, making them more attractive for the deployment of prognostic software. We present a survey of model-based prognostic algorithms with considerations for leveraging the parallel architecture of the GPU and a case study of GPU-accelerated battery prognostics with computational performance results.
Time Triggered Protocol (TTP) for Integrated Modular Avionics
NASA Technical Reports Server (NTRS)
Motzet, Guenter; Gwaltney, David A.; Bauer, Guenther; Jakovljevic, Mirko; Gagea, Leonard
2006-01-01
Traditional avionics computing systems are federated, with each system provided on a number of dedicated hardware units. Federated applications are physically separated from one another and analysis of the systems is undertaken individually. Integrated Modular Avionics (IMA) takes these federated functions and integrates them on a common computing platform in a tightly deterministic distributed real-time network of computing modules in which the different applications can run. IMA supports different levels of criticality in the same computing resource and provides a platform for implementation of fault tolerance through hardware and application redundancy. Modular implementation has distinct benefits in design, testing and system maintainability. This paper covers the requirements for fault tolerant bus systems used to provide reliable communication between IMA computing modules. An overview of the Time Triggered Protocol (TTP) specification and implementation as a reliable solution for IMA systems is presented. Application examples in aircraft avionics and a development system for future space application are covered. The commercially available TTP controller can be also be implemented in an FPGA and the results from implementation studies are covered. Finally future direction for the application of TTP and related development activities are presented.
Adoption of computer-assisted learning in medical education: the educators' perspective.
Schifferdecker, Karen E; Berman, Norm B; Fall, Leslie H; Fischer, Martin R
2012-11-01
Computer-assisted learning (CAL) in medical education has been shown to be effective in the achievement of learning outcomes, but requires the input of significant resources and development time. This study examines the key elements and processes that led to the widespread adoption of a CAL program in undergraduate medical education, the Computer-assisted Learning in Paediatrics Program (CLIPP). It then considers the relative importance of elements drawn from existing theories and models for technology adoption and other studies on CAL in medical education to inform the future development, implementation and testing of CAL programs in medical education. The study used a mixed-methods explanatory design. All paediatric clerkship directors (CDs) using CLIPP were recruited to participate in a self-administered, online questionnaire. Semi-structured interviews were then conducted with a random sample of CDs to further explore the quantitative results. Factors that facilitated adoption included CLIPP's ability to fill gaps in exposure to core clinical problems, the use of a national curriculum, development by CDs, and the meeting of CDs' desires to improve teaching and student learning. An additional facilitating factor was that little time and effort were needed to implement CLIPP within a clerkship. The quantitative findings were mostly corroborated by the qualitative findings. This study indicates issues that are important in the consideration and future exploration of the development and implementation of CAL programs in medical education. The promise of CAL as a method of enhancing the process and outcomes of medical education, and its cost, increase the need for future CAL funders and developers to pay equal attention to the needs of potential adopters and the development process as they do to the content and tools in the CAL program. Important questions that remain on the optimal design, use and integration of CAL should be addressed in order to adequately inform future development. Support is needed for studies that address these critical areas. © Blackwell Publishing Ltd 2012.
ADP Analysis project for the Human Resources Management Division
NASA Technical Reports Server (NTRS)
Tureman, Robert L., Jr.
1993-01-01
The ADP (Automated Data Processing) Analysis Project was conducted for the Human Resources Management Division (HRMD) of NASA's Langley Research Center. The three major areas of work in the project were computer support, automated inventory analysis, and an ADP study for the Division. The goal of the computer support work was to determine automation needs of Division personnel and help them solve computing problems. The goal of automated inventory analysis was to find a way to analyze installed software and usage on a Macintosh. Finally, the ADP functional systems study for the Division was designed to assess future HRMD needs concerning ADP organization and activities.
Marotta, Phillip L; Voisin, Dexter R
2017-10-01
The following study assessed whether future orientation mediated the effects of peer norms and parental monitoring on delinquency and substance use among 549 African American adolescents. Structural equation modeling computed direct and indirect (meditational) relationships between parental monitoring and peer norms through future orientation. Parental monitoring significantly correlated with lower delinquency through future orientation ( B = -.05, standard deviation = .01, p < .01). Future orientation mediated more than quarter (27.70%) of the total effect of parental monitoring on delinquency. Overall findings underscore the importance of strengthening resilience factors for African American youth, especially those who live in low-income communities.
Heterogeneous High Throughput Scientific Computing with APM X-Gene and Intel Xeon Phi
NASA Astrophysics Data System (ADS)
Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Muzaffar, Shahzad
2015-05-01
Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. We report our experience on software porting, performance and energy efficiency and evaluate the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).
Intentions of hospital nurses to work with computers: based on the theory of planned behavior.
Shoham, Snunith; Gonen, Ayala
2008-01-01
The purpose of this study was to determine registered nurses' attitudes related to intent to use computers in the hospital setting as a predictor of their future behavior. The study was further aimed at identifying the relationship between these attitudes and selected sociological, professional, and personal factors and to describe a research model integrating these various factors. The study was based on the theory of planned behavior. A random sample of 411 registered nurses was selected from a single large medical center in Israel. The study tool was a Likert-style questionnaire. Nine different indices were used: (1) behavioral intention toward computer use; (2) general attitudes toward computer use; (3) nursing attitudes toward computer use; (4) threat involved in computer use; (5) challenge involved in computer use; (6) organizational climate; (7) departmental climate; (8) attraction to technological innovations/innovativeness; (9) self-efficacy, ability to control behavior. Strong significant positive correlations were found between the nurses' attitudes (general attitudes and nursing attitudes), self-efficacy, innovativeness, and intentions to use computers. Higher correlations were found between departmental climate and attitudes than between organizational climate and attitudes. The threat and challenge that are involved in computer use were shown as important mediating variables to the understanding of the process of predicting attitudes and intentions toward using computers.
Mass storage system experiences and future needs at the National Center for Atmospheric Research
NASA Technical Reports Server (NTRS)
Olear, Bernard T.
1992-01-01
This presentation is designed to relate some of the experiences of the Scientific Computing Division at NCAR dealing with the 'data problem'. A brief history and a development of some basic Mass Storage System (MSS) principles are given. An attempt is made to show how these principles apply to the integration of various components into NCAR's MSS. There is discussion of future MSS needs for future computing environments.
Prospective Optimization with Limited Resources
Snider, Joseph; Lee, Dongpyo; Poizner, Howard; Gepshtein, Sergei
2015-01-01
The future is uncertain because some forthcoming events are unpredictable and also because our ability to foresee the myriad consequences of our own actions is limited. Here we studied how humans select actions under such extrinsic and intrinsic uncertainty, in view of an exponentially expanding number of prospects on a branching multivalued visual stimulus. A triangular grid of disks of different sizes scrolled down a touchscreen at a variable speed. The larger disks represented larger rewards. The task was to maximize the cumulative reward by touching one disk at a time in a rapid sequence, forming an upward path across the grid, while every step along the path constrained the part of the grid accessible in the future. This task captured some of the complexity of natural behavior in the risky and dynamic world, where ongoing decisions alter the landscape of future rewards. By comparing human behavior with behavior of ideal actors, we identified the strategies used by humans in terms of how far into the future they looked (their “depth of computation”) and how often they attempted to incorporate new information about the future rewards (their “recalculation period”). We found that, for a given task difficulty, humans traded off their depth of computation for the recalculation period. The form of this tradeoff was consistent with a complete, brute-force exploration of all possible paths up to a resource-limited finite depth. A step-by-step analysis of the human behavior revealed that participants took into account very fine distinctions between the future rewards and that they abstained from some simple heuristics in assessment of the alternative paths, such as seeking only the largest disks or avoiding the smaller disks. The participants preferred to reduce their depth of computation or increase the recalculation period rather than sacrifice the precision of computation. PMID:26367309
NASA Astrophysics Data System (ADS)
Huq, E.; Abdul-Aziz, O. I.
2017-12-01
We computed the historical and future storm runoff scenarios for the Shingle Creek Basin, including the growing urban centers of central Florida (e.g., City of Orlando). Storm Water Management Model (SWMM 5.1) of US EPA was used to develop a mechanistic hydrologic model for the basin by incorporating components of urban hydrology, hydroclimatological variables, and land use/cover features. The model was calibrated and validated with historical streamflow of 2004-2013 near the outlet of the Shingle Creek. The calibrated model was used to compute the sensitivities of stormwater budget to reference changes in hydroclimatological variables (rainfall and evapotranspiration) and land use/cover features (imperviousness, roughness). Basin stormwater budgets for the historical (2010s = 2004-2013) and future periods (2050s = 2030-2059; 2080s = 2070-2099) were also computed based on downscaled climatic projections of 20 GCMs-RCMs representing the coupled model intercomparison project (CMIP5), and anticipated changes in land use/cover. The sensitivity analyses indicated the dominant drivers of urban runoff in the basin. Comparative assessment of the historical and future stormwater runoff scenarios helped to locate basin areas that would be at a higher risk of future stormwater flooding. Importance of the study lies in providing valuable guidelines for managing stormwater flooding in central Florida and similar growing urban centers around the world.
A Man-Machine System for Contemporary Counseling Practice: Diagnosis and Prediction.
ERIC Educational Resources Information Center
Roach, Arthur J.
This paper looks at present and future capabilities for diagnosis and prediction in computer-based guidance efforts and reviews the problems and potentials which will accompany the implementation of such capabilities. In addition to necessary procedural refinement in prediction, future developments in computer-based educational and career…
Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.
ERIC Educational Resources Information Center
Moore, Gwendolyn B.; And Others
The report describes three advanced technologies--robotics, artificial intelligence, and computer simulation--and identifies the ways in which they might contribute to special education. A hybrid methodology was employed to identify existing technology and forecast future needs. Following this framework, each of the technologies is defined,…
Howard, Matt C
2014-10-01
Computer self-efficacy is an often studied construct that has been shown to be related to an array of important individual outcomes. Unfortunately, existing measures of computer self-efficacy suffer from several deficiencies, including criterion contamination, outdated wording, and/or inadequate psychometric properties. For this reason, the current article presents the creation of a new computer self-efficacy measure. In Study 1, an over-representative item list is created and subsequently reduced through exploratory factor analysis to create an initial measure, and the discriminant validity of this initial measure is tested. In Study 2, the unidimensional factor structure of the initial measure is supported through confirmatory factor analysis and further reduced into a final, 12-item measure. In Study 3, the convergent and criterion validity of the 12-item measure is tested. Overall, this three study process demonstrates that the new computer self-efficacy measure has superb psychometric properties and internal reliability, and demonstrates excellent evidence for several aspects of validity. It is hoped that the 12-item computer self-efficacy measure will be utilized in future research on computer self-efficacy, which is discussed in the current article.
Computational Methods Development at Ames
NASA Technical Reports Server (NTRS)
Kwak, Dochan; Smith, Charles A. (Technical Monitor)
1998-01-01
This viewgraph presentation outlines the development at Ames Research Center of advanced computational methods to provide appropriate fidelity computational analysis/design capabilities. Current thrusts of the Ames research include: 1) methods to enhance/accelerate viscous flow simulation procedures, and the development of hybrid/polyhedral-grid procedures for viscous flow; 2) the development of real time transonic flow simulation procedures for a production wind tunnel, and intelligent data management technology; and 3) the validation of methods and the flow physics study gives historical precedents to above research, and speculates on its future course.
An Initial Multi-Domain Modeling of an Actively Cooled Structure
NASA Technical Reports Server (NTRS)
Steinthorsson, Erlendur
1997-01-01
A methodology for the simulation of turbine cooling flows is being developed. The methodology seeks to combine numerical techniques that optimize both accuracy and computational efficiency. Key components of the methodology include the use of multiblock grid systems for modeling complex geometries, and multigrid convergence acceleration for enhancing computational efficiency in highly resolved fluid flow simulations. The use of the methodology has been demonstrated in several turbo machinery flow and heat transfer studies. Ongoing and future work involves implementing additional turbulence models, improving computational efficiency, adding AMR.
The possible usability of three-dimensional cone beam computed dental tomography in dental research
NASA Astrophysics Data System (ADS)
Yavuz, I.; Rizal, M. F.; Kiswanjaya, B.
2017-08-01
The innovations and advantages of three-dimensional cone beam computed dental tomography (3D CBCT) are continually growing for its potential use in dental research. Imaging techniques are important for planning research in dentistry. Newly improved 3D CBCT imaging systems and accessory computer programs have recently been proven effective for use in dental research. The aim of this study is to introduce 3D CBCT and open a window for future research possibilities that should be given attention in dental research.
Effect of computer game playing on baseline laparoscopic simulator skills.
Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd
2013-08-01
Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.
Wagg, Amanda J; Callanan, Margie M; Hassett, Alexander
2018-03-16
The aim of this study is to explore how computer mediated communication has been used by a variety of healthcare,professionals to support their patients and discuss the implication that this may have for future practice. A systematized review of the literature. A review of empirical studies within the literature was carried out in April 2016 in CINAHL, MEDLINE, ASSIA, BNI, Psychinfo, and Web of Science databases. The databases searched produced 2930 titles, of which 190 publications were considered relevant to the objectives. Titles and abstracts were then reviewed and duplicates removed producing 67 publications. Exclusion and inclusion criteria were applied. The inclusion criteria were (1) interventions that facilitate two-way communication between any healthcare professional and their patients via a computer; (2) Interventions aimed at providing any type of support e.g. emotional, tangible, informational, or esteem support; (3) English language; (4) Primary empirical studies. Data quality was assessed and thematic analysis applied. Thirty-one publications were included in this study. Intervention types included Email (n = 8), Videoconferencing (n = 7), Online Social Support Groups (n = 9) and multifaceted interventions (n = 7). Three themes emerged from the data including increasing access to healthcare, adding value to healthcare delivery and improving patient outcomes. Twenty-five (81%) of the studies found that computer mediated communication could produce positive effects. Computer mediated communication could be both what patients want and a way of delivering support to patients in a resource tight environment. This has implications for a range of health support needs and professionals including nurses, midwives and allied healthcare professionals. Reviewing the lessons learnt will ensure future interventions are tailored to the support needs of the patients, carefully planned and mindful of the risks. Copyright © 2018 Elsevier Ltd. All rights reserved.
Computer analysis of Holter electrocardiogram.
Yanaga, T; Adachi, M; Sato, Y; Ichimaru, Y; Otsuka, K
1994-10-01
Computer analysis is indispensable for the interpretation of Holter ECG, because it includes a large quantity of data. Computer analysis of Holter ECG is similar to that of conventional ECG, however, in computer analysis of Holter ECG, there are some difficulties such as many noise, limited analyzing time and voluminous data. The main topics in computer analysis of Holter ECG will be arrhythmias, ST-T changes, heart rate variability, QT interval, late potential and construction of database. Although many papers have been published on the computer analysis of Holter ECG, some of the papers was reviewed briefly in the present paper. We have studied on computer analysis of VPCs, ST-T changes, heart rate variability, QT interval and Cheyne-Stokes respiration during 24-hour ambulatory ECG monitoring. Further, we have studied on ambulatory palmar sweating for the evaluation of mental stress during a day. In future, the development of "the integrated Holter system", which enables the evaluation of ventricular vulnerability and modulating factor such as psychoneural hypersensitivity may be important.
Malta Barbosa, João; Tovar, Nick; A Tuesta, Pablo; Hirata, Ronaldo; Guimarães, Nuno; Romanini, José C; Moghadam, Marjan; Coelho, Paulo G; Jahangiri, Leila
2017-07-08
This work aims to present a pilot study of a non-destructive dental histo-anatomical analysis technique as well as to push the boundaries of the presently available restorative workflows for the fabrication of highly customized ceramic restorations. An extracted human maxillary central incisor was subject to a micro computed tomography scan and the acquired data was transferred into a workstation, reconstructed, segmented, evaluated and later imported into a Computer-Aided Design/Computer-Aided Manufacturing software for the fabrication of a ceramic resin-bonded prosthesis. The obtained prosthesis presented an encouraging optical behavior and was used clinically as final restoration. The digitally layered restorative replication of natural tooth morphology presents today as a clear possibility. New clinical and laboratory-fabricated, biologically inspired digital restorative protocols are to be expected in the near future. The digitally layered restorative replication of natural tooth morphology presents today as a clear possibility. This pilot study may represent a stimulus for future research and applications of digital imaging as well as digital restorative workflows in service of esthetic dentistry. © 2017 Wiley Periodicals, Inc.
Optimum spaceborne computer system design by simulation
NASA Technical Reports Server (NTRS)
Williams, T.; Weatherbee, J. E.; Taylor, D. S.
1972-01-01
A deterministic digital simulation model is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Use of the model as a tool in configuring a minimum computer system for a typical mission is demonstrated. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources, i.e., the configuration derived is a minimal one. Other considerations such as increased reliability through the use of standby spares would be taken into account in the definition of a practical system for a given mission.
Information Technology and the Social Studies.
ERIC Educational Resources Information Center
Searles, John E.
1983-01-01
The information revolution is making various impacts on social studies. Students are children of this age and are learning social ideas from technology. The information revolution should be part of the social studies curriculum. Unresolved questions (e.g., Who should write computer software?) and some thoughts on the future are discussed. (RM)
Navier-Stokes computations useful in aircraft design
NASA Technical Reports Server (NTRS)
Holst, Terry L.
1990-01-01
Large scale Navier-Stokes computations about aircraft components as well as reasonably complete aircraft configurations are presented and discussed. Speed and memory requirements are described for various general problem classes, which in some cases are already being used in the industrial design environment. Recent computed results, with experimental comparisons when available, are included to highlight the presentation. Finally, prospects for the future are described and recommendations for areas of concentrated research are indicated. The future of Navier-Stokes computations is seen to be rapidly expanding across a broad front of applications, which includes the entire subsonic-to-hypersonic speed regime.
Mentat: An object-oriented macro data flow system
NASA Technical Reports Server (NTRS)
Grimshaw, Andrew S.; Liu, Jane W. S.
1988-01-01
Mentat, an object-oriented macro data flow system designed to facilitate parallelism in distributed systems, is presented. The macro data flow model is a model of computation similar to the data flow model with two principal differences: the computational complexity of the actors is much greater than in traditional data flow systems, and there are persistent actors that maintain state information between executions. Mentat is a system that combines the object-oriented programming paradigm and the macro data flow model of computation. Mentat programs use a dynamic structure called a future list to represent the future of computations.
US computer research networks: Current and future
NASA Technical Reports Server (NTRS)
Kratochvil, D.; Sood, D.; Verostko, A.
1989-01-01
During the last decade, NASA LeRC's Communication Program has conducted a series of telecommunications forecasting studies to project trends and requirements and to identify critical telecommunications technologies that must be developed to meet future requirements. The Government Networks Division of Contel Federal Systems has assisted NASA in these studies, and the current study builds upon these earlier efforts. The current major thrust of the NASA Communications Program is aimed at developing the high risk, advanced, communications satellite and terminal technologies required to significantly increase the capacity of future communications systems. Also, major new technological, economic, and social-political events and trends are now shaping the communications industry of the future. Therefore, a re-examination of future telecommunications needs and requirements is necessary to enable NASA to make management decisions in its Communications Program and to ensure the proper technologies and systems are addressed. This study, through a series of Task Orders, is helping NASA define the likely communication service needs and requirements of the future and thereby ensuring that the most appropriate technology developments are pursued.
Computers--Teaching, Technology, and Applications.
ERIC Educational Resources Information Center
Cocco, Anthony M.; And Others
1995-01-01
Includes "Managing Personality Types in the Computer Classroom" (Cocco); "External I/O Input/Output with a PC" (Fryda); "The Future of CAD/CAM Computer-Assisted Design/Computer-Assisted Manufacturing Software" (Fulton); and "Teaching Quality Assurance--A Laboratory Approach" (Wojslaw). (SK)
Heterogeneous high throughput scientific computing with APM X-Gene and Intel Xeon Phi
Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; ...
2015-05-22
Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. As a result, we report our experience on software porting, performance and energy efficiency and evaluatemore » the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chiang, Patrick
2014-01-31
The research goal of this CAREER proposal is to develop energy-efficient, VLSI interconnect circuits and systems that will facilitate future massively-parallel, high-performance computing. Extreme-scale computing will exhibit massive parallelism on multiple vertical levels, from thou sands of computational units on a single processor to thousands of processors in a single data center. Unfortunately, the energy required to communicate between these units at every level (on chip, off-chip, off-rack) will be the critical limitation to energy efficiency. Therefore, the PI's career goal is to become a leading researcher in the design of energy-efficient VLSI interconnect for future computing systems.
Computational chemistry in pharmaceutical research: at the crossroads.
Bajorath, Jürgen
2012-01-01
Computational approaches are an integral part of pharmaceutical research. However, there are many of unsolved key questions that limit the scientific progress in the still evolving computational field and its impact on drug discovery. Importantly, a number of these questions are not new but date back many years. Hence, it might be difficult to conclusively answer them in the foreseeable future. Moreover, the computational field as a whole is characterized by a high degree of heterogeneity and so is, unfortunately, the quality of its scientific output. In light of this situation, it is proposed that changes in scientific standards and culture should be seriously considered now in order to lay a foundation for future progress in computational research.
Online Activity Levels Are Related to Caffeine Dependency.
Phillips, James G; Landhuis, C Erik; Shepherd, Daniel; Ogeil, Rowan P
2016-05-01
Online activity could serve in the future as behavioral markers of emotional states for computer systems (i.e., affective computing). Hence, this study considered relationships between self-reported stimulant use and online study patterns. Sixty-two undergraduate psychology students estimated their daily caffeine use, and this was related to study patterns as tracked by their use of a Learning Management System (Blackboard). Caffeine dependency was associated with less time spent online, lower rates of file access, and fewer online activities completed. Reduced breadth or depth of processing during work/study could be used as a behavioral marker of stimulant use.
Student Computer Dialogs Without Special Purpose Languages.
ERIC Educational Resources Information Center
Bork, Alfred
The phrase "student computer dialogs" refers to interactive sessions between the student and the computer. Rather than using programing languages specifically designed for computer assisted instruction (CAI), existing general purpose languages should be emphasized in the future development of student computer dialogs, as the power and…
A literature review of neck pain associated with computer use: public health implications
Green, Bart N
2008-01-01
Prolonged use of computers during daily work activities and recreation is often cited as a cause of neck pain. This review of the literature identifies public health aspects of neck pain as associated with computer use. While some retrospective studies support the hypothesis that frequent computer operation is associated with neck pain, few prospective studies reveal causal relationships. Many risk factors are identified in the literature. Primary prevention strategies have largely been confined to addressing environmental exposure to ergonomic risk factors, since to date, no clear cause for this work-related neck pain has been acknowledged. Future research should include identifying causes of work related neck pain so that appropriate primary prevention strategies may be developed and to make policy recommendations pertaining to prevention. PMID:18769599
George Leonard's View of the Computer in Education.
ERIC Educational Resources Information Center
Bork, Alfred
Relatively few individuals have attempted to view the future of computers in education, and those who have done so often tend to focus too much upon present capabilities rather than thinking about the changes that new technology will introduce in the future. George Leonard's book "Education and Ecstasy" provides an interesting picture of…
ERIC Educational Resources Information Center
Watson, J. Allen; And Others
1986-01-01
The article surveys computer usage with young handicapped children by developing three instructional scenarios (present actual, present possible, and future). Research is reviewed on computer use with very young children, cognitive theory and microcomputer learning, and social aspects of the microcomputer experience. Trends in microcomputer,…
ERIC Educational Resources Information Center
Hanks, Walter A.; Barnes, Michael D.; Merrill, Ray M.; Neiger, Brad L.
2000-01-01
Investigated how health educators currently used computers and how they expected to use them in the future. Surveys of practicing health educators at many types of sites indicated that important current abilities included Internet, word processing, and electronic presentation skills. Important future tasks and skills included developing computer…
International Futures (IFs): A Global Issues Simulation for Teaching and Research.
ERIC Educational Resources Information Center
Hughes, Barry B.
This paper describes the International Futures (IFs) computer assisted simulation game for use with undergraduates. Written in Standard Fortran IV, the model currently runs on mainframe or mini computers, but has not been adapted for micros. It has been successfully installed on Harris, Burroughs, Telefunken, CDC, Univac, IBM, and Prime machines.…
ERIC Educational Resources Information Center
Walther, Joseph B.
1994-01-01
Assesses the related effects of anticipated future interaction and different communication media (computer-mediated versus face-to-face communication) on the communication of relational intimacy and composure. Shows that the assignment of long-term versus short-term partnerships has a larger impact on anticipated future interaction reported by…
Smoke and Air Resource Management-Peering Through the Haze
A. R. Fox Riebau
1987-01-01
This paper presents a vision of the future rooted in consideration of the past 20 years in the smoke and air resource management field. This future is characterized by rapid technological development of computers for computation, communications, and remote sensing capabilities and of the possible societal responses to these advances. We discuss intellectual...
The Future's Future: Implications of Emerging Technology for Special Education Program Planning.
ERIC Educational Resources Information Center
Hofstetter, Fred T.
2001-01-01
This article reviews emerging technologies, imagines how they can be used to help learners with special needs, and recommends new special education program initiatives to help these students make a meaningful transition from school to work. Wearable computers, personal computing devices, DVD, HDTV, MP3, and personal digital assistants are…
The next generation of command post computing
NASA Astrophysics Data System (ADS)
Arnold, Ross D.; Lieb, Aaron J.; Samuel, Jason M.; Burger, Mitchell A.
2015-05-01
The future of command post computing demands an innovative new solution to address a variety of challenging operational needs. The Command Post of the Future is the Army's primary command and control decision support system, providing situational awareness and collaborative tools for tactical decision making, planning, and execution management from Corps to Company level. However, as the U.S. Army moves towards a lightweight, fully networked battalion, disconnected operations, thin client architecture and mobile computing become increasingly essential. The Command Post of the Future is not designed to support these challenges in the coming decade. Therefore, research into a hybrid blend of technologies is in progress to address these issues. This research focuses on a new command and control system utilizing the rich collaboration framework afforded by Command Post of the Future coupled with a new user interface consisting of a variety of innovative workspace designs. This new system is called Tactical Applications. This paper details a brief history of command post computing, presents the challenges facing the modern Army, and explores the concepts under consideration for Tactical Applications that meet these challenges in a variety of innovative ways.
Influence versus intent for predictive analytics in situation awareness
NASA Astrophysics Data System (ADS)
Cui, Biru; Yang, Shanchieh J.; Kadar, Ivan
2013-05-01
Predictive analytics in situation awareness requires an element to comprehend and anticipate potential adversary activities that might occur in the future. Most work in high level fusion or predictive analytics utilizes machine learning, pattern mining, Bayesian inference, and decision tree techniques to predict future actions or states. The emergence of social computing in broader contexts has drawn interests in bringing the hypotheses and techniques from social theory to algorithmic and computational settings for predictive analytics. This paper aims at answering the question on how influence and attitude (some interpreted such as intent) of adversarial actors can be formulated and computed algorithmically, as a higher level fusion process to provide predictions of future actions. The challenges in this interdisciplinary endeavor include drawing existing understanding of influence and attitude in both social science and computing fields, as well as the mathematical and computational formulation for the specific context of situation to be analyzed. The study of `influence' has resurfaced in recent years due to the emergence of social networks in the virtualized cyber world. Theoretical analysis and techniques developed in this area are discussed in this paper in the context of predictive analysis. Meanwhile, the notion of intent, or `attitude' using social theory terminologies, is a relatively uncharted area in the computing field. Note that a key objective of predictive analytics is to identify impending/planned attacks so their `impact' and `threat' can be prevented. In this spirit, indirect and direct observables are drawn and derived to infer the influence network and attitude to predict future threats. This work proposes an integrated framework that jointly assesses adversarial actors' influence network and their attitudes as a function of past actions and action outcomes. A preliminary set of algorithms are developed and tested using the Global Terrorism Database (GTD). Our results reveals the benefits to perform joint predictive analytics with both attitude and influence. At the same time, we discover significant challenges in deriving influence and attitude from indirect observables for diverse adversarial behavior. These observations warrant further investigation of optimal use of influence and attitude for predictive analytics, as well as the potential inclusion of other environmental or capability elements for the actors.
Study of USGS/NASA land use classification system. [computer analysis from LANDSAT data
NASA Technical Reports Server (NTRS)
Spann, G. W.
1975-01-01
The results of a computer mapping project using LANDSAT data and the USGS/NASA land use classification system are summarized. During the computer mapping portion of the project, accuracies of 67 percent to 79 percent were achieved using Level II of the classification system and a 4,000 acre test site centered on Douglasville, Georgia. Analysis of response to a questionaire circulated to actual and potential LANDSAT data users reveals several important findings: (1) there is a substantial desire for additional information related to LANDSAT capabilities; (2) a majority of the respondents feel computer mapping from LANDSAT data could aid present or future projects; and (3) the costs of computer mapping are substantially less than those of other methods.
LHC@Home: a BOINC-based volunteer computing infrastructure for physics studies at CERN
NASA Astrophysics Data System (ADS)
Barranco, Javier; Cai, Yunhai; Cameron, David; Crouch, Matthew; Maria, Riccardo De; Field, Laurence; Giovannozzi, Massimo; Hermes, Pascal; Høimyr, Nils; Kaltchev, Dobrin; Karastathis, Nikos; Luzzi, Cinzia; Maclean, Ewen; McIntosh, Eric; Mereghetti, Alessio; Molson, James; Nosochkov, Yuri; Pieloni, Tatiana; Reid, Ivan D.; Rivkin, Lenny; Segal, Ben; Sjobak, Kyrre; Skands, Peter; Tambasco, Claudia; Veken, Frederik Van der; Zacharov, Igor
2017-12-01
The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and has since 2011 been expanded with a wider range of applications. The traditional CERN accelerator physics simulation code SixTrack enjoys continuing volunteers support, and thanks to virtualisation a number of applications from the LHC experiment collaborations and particle theory groups have joined the consolidated LHC@Home BOINC project. This paper addresses the challenges related to traditional and virtualized applications in the BOINC environment, and how volunteer computing has been integrated into the overall computing strategy of the laboratory through the consolidated LHC@Home service. Thanks to the computing power provided by volunteers joining LHC@Home, numerous accelerator beam physics studies have been carried out, yielding an improved understanding of charged particle dynamics in the CERN Large Hadron Collider (LHC) and its future upgrades. The main results are highlighted in this paper.
High-End Computing Challenges in Aerospace Design and Engineering
NASA Technical Reports Server (NTRS)
Bailey, F. Ronald
2004-01-01
High-End Computing (HEC) has had significant impact on aerospace design and engineering and is poised to make even more in the future. In this paper we describe four aerospace design and engineering challenges: Digital Flight, Launch Simulation, Rocket Fuel System and Digital Astronaut. The paper discusses modeling capabilities needed for each challenge and presents projections of future near and far-term HEC computing requirements. NASA's HEC Project Columbia is described and programming strategies presented that are necessary to achieve high real performance.
Computational chemistry and cheminformatics: an essay on the future.
Glen, Robert Charles
2012-01-01
Computers have changed the way we do science. Surrounded by a sea of data and with phenomenal computing capacity, the methodology and approach to scientific problems is evolving into a partnership between experiment, theory and data analysis. Given the pace of change of the last twenty-five years, it seems folly to speculate on the future, but along with unpredictable leaps of progress there will be a continuous evolution of capability, which points to opportunities and improvements that will certainly appear as our discipline matures.
NASA Technical Reports Server (NTRS)
Mularz, Edward J.; Sockol, Peter M.
1987-01-01
Future aerospace propulsion concepts involve the combination of liquid or gaseous fuels in a highly turbulent internal air stream. Accurate predictive computer codes which can simulate the fluid mechanics, chemistry, and turbulence combustion interaction of these chemical reacting flows will be a new tool that is needed in the design of these future propulsion concepts. Experimental and code development research is being performed at Lewis to better understand chemical reacting flows with the long term goal of establishing these reliable computer codes. The approach to understanding chemical reacting flows is to look at separate simple parts of this complex phenomena as well as to study the full turbulent reacting flow process. As a result research on the fluid mechanics associated with chemical reacting flows was initiated. The chemistry of fuel-air combustion is also being studied. Finally, the phenomena of turbulence-combustion interaction is being investigated. This presentation will highlight research, both experimental and analytical, in each of these three major areas.
NASA Technical Reports Server (NTRS)
Mularz, Edward J.; Sockol, Peter M.
1990-01-01
Future aerospace propulsion concepts involve the combustion of liquid or gaseous fuels in a highly turbulent internal airstream. Accurate predictive computer codes which can simulate the fluid mechanics, chemistry, and turbulence-combustion interaction of these chemical reacting flows will be a new tool that is needed in the design of these future propulsion concepts. Experimental and code development research is being performed at LeRC to better understand chemical reacting flows with the long-term goal of establishing these reliable computer codes. Our approach to understand chemical reacting flows is to look at separate, more simple parts of this complex phenomenon as well as to study the full turbulent reacting flow process. As a result, we are engaged in research on the fluid mechanics associated with chemical reacting flows. We are also studying the chemistry of fuel-air combustion. Finally, we are investigating the phenomenon of turbulence-combustion interaction. Research, both experimental and analytical, is highlighted in each of these three major areas.
Cellular Level Brain Imaging in Behaving Mammals: An Engineering Approach
Hamel, Elizabeth J.O.; Grewe, Benjamin F.; Parker, Jones G.; Schnitzer, Mark J.
2017-01-01
Fluorescence imaging offers expanding capabilities for recording neural dynamics in behaving mammals, including the means to monitor hundreds of cells targeted by genetic type or connectivity, track cells over weeks, densely sample neurons within local microcircuits, study cells too inactive to isolate in extracellular electrical recordings, and visualize activity in dendrites, axons, or dendritic spines. We discuss recent progress and future directions for imaging in behaving mammals from a systems engineering perspective, which seeks holistic consideration of fluorescent indicators, optical instrumentation, and computational analyses. Today, genetically encoded indicators of neural Ca2+ dynamics are widely used, and those of trans-membrane voltage are rapidly improving. Two complementary imaging paradigms involve conventional microscopes for studying head-restrained animals and head-mounted miniature microscopes for imaging in freely behaving animals. Overall, the field has attained sufficient sophistication that increased cooperation between those designing new indicators, light sources, microscopes, and computational analyses would greatly benefit future progress. PMID:25856491
Effects of computer-based training on procedural modifications to standard functional analyses.
Schnell, Lauren K; Sidener, Tina M; DeBar, Ruth M; Vladescu, Jason C; Kahng, SungWoo
2018-01-01
Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to training materials using interactive software during a 1-day session. Following the training, mean scores on the posttest, novel cases probe, and maintenance probe increased for all participants. These results replicate previous findings during a 1-day session and include a measure of participant acceptability of the training. Recommendations for future research on computer-based training and functional analysis are discussed. © 2017 Society for the Experimental Analysis of Behavior.
Evaluating Imaging and Computer-aided Detection and Diagnosis Devices at the FDA
Gallas, Brandon D.; Chan, Heang-Ping; D’Orsi, Carl J.; Dodd, Lori E.; Giger, Maryellen L.; Gur, David; Krupinski, Elizabeth A.; Metz, Charles E.; Myers, Kyle J.; Obuchowski, Nancy A.; Sahiner, Berkman; Toledano, Alicia Y.; Zuley, Margarita L.
2017-01-01
This report summarizes the Joint FDA-MIPS Workshop on Methods for the Evaluation of Imaging and Computer-Assist Devices. The purpose of the workshop was to gather information on the current state of the science and facilitate consensus development on statistical methods and study designs for the evaluation of imaging devices to support US Food and Drug Administration submissions. Additionally, participants expected to identify gaps in knowledge and unmet needs that should be addressed in future research. This summary is intended to document the topics that were discussed at the meeting and disseminate the lessons that have been learned through past studies of imaging and computer-aided detection and diagnosis device performance. PMID:22306064
Flight program language requirements. Volume 2: Requirements and evaluations
NASA Technical Reports Server (NTRS)
1972-01-01
The efforts and results are summarized for a study to establish requirements for a flight programming language for future onboard computer applications. Several different languages were available as potential candidates for future NASA flight programming efforts. The study centered around an evaluation of the four most pertinent existing aerospace languages. Evaluation criteria were established, and selected kernels from the current Saturn 5 and Skylab flight programs were used as benchmark problems for sample coding. An independent review of the language specifications incorporated anticipated future programming requirements into the evaluation. A set of detailed language requirements was synthesized from these activities. The details of program language requirements and of the language evaluations are described.
A Model Computer Literacy Course.
ERIC Educational Resources Information Center
Orndorff, Joseph
Designed to address the varied computer skill levels of college students, this proposed computer literacy course would be modular in format, with modules tailored to address various levels of expertise and permit individualized instruction. An introductory module would present both the history and future of computers and computing, followed by an…
Bruno Garza, J L; Eijckelhof, B H W; Johnson, P W; Raina, S M; Rynell, P W; Huysmans, M A; van Dieën, J H; van der Beek, A J; Blatter, B M; Dennerlein, J T
2012-01-01
This study, a part of the PRedicting Occupational biomechanics in OFfice workers (PROOF) study, investigated whether there are differences in field-measured forces, muscle efforts, postures, velocities and accelerations across computer activities. These parameters were measured continuously for 120 office workers performing their own work for two hours each. There were differences in nearly all forces, muscle efforts, postures, velocities and accelerations across keyboard, mouse and idle activities. Keyboard activities showed a 50% increase in the median right trapezius muscle effort when compared to mouse activities. Median shoulder rotation changed from 25 degrees internal rotation during keyboard use to 15 degrees external rotation during mouse use. Only keyboard use was associated with median ulnar deviations greater than 5 degrees. Idle activities led to the greatest variability observed in all muscle efforts and postures measured. In future studies, measurements of computer activities could be used to provide information on the physical exposures experienced during computer use. Practitioner Summary: Computer users may develop musculoskeletal disorders due to their force, muscle effort, posture and wrist velocity and acceleration exposures during computer use. We report that many physical exposures are different across computer activities. This information may be used to estimate physical exposures based on patterns of computer activities over time.
Challenges of Future High-End Computing
NASA Technical Reports Server (NTRS)
Bailey, David; Kutler, Paul (Technical Monitor)
1998-01-01
The next major milestone in high performance computing is a sustained rate of one Pflop/s (also written one petaflops, or 10(circumflex)15 floating-point operations per second). In addition to prodigiously high computational performance, such systems must of necessity feature very large main memories, as well as comparably high I/O bandwidth and huge mass storage facilities. The current consensus of scientists who have studied these issues is that "affordable" petaflops systems may be feasible by the year 2010, assuming that certain key technologies continue to progress at current rates. One important question is whether applications can be structured to perform efficiently on such systems, which are expected to incorporate many thousands of processors and deeply hierarchical memory systems. To answer these questions, advanced performance modeling techniques, including simulation of future architectures and applications, may be required. It may also be necessary to formulate "latency tolerant algorithms" and other completely new algorithmic approaches for certain applications. This talk will give an overview of these challenges.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-26
... controls on trading; information and data relating to the index, including the design, computation and... futures contract raises novel or complex issues that require additional time for review, or if the foreign... composition, computation, or method of selection of component entities of an index referenced and defined in...
Looking towards the Future of Language Assessment: Usability of Tablet PCs in Language Testing
ERIC Educational Resources Information Center
Garcia Laborda, Jesus; Magal Royo, Teresa; Bakieva, Margarita
2016-01-01
This research addresses the change in how the Spanish University Entrance Examination can be delivered in the future. There is a wide acknowledgement that computer tests are very demanding for the delivering institutions which makes computer language testing difficult to implement. However, the use of tablet PCs can facilitate the delivery at even…
ERIC Educational Resources Information Center
Mitchell, Lynda K.; Hardy, Philippe L.
The purpose of this chapter is to envision how the era of technological revolution will affect the guidance, counseling, and student support programs of the future. Advances in computer science, telecommunications, and biotechnology are discussed. These advances have the potential to affect dramatically the services of guidance programs of the…
Structural Analysis Methods for Structural Health Management of Future Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Tessler, Alexander
2007-01-01
Two finite element based computational methods, Smoothing Element Analysis (SEA) and the inverse Finite Element Method (iFEM), are reviewed, and examples of their use for structural health monitoring are discussed. Due to their versatility, robustness, and computational efficiency, the methods are well suited for real-time structural health monitoring of future space vehicles, large space structures, and habitats. The methods may be effectively employed to enable real-time processing of sensing information, specifically for identifying three-dimensional deformed structural shapes as well as the internal loads. In addition, they may be used in conjunction with evolutionary algorithms to design optimally distributed sensors. These computational tools have demonstrated substantial promise for utilization in future Structural Health Management (SHM) systems.
The UCLA MEDLARS Computer System *
Garvis, Francis J.
1966-01-01
Under a subcontract with UCLA the Planning Research Corporation has changed the MEDLARS system to make it possible to use the IBM 7094/7040 direct-couple computer instead of the Honeywell 800 for demand searches. The major tasks were the rewriting of the programs in COBOL and copying of the stored information on the narrower tapes that IBM computers require. (In the future NLM will copy the tapes for IBM computer users.) The differences in the software required by the two computers are noted. Major and costly revisions would be needed to adapt the large MEDLARS system to the smaller IBM 1401 and 1410 computers. In general, MEDLARS is transferrable to other computers of the IBM 7000 class, the new IBM 360, and those of like size, such as the CDC 1604 or UNIVAC 1108, although additional changes are necessary. Potential future improvements are suggested. PMID:5901355
Computer Forensics Education - the Open Source Approach
NASA Astrophysics Data System (ADS)
Huebner, Ewa; Bem, Derek; Cheung, Hon
In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.
Prediction of intestinal absorption and blood-brain barrier penetration by computational methods.
Clark, D E
2001-09-01
This review surveys the computational methods that have been developed with the aim of identifying drug candidates likely to fail later on the road to market. The specifications for such computational methods are outlined, including factors such as speed, interpretability, robustness and accuracy. Then, computational filters aimed at predicting "drug-likeness" in a general sense are discussed before methods for the prediction of more specific properties--intestinal absorption and blood-brain barrier penetration--are reviewed. Directions for future research are discussed and, in concluding, the impact of these methods on the drug discovery process, both now and in the future, is briefly considered.
CT-assisted agile manufacturing
NASA Astrophysics Data System (ADS)
Stanley, James H.; Yancey, Robert N.
1996-11-01
The next century will witness at least two great revolutions in the way goods are produced. First, workers will use the medium of virtual reality in all aspects of marketing, research, development, prototyping, manufacturing, sales and service. Second, market forces will drive manufacturing towards small-lot production and just-in-time delivery. Already, we can discern the merging of these megatrends into what some are calling agile manufacturing. Under this new paradigm, parts and processes will be designed and engineered within the mind of a computer, tooled and manufactured by the offspring of today's rapid prototyping equipment, and evaluated for performance and reliability by advanced nondestructive evaluation (NDE) techniques and sophisticated computational models. Computed tomography (CT) is the premier example of an NDE method suitable for future agile manufacturing activities. It is the only modality that provides convenient access to the full suite of engineering data that users will need to avail themselves of computer- aided design, computer-aided manufacturing, and computer- aided engineering capabilities, as well as newly emerging reverse engineering, rapid prototyping and solid freeform fabrication technologies. As such, CT is assured a central, utilitarian role in future industrial operations. An overview of this exciting future for industrial CT is presented.
Review of NASA antiskid braking research
NASA Technical Reports Server (NTRS)
Tanner, J. A.
1982-01-01
NASA antiskid braking system research programs are reviewed. These programs include experimental studies of four antiskid systems on the Langley Landing Loads Track, flights tests with a DC-9 airplane, and computer simulation studies. Results from these research efforts include identification of factors contributing to degraded antiskid performance under adverse weather conditions, tire tread temperature measurements during antiskid braking on dry runway surfaces, and an assessment of the accuracy of various brake pressure-torque computer models. This information should lead to the development of better antiskid systems in the future.
On Using Home Networks and Cloud Computing for a Future Internet of Things
NASA Astrophysics Data System (ADS)
Niedermayer, Heiko; Holz, Ralph; Pahl, Marc-Oliver; Carle, Georg
In this position paper we state four requirements for a Future Internet and sketch our initial concept. The requirements: (1) more comfort, (2) integration of home networks, (3) resources like service clouds in the network, and (4) access anywhere on any machine. Future Internet needs future quality and future comfort. There need to be new possiblities for everyone. Our focus is on higher layers and related to the many overlay proposals. We consider them to run on top of a basic Future Internet core. A new user experience means to include all user devices. Home networks and services should be a fundamental part of the Future Internet. Home networks extend access and allow interaction with the environment. Cloud Computing can provide reliable resources beyond local boundaries. For access anywhere, we also need secure storage for data and profiles in the network, in particular for access with non-personal devices (Internet terminal, ticket machine, ...).
NASA Astrophysics Data System (ADS)
Furht, Borko
In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.
INCREASING SAVING BEHAVIOR THROUGH AGE-PROGRESSED RENDERINGS OF THE FUTURE SELF.
Hershfield, Hal E; Goldstein, Daniel G; Sharpe, William F; Fox, Jesse; Yeykelis, Leo; Carstensen, Laura L; Bailenson, Jeremy N
2011-11-01
Many people fail to save what they need to for retirement (Munnell, Webb, and Golub-Sass 2009). Research on excessive discounting of the future suggests that removing the lure of immediate rewards by pre-committing to decisions, or elaborating the value of future rewards can both make decisions more future-oriented. In this article, we explore a third and complementary route, one that deals not with present and future rewards, but with present and future selves. In line with thinkers who have suggested that people may fail, through a lack of belief or imagination, to identify with their future selves (Parfit 1971; Schelling 1984), we propose that allowing people to interact with age-progressed renderings of themselves will cause them to allocate more resources toward the future. In four studies, participants interacted with realistic computer renderings of their future selves using immersive virtual reality hardware and interactive decision aids. In all cases, those who interacted with virtual future selves exhibited an increased tendency to accept later monetary rewards over immediate ones.
Redundant actuator development study. [flight control systems for supersonic transport aircraft
NASA Technical Reports Server (NTRS)
Ryder, D. R.
1973-01-01
Current and past supersonic transport configurations are reviewed to assess redundancy requirements for future airplane control systems. Secondary actuators used in stability augmentation systems will probably be the most critical actuator application and require the highest level of redundancy. Two methods of actuator redundancy mechanization have been recommended for further study. Math models of the recommended systems have been developed for use in future computer simulations. A long range plan has been formulated for actuator hardware development and testing in conjunction with the NASA Flight Simulator for Advanced Aircraft.
Performance of the Widely-Used CFD Code OVERFLOW on the Pleides Supercomputer
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.
2017-01-01
Computational performance studies were made for NASA's widely used Computational Fluid Dynamics code OVERFLOW on the Pleiades Supercomputer. Two test cases were considered: a full launch vehicle with a grid of 286 million points and a full rotorcraft model with a grid of 614 million points. Computations using up to 8000 cores were run on Sandy Bridge and Ivy Bridge nodes. Performance was monitored using times reported in the day files from the Portable Batch System utility. Results for two grid topologies are presented and compared in detail. Observations and suggestions for future work are made.
Application of software technology to a future spacecraft computer design
NASA Technical Reports Server (NTRS)
Labaugh, R. J.
1980-01-01
A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.
The Guide to Better Hospital Computer Decisions
Dorenfest, Sheldon I.
1981-01-01
A soon-to-be-published major study of hospital computer use entitled “The Guide to Better Hospital Computer Decisions” was conducted by my firm over the past 2½ years. The study required over twenty (20) man years of effort at a cost of over $300,000, and the six (6) volume final report provides more than 1,000 pages of data about how hospitals are and will be using computerized medical and business information systems. It describes the current status and future expectations for computer use in major application areas, such as, but not limited to, finance, admitting, pharmacy, laboratory, data collection and hospital or medical information systems. It also includes profiles of over 100 companies and other types of organizations providing data processing products and services to hospitals. In this paper, we discuss the need for the study, the specific objectives of the study, the methodology and approach taken to complete the study and a few major conclusions.
Computing Project, Marc develops high-fidelity turbulence models to enhance simulation accuracy and efficient numerical algorithms for future high performance computing hardware architectures. Research Interests High performance computing High order numerical methods for computational fluid dynamics Fluid
Proposed Projects and Experiments Fermilab's Tevatron Questions for the Universe Theory Computing High -performance Computing Grid Computing Networking Mass Storage Plan for the Future State of the Laboratory Homeland Security Industry Computing Sciences Workforce Development A Growing List Historic Results
Laboratory Computing Resource Center
Systems Computing and Data Resources Purchasing Resources Future Plans For Users Getting Started Using LCRC Software Best Practices and Policies Getting Help Support Laboratory Computing Resource Center Laboratory Computing Resource Center Latest Announcements See All April 27, 2018, Announcements, John Low
An overview of computer-based natural language processing
NASA Technical Reports Server (NTRS)
Gevarter, W. B.
1983-01-01
Computer based Natural Language Processing (NLP) is the key to enabling humans and their computer based creations to interact with machines in natural language (like English, Japanese, German, etc., in contrast to formal computer languages). The doors that such an achievement can open have made this a major research area in Artificial Intelligence and Computational Linguistics. Commercial natural language interfaces to computers have recently entered the market and future looks bright for other applications as well. This report reviews the basic approaches to such systems, the techniques utilized, applications, the state of the art of the technology, issues and research requirements, the major participants and finally, future trends and expectations. It is anticipated that this report will prove useful to engineering and research managers, potential users, and others who will be affected by this field as it unfolds.
Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S
2016-05-01
Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.
Liu, Jun; Zhang, Liqun; Cao, Dapeng; Wang, Wenchuan
2009-12-28
Polymer nanocomposites (PNCs) often exhibit excellent mechanical, thermal, electrical and optical properties, because they combine the performances of both polymers and inorganic or organic nanoparticles. Recently, computer modeling and simulation are playing an important role in exploring the reinforcement mechanism of the PNCs and even the design of functional PNCs. This report provides an overview of the progress made in past decades in the investigation of the static, rheological and mechanical properties of polymer nanocomposites studied by computer modeling and simulation. Emphases are placed on exploring the mechanisms at the molecular level for the dispersion of nanoparticles in nanocomposites, the effects of nanoparticles on chain conformation and glass transition temperature (T(g)), as well as viscoelastic and mechanical properties. Finally, some future challenges and opportunities in computer modeling and simulation of PNCs are addressed.
An Analysis of Navigation Algorithms for Smartphones Using J2ME
NASA Astrophysics Data System (ADS)
Santos, André C.; Tarrataca, Luís; Cardoso, João M. P.
Embedded systems are considered one of the most potential areas for future innovations. Two embedded fields that will most certainly take a primary role in future innovations are mobile robotics and mobile computing. Mobile robots and smartphones are growing in number and functionalities, becoming a presence in our daily life. In this paper, we study the current feasibility of a smartphone to execute navigation algorithms. As a test case, we use a smartphone to control an autonomous mobile robot. We tested three navigation problems: Mapping, Localization and Path Planning. For each of these problems, an algorithm has been chosen, developed in J2ME, and tested on the field. Results show the current mobile Java capacity for executing computationally demanding algorithms and reveal the real possibility of using smartphones for autonomous navigation.
NASA Technical Reports Server (NTRS)
Khan, Gufran Sayeed; Gubarev, Mikhail; Speegle, Chet; Ramsey, Brian
2010-01-01
The presentation includes grazing incidence X-ray optics, motivation and challenges, mid spatial frequency generation in cylindrical polishing, design considerations for polishing lap, simulation studies and experimental results, future scope, and summary. Topics include current status of replication optics technology, cylindrical polishing process using large size polishing lap, non-conformance of polishin lap to the optics, development of software and polishing machine, deterministic prediction of polishing, polishing experiment under optimum conditions, and polishing experiment based on known error profile. Future plans include determination of non-uniformity in the polishing lap compliance, development of a polishing sequence based on a known error profile of the specimen, software for generating a mandrel polishing sequence, design an development of a flexible polishing lap, and computer controlled localized polishing process.
Computed tomographic and cross-sectional anatomy of the normal pacu (Colossoma macroponum).
Carr, Alaina; Weber, E P Scott; Murphy, Chris J; Zwingenberger, Alison
2014-03-01
The purpose of this study was to compare and define the normal cross-sectional gross and computed tomographic (CT) anatomy for a species of boney fish to better gain insight into the use of advanced diagnostic imaging for future clinical cases. The pacu (Colossoma macropomum) was used because of its widespread presence in the aquarium trade, its relatively large body size, and its importance in the research and aquaculture settings. Transverse 0.6-mm CT images of three cadaver fish were obtained and compared to corresponding frozen cross sections of the fish. Relevant anatomic structures were identified and labeled at each level; the Hounsfield unit density of major organs was established. The images presented good anatomic detail and provide a reference for future research and clinical investigation.
ERIC Educational Resources Information Center
Prince, Amber T.
Computer assisted instruction, and especially computer simulations, can help to ensure that preservice and inservice teachers learn from the right experiences. In the past, colleges of education used large mainframe computer systems to store student registration, provide simulation lessons on diagnosing reading difficulties, construct informal…
Argonne Out Loud: Computation, Big Data, and the Future of Cities
Catlett, Charlie
2018-01-16
Charlie Catlett, a Senior Computer Scientist at Argonne and Director of the Urban Center for Computation and Data at the Computation Institute of the University of Chicago and Argonne, talks about how he and his colleagues are using high-performance computing, data analytics, and embedded systems to better understand and design cities.
Computational Physics for Space Flight Applications
NASA Technical Reports Server (NTRS)
Reed, Robert A.
2004-01-01
This paper presents viewgraphs on computational physics for space flight applications. The topics include: 1) Introduction to space radiation effects in microelectronics; 2) Using applied physics to help NASA meet mission objectives; 3) Example of applied computational physics; and 4) Future directions in applied computational physics.
Lechner, William J; MacGlashan, James; Wray, Tyler B; Littman, Michael L
2017-01-01
Background Computer-delivered interventions have been shown to be effective in reducing alcohol consumption in heavy drinking college students. However, these computer-delivered interventions rely on mouse, keyboard, or touchscreen responses for interactions between the users and the computer-delivered intervention. The principles of motivational interviewing suggest that in-person interventions may be effective, in part, because they encourage individuals to think through and speak aloud their motivations for changing a health behavior, which current computer-delivered interventions do not allow. Objective The objective of this study was to take the initial steps toward development of a voice-based computer-delivered intervention that can ask open-ended questions and respond appropriately to users’ verbal responses, more closely mirroring a human-delivered motivational intervention. Methods We developed (1) a voice-based computer-delivered intervention that was run by a human controller and that allowed participants to speak their responses to scripted prompts delivered by speech generation software and (2) a text-based computer-delivered intervention that relied on the mouse, keyboard, and computer screen for all interactions. We randomized 60 heavy drinking college students to interact with the voice-based computer-delivered intervention and 30 to interact with the text-based computer-delivered intervention and compared their ratings of the systems as well as their motivation to change drinking and their drinking behavior at 1-month follow-up. Results Participants reported that the voice-based computer-delivered intervention engaged positively with them in the session and delivered content in a manner consistent with motivational interviewing principles. At 1-month follow-up, participants in the voice-based computer-delivered intervention condition reported significant decreases in quantity, frequency, and problems associated with drinking, and increased perceived importance of changing drinking behaviors. In comparison to the text-based computer-delivered intervention condition, those assigned to voice-based computer-delivered intervention reported significantly fewer alcohol-related problems at the 1-month follow-up (incident rate ratio 0.60, 95% CI 0.44-0.83, P=.002). The conditions did not differ significantly on perceived importance of changing drinking or on measures of drinking quantity and frequency of heavy drinking. Conclusions Results indicate that it is feasible to construct a series of open-ended questions and a bank of responses and follow-up prompts that can be used in a future fully automated voice-based computer-delivered intervention that may mirror more closely human-delivered motivational interventions to reduce drinking. Such efforts will require using advanced speech recognition capabilities and machine-learning approaches to train a program to mirror the decisions made by human controllers in the voice-based computer-delivered intervention used in this study. In addition, future studies should examine enhancements that can increase the perceived warmth and empathy of voice-based computer-delivered intervention, possibly through greater personalization, improvements in the speech generation software, and embodying the computer-delivered intervention in a physical form. PMID:28659259
Self-Reported Ache, Pain, or Numbness in Feet and Use of Computers amongst Working-Age Finns.
Korpinen, Leena; Pääkkönen, Rauno; Gobba, Fabriziomaria
2016-11-07
The use of the computers and other technical devices has increased. The aim of our work was to study the possible relation between self-reported foot symptoms and use of computers and cell phones using a questionnaire. The study was carried out as a cross-sectional study by posting a questionnaire to 15,000 working-age Finns. A total of 6121 responded, and 7.1% of respondents reported that they very often experienced pain, numbness, and aches in the feet. They also often experienced other symptoms: 52.3% had symptoms in the neck, 53.5% in had problems in the hip and lower back, and 14.6% often had sleeping disorders/disturbances. Only 11.2% of the respondents thought that their symptoms were connected to the use of desktop computers. We found that persons with symptoms in the feet quite often, or more often, had additional physical and mental symptoms. In future studies, it is important to take into account that the persons with symptoms in the feet may very often have other symptoms, and the use of computers can influence these symptoms.
NASA Astrophysics Data System (ADS)
Falkner, Katrina; Vivian, Rebecca
2015-10-01
To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.
Mass storage system experiences and future needs at the National Center for Atmospheric Research
NASA Technical Reports Server (NTRS)
Olear, Bernard T.
1991-01-01
A summary and viewgraphs of a discussion presented at the National Space Science Data Center (NSSDC) Mass Storage Workshop is included. Some of the experiences of the Scientific Computing Division at the National Center for Atmospheric Research (NCAR) dealing the the 'data problem' are discussed. A brief history and a development of some basic mass storage system (MSS) principles are given. An attempt is made to show how these principles apply to the integration of various components into NCAR's MSS. Future MSS needs for future computing environments is discussed.
Research and the Personal Computer.
ERIC Educational Resources Information Center
Blackburn, D. A.
1989-01-01
Discussed is the history and elements of the personal computer. Its uses as a laboratory assistant and generic toolkit for mathematical analysis and modeling are included. The future of the personal computer in research is addressed. (KR)
Ti:sapphire - A theoretical assessment for its spectroscopy
NASA Astrophysics Data System (ADS)
Da Silva, A.; Boschetto, D.; Rax, J. M.; Chériaux, G.
2017-03-01
This article tries to theoretically compute the stimulated emission cross-sections when we know the oscillator strength of a broad material class (dielectric crystals hosting metal-transition impurity atoms). We apply the present approach to Ti:sapphire and check it by computing some emission cross-section curves for both π and σ polarizations. We also set a relationship between oscillator strength and radiative lifetime. Such an approach will allow future parametric studies for Ti:sapphire spectroscopic properties.
Computational Approaches to Simulation and Optimization of Global Aircraft Trajectories
NASA Technical Reports Server (NTRS)
Ng, Hok Kwan; Sridhar, Banavar
2016-01-01
This study examines three possible approaches to improving the speed in generating wind-optimal routes for air traffic at the national or global level. They are: (a) using the resources of a supercomputer, (b) running the computations on multiple commercially available computers and (c) implementing those same algorithms into NASAs Future ATM Concepts Evaluation Tool (FACET) and compares those to a standard implementation run on a single CPU. Wind-optimal aircraft trajectories are computed using global air traffic schedules. The run time and wait time on the supercomputer for trajectory optimization using various numbers of CPUs ranging from 80 to 10,240 units are compared with the total computational time for running the same computation on a single desktop computer and on multiple commercially available computers for potential computational enhancement through parallel processing on the computer clusters. This study also re-implements the trajectory optimization algorithm for further reduction of computational time through algorithm modifications and integrates that with FACET to facilitate the use of the new features which calculate time-optimal routes between worldwide airport pairs in a wind field for use with existing FACET applications. The implementations of trajectory optimization algorithms use MATLAB, Python, and Java programming languages. The performance evaluations are done by comparing their computational efficiencies and based on the potential application of optimized trajectories. The paper shows that in the absence of special privileges on a supercomputer, a cluster of commercially available computers provides a feasible approach for national and global air traffic system studies.
NASA Technical Reports Server (NTRS)
Parikh, Paresh; Engelund, Walter; Armand, Sasan; Bittner, Robert
2004-01-01
A computational fluid dynamic (CFD) study is performed on the Hyper-X (X-43A) Launch Vehicle stack configuration in support of the aerodynamic database generation in the transonic to hypersonic flow regime. The main aim of the study is the evaluation of a CFD method that can be used to support aerodynamic database development for similar future configurations. The CFD method uses the NASA Langley Research Center developed TetrUSS software, which is based on tetrahedral, unstructured grids. The Navier-Stokes computational method is first evaluated against a set of wind tunnel test data to gain confidence in the code s application to hypersonic Mach number flows. The evaluation includes comparison of the longitudinal stability derivatives on the complete stack configuration (which includes the X-43A/Hyper-X Research Vehicle, the launch vehicle and an adapter connecting the two), detailed surface pressure distributions at selected locations on the stack body and component (rudder, elevons) forces and moments. The CFD method is further used to predict the stack aerodynamic performance at flow conditions where no experimental data is available as well as for component loads for mechanical design and aero-elastic analyses. An excellent match between the computed and the test data over a range of flow conditions provides a computational tool that may be used for future similar hypersonic configurations with confidence.
Using Alice 2.0 to Design Games for People with Stroke.
Proffitt, Rachel; Kelleher, Caitlin; Baum, M Carolyn; Engsberg, Jack
2012-08-01
Computer and videogames are gaining in popularity as rehabilitation tools. Unfortunately, most systems still require extensive programming/engineering knowledge to create, something that therapists, as novice programmers, do not possess. There is software designed to allow novice programmers to create storyboard and games through simple drag-and-drop formats; however, the applications for therapeutic game development have not been studied. The purpose of this study was to have an occupational therapy (OT) student with no prior computer programming experience learn how to create computer games for persons with stroke using Alice 2.0, a drag-and-drop editor, designed by Carnegie Mellon University (Pittsburgh, PA). The OT student learned how to use Alice 2.0 through a textbook, tutorials, and assistance from computer science students. She kept a journal of her process, detailing her successes and challenges. The OT student created three games for people with stroke using Alice 2.0. She found that although there were many supports in Alice for creating stories, it lacked critical pieces necessary for game design. Her recommendations for a future programming environment for therapists were that it (1) be efficient, (2) include basic game design pieces so therapists do not have to create them, (3) provide technical support, and (4) be simple. With the incorporation of these recommendations, a future programming environment for therapists will be an effective tool for therapeutic game development.
Workshops of the Fifth International Brain-Computer Interface Meeting: Defining the Future.
Huggins, Jane E; Guger, Christoph; Allison, Brendan; Anderson, Charles W; Batista, Aaron; Brouwer, Anne-Marie A-M; Brunner, Clemens; Chavarriaga, Ricardo; Fried-Oken, Melanie; Gunduz, Aysegul; Gupta, Disha; Kübler, Andrea; Leeb, Robert; Lotte, Fabien; Miller, Lee E; Müller-Putz, Gernot; Rutkowski, Tomasz; Tangermann, Michael; Thompson, David Edward
2014-01-01
The Fifth International Brain-Computer Interface (BCI) Meeting met June 3-7 th , 2013 at the Asilomar Conference Grounds, Pacific Grove, California. The conference included 19 workshops covering topics in brain-computer interface and brain-machine interface research. Topics included translation of BCIs into clinical use, standardization and certification, types of brain activity to use for BCI, recording methods, the effects of plasticity, special interest topics in BCIs applications, and future BCI directions. BCI research is well established and transitioning to practical use to benefit people with physical impairments. At the same time, new applications are being explored, both for people with physical impairments and beyond. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and high-lighting important issues for future research and development.
Souza, Eliana Pereira Salles de; Cabrera, Eliana Márcia Sotello; Braile, Domingo Marcolino
2010-01-01
Technological advances and the Internet have contributed to the increased disclosure and updating of knowledge and science. Scientific papers are considered the best form of disclosure of information and have been undergoing many changes, not on their way of development, but on the structure of publication. The Future paper, a name for this new structure, uses hypermediatic resources, allowing a quick, easy and organized access to these items online. The exchange of information, comments and criticisms can be performed in real time, providing agility in science disclosure. The trend for the future of documents, both from professionals or enterprises, is the "cloud computing", in which all documents will be developed and updated with the use of various equipments: computer, palm, netbook, ipad, without need to have the software installed on your computer, requiring only an Internet connection.
CADD medicine: design is the potion that can cure my disease
NASA Astrophysics Data System (ADS)
Manas, Eric S.; Green, Darren V. S.
2017-03-01
The acronym "CADD" is often used interchangeably to refer to "Computer Aided Drug Discovery" and "Computer Aided Drug Design". While the former definition implies the use of a computer to impact one or more aspects of discovering a drug, in this paper we contend that computational chemists are most effective when they enable teams to apply true design principles as they strive to create medicines to treat human disease. We argue that teams must bring to bear multiple sub-disciplines of computational chemistry in an integrated manner in order to utilize these principles to address the multi-objective nature of the drug discovery problem. Impact, resourcing principles, and future directions for the field are also discussed, including areas of future opportunity as well as a cautionary note about hype and hubris.
Coping with Computing Success.
ERIC Educational Resources Information Center
Breslin, Richard D.
Elements of computing success of Iona College, the challenges it currently faces, and the strategies conceived to cope with future computing needs are discussed. The college has mandated computer literacy for students and offers nine degrees in the computerized information system/management information system areas. Since planning is needed in…
Learning With Computers; Today and Tomorrow.
ERIC Educational Resources Information Center
Bork, Alfred
This paper describes the present practical use of computers in two large beginning physics courses at the University of California, Irvine; discusses the versatility and desirability of computers in the field of education; and projects the possible future directions of computer-based learning. The advantages and disadvantages of educational…
Lopez, Richard B; Stillman, Paul E; Heatherton, Todd F; Freeman, Jonathan B
2018-01-01
In this review, we present the case for using computer mouse-tracking techniques to examine psychological processes that support (and hinder) self-regulation of eating. We first argue that computer mouse-tracking is suitable for studying the simultaneous engagement of-and dynamic interactions between-multiple perceptual and cognitive processes as they unfold and interact over a fine temporal scale (i.e., hundreds of milliseconds). Next, we review recent work that implemented mouse-tracking techniques by measuring mouse movements as participants chose between various food items (of varying nutritional content). Lastly, we propose next steps for future investigations to link behavioral features from mouse-tracking paradigms, corresponding neural correlates, and downstream eating behaviors.
NASA Technical Reports Server (NTRS)
Stricker, L. T.
1973-01-01
The DORCA Applications study has been directed at development of a data bank management computer program identified as DORMAN. Because of the size of the DORCA data files and the manipulations required on that data to support analyses with the DORCA program, automated data techniques to replace time-consuming manual input generation are required. The Dynamic Operations Requirements and Cost Analysis (DORCA) program was developed for use by NASA in planning future space programs. Both programs are designed for implementation on the UNIVAC 1108 computing system. The purpose of this Executive Summary Report is to define for the NASA management the basic functions of the DORMAN program and its capabilities.
The role of voice input for human-machine communication.
Cohen, P R; Oviatt, S L
1995-01-01
Optimism is growing that the near future will witness rapid growth in human-computer interaction using voice. System prototypes have recently been built that demonstrate speaker-independent real-time speech recognition, and understanding of naturally spoken utterances with vocabularies of 1000 to 2000 words, and larger. Already, computer manufacturers are building speech recognition subsystems into their new product lines. However, before this technology can be broadly useful, a substantial knowledge base is needed about human spoken language and performance during computer-based spoken interaction. This paper reviews application areas in which spoken interaction can play a significant role, assesses potential benefits of spoken interaction with machines, and compares voice with other modalities of human-computer interaction. It also discusses information that will be needed to build a firm empirical foundation for the design of future spoken and multimodal interfaces. Finally, it argues for a more systematic and scientific approach to investigating spoken input and performance with future language technology. PMID:7479803
Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.
Bian, Yuemin; Xie, Xiang-Qun Sean
2018-04-09
Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.
NASA Technical Reports Server (NTRS)
Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.
1975-01-01
The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.
Study of basic computer competence among public health nurses in Taiwan.
Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling
2004-03-01
Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.
ERIC Educational Resources Information Center
Wohl, Benjamin S.; Beck, Sophie; Blair, Lynne
2017-01-01
In these early stages of implementation of the English computing curriculum policy reforms, there are uncertainties with regards to the intentions of computing to young people. To date, research regarding the English computing curriculum has been mostly concerned with the content of the curriculum, its delivery and surrounding pedagogy. In…
Petascale Computing: Impact on Future NASA Missions
NASA Technical Reports Server (NTRS)
Brooks, Walter
2006-01-01
This slide presentation reviews NASA's use of a new super computer, called Columbia, capable of operating at 62 Tera Flops. This computer is the 4th fastest computer in the world. This computer will serve all mission directorates. The applications that it would serve are: aerospace analysis and design, propulsion subsystem analysis, climate modeling, hurricane prediction and astrophysics and cosmology.
Current state and future direction of computer systems at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Rogers, James L. (Editor); Tucker, Jerry H. (Editor)
1992-01-01
Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.
Stone, John E; Hallock, Michael J; Phillips, James C; Peterson, Joseph R; Luthey-Schulten, Zaida; Schulten, Klaus
2016-05-01
Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers.
Career Repertoires of IT Students: A Group Counselling Case Study in Higher Education
ERIC Educational Resources Information Center
Penttinen, Leena; Vesisenaho, Mikko
2013-01-01
Uncertainty about future career prospects has increased enormously for students enrolled in higher education Information Technology (IT) programs. However, many computer science programmes pay little attention to career counselling. This article reports the results of a pilot study intended to develop group counselling for IT students to promote…
Empathy in Future Teachers of the Pedagogical and Technological University of Colombia
ERIC Educational Resources Information Center
Herrera Torres, Lucía; Buitrago Bonilla, Rafael Enrique; Avila Moreno, Aida Karina
2016-01-01
This study analyzes cognitive and emotional empathy in students who started their training at the Education Science Faculty of the Pedagogical and Technological University of Colombia. The sample was formed by 317 students enrolled in the study programs of Preschool, Plastic Arts, Natural Sciences, Physical Education, Philosophy, Computer Science,…
75 FR 17207 - Electronic On-Board Recorders for Hours-of-Service Compliance
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-05
... to data limitation, FMCSA used outdated studies in the analysis for this rule. For future HOS rulemakings, FMCSA will use updated studies and reports to analyze impacts. \\1\\ Estimates of benefits and... percent of the long-distance drivers in 2005 said there were EOBRs or other on-board computers in their...
PREDICTORS OF COMPUTER USE IN COMMUNITY-DWELLING ETHNICALLY DIVERSE OLDER ADULTS
Werner, Julie M.; Carlson, Mike; Jordan-Marsh, Maryalice; Clark, Florence
2011-01-01
Objective In this study we analyzed self-reported computer use, demographic variables, psychosocial variables, and health and well-being variables collected from 460 ethnically diverse, community-dwelling elders in order to investigate the relationship computer use has with demographics, well-being and other key psychosocial variables in older adults. Background Although younger elders with more education, those who employ active coping strategies, or those who are low in anxiety levels are thought to use computers at higher rates than others, previous research has produced mixed or inconclusive results regarding ethnic, gender, and psychological factors, or has concentrated on computer-specific psychological factors only (e.g., computer anxiety). Few such studies have employed large sample sizes or have focused on ethnically diverse populations of community-dwelling elders. Method With a large number of overlapping predictors, zero-order analysis alone is poorly equipped to identify variables that are independently associated with computer use. Accordingly, both zero-order and stepwise logistic regression analyses were conducted to determine the correlates of two types of computer use: email and general computer use. Results Results indicate that younger age, greater level of education, non-Hispanic ethnicity, behaviorally active coping style, general physical health, and role-related emotional health each independently predicted computer usage. Conclusion Study findings highlight differences in computer usage, especially in regard to Hispanic ethnicity and specific health and well-being factors. Application Potential applications of this research include future intervention studies, individualized computer-based activity programming, or customizable software and user interface design for older adults responsive to a variety of personal characteristics and capabilities. PMID:22046718
Predictors of computer use in community-dwelling, ethnically diverse older adults.
Werner, Julie M; Carlson, Mike; Jordan-Marsh, Maryalice; Clark, Florence
2011-10-01
In this study, we analyzed self-reported computer use, demographic variables, psychosocial variables, and health and well-being variables collected from 460 ethnically diverse, community-dwelling elders to investigate the relationship computer use has with demographics, well-being, and other key psychosocial variables in older adults. Although younger elders with more education, those who employ active coping strategies, or those who are low in anxiety levels are thought to use computers at higher rates than do others, previous research has produced mixed or inconclusive results regarding ethnic, gender, and psychological factors or has concentrated on computer-specific psychological factors only (e.g., computer anxiety). Few such studies have employed large sample sizes or have focused on ethnically diverse populations of community-dwelling elders. With a large number of overlapping predictors, zero-order analysis alone is poorly equipped to identify variables that are independently associated with computer use. Accordingly, both zero-order and stepwise logistic regression analyses were conducted to determine the correlates of two types of computer use: e-mail and general computer use. Results indicate that younger age, greater level of education, non-Hispanic ethnicity, behaviorally active coping style, general physical health, and role-related emotional health each independently predicted computer usage. Study findings highlight differences in computer usage, especially in regard to Hispanic ethnicity and specific health and well-being factors. Potential applications of this research include future intervention studies, individualized computer-based activity programming, or customizable software and user interface design for older adults responsive to a variety of personal characteristics and capabilities.
Seventy Years of Computing in the Nuclear Weapons Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Archer, Billy Joe
Los Alamos has continuously been on the forefront of scientific computing since it helped found the field. This talk will explore the rich history of computing in the Los Alamos weapons program. The current status of computing will be discussed, as will the expectations for the near future.
Is There a Microcomputer in Your Future? ComputerTown Thinks The Answer Is "Yes."
ERIC Educational Resources Information Center
Harvie, Barbara; Anton, Julie
1983-01-01
The services of ComputerTown, a nonprofit computer literacy project of the People's Computer Company in Menlo Park, California with 150 worldwide affiliates, are enumerated including getting started, funding sources, selecting hardware, software selection, support materials, administrative details, special offerings (classes, events), and common…
Preparing Future Secondary Computer Science Educators
ERIC Educational Resources Information Center
Ajwa, Iyad
2007-01-01
Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…
ERIC Educational Resources Information Center
Peled, Abraham
1987-01-01
Discusses some of the future trends in the use of the computer in our society, suggesting that computing is now entering a new phase in which it will grow exponentially more powerful, flexible, and sophisticated in the next decade. Describes some of the latest breakthroughs in computer hardware and software technology. (TW)
Preservice Teachers' Computer Use in Single Computer Training Courses; Relationships and Predictions
ERIC Educational Resources Information Center
Zogheib, Salah
2015-01-01
Single computer courses offered at colleges of education are expected to provide preservice teachers with the skills and expertise needed to adopt computer technology in their future classrooms. However, preservice teachers still find difficulty adopting such technology. This research paper investigated relationships among preservice teachers'…
Ciullo, Stephen; Falcomata, Terry S; Pfannenstiel, Kathleen; Billingsley, Glenna
2015-01-01
Concept maps have been used to help students with learning disabilities (LD) improve literacy skills and content learning, predominantly in secondary school. However, despite increased access to classroom technology, no previous studies have examined the efficacy of computer-based concept maps to improve learning from informational text for students with LD in elementary school. In this study, we used a concurrent delayed multiple probe design to evaluate the interactive use of computer-based concept maps on content acquisition with science and social studies texts for Hispanic students with LD in Grades 4 and 5. Findings from this study suggest that students improved content knowledge during intervention relative to a traditional instruction baseline condition. Learning outcomes and social validity information are considered to inform recommendations for future research and the feasibility of classroom implementation. © The Author(s) 2014.
Korpinen, Leena; Pääkkönen, Rauno; Gobba, Fabriziomaria
2018-03-01
Recently, computer, mobile phone and Internet use has increased. This study aimed to determine the possible relation between self-reported wrist and finger symptoms (aches, pain or numbness) and using computers/mobile phones, and to analyze how the symptoms are specifically associated with utilizing desktop computers, portable computers or mini-computers and mobile phones. A questionnaire was sent to 15,000 working-age Finns (age 18-65). Via a questionnaire, 723 persons reported wrist and finger symptoms often or more with use. Over 80% use mobile phones daily and less than 30% use desktop computers or the Internet daily at leisure, e.g., over 89.8% quite often or often experienced pain, numbness or aches in the neck, and 61.3% had aches in the hips and the lower back. Only 33.7% connected their symptoms to computer use. In the future, the development of new devices and Internet services should incorporate the ergonomics of the hands and wrists.
NASA Astrophysics Data System (ADS)
Doroszkiewicz, J. M.; Romanowicz, R. J.
2016-12-01
The standard procedure of climate change impact assessment on future hydrological extremes consists of a chain of consecutive actions, starting from the choice of GCM driven by an assumed CO2 scenario, through downscaling of climatic forcing to a catchment scale, estimation of hydrological extreme indices using hydrological modelling tools and subsequent derivation of flood risk maps with the help of a hydraulic model. Among many possible sources of uncertainty, the main are the uncertainties related to future climate scenarios, climate models, downscaling techniques and hydrological and hydraulic models. Unfortunately, we cannot directly assess the impact of these different sources of uncertainties on flood risk in future due to lack of observations of future climate realizations. The aim of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the processes involved, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-sections. The study shows that the application of a simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps.
ERIC Educational Resources Information Center
Smith, Peter, Ed.; Smith, Carol L., Ed.
2005-01-01
This 2005 Association of Small Computer Users in Education (ASCUE) conference proceedings presented the theme "Campus Technology: Anticipating the Future." The conference introduced its ASCUE Officers and Directors, and provides abstracts of the pre-conference workshops. The full-text conference papers in this document include: (1) Developing…
Communication Avoiding and Overlapping for Numerical Linear Algebra
2012-05-08
future exascale systems, communication cost must be avoided or overlapped. Communication-avoiding 2.5D algorithms improve scalability by reducing...linear algebra problems to future exascale systems, communication cost must be avoided or overlapped. Communication-avoiding 2.5D algorithms improve...will continue to grow relative to the cost of computation. With exascale computing as the long-term goal, the community needs to develop techniques
A computer vision for animal ecology.
Weinstein, Ben G
2018-05-01
A central goal of animal ecology is to observe species in the natural world. The cost and challenge of data collection often limit the breadth and scope of ecological study. Ecologists often use image capture to bolster data collection in time and space. However, the ability to process these images remains a bottleneck. Computer vision can greatly increase the efficiency, repeatability and accuracy of image review. Computer vision uses image features, such as colour, shape and texture to infer image content. I provide a brief primer on ecological computer vision to outline its goals, tools and applications to animal ecology. I reviewed 187 existing applications of computer vision and divided articles into ecological description, counting and identity tasks. I discuss recommendations for enhancing the collaboration between ecologists and computer scientists and highlight areas for future growth of automated image analysis. © 2017 The Author. Journal of Animal Ecology © 2017 British Ecological Society.
Weightbearing Computed Tomography of the Foot and Ankle: Emerging Technology Topical Review.
Barg, Alexej; Bailey, Travis; Richter, Martinus; de Cesar Netto, Cesar; Lintz, François; Burssens, Arne; Phisitkul, Phinit; Hanrahan, Christopher J; Saltzman, Charles L
2018-03-01
In the last decade, cone-beam computed tomography technology with improved designs allowing flexible gantry movements has allowed both supine and standing weight-bearing imaging of the lower extremity. There is an increasing amount of literature describing the use of weightbearing computed tomography in patients with foot and ankle disorders. To date, there is no review article summarizing this imaging modality in the foot and ankle. Therefore, we performed a systematic literature review of relevant clinical studies targeting the use of weightbearing computed tomography in diagnosis of patients with foot and ankle disorders. Furthermore, this review aims to offer insight to those with interest in considering possible future research opportunities with use of this technology. Level V, expert opinion.
Brain-Computer Interfaces in Medicine
Shih, Jerry J.; Krusienski, Dean J.; Wolpaw, Jonathan R.
2012-01-01
Brain-computer interfaces (BCIs) acquire brain signals, analyze them, and translate them into commands that are relayed to output devices that carry out desired actions. BCIs do not use normal neuromuscular output pathways. The main goal of BCI is to replace or restore useful function to people disabled by neuromuscular disorders such as amyotrophic lateral sclerosis, cerebral palsy, stroke, or spinal cord injury. From initial demonstrations of electroencephalography-based spelling and single-neuron-based device control, researchers have gone on to use electroencephalographic, intracortical, electrocorticographic, and other brain signals for increasingly complex control of cursors, robotic arms, prostheses, wheelchairs, and other devices. Brain-computer interfaces may also prove useful for rehabilitation after stroke and for other disorders. In the future, they might augment the performance of surgeons or other medical professionals. Brain-computer interface technology is the focus of a rapidly growing research and development enterprise that is greatly exciting scientists, engineers, clinicians, and the public in general. Its future achievements will depend on advances in 3 crucial areas. Brain-computer interfaces need signal-acquisition hardware that is convenient, portable, safe, and able to function in all environments. Brain-computer interface systems need to be validated in long-term studies of real-world use by people with severe disabilities, and effective and viable models for their widespread dissemination must be implemented. Finally, the day-to-day and moment-to-moment reliability of BCI performance must be improved so that it approaches the reliability of natural muscle-based function. PMID:22325364
ERIC Educational Resources Information Center
Ranade, Sanjay; Schraeder, Jeff
1991-01-01
Presents an overview of the mass storage market and discusses mass storage systems as part of computer networks. Systems for personal computers, workstations, minicomputers, and mainframe computers are described; file servers are explained; system integration issues are raised; and future possibilities are suggested. (LRW)
Multimedia and the Future of Distance Learning Technology.
ERIC Educational Resources Information Center
Barnard, John
1992-01-01
Describes recent innovations in distance learning technology, including the use of video technology; personal computers, including computer conferencing, computer-mediated communication, and workstations; multimedia, including hypermedia; Integrated Services Digital Networks (ISDN); and fiber optics. Research implications for multimedia and…
NASA Astrophysics Data System (ADS)
Cholko, Timothy; Chen, Wei; Tang, Zhiye; Chang, Chia-en A.
2018-05-01
Abnormal activity of cyclin-dependent kinase 8 (CDK8) along with its partner protein cyclin C (CycC) is a common feature of many diseases including colorectal cancer. Using molecular dynamics (MD) simulations, this study determined the dynamics of the CDK8-CycC system and we obtained detailed breakdowns of binding energy contributions for four type-I and five type-II CDK8 inhibitors. We revealed system motions and conformational changes that will affect ligand binding, confirmed the essentialness of CycC for inclusion in future computational studies, and provide guidance in development of CDK8 binders. We employed unbiased all-atom MD simulations for 500 ns on twelve CDK8-CycC systems, including apoproteins and protein-ligand complexes, then performed principal component analysis (PCA) and measured the RMSF of key regions to identify protein dynamics. Binding pocket volume analysis identified conformational changes that accompany ligand binding. Next, H-bond analysis, residue-wise interaction calculations, and MM/PBSA were performed to characterize protein-ligand interactions and find the binding energy. We discovered that CycC is vital for maintaining a proper conformation of CDK8 to facilitate ligand binding and that the system exhibits motion that should be carefully considered in future computational work. Surprisingly, we found that motion of the activation loop did not affect ligand binding. Type-I and type-II ligand binding is driven by van der Waals interactions, but electrostatic energy and entropic penalties affect type-II binding as well. Binding of both ligand types affects protein flexibility. Based on this we provide suggestions for development of tighter-binding CDK8 inhibitors and offer insight that can aid future computational studies.
Harrington, Susan S.; Walker, Bonnie L.
2010-01-01
Background Older adults in small residential board and care facilities are at a particularly high risk of fire death and injury because of their characteristics and environment. Methods The authors investigated computer-based instruction as a way to teach fire emergency planning to owners, operators, and staff of small residential board and care facilities. Participants (N = 59) were randomly assigned to a treatment or control group. Results Study participants who completed the training significantly improved their scores from pre- to posttest when compared to a control group. Participants indicated on the course evaluation that the computers were easy to use for training (97%) and that they would like to use computers for future training courses (97%). Conclusions This study demonstrates the potential for using interactive computer-based training as a viable alternative to instructor-led training to meet the fire safety training needs of owners, operators, and staff of small board and care facilities for the elderly. PMID:19263929
Xiao, Bo; Imel, Zac E.; Georgiou, Panayiotis; Atkins, David C.; Narayanan, Shrikanth S.
2017-01-01
Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation, and offer a series of open problems for future research. PMID:27017830
White-collar workers' self-reported physical symptoms associated with using computers.
Korpinen, Leena; Pääkkönen, Rauno; Gobba, Fabriziomaria
2012-01-01
The aim of our work was to study the physical symptoms of upper- and lower-level white-collar workers using a questionnaire. The study was cross-sectional with a questionnaire posted to 15 000 working-age persons. The responses (6121) included 970 upper- and 1150 lower-level white-collar workers. In the upper- and lower-level white-collar worker groups, 45.7 and 56.0%, respectively, had experienced pain, numbness and aches in the neck either pretty often or more frequently. When comparing daily computer users and nonusers, there were significant differences in pain, numbness and aches in the neck or in the shoulders. In addition, age and gender influenced some physical symptoms. In the future, it is essential to take into account that working with computers can be especially associated with physical symptoms in the neck and in the shoulders when workers use computers daily.
NASA Astrophysics Data System (ADS)
Honing, Henkjan; Zuidema, Willem
2014-09-01
The future of cognitive science will be about bridging neuroscience and behavioral studies, with essential roles played by comparative biology, formal modeling, and the theory of computation. Nowhere will this integration be more strongly needed than in understanding the biological basis of language and music. We thus strongly sympathize with the general framework that Fitch [1] proposes, and welcome the remarkably broad and readable review he presents to support it.
Cable Connected Spinning Spacecraft, 1. the Canonical Equations, 2. Urban Mass Transportation, 3
NASA Technical Reports Server (NTRS)
Sitchin, A.
1972-01-01
Work on the dynamics of cable-connected spinning spacecraft was completed by formulating the equations of motion by both the canonical equations and Lagrange's equations and programming them for numerical solution on a digital computer. These energy-based formulations will permit future addition of the effect of cable mass. Comparative runs indicate that the canonical formulation requires less computer time. Available literature on urban mass transportation was surveyed. Areas of the private rapid transit concept of urban transportation are also studied.
Goshima, Yoshio; Hida, Tomonobu; Gotoh, Toshiyuki
2012-01-01
Axonal transport plays a crucial role in neuronal morphogenesis, survival and function. Despite its importance, however, the molecular mechanisms of axonal transport remain mostly unknown because a simple and quantitative assay system for monitoring this cellular process has been lacking. In order to better characterize the mechanisms involved in axonal transport, we formulate a novel computer-assisted monitoring system of axonal transport. Potential uses of this system and implications for future studies will be discussed.
Computer-Based Instruction in Military Environments: Defense Research Series. Volume 1
1987-01-01
following papers you will find both a practical and scientific basis for the way current and future training and training systems shouli be designed, applied...should be expended on the many payoffs of computer based instructional techniques. As you study these papers be aware that they are only part of the...your goal is to accomplish XXX, you should next do YYY" Conceptual Model orientation: "XXX will effect YYY by accomplishing ZZZ, which in turn effects
ERIC Educational Resources Information Center
Northwestern Univ., Evanston, IL. Univ. Libraries.
In March 1974, a study was undertaken at Northwestern University to examine the role of the library in providing information services based on computerized data bases. After taking an inventory of existing data bases at Northwestern and in the greater Chicago area, a committee suggested ways to continue and expand the scope of information…
Single Cell Genomics: Approaches and Utility in Immunology
Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A
2017-01-01
Single cell genomics offers powerful tools for studying lymphocytes, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population-level. Advances in computer science and single cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single cell RNA-seq data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. PMID:28094102
NASA Technical Reports Server (NTRS)
Fishbach, L. H.
1980-01-01
The computational techniques are described which are utilized at Lewis Research Center to determine the optimum propulsion systems for future aircraft applications and to identify system tradeoffs and technology requirements. Cycle performance, and engine weight can be calculated along with costs and installation effects as opposed to fuel consumption alone. Almost any conceivable turbine engine cycle can be studied. These computer codes are: NNEP, WATE, LIFCYC, INSTAL, and POD DRG. Examples are given to illustrate how these computer techniques can be applied to analyze and optimize propulsion system fuel consumption, weight and cost for representative types of aircraft and missions.
Implementation of glider guns in the light-sensitive Belousov-Zhabotinsky medium.
de Lacy Costello, Ben; Toth, Rita; Stone, Christopher; Adamatzky, Andrew; Bull, Larry
2009-02-01
In cellular automata models a glider gun is an oscillating pattern of nonquiescent states that periodically emits traveling localizations (gliders). The glider streams can be combined to construct functionally complete systems of logical gates and thus realize universal computation. The glider gun is the only means of ensuring the negation operation without additional external input and therefore is an essential component of a collision-based computing circuit. We demonstrate the existence of glider-gun-like structures in both experimental and numerical studies of an excitable chemical system-the light-sensitive Belousov-Zhabotinsky reaction. These discoveries could provide the basis for future designs of collision-based reaction-diffusion computers.
Current status and future direction of NASA's Space Life Sciences Program
NASA Technical Reports Server (NTRS)
White, Ronald J.; Lujan, Barbara F.
1989-01-01
The elements of the NASA Life Sciences Program that are related to manned space flight and biological scientific studies in space are reviewed. Projects included in the current program are outlined and the future direction of the program is discussed. Consideration is given to issues such as long-duration spaceflight, medical support in space, readaptation to the gravity field of earth, considerations for the Space Station, radiation hazards, environmental standards for space habitation, and human operator interaction with computers, robots, and telepresence systems.
ISCB Ebola Award for Important Future Research on the Computational Biology of Ebola Virus
Karp, Peter D.; Berger, Bonnie; Kovats, Diane; Lengauer, Thomas; Linial, Michal; Sabeti, Pardis; Hide, Winston; Rost, Burkhard
2015-01-01
Speed is of the essence in combating Ebola; thus, computational approaches should form a significant component of Ebola research. As for the development of any modern drug, computational biology is uniquely positioned to contribute through comparative analysis of the genome sequences of Ebola strains as well as 3-D protein modeling. Other computational approaches to Ebola may include large-scale docking studies of Ebola proteins with human proteins and with small-molecule libraries, computational modeling of the spread of the virus, computational mining of the Ebola literature, and creation of a curated Ebola database. Taken together, such computational efforts could significantly accelerate traditional scientific approaches. In recognition of the need for important and immediate solutions from the field of computational biology against Ebola, the International Society for Computational Biology (ISCB) announces a prize for an important computational advance in fighting the Ebola virus. ISCB will confer the ISCB Fight against Ebola Award, along with a prize of US$2,000, at its July 2016 annual meeting (ISCB Intelligent Systems for Molecular Biology (ISMB) 2016, Orlando, Florida). PMID:26097686
ISCB Ebola Award for Important Future Research on the Computational Biology of Ebola Virus.
Karp, Peter D; Berger, Bonnie; Kovats, Diane; Lengauer, Thomas; Linial, Michal; Sabeti, Pardis; Hide, Winston; Rost, Burkhard
2015-01-01
Speed is of the essence in combating Ebola; thus, computational approaches should form a significant component of Ebola research. As for the development of any modern drug, computational biology is uniquely positioned to contribute through comparative analysis of the genome sequences of Ebola strains as well as 3-D protein modeling. Other computational approaches to Ebola may include large-scale docking studies of Ebola proteins with human proteins and with small-molecule libraries, computational modeling of the spread of the virus, computational mining of the Ebola literature, and creation of a curated Ebola database. Taken together, such computational efforts could significantly accelerate traditional scientific approaches. In recognition of the need for important and immediate solutions from the field of computational biology against Ebola, the International Society for Computational Biology (ISCB) announces a prize for an important computational advance in fighting the Ebola virus. ISCB will confer the ISCB Fight against Ebola Award, along with a prize of US$2,000, at its July 2016 annual meeting (ISCB Intelligent Systems for Molecular Biology (ISMB) 2016, Orlando, Florida).
Sheriff, Kelli A; Boon, Richard T
2014-08-01
The purpose of this study was to examine the effects of computer-based graphic organizers, using Kidspiration 3© software, to solve one-step word problems. Participants included three students with mild intellectual disability enrolled in a functional academic skills curriculum in a self-contained classroom. A multiple probe single-subject research design (Horner & Baer, 1978) was used to evaluate the effectiveness of computer-based graphic organizers to solving mathematical one-step word problems. During the baseline phase, the students completed a teacher-generated worksheet that consisted of nine functional word problems in a traditional format using a pencil, paper, and a calculator. In the intervention and maintenance phases, the students were instructed to complete the word problems using a computer-based graphic organizer. Results indicated that all three of the students improved in their ability to solve the one-step word problems using computer-based graphic organizers compared to traditional instructional practices. Limitations of the study and recommendations for future research directions are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Squeezed Dirac and topological magnons in a bosonic honeycomb optical lattice
NASA Astrophysics Data System (ADS)
Owerre, S. A.; Nsofini, J.
2017-11-01
Quantum information storage using charge-neutral quasiparticles is expected to play a crucial role in the future of quantum computers. In this regard, magnons or collective spin-wave excitations in solid-state materials are promising candidates in the future of quantum computing. Here, we study the quantum squeezing of Dirac and topological magnons in a bosonic honeycomb optical lattice with spin-orbit interaction by utilizing the mapping to quantum spin-1/2 XYZ Heisenberg model on the honeycomb lattice with discrete Z2 symmetry and a Dzyaloshinskii-Moriya interaction. We show that the squeezed magnons can be controlled by the Z2 anisotropy and demonstrate how the noise in the system is periodically modified in the ferromagnetic and antiferromagnetic phases of the model. Our results also apply to solid-state honeycomb (anti)ferromagnetic insulators.
Squeezed Dirac and Topological Magnons in a Bosonic Honeycomb Optical Lattice.
Owerre, Solomon; Nsofini, Joachim
2017-09-20
Quantum information storage using charge-neutral quasiparticles are expected to play a crucial role in the future of quantum computers. In this regard, magnons or collective spin-wave excitations in solid-state materials are promising candidates in the future of quantum computing. Here, we study the quantum squeezing of Dirac and topological magnons in a bosonic honeycomb optical lattice with spin-orbit interaction by utilizing the mapping to quantum spin-$1/2$ XYZ Heisenberg model on the honeycomb lattice with discrete Z$_2$ symmetry and a Dzyaloshinskii-Moriya interaction. We show that the squeezed magnons can be controlled by the Z$_2$ anisotropy and demonstrate how the noise in the system is periodically modified in the ferromagnetic and antiferromagnetic phases of the model. Our results also apply to solid-state honeycomb (anti)ferromagnetic insulators. . © 2017 IOP Publishing Ltd.
Squeezed Dirac and topological magnons in a bosonic honeycomb optical lattice.
Owerre, S A; Nsofini, J
2017-10-19
Quantum information storage using charge-neutral quasiparticles is expected to play a crucial role in the future of quantum computers. In this regard, magnons or collective spin-wave excitations in solid-state materials are promising candidates in the future of quantum computing. Here, we study the quantum squeezing of Dirac and topological magnons in a bosonic honeycomb optical lattice with spin-orbit interaction by utilizing the mapping to quantum spin-[Formula: see text] XYZ Heisenberg model on the honeycomb lattice with discrete Z 2 symmetry and a Dzyaloshinskii-Moriya interaction. We show that the squeezed magnons can be controlled by the Z 2 anisotropy and demonstrate how the noise in the system is periodically modified in the ferromagnetic and antiferromagnetic phases of the model. Our results also apply to solid-state honeycomb (anti)ferromagnetic insulators.
Can An Evolutionary Process Create English Text?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, David H.
Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed tomore » produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).« less
Computer problem-solving coaches for introductory physics: Design and usability studies
NASA Astrophysics Data System (ADS)
Ryan, Qing X.; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Mason, Andrew
2016-06-01
The combination of modern computing power, the interactivity of web applications, and the flexibility of object-oriented programming may finally be sufficient to create computer coaches that can help students develop metacognitive problem-solving skills, an important competence in our rapidly changing technological society. However, no matter how effective such coaches might be, they will only be useful if they are attractive to students. We describe the design and testing of a set of web-based computer programs that act as personal coaches to students while they practice solving problems from introductory physics. The coaches are designed to supplement regular human instruction, giving students access to effective forms of practice outside class. We present results from large-scale usability tests of the computer coaches and discuss their implications for future versions of the coaches.
Facial Animations: Future Research Directions & Challenges
NASA Astrophysics Data System (ADS)
Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Rehman, Amjad; Basori, Ahmad Hoirul
2014-06-01
Nowadays, computer facial animation is used in a significant multitude fields that brought human and social to study the computer games, films and interactive multimedia reality growth. Authoring the computer facial animation, complex and subtle expressions are challenging and fraught with problems. As a result, the current most authored using universal computer animation techniques often limit the production quality and quantity of facial animation. With the supplement of computer power, facial appreciative, software sophistication and new face-centric methods emerging are immature in nature. Therefore, this paper concentrates to define and managerially categorize current and emerged surveyed facial animation experts to define the recent state of the field, observed bottlenecks and developing techniques. This paper further presents a real-time simulation model of human worry and howling with detail discussion about their astonish, sorrow, annoyance and panic perception.
Future computing platforms for science in a power constrained era
Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; ...
2015-12-23
Power consumption will be a key constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics (HEP). This makes performance-per-watt a crucial metric for selecting cost-efficient computing solutions. For this paper, we have done a wide survey of current and emerging architectures becoming available on the market including x86-64 variants, ARMv7 32-bit, ARMv8 64-bit, Many-Core and GPU solutions, as well as newer System-on-Chip (SoC) solutions. We compare performance and energy efficiency using an evolving set of standardized HEP-related benchmarks and power measurement techniques we have been developing. In conclusion, we evaluate the potentialmore » for use of such computing solutions in the context of DHTC systems, such as the Worldwide LHC Computing Grid (WLCG).« less
Future Directions in Computer Graphics and Visualization: From CG&A's Editorial Board
DOE Office of Scientific and Technical Information (OSTI.GOV)
Encarnacao, L. M.; Chuang, Yung-Yu; Stork, Andre
2015-01-01
With many new members joining the CG&A editorial board over the past year, and with a renewed commitment to not only document the state of the art in computer graphics research and applications but to anticipate and where possible foster future areas of scientific discourse and industrial practice, we asked editorial and advisory council members about where they see their fields of expertise going. The answers compiled here aren’t meant to be all encompassing or deterministic when it comes to the opportunities computer graphics and interactive visualization hold for the future. Instead, we aim to accomplish two things: give amore » more in-depth introduction of members of the editorial board to the CG&A readership and encourage cross-disciplinary discourse toward approaching, complementing, or disputing the visions laid out in this compilation.« less
Workshops of the Fifth International Brain-Computer Interface Meeting: Defining the Future
Huggins, Jane E.; Guger, Christoph; Allison, Brendan; Anderson, Charles W.; Batista, Aaron; Brouwer, Anne-Marie (A.-M.); Brunner, Clemens; Chavarriaga, Ricardo; Fried-Oken, Melanie; Gunduz, Aysegul; Gupta, Disha; Kübler, Andrea; Leeb, Robert; Lotte, Fabien; Miller, Lee E.; Müller-Putz, Gernot; Rutkowski, Tomasz; Tangermann, Michael; Thompson, David Edward
2014-01-01
The Fifth International Brain-Computer Interface (BCI) Meeting met June 3–7th, 2013 at the Asilomar Conference Grounds, Pacific Grove, California. The conference included 19 workshops covering topics in brain-computer interface and brain-machine interface research. Topics included translation of BCIs into clinical use, standardization and certification, types of brain activity to use for BCI, recording methods, the effects of plasticity, special interest topics in BCIs applications, and future BCI directions. BCI research is well established and transitioning to practical use to benefit people with physical impairments. At the same time, new applications are being explored, both for people with physical impairments and beyond. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and high-lighting important issues for future research and development. PMID:25485284
Explore the Future: Will Books Have a Place in the Computer Classroom?
ERIC Educational Resources Information Center
Jobe, Ronald A.
The question of the place of books in a classroom using computers appears to be simple, yet it is one of vital concern to teachers. The availability of programs (few of which focus on literary appreciation), the mesmerizing qualities of the computer, its distortion of time, the increasing power of computers over teacher time, and the computer's…
INCREASING SAVING BEHAVIOR THROUGH AGE-PROGRESSED RENDERINGS OF THE FUTURE SELF
HERSHFIELD, HAL E.; GOLDSTEIN, DANIEL G.; SHARPE, WILLIAM F.; FOX, JESSE; YEYKELIS, LEO; CARSTENSEN, LAURA L.; BAILENSON, JEREMY N.
2014-01-01
Many people fail to save what they need to for retirement (Munnell, Webb, and Golub-Sass 2009). Research on excessive discounting of the future suggests that removing the lure of immediate rewards by pre-committing to decisions, or elaborating the value of future rewards can both make decisions more future-oriented. In this article, we explore a third and complementary route, one that deals not with present and future rewards, but with present and future selves. In line with thinkers who have suggested that people may fail, through a lack of belief or imagination, to identify with their future selves (Parfit 1971; Schelling 1984), we propose that allowing people to interact with age-progressed renderings of themselves will cause them to allocate more resources toward the future. In four studies, participants interacted with realistic computer renderings of their future selves using immersive virtual reality hardware and interactive decision aids. In all cases, those who interacted with virtual future selves exhibited an increased tendency to accept later monetary rewards over immediate ones. PMID:24634544
NASA Astrophysics Data System (ADS)
Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi
2015-01-01
We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.
The Southern Forest Futures Project: summary report
David N. Wear; John G. Greis
2012-01-01
The Southern Forest Futures Project provides a science-based âfuturingâ analysis of the forests of the 13 States of the Southeastern United States. With findings organized in a set of scenarios and using a combination of computer models and science synthesis, the authors of the Southern Forest Futures Project examine a variety of possible futures that could shape...
Program on application of communications satellites to educational development
NASA Technical Reports Server (NTRS)
Morgan, R. P.; Singh, J. P.
1971-01-01
Interdisciplinary research in needs analysis, communications technology studies, and systems synthesis is reported. Existing and planned educational telecommunications services are studied and library utilization of telecommunications is described. Preliminary estimates are presented of ranges of utilization of educational telecommunications services for 1975 and 1985; instructional and public television, computer-aided instruction, computing resources, and information resource sharing for various educational levels and purposes. Communications technology studies include transmission schemes for still-picture television, use of Gunn effect devices, and TV receiver front ends for direct satellite reception at 12 GHz. Two major studies in the systems synthesis project concern (1) organizational and administrative aspects of a large-scale instructional satellite system to be used with schools and (2) an analysis of future development of instructional television, with emphasis on the use of video tape recorders and cable television. A communications satellite system synthesis program developed for NASA is now operational on the university IBM 360-50 computer.
Computational Modeling of Cobalt-Based Water Oxidation: Current Status and Future Challenges
Schilling, Mauro; Luber, Sandra
2018-01-01
A lot of effort is nowadays put into the development of novel water oxidation catalysts. In this context, mechanistic studies are crucial in order to elucidate the reaction mechanisms governing this complex process, new design paradigms and strategies how to improve the stability and efficiency of those catalysts. This review is focused on recent theoretical mechanistic studies in the field of homogeneous cobalt-based water oxidation catalysts. In the first part, computational methodologies and protocols are summarized and evaluated on the basis of their applicability toward real catalytic or smaller model systems, whereby special emphasis is laid on the choice of an appropriate model system. In the second part, an overview of mechanistic studies is presented, from which conceptual guidelines are drawn on how to approach novel studies of catalysts and how to further develop the field of computational modeling of water oxidation reactions. PMID:29721491
Computational Modeling of Cobalt-based Water Oxidation: Current Status and Future Challenges
NASA Astrophysics Data System (ADS)
Schilling, Mauro; Luber, Sandra
2018-04-01
A lot of effort is nowadays put into the development of novel water oxidation catalysts. In this context mechanistic studies are crucial in order to elucidate the reaction mechanisms governing this complex process, new design paradigms and strategies how to improve the stability and efficiency of those catalysis. This review is focused on recent theoretical mechanistic studies in the field of homogeneous cobalt-based water oxidation catalysts. In the first part, computational methodologies and protocols are summarized and evaluated on the basis of their applicability towards real catalytic or smaller model systems, whereby special emphasis is laid on the choice of an appropriate model system. In the second part, an overview of mechanistic studies is presented, from which conceptual guidelines are drawn on how to approach novel studies of catalysts and how to further develop the field of computational modeling of water oxidation reactions.
ePatient Conference Explores Future of Personalized Medicine | NIH MedlinePlus the Magazine
... Javascript on. ePatient Conference Explores Future of Personalized Medicine Past Issues / Spring - Summer 2010 Table of Contents ... better in the digital future? What is personalized medicine? Some of the nation's top health researchers, computer ...
Is There Computer Graphics after Multimedia?
ERIC Educational Resources Information Center
Booth, Kellogg S.
Computer graphics has been driven by the desire to generate real-time imagery subject to constraints imposed by the human visual system. The future of computer graphics, when off-the-shelf systems have full multimedia capability and when standard computing engines render imagery faster than real-time, remains to be seen. A dedicated pipeline for…
Computer Skills Acquisition: A Review and Future Directions for Research.
ERIC Educational Resources Information Center
Gattiker, Urs E.
A review of past research on training employees for computer-mediated work leads to the development of theory and propositions concerning the relationship between different variables, such as: (1) individual factors; (2) task and person-computer interface; (3) characteristics of training design for the acquisition of computer skills; and (4) the…
Future Computer Requirements for Computational Aerodynamics
NASA Technical Reports Server (NTRS)
1978-01-01
Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.
Towards future high performance computing: What will change? How can we be efficient?
NASA Astrophysics Data System (ADS)
Düben, Peter
2017-04-01
How can we make the most out of "exascale" supercomputers that will be available soon and enable us to calculate an amazing number of 1,000,000,000,000,000,000 real numbers operations within a single second? How do we need to design applications to use these machines efficiently? What are the limits? We will discuss opportunities and limits of the use of future high performance computers from the perspective of Earth System Modelling. We will provide an overview about future challenges and outline how numerical application will need to be changed to run efficiently on supercomputers in the future. We will also discuss how different disciplines can support each other and talk about data handling and numerical precision of data.
On Target Localization Using Combined RSS and AoA Measurements
Beko, Marko; Dinis, Rui
2018-01-01
This work revises existing solutions for a problem of target localization in wireless sensor networks (WSNs), utilizing integrated measurements, namely received signal strength (RSS) and angle of arrival (AoA). The problem of RSS/AoA-based target localization became very popular in the research community recently, owing to its great applicability potential and relatively low implementation cost. Therefore, here, a comprehensive study of the state-of-the-art (SoA) solutions and their detailed analysis is presented. The beginning of this work starts by considering the SoA approaches based on convex relaxation techniques (more computationally complex in general), and it goes through other (less computationally complex) approaches, as well, such as the ones based on the generalized trust region sub-problems framework and linear least squares. Furthermore, a detailed analysis of the computational complexity of each solution is reviewed. Furthermore, an extensive set of simulation results is presented. Finally, the main conclusions are summarized, and a set of future aspects and trends that might be interesting for future research in this area is identified. PMID:29671832
NASA Technical Reports Server (NTRS)
Fischer, James R.; Grosch, Chester; Mcanulty, Michael; Odonnell, John; Storey, Owen
1987-01-01
NASA's Office of Space Science and Applications (OSSA) gave a select group of scientists the opportunity to test and implement their computational algorithms on the Massively Parallel Processor (MPP) located at Goddard Space Flight Center, beginning in late 1985. One year later, the Working Group presented its report, which addressed the following: algorithms, programming languages, architecture, programming environments, the way theory relates, and performance measured. The findings point to a number of demonstrated computational techniques for which the MPP architecture is ideally suited. For example, besides executing much faster on the MPP than on conventional computers, systolic VLSI simulation (where distances are short), lattice simulation, neural network simulation, and image problems were found to be easier to program on the MPP's architecture than on a CYBER 205 or even a VAX. The report also makes technical recommendations covering all aspects of MPP use, and recommendations concerning the future of the MPP and machines based on similar architectures, expansion of the Working Group, and study of the role of future parallel processors for space station, EOS, and the Great Observatories era.
Current concepts and future perspectives in computer-assisted navigated total knee replacement.
Matsumoto, Tomoyuki; Nakano, Naoki; Lawrence, John E; Khanduja, Vikas
2018-05-12
Total knee replacements (TKR) aim to restore stability of the tibiofemoral and patella-femoral joints and provide relief of pain and improved quality of life for the patient. In recent years, computer-assisted navigation systems have been developed with the aim of reducing human error in joint alignment and improving patient outcomes. We examined the current body of evidence surrounding the use of navigation systems and discussed their current and future role in TKR. The current body of evidence shows that the use of computer navigation systems for TKR significantly reduces outliers in the mechanical axis and coronal prosthetic position. Also, navigation systems offer an objective assessment of soft tissue balancing that had previously not been available. Although these benefits represent a technical superiority to conventional TKR techniques, there is limited evidence to show long-term clinical benefit with the use of navigation systems, with only a small number of studies showing improvement in outcome scores at short-term follow-up. Because of the increased costs and operative time associated with their use as well as the emergence of more affordable and patient-specific technologies, it is unlikely for navigation systems to become more widely used in the near future. Whilst this technology helps surgeons to achieve improved component positioning, it is important to consider the clinical and functional implications, as well as the added costs and potential learning curve associated with adopting new technology.
Kahler, Christopher W; Lechner, William J; MacGlashan, James; Wray, Tyler B; Littman, Michael L
2017-06-28
Computer-delivered interventions have been shown to be effective in reducing alcohol consumption in heavy drinking college students. However, these computer-delivered interventions rely on mouse, keyboard, or touchscreen responses for interactions between the users and the computer-delivered intervention. The principles of motivational interviewing suggest that in-person interventions may be effective, in part, because they encourage individuals to think through and speak aloud their motivations for changing a health behavior, which current computer-delivered interventions do not allow. The objective of this study was to take the initial steps toward development of a voice-based computer-delivered intervention that can ask open-ended questions and respond appropriately to users' verbal responses, more closely mirroring a human-delivered motivational intervention. We developed (1) a voice-based computer-delivered intervention that was run by a human controller and that allowed participants to speak their responses to scripted prompts delivered by speech generation software and (2) a text-based computer-delivered intervention that relied on the mouse, keyboard, and computer screen for all interactions. We randomized 60 heavy drinking college students to interact with the voice-based computer-delivered intervention and 30 to interact with the text-based computer-delivered intervention and compared their ratings of the systems as well as their motivation to change drinking and their drinking behavior at 1-month follow-up. Participants reported that the voice-based computer-delivered intervention engaged positively with them in the session and delivered content in a manner consistent with motivational interviewing principles. At 1-month follow-up, participants in the voice-based computer-delivered intervention condition reported significant decreases in quantity, frequency, and problems associated with drinking, and increased perceived importance of changing drinking behaviors. In comparison to the text-based computer-delivered intervention condition, those assigned to voice-based computer-delivered intervention reported significantly fewer alcohol-related problems at the 1-month follow-up (incident rate ratio 0.60, 95% CI 0.44-0.83, P=.002). The conditions did not differ significantly on perceived importance of changing drinking or on measures of drinking quantity and frequency of heavy drinking. Results indicate that it is feasible to construct a series of open-ended questions and a bank of responses and follow-up prompts that can be used in a future fully automated voice-based computer-delivered intervention that may mirror more closely human-delivered motivational interventions to reduce drinking. Such efforts will require using advanced speech recognition capabilities and machine-learning approaches to train a program to mirror the decisions made by human controllers in the voice-based computer-delivered intervention used in this study. In addition, future studies should examine enhancements that can increase the perceived warmth and empathy of voice-based computer-delivered intervention, possibly through greater personalization, improvements in the speech generation software, and embodying the computer-delivered intervention in a physical form. ©Christopher W Kahler, William J Lechner, James MacGlashan, Tyler B Wray, Michael L Littman. Originally published in JMIR Mental Health (http://mental.jmir.org), 28.06.2017.
Korpinen, Leena; Pääkkönen, Rauno; Gobba, Fabriziomaria
2013-01-01
The purpose of this study was to investigate the possible relation between self-reported neck symptoms (aches, pain or numbness) and use of computers/cell phones. The study was carried out as a cross-sectional study by posting a questionnaire to 15,000 working-age persons, and 15.1% of all respondents (6121) reported that they very often experienced physical symptoms in the neck. The results showed that they also had many other symptoms very often, and 49% used a computer daily at work and 83.9% used cell phones. We compared physical/mental symptoms of persons with symptoms in the neck quite often or more, with others. We found significant differences in the physical/mental symptoms and use of cell phones and computers. The results suggest taking into account in the future that those persons' symptoms in the neck can be associated with use of cell phones or computers. We investigated the possible relation between neck symptoms and use of computers/cell phones. We found that persons, who very often had symptoms in the neck, had also other symptoms very often (e.g. exhaustion at work). Their use of information and communication technology (e.g. computers) can associate with their symptoms.
A socioeconomic related 'digital divide' exists in how, not if, young people use computers.
Harris, Courtenay; Straker, Leon; Pollock, Clare
2017-01-01
Government initiatives have tried to ensure uniform computer access for young people; however a divide related to socioeconomic status (SES) may still exist in the nature of information technology (IT) use. This study aimed to investigate this relationship in 1,351 Western Australian children between 6 and 17 years of age. All participants had computer access at school and 98.9% at home. Neighbourhood SES was related to computer use, IT activities, playing musical instruments, and participating in vigorous physical activity. Participants from higher SES neighbourhoods were more exposed to school computers, reading, playing musical instruments, and vigorous physical activity. Participants from lower SES neighbourhoods were more exposed to TV, electronic games, mobile phones, and non-academic computer activities at home. These patterns may impact future economic, academic, and health outcomes. Better insight into neighbourhood SES influences will assist in understanding and managing the impact of computer use on young people's health and development.
A socioeconomic related 'digital divide' exists in how, not if, young people use computers
2017-01-01
Government initiatives have tried to ensure uniform computer access for young people; however a divide related to socioeconomic status (SES) may still exist in the nature of information technology (IT) use. This study aimed to investigate this relationship in 1,351 Western Australian children between 6 and 17 years of age. All participants had computer access at school and 98.9% at home. Neighbourhood SES was related to computer use, IT activities, playing musical instruments, and participating in vigorous physical activity. Participants from higher SES neighbourhoods were more exposed to school computers, reading, playing musical instruments, and vigorous physical activity. Participants from lower SES neighbourhoods were more exposed to TV, electronic games, mobile phones, and non-academic computer activities at home. These patterns may impact future economic, academic, and health outcomes. Better insight into neighbourhood SES influences will assist in understanding and managing the impact of computer use on young people’s health and development. PMID:28362868
ERIC Educational Resources Information Center
Rensselaer Research Corp., Troy, NY.
The purpose of this study was to develop the schema and methodology for the construction of a computerized mathematical model designed to project college and university enrollments in New York State and to meet the future increased demands of higher education planners. This preliminary report describes the main structure of the proposed computer…
An Exploratory Study of a Measure of Vocational Identity for Spanish-Speaking Persons
ERIC Educational Resources Information Center
Tosado, Luis Antonio, II
2012-01-01
Two overlapping issues have given rise to this study: the need for assessment instruments to use with Spanish-speaking Latinos and the need for normative data on current and future Spanish-language instruments. Numerous career assessment instruments exist for the English-speaking population. These instruments may be administered on computer-based…
The Semiconductor Industry and Emerging Technologies: A Study Using a Modified Delphi Method
ERIC Educational Resources Information Center
Jordan, Edgar A.
2010-01-01
The purpose of this qualitative descriptive study was to determine what leaders in the semiconductor industry thought the future of computing would look like and what emerging materials showed the most promise to overcome the current theoretical limit of 10 nanometers for silicon dioxide. The researcher used a modified Delphi technique in two…
NASA Technical Reports Server (NTRS)
Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.
1993-01-01
A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.
NASA Astrophysics Data System (ADS)
Hoeke, R. K.; Reyns, J.; O'Grady, J.; Becker, J. M.; Merrifield, M. A.; Roelvink, J. A.
2016-02-01
Oceanic islands are widely perceived as vulnerable to sea level rise and are characterized by steep nearshore topography and fringing reefs. In such settings, near shore dynamics and (non-tidal) water level variability tends to be dominated by wind-wave processes. These processes are highly sensitive to reef morphology and roughness and to regional wave climate. Thus sea level extremes tend to be highly localized and their likelihood can be expected to change in the future (beyond simple extrapolation of sea level rise scenarios): e.g. sea level rise may increase the effective mean depth of reef crests and flats and ocean acidification and/or increased temperatures may lead to changes in reef structure. The problem is sufficiently complex that analytic or numerical approaches are necessary to estimate current hazards and explore potential future changes. In this study, we evaluate the capacity of several analytic/empirical approaches and phase-averaged and phase-resolved numerical models at sites in the insular tropical Pacific. We consider their ability to predict time-averaged wave setup and instantaneous water level exceedance probability (or dynamic wave run-up) as well as computational cost; where possible, we compare the model results with in situ observations from a number of previous studies. Preliminary results indicate analytic approaches are by far the most computationally efficient, but tend to perform poorly when alongshore straight and parallel morphology cannot be assumed. Phase-averaged models tend to perform well with respect to wave setup in such situations, but are unable to predict processes related to individual waves or wave groups, such as infragravity motions or wave run-up. Phase-resolved models tend to perform best, but come at high computational cost, an important consideration when exploring possible future scenarios. A new approach of combining an unstructured computational grid with a quasi-phase averaged approach (i.e. only phase resolving motions below a frequency cutoff) shows promise as a good compromise between computational efficiency and resolving processes such as wave runup and overtopping in more complex bathymetric situations.
Future Directions: Advances and Implications of Virtual Environments Designed for Pain Management
Soomro, Ahmad; Riva, Giuseppe; Wiederhold, Mark D.
2014-01-01
Abstract Pain symptoms have been addressed with a variety of therapeutic measures in the past, but as we look to the future, we begin encountering new options for patient care and individual health and well-being. Recent studies indicate that computer-generated graphic environments—virtual reality (VR)—can offer effective cognitive distractions for individuals suffering from pain arising from a variety of physical and psychological illnesses. Studies also indicate the effectiveness of VR for both chronic and acute pain conditions. Future possibilities for VR to address pain-related concerns include such diverse groups as military personnel, space exploration teams, the general labor force, and our ever increasing elderly population. VR also shows promise to help in such areas as drug abuse, at-home treatments, and athletic injuries. PMID:24892206
Future directions: advances and implications of virtual environments designed for pain management.
Wiederhold, Brenda K; Soomro, Ahmad; Riva, Giuseppe; Wiederhold, Mark D
2014-06-01
Pain symptoms have been addressed with a variety of therapeutic measures in the past, but as we look to the future, we begin encountering new options for patient care and individual health and well-being. Recent studies indicate that computer-generated graphic environments--virtual reality (VR)--can offer effective cognitive distractions for individuals suffering from pain arising from a variety of physical and psychological illnesses. Studies also indicate the effectiveness of VR for both chronic and acute pain conditions. Future possibilities for VR to address pain-related concerns include such diverse groups as military personnel, space exploration teams, the general labor force, and our ever increasing elderly population. VR also shows promise to help in such areas as drug abuse, at-home treatments, and athletic injuries.
Brain-computer interfaces in the continuum of consciousness.
Kübler, Andrea; Kotchoubey, Boris
2007-12-01
To summarize recent developments and look at important future aspects of brain-computer interfaces. Recent brain-computer interface studies are largely targeted at helping severely or even completely paralysed patients. The former are only able to communicate yes or no via a single muscle twitch, and the latter are totally nonresponsive. Such patients can control brain-computer interfaces and use them to select letters, words or items on a computer screen, for neuroprosthesis control or for surfing the Internet. This condition of motor paralysis, in which cognition and consciousness appear to be unaffected, is traditionally opposed to nonresponsiveness due to disorders of consciousness. Although these groups of patients may appear to be very alike, numerous transition states between them are demonstrated by recent studies. All nonresponsive patients can be regarded on a continuum of consciousness which may vary even within short time periods. As overt behaviour is lacking, cognitive functions in such patients can only be investigated using neurophysiological methods. We suggest that brain-computer interfaces may provide a new tool to investigate cognition in disorders of consciousness, and propose a hierarchical procedure entailing passive stimulation, active instructions, volitional paradigms, and brain-computer interface operation.
NASA Astrophysics Data System (ADS)
Tian, Xin; Negenborn, Rudy R.; van Overloop, Peter-Jules; María Maestre, José; Sadowska, Anna; van de Giesen, Nick
2017-11-01
Model Predictive Control (MPC) is one of the most advanced real-time control techniques that has been widely applied to Water Resources Management (WRM). MPC can manage the water system in a holistic manner and has a flexible structure to incorporate specific elements, such as setpoints and constraints. Therefore, MPC has shown its versatile performance in many branches of WRM. Nonetheless, with the in-depth understanding of stochastic hydrology in recent studies, MPC also faces the challenge of how to cope with hydrological uncertainty in its decision-making process. A possible way to embed the uncertainty is to generate an Ensemble Forecast (EF) of hydrological variables, rather than a deterministic one. The combination of MPC and EF results in a more comprehensive approach: Multi-scenario MPC (MS-MPC). In this study, we will first assess the model performance of MS-MPC, considering an ensemble streamflow forecast. Noticeably, the computational inefficiency may be a critical obstacle that hinders applicability of MS-MPC. In fact, with more scenarios taken into account, the computational burden of solving an optimization problem in MS-MPC accordingly increases. To deal with this challenge, we propose the Adaptive Control Resolution (ACR) approach as a computationally efficient scheme to practically reduce the number of control variables in MS-MPC. In brief, the ACR approach uses a mixed-resolution control time step from the near future to the distant future. The ACR-MPC approach is tested on a real-world case study: an integrated flood control and navigation problem in the North Sea Canal of the Netherlands. Such an approach reduces the computation time by 18% and up in our case study. At the same time, the model performance of ACR-MPC remains close to that of conventional MPC.
Stone, John E.; Hallock, Michael J.; Phillips, James C.; Peterson, Joseph R.; Luthey-Schulten, Zaida; Schulten, Klaus
2016-01-01
Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers. PMID:27516922
Self-Reported Ache, Pain, or Numbness in Feet and Use of Computers amongst Working-Age Finns
Korpinen, Leena; Pääkkönen, Rauno; Gobba, Fabriziomaria
2016-01-01
The use of the computers and other technical devices has increased. The aim of our work was to study the possible relation between self-reported foot symptoms and use of computers and cell phones using a questionnaire. The study was carried out as a cross-sectional study by posting a questionnaire to 15,000 working-age Finns. A total of 6121 responded, and 7.1% of respondents reported that they very often experienced pain, numbness, and aches in the feet. They also often experienced other symptoms: 52.3% had symptoms in the neck, 53.5% in had problems in the hip and lower back, and 14.6% often had sleeping disorders/disturbances. Only 11.2% of the respondents thought that their symptoms were connected to the use of desktop computers. We found that persons with symptoms in the feet quite often, or more often, had additional physical and mental symptoms. In future studies, it is important to take into account that the persons with symptoms in the feet may very often have other symptoms, and the use of computers can influence these symptoms. PMID:27827987
Computer network environment planning and analysis
NASA Technical Reports Server (NTRS)
Dalphin, John F.
1989-01-01
The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.
Touching the Future by Training Students as Technology Workers.
ERIC Educational Resources Information Center
Wodarz, Nan
1999-01-01
Describes a technology consultant's training of promising students as network administrators as part of a high-school work-study program. Success hinged on combining work with education, providing supervision and mentoring, using knowledgeable trainers, not substituting students for staff shortcomings, and installing adequate computer security.…
Language Learning in Virtual Reality Environments: Past, Present, and Future
ERIC Educational Resources Information Center
Lin, Tsun-Ju; Lan, Yu-Ju
2015-01-01
This study investigated the research trends in language learning in a virtual reality environment by conducting a content analysis of findings published in the literature from 2004 to 2013 in four top ranked computer-assisted language learning journals: "Language Learning & Technology," "CALICO Journal," "Computer…
On computational methods for crashworthiness
NASA Technical Reports Server (NTRS)
Belytschko, T.
1992-01-01
The evolution of computational methods for crashworthiness and related fields is described and linked with the decreasing cost of computational resources and with improvements in computation methodologies. The latter includes more effective time integration procedures and more efficient elements. Some recent developments in methodologies and future trends are also summarized. These include multi-time step integration (or subcycling), further improvements in elements, adaptive meshes, and the exploitation of parallel computers.
NASA Astrophysics Data System (ADS)
Cao, Chao
2009-03-01
Nano-scale physical phenomena and processes, especially those in electronics, have drawn great attention in the past decade. Experiments have shown that electronic and transport properties of functionalized carbon nanotubes are sensitive to adsorption of gas molecules such as H2, NO2, and NH3. Similar measurements have also been performed to study adsorption of proteins on other semiconductor nano-wires. These experiments suggest that nano-scale systems can be useful for making future chemical and biological sensors. Aiming to understand the physical mechanisms underlying and governing property changes at nano-scale, we start off by investigating, via first-principles method, the electronic structure of Pd-CNT before and after hydrogen adsorption, and continue with coherent electronic transport using non-equilibrium Green’s function techniques combined with density functional theory. Once our results are fully analyzed they can be used to interpret and understand experimental data, with a few difficult issues to be addressed. Finally, we discuss a newly developed multi-scale computing architecture, OPAL, that coordinates simultaneous execution of multiple codes. Inspired by the capabilities of this computing framework, we present a scenario of future modeling and simulation of multi-scale, multi-physical processes.
Wallace, Sean; Clark, Marcia; White, Jonathan
2012-01-01
Objective The last decade has seen the introduction of new technology which has transformed many aspects of our culture, commerce, communication and education. This study examined how medical teachers and learners are using mobile computing devices such as the iPhone in medical education and practice, and how they envision them being used in the future. Design Semistructured interviews were conducted with medical students, residents and faculty to examine participants’ attitudes about the current and future use of mobile computing devices in medical education and practice. A thematic approach was used to summarise ideas and concepts expressed, and to develop an online survey. A mixed methods approach was used to integrate qualitative and quantitative findings. Setting and participants Medical students, residents and faculty at a large Canadian medical school in 2011. Results Interviews were conducted with 18 participants (10 students, 7 residents and 1 faculty member). Only 213 participants responded to the online survey (76 students, 65 residents and 41 faculty members). Over 85% of participants reported using a mobile-computing device. The main uses described for mobile devices related to information management, communication and time management. Advantages identified were portability, flexibility, access to multimedia and the ability to look up information quickly. Challenges identified included: superficial learning, not understanding how to find good learning resources, distraction, inappropriate use and concerns about access and privacy. Both medical students and physicians expressed the view that the use of these devices in medical education and practice will increase in the future. Conclusions This new technology offers the potential to enhance learning and patient care, but also has potential problems associated with its use. It is important for leadership in medical schools and healthcare organisations to set the agenda in this rapidly developing area to maximise the benefits of this powerful new technology while avoiding unintended consequences. PMID:22923627
Wallace, Sean; Clark, Marcia; White, Jonathan
2012-01-01
The last decade has seen the introduction of new technology which has transformed many aspects of our culture, commerce, communication and education. This study examined how medical teachers and learners are using mobile computing devices such as the iPhone in medical education and practice, and how they envision them being used in the future. Semistructured interviews were conducted with medical students, residents and faculty to examine participants' attitudes about the current and future use of mobile computing devices in medical education and practice. A thematic approach was used to summarise ideas and concepts expressed, and to develop an online survey. A mixed methods approach was used to integrate qualitative and quantitative findings. Medical students, residents and faculty at a large Canadian medical school in 2011. Interviews were conducted with 18 participants (10 students, 7 residents and 1 faculty member). Only 213 participants responded to the online survey (76 students, 65 residents and 41 faculty members). Over 85% of participants reported using a mobile-computing device. The main uses described for mobile devices related to information management, communication and time management. Advantages identified were portability, flexibility, access to multimedia and the ability to look up information quickly. Challenges identified included: superficial learning, not understanding how to find good learning resources, distraction, inappropriate use and concerns about access and privacy. Both medical students and physicians expressed the view that the use of these devices in medical education and practice will increase in the future. This new technology offers the potential to enhance learning and patient care, but also has potential problems associated with its use. It is important for leadership in medical schools and healthcare organisations to set the agenda in this rapidly developing area to maximise the benefits of this powerful new technology while avoiding unintended consequences.
Progress on the Fabric for Frontier Experiments Project at Fermilab
NASA Astrophysics Data System (ADS)
Box, Dennis; Boyd, Joseph; Dykstra, Dave; Garzoglio, Gabriele; Herner, Kenneth; Kirby, Michael; Kreymer, Arthur; Levshina, Tanya; Mhashilkar, Parag; Sharma, Neha
2015-12-01
The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.
Coleman, Mari Beth; Cherry, Rebecca A; Moore, Tara C; Park, Yujeong; Cihak, David F
2015-06-01
The purpose of this study was to compare the effects of teacher-directed simultaneous prompting to computer-assisted simultaneous prompting for teaching sight words to 3 elementary school students with intellectual disability. Activities in the computer-assisted condition were designed with Intellitools Classroom Suite software whereas traditional materials (i.e., flashcards) were used in the teacher-directed condition. Treatment conditions were compared using an adapted alternating treatments design. Acquisition of sight words occurred in both conditions for all 3 participants; however, each participant either clearly responded better in the teacher-directed condition or reported a preference for the teacher-directed condition when performance was similar with computer-assisted instruction being more efficient. Practical implications and directions for future research are discussed.
Advances in computer-aided well-test interpretation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horne, R.N.
1994-07-01
Despite the feeling expressed several times over the past 40 years that well-test analysis had reached it peak development, an examination of recent advances shows continuous expansion in capability, with future improvement likely. The expansion in interpretation capability over the past decade arose mainly from the development of computer-aided techniques, which, although introduced 20 years ago, have come into use only recently. The broad application of computer-aided interpretation originated with the improvement of the methodologies and continued with the expansion in computer access and capability that accompanied the explosive development of the microcomputer industry. This paper focuses on the differentmore » pieces of the methodology that combine to constitute a computer-aided interpretation and attempts to compare some of the approaches currently used. Future directions of the approach are also discussed. The separate areas discussed are deconvolution, pressure derivatives, model recognition, nonlinear regression, and confidence intervals.« less
Computers in the Classroom: A Feminist Issue.
ERIC Educational Resources Information Center
Stalker, Sylvia
Women stand to lose a great deal in the information revolution if they fail to master computer technology. Presently they are taking little part in the computer field. In the future, access to jobs, information, and power will depend on computer knowledge and skills. Women will be pushed into lower status jobs or out of jobs altogether,…
Closing the Gender Gap: Girls and Computers.
ERIC Educational Resources Information Center
Fuchs, Lucy
While 15 years ago only a few schools had microcomputers, today a majority of public schools have some computers, although an adequate number of computers for students to use is still in the future. Unfortunately, statistics show that, in many states, a higher percentage of male students are enrolled in computer classes than female; boys seem to…
ERIC Educational Resources Information Center
Lesgold, Alan; Reif, Frederick
The future of computers in education and the research needed to realize the computer's potential are discussed in this report, which presents a summary and the conclusions from an invitational conference involving 40 computer scientists, psychologists, educational researchers, teachers, school administrators, and parents. The summary stresses the…
Web N.0, the New Development Trend of Internet
NASA Astrophysics Data System (ADS)
Sun, Zhiguo; Wang, Wensheng
This article analyzes the Internet basic theory, the network foundation environment and the user behavior change and so on, Which analyzes the development tendency of existing partial Internet products in the future Internet environment. The article also hot on the concept of cloud computing, Demonstrates the relation between Cloud Computing and Web 2.0 from the angle of Cloud-based end-user applications, The possibly killing application in the future was discussed.
ERIC Educational Resources Information Center
Friedman, Stan, Sr.
2004-01-01
This article describes the results of the 19th annual Computers in Libraries Conference in Washington, DC on March 10-12, 2004. The conference peered into the future, drew lessons from the past, and ran like clockwork. Program chair Jane Dysart and her organizing committee are by now old hands, bringing together three keynote addresses, 100…
Advanced laptop and small personal computer technology
NASA Technical Reports Server (NTRS)
Johnson, Roger L.
1991-01-01
Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.
Computer Conferencing: A Campus Meets Online.
ERIC Educational Resources Information Center
Tooey, Mary Joan; Wester, Beverly R.
1989-01-01
Describes the implementation and use of a computer conferencing system at the University of Maryland at Baltimore. The discussion covers the pros and cons of computer conferencing in general, an informal evaluation of the system at Baltimore, and some predictions for future enhancements and utilization. (CLB)
The Macintosh Based Design Studio.
ERIC Educational Resources Information Center
Earle, Daniel W., Jr.
1988-01-01
Describes the configuration of a workstation for a college design studio based on the Macintosh Plus microcomputer. Highlights include cost estimates, computer hardware peripherals, computer aided design software, networked studios, and potentials for new approaches to design activity in the computer based studio of the future. (Author/LRW)
The Metamorphosis of an Introduction to Computer Science.
ERIC Educational Resources Information Center
Ben-Jacob, Marion G.
1997-01-01
Introductory courses in computer science at colleges and universities have undergone significant changes in 20 years. This article provides an overview of the history of introductory computer science (FORTRAN, ANSI flowchart symbols, BASIC, data processing concepts, and PASCAL) and its future (robotics and C++). (PEN)
Output Devices, Computation, and the Future of Mathematical Crafts.
ERIC Educational Resources Information Center
Eisenberg, Michael
2002-01-01
The advent of powerful, affordable output devices offers the potential for a vastly expanded landscape of computationally-enriched mathematical craft activities in education. Craft activities have both intellectual and emotional affordances that are relatively lacking in "traditional" computer-based education. Describes three software applications…
A Plan for Community College Instructional Computing.
ERIC Educational Resources Information Center
Howard, Alan; And Others
This document presents a comprehensive plan for future growth in instructional computing in the Washington community colleges. Two chapters define the curriculum objectives and content recommended for instructional courses in the community colleges which require access to computing facilities. The courses described include data processing…
The Fabric for Frontier Experiments Project at Fermilab
NASA Astrophysics Data System (ADS)
Kirby, Michael
2014-06-01
The FabrIc for Frontier Experiments (FIFE) project is a new, far-reaching initiative within the Fermilab Scientific Computing Division to drive the future of computing services for experiments at FNAL and elsewhere. It is a collaborative effort between computing professionals and experiment scientists to produce an end-to-end, fully integrated set of services for computing on the grid and clouds, managing data, accessing databases, and collaborating within experiments. FIFE includes 1) easy to use job submission services for processing physics tasks on the Open Science Grid and elsewhere; 2) an extensive data management system for managing local and remote caches, cataloging, querying, moving, and tracking the use of data; 3) custom and generic database applications for calibrations, beam information, and other purposes; 4) collaboration tools including an electronic log book, speakers bureau database, and experiment membership database. All of these aspects will be discussed in detail. FIFE sets the direction of computing at Fermilab experiments now and in the future, and therefore is a major driver in the design of computing services worldwide.
The future of scientific workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deelman, Ewa; Peterka, Tom; Altintas, Ilkay
Today’s computational, experimental, and observational sciences rely on computations that involve many related tasks. The success of a scientific mission often hinges on the computer automation of these workflows. In April 2015, the US Department of Energy (DOE) invited a diverse group of domain and computer scientists from national laboratories supported by the Office of Science, the National Nuclear Security Administration, from industry, and from academia to review the workflow requirements of DOE’s science and national security missions, to assess the current state of the art in science workflows, to understand the impact of emerging extreme-scale computing systems on thosemore » workflows, and to develop requirements for automated workflow management in future and existing environments. This article is a summary of the opinions of over 50 leading researchers attending this workshop. We highlight use cases, computing systems, workflow needs and conclude by summarizing the remaining challenges this community sees that inhibit large-scale scientific workflows from becoming a mainstream tool for extreme-scale science.« less
Computer simulation: A modern day crystal ball?
NASA Technical Reports Server (NTRS)
Sham, Michael; Siprelle, Andrew
1994-01-01
It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.
Basic concepts and development of an all-purpose computer interface for ROC/FROC observer study.
Shiraishi, Junji; Fukuoka, Daisuke; Hara, Takeshi; Abe, Hiroyuki
2013-01-01
In this study, we initially investigated various aspects of requirements for a computer interface employed in receiver operating characteristic (ROC) and free-response ROC (FROC) observer studies which involve digital images and ratings obtained by observers (radiologists). Secondly, by taking into account these aspects, an all-purpose computer interface utilized for these observer performance studies was developed. Basically, the observer studies can be classified into three paradigms, such as one rating for one case without an identification of a signal location, one rating for one case with an identification of a signal location, and multiple ratings for one case with identification of signal locations. For these paradigms, display modes on the computer interface can be used for single/multiple views of a static image, continuous viewing with cascade images (i.e., CT, MRI), and dynamic viewing of movies (i.e., DSA, ultrasound). Various functions on these display modes, which include windowing (contrast/level), magnifications, and annotations, are needed to be selected by an experimenter corresponding to the purpose of the research. In addition, the rules of judgment for distinguishing between true positives and false positives are an important factor for estimating diagnostic accuracy in an observer study. We developed a computer interface which runs on a Windows operating system by taking into account all aspects required for various observer studies. This computer interface requires experimenters to have sufficient knowledge about ROC/FROC observer studies, but allows its use for any purpose of the observer studies. This computer interface will be distributed publicly in the near future.
Becker, Annette; Herzberg, Dominikus; Marsden, Nicola; Thomanek, Sabine; Jung, Hartmut; Leonhardt, Corinna
2011-05-01
To develop a computer-based counselling system (CBCS) for the improvement of attitudes towards physical activity in chronically ill patients and to pilot its efficacy and acceptance in primary care. The system is tailored to patients' disease and motivational stage. During a pilot study in five German general practices, patients answered questions before, directly and 6 weeks after using the CBCS. Outcome criteria were attitudes and self-efficacy. Qualitative interviews were performed to identify acceptance indicators. Seventy-nine patients participated (mean age: 64.5 years, 53% males; 38% without previous computer experience). Patients' affective and cognitive attitudes changed significantly, self-efficacy showed only minor changes. Patients mentioned no difficulties in interacting with the CBCS. However, perception of the system's usefulness was inconsistent. Computer-based counselling for physical activity related attitudes in patients with chronic diseases is feasible, but the circumstances of use with respect to the target group and its integration into the management process have to be clarified in future studies. This study adds to the understanding of computer-based counselling in primary health care. Acceptance indicators identified in this study will be validated as part of a questionnaire on technology acceptability in a subsequent study. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
WE-B-BRD-01: Innovation in Radiation Therapy Planning II: Cloud Computing in RT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, K; Kagadis, G; Xing, L
As defined by the National Institute of Standards and Technology, cloud computing is “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” Despite the omnipresent role of computers in radiotherapy, cloud computing has yet to achieve widespread adoption in clinical or research applications, though the transition to such “on-demand” access is underway. As this transition proceeds, new opportunities for aggregate studies and efficient use of computational resources are set againstmore » new challenges in patient privacy protection, data integrity, and management of clinical informatics systems. In this Session, current and future applications of cloud computing and distributed computational resources will be discussed in the context of medical imaging, radiotherapy research, and clinical radiation oncology applications. Learning Objectives: Understand basic concepts of cloud computing. Understand how cloud computing could be used for medical imaging applications. Understand how cloud computing could be employed for radiotherapy research.4. Understand how clinical radiotherapy software applications would function in the cloud.« less
Recursive computer architecture for VLSI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Treleaven, P.C.; Hopkins, R.P.
1982-01-01
A general-purpose computer architecture based on the concept of recursion and suitable for VLSI computer systems built from replicated (lego-like) computing elements is presented. The recursive computer architecture is defined by presenting a program organisation, a machine organisation and an experimental machine implementation oriented to VLSI. The experimental implementation is being restricted to simple, identical microcomputers each containing a memory, a processor and a communications capability. This future generation of lego-like computer systems are termed fifth generation computers by the Japanese. 30 references.
Bhavnani, Suresh K; Chavan, Apala L; Jain, Isha; Maroo, Sudhanshoo
2011-01-01
The growing influx of information and communication technologies (ICTs) into rural India provides new opportunities for the prevention and treatment of diseases across millions of residents. However, little is known about how rural Indians with little or no exposure to computers perceive computers and their uses, and how best to elicit those perceptions. Such perceptions could lead to new insights for using ICTs to affect health behavior change in developing countries. We therefore developed a semi-structured interview approach to probe how residents of a north Indian village perceived computers and their uses. The results suggest that besides helping to overturn several assumptions of the researchers through unexpected insights, the approach could be easily implemented in rural settings, which could lead to deeper insights for developing future culturally and medically-relevant ICTs for rural residents.
Computer ethics and teritary level education in Hong Kong
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, E.Y.W.; Davison, R.M.; Wade, P.W.
1994-12-31
This paper seeks to highlight some ethical issues relating to the increasing proliferation of Information Technology into our everyday lives. The authors explain their understanding of computer ethics, and give some reasons why the study of computer ethics is becoming increasingly pertinent. The paper looks at some of the problems that arise in attempting to develop appropriate ethical concepts in a constantly changing environment, and explores some of the ethical dilemmas arising from the increasing use of computers. Some initial research undertaken to explore the ideas and understanding of tertiary level students in Hong Kong on a number of ethicalmore » issues of interest is described, and our findings discussed. We hope that presenting this paper and eliciting subsequent discussion will enable us to draw up more comprehensive guidelines for the teaching of computer related ethics to tertiary level students, as well as reveal some directions for future research.« less
Bhavnani, Suresh K.; Chavan, Apala L.; Jain, Isha; Maroo, Sudhanshoo
2011-01-01
The growing influx of information and communication technologies (ICTs) into rural India provides new opportunities for the prevention and treatment of diseases across millions of residents. However, little is known about how rural Indians with little or no exposure to computers perceive computers and their uses, and how best to elicit those perceptions. Such perceptions could lead to new insights for using ICTs to affect health behavior change in developing countries. We therefore developed a semi-structured interview approach to probe how residents of a north Indian village perceived computers and their uses. The results suggest that besides helping to overturn several assumptions of the researchers through unexpected insights, the approach could be easily implemented in rural settings, which could lead to deeper insights for developing future culturally and medically-relevant ICTs for rural residents. PMID:22195062
Oluwagbemi, Olugbenga O; Adewumi, Adewole; Esuruoso, Abimbola
2012-01-01
Computational biology and bioinformatics are gradually gaining grounds in Africa and other developing nations of the world. However, in these countries, some of the challenges of computational biology and bioinformatics education are inadequate infrastructures, and lack of readily-available complementary and motivational tools to support learning as well as research. This has lowered the morale of many promising undergraduates, postgraduates and researchers from aspiring to undertake future study in these fields. In this paper, we developed and described MACBenAbim (Multi-platform Mobile Application for Computational Biology and Bioinformatics), a flexible user-friendly tool to search for, define and describe the meanings of keyterms in computational biology and bioinformatics, thus expanding the frontiers of knowledge of the users. This tool also has the capability of achieving visualization of results on a mobile multi-platform context. MACBenAbim is available from the authors for non-commercial purposes.
Edwards, Sandra L; Slattery, Martha L; Murtaugh, Maureen A; Edwards, Roger L; Bryner, James; Pearson, Mindy; Rogers, Amy; Edwards, Alison M; Tom-Orme, Lillian
2007-06-01
This article describes the development and usability of an audio computer-assisted self-interviewing (ACASI) questionnaire created to collect dietary, physical activity, medical history, and other lifestyle data in a population of American Indians. Study participants were part of a cohort of American Indians living in the southwestern United States. Data were collected between March 2004 and July 2005. Information for evaluating questionnaire usability and acceptability was collected from three different sources: baseline study data, auxiliary background data, and a short questionnaire administered to a subset of study participants. For the subset of participants, 39.6% reported not having used a computer in the past year. The ACASI questionnaires were well accepted: 96.0% of the subset of participants reported finding them enjoyable to use, 97.2% reported that they were easy to use, and 82.6% preferred them for future questionnaires. A lower educational level and infrequent computer use in the past year were predictors of having usability trouble. These results indicate that the ACASI questionnaire is both an acceptable and a preferable mode of data collection in this population.
Theoretical Study of White Dwarf Double Stars
NASA Astrophysics Data System (ADS)
Hira, Ajit; Koetter, Ted; Rivera, Ruben; Diaz, Juan
2015-04-01
We continue our interest in the computational simulation of the astrophysical phenomena with a study of gravitationally-bound binary stars, composed of at least one white dwarf star. Of particular interest to astrophysicists are the conditions inside a white dwarf star in the time frame leading up to its explosive end as a Type Ia supernova, for an understanding of the massive stellar explosions. In addition, the studies of the evolution of white dwarfs could serve as promising probes of theories of gravitation. We developed FORTRAN computer programs to implement our models for white dwarfs and other stars. These codes allow for different sizes and masses of stars. Simulations were done in the mass interval from 0.1 to 2.0 solar masses. Our goal was to obtain both atmospheric and orbital parameters. The computational results thus obtained are compared with relevant observational data. The data are further analyzed to identify trends in terms of sizes and masses of stars. We hope to extend our computational studies to blue giant stars in the future. Research Supported by National Science Foundation.
ERIC Educational Resources Information Center
Friedman, Adam
2014-01-01
In his 1997 article "Technology and the Social Studies--or: Which Way to the Sleeping Giant?" Peter Martorella made several predictions regarding technology resources in the social studies. Through a 2014 lens, Martorella's Internet seems archaic, yet two of his predictions were particularly poignant and have had a significant impact on…
Workshop on Engineering Turbulence Modeling
NASA Technical Reports Server (NTRS)
Povinelli, Louis A. (Editor); Liou, W. W. (Editor); Shabbir, A. (Editor); Shih, T.-H. (Editor)
1992-01-01
Discussed here is the future direction of various levels of engineering turbulence modeling related to computational fluid dynamics (CFD) computations for propulsion. For each level of computation, there are a few turbulence models which represent the state-of-the-art for that level. However, it is important to know their capabilities as well as their deficiencies in order to help engineers select and implement the appropriate models in their real world engineering calculations. This will also help turbulence modelers perceive the future directions for improving turbulence models. The focus is on one-point closure models (i.e., from algebraic models to higher order moment closure schemes and partial differential equation methods) which can be applied to CFD computations. However, other schemes helpful in developing one-point closure models, are also discussed.
Brain-computer interfaces in medicine.
Shih, Jerry J; Krusienski, Dean J; Wolpaw, Jonathan R
2012-03-01
Brain-computer interfaces (BCIs) acquire brain signals, analyze them, and translate them into commands that are relayed to output devices that carry out desired actions. BCIs do not use normal neuromuscular output pathways. The main goal of BCI is to replace or restore useful function to people disabled by neuromuscular disorders such as amyotrophic lateral sclerosis, cerebral palsy, stroke, or spinal cord injury. From initial demonstrations of electroencephalography-based spelling and single-neuron-based device control, researchers have gone on to use electroencephalographic, intracortical, electrocorticographic, and other brain signals for increasingly complex control of cursors, robotic arms, prostheses, wheelchairs, and other devices. Brain-computer interfaces may also prove useful for rehabilitation after stroke and for other disorders. In the future, they might augment the performance of surgeons or other medical professionals. Brain-computer interface technology is the focus of a rapidly growing research and development enterprise that is greatly exciting scientists, engineers, clinicians, and the public in general. Its future achievements will depend on advances in 3 crucial areas. Brain-computer interfaces need signal-acquisition hardware that is convenient, portable, safe, and able to function in all environments. Brain-computer interface systems need to be validated in long-term studies of real-world use by people with severe disabilities, and effective and viable models for their widespread dissemination must be implemented. Finally, the day-to-day and moment-to-moment reliability of BCI performance must be improved so that it approaches the reliability of natural muscle-based function. Copyright © 2012 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.
Quantum chemistry simulation on quantum computers: theories and experiments.
Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng
2012-07-14
It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.
NASA Astrophysics Data System (ADS)
Lugmayr, Artur
2006-02-01
The research field of ambient media starts to spread rapidly and first applications for consumer homes are on the way. Ambient media is the logical continuation of research around media. Media has been evolving from old media (e.g. print media), to integrated presentation in one form (multimedia - or new media), to generating a synthetic world (virtual reality), to the natural environment is the user-interface (ambient media), and will be evolving towards real/synthetic undistinguishable media (bio-media or bio-multimedia). After the IT bubble was bursting, multimedia was lacking a vision of potential future scenarios and applications. Within this research paper the potentials, applications, and market available solutions of mobile ambient multimedia are studied. The different features of ambient mobile multimedia are manifold and include wearable computers, adaptive software, context awareness, ubiquitous computers, middleware, and wireless networks. The paper especially focuses on algorithms and methods that can be utilized to realize modern mobile ambient systems.
Computational Study of the Structure of a Sepiolite/Thioindigo Mayan Pigment
Alvarado, Manuel; Chianelli, Russell C.; Arrowood, Roy M.
2012-01-01
The interaction of thioindigo and the phyllosilicate clay sepiolite is investigated using density functional theory (DFT) and molecular orbital theory (MO). The best fit to experimental UV/Vis spectra occurs when a single thioindigo molecule attaches via Van der Waals forces to a tetrahedrally coordinated Al3+ cation with an additional nearby tetrahedrally coordinated Al3+ also present. The thioindigo molecule distorts from its planar structure, a behavior consistent with a color change. Due to the weak interaction between thioindigo and sepiolite we conclude that the thioindigo molecule must be trapped in a channel, an observation consistent with previous experimental studies. Future computational studies will look at the interaction of indigo with sepiolite. PMID:23193386
Single-Cell Genomics: Approaches and Utility in Immunology.
Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A
2017-02-01
Single-cell genomics offers powerful tools for studying immune cells, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population level. Advances in computer science and single-cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single-cell RNA-sequencing data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. Copyright © 2016 Elsevier Ltd. All rights reserved.
Burgstahler, Sheryl; Comden, Dan; Lee, Sang-Mook; Arnold, Anthony; Brown, Kayla
2011-01-01
Computers, telephones, and assistive technology hold promise for increasing the independence, productivity, and participation of individuals with disabilities in academic, employment, recreation, and other activities. However, to reach this goal, technology must be accessible to, available to, and usable by everyone. The authors of this article share computer and telephone access challenges faced by individuals with neurological and other impairments, assistive technology solutions, issues that impact product adoption and use, needs for new technologies, and recommendations for practitioners and researchers. They highlight the stories of three individuals with neurological/mobility impairments, the technology they have found useful to them, and their recommendations for future product development.
Doughty, Teresa Taber; Bouck, Emily C; Bassette, Laura; Szwed, Kathryn; Flanagan, Sara
2013-01-01
The purpose of this study was to examine the effects of a pentop computer and accompanying spelling software on the spelling accuracy and academic engagement behavior in three elementary students with disabilities who were served in a resource room setting. Using a multiple baseline across students single subject research design, researchers determined student use of the pentop computer--the FLYPen--and its spelling software may serve as an equivalent intervention to traditional spelling instruction. While academic engagement performance increased considerably for students when using the FLYPen, results indicated little to no improvement over traditional instruction in spelling accuracy. Implications and suggestions for future research are presented.
Transonic Blunt Body Aerodynamic Coefficients Computation
NASA Astrophysics Data System (ADS)
Sancho, Jorge; Vargas, M.; Gonzalez, Ezequiel; Rodriguez, Manuel
2011-05-01
In the framework of EXPERT (European Experimental Re-entry Test-bed) accurate transonic aerodynamic coefficients are of paramount importance for the correct trajectory assessment and parachute deployment. A combined CFD (Computational Fluid Dynamics) modelling and experimental campaign strategy was selected to obtain accurate coefficients. A preliminary set of coefficients were obtained by CFD Euler inviscid computation. Then experimental campaign was performed at DNW facilities at NLR. A profound review of the CFD modelling was done lighten up by WTT results, aimed to obtain reliable values of the coefficients in the future (specially the pitching moment). Study includes different turbulence modelling and mesh sensitivity analysis. Comparison with the WTT results is explored, and lessons learnt are collected.
A Mass Spectrometer Simulator in Your Computer
ERIC Educational Resources Information Center
Gagnon, Michel
2012-01-01
Introduced to study components of ionized gas, the mass spectrometer has evolved into a highly accurate device now used in many undergraduate and research laboratories. Unfortunately, despite their importance in the formation of future scientists, mass spectrometers remain beyond the financial reach of many high schools and colleges. As a result,…
Must Invisible Colleges Be Invisible? An Approach to Examining Large Communities of Network Users.
ERIC Educational Resources Information Center
Ruth, Stephen R.; Gouet, Raul
1993-01-01
Discussion of characteristics of users of computer-mediated communication systems and scientific networks focuses on a study of the scientific community in Chile. Topics addressed include users and nonusers; productivity; educational level; academic specialty; age; gender; international connectivity; public policy issues; and future research…
Occupational Aspirations of State FFA Contest and Award Winners.
ERIC Educational Resources Information Center
Bowen, Blannie E.; Doerfert, David L.
1989-01-01
A study explored the occupational aspirations of 300 (of 503) students with high levels of participation in Future Farmers of America's (FFA) Computers in Agriculture (CIA), Proficiency Award (PA), and Prepared and Extemporaneous Speaking (PES) contests. CIA and PES winners aspired to professional occupations more than PA winners. PES winners…
Analysis of propellant feedline dynamics
NASA Technical Reports Server (NTRS)
Astleford, W. J.; Holster, J. L.; Gerlach, C. R.
1972-01-01
An analytical model and computer program were developed for studying the disturbances of liquid propellants in engine feedline systems. It was found that the predominant effect of turbulence is to increase the spatial attenuation at low frequencies; at high frequencies the laminar and turbulent frequencies coincide. Recommendations for future work are included.
Fermilab | Science at Fermilab | Experiments & Projects | Intensity
Theory Computing High-performance Computing Grid Computing Networking Mass Storage Plan for the Future List Historic Results Inquiring Minds Questions About Physics Other High-Energy Physics Sites More About Particle Physics Library Visual Media Services Timeline History High-Energy Physics Accelerator
The Future Is Kids and Computers.
ERIC Educational Resources Information Center
Personal Computing, 1982
1982-01-01
Describes a project which produced educational computer programs for PET microcomputers and use of computers in money management, in a filter company, and in a certified public accountant firm (which cancelled a contract for a time-sharing service). Also describes a computerized eye information network for ophthalmologists. (JN)
IUWare and Computing Tools: Indiana University's Approach to Low-Cost Software.
ERIC Educational Resources Information Center
Sheehan, Mark C.; Williams, James G.
1987-01-01
Describes strategies for providing low-cost microcomputer-based software for classroom use on college campuses. Highlights include descriptions of the software (IUWare and Computing Tools); computing center support; license policies; documentation; promotion; distribution; staff, faculty, and user training; problems; and future plans. (LRW)
Technology Trends: Buying a Computer.
ERIC Educational Resources Information Center
Strot, Melody; Benno, Mark
1997-01-01
Provides guidelines for buying computers for parents of gifted children. Steps for making decisions include deciding who will use the computer, deciding its purposes and what software packages will be used, determining current and future needs, setting a budget, and reviewing needs with salespersons and school-based technology specialists. (CR)
The Next Generation of Personal Computers.
ERIC Educational Resources Information Center
Crecine, John P.
1986-01-01
Discusses factors converging to create high-capacity, low-cost nature of next generation of microcomputers: a coherent vision of what graphics workstation and future computing environment should be like; hardware developments leading to greater storage capacity at lower costs; and development of software and expertise to exploit computing power…
Ultrasonic Phased Array Simulations of Welded Components at NASA
NASA Technical Reports Server (NTRS)
Roth, D. J.; Tokars, R. P.; Martin, R. E.; Rauser, R. W.; Aldrin, J. C.
2009-01-01
Comprehensive and accurate inspections of welded components have become of increasing importance as NASA develops new hardware such as Ares rocket segments for future exploration missions. Simulation and modeling will play an increasing role in the future for nondestructive evaluation in order to better understand the physics of the inspection process, to prove or disprove the feasibility for an inspection method or inspection scenario, for inspection optimization, for better understanding of experimental results, and for assessment of probability of detection. This study presents simulation and experimental results for an ultrasonic phased array inspection of a critical welded structure important for NASA future exploration vehicles. Keywords: nondestructive evaluation, computational simulation, ultrasonics, weld, modeling, phased array
Hypersonic Boundary-Layer Transition for X-33 Phase 2 Vehicle
NASA Technical Reports Server (NTRS)
Thompson, Richard A.; Hamilton, Harris H., II; Berry, Scott A.; Horvath, Thomas J.; Nowak, Robert J.
1998-01-01
A status review of the experimental and computational work performed to support the X-33 program in the area of hypersonic boundary-layer transition is presented. Global transition fronts are visualized using thermographic phosphor measurements. Results are used to derive transition correlations for "smooth body" and discrete roughness data and a computational tool is developed to predict transition onset for X-33 using these results. The X-33 thermal protection system appears to be conservatively designed for transition effects based on these studies. Additional study is needed to address concerns related to surface waviness. A discussion of future test plans is included.
NASA Astrophysics Data System (ADS)
Singh, Ranjana; Mishra, Vijay K.; Singh, Hemant K.; Sharma, Gunjan; Koch, Biplob; Singh, Bachcha; Singh, Ranjan K.
2018-03-01
Acrylamide (acr) is a potential toxic molecule produced in thermally processed food stuff. Acr-Mg complex has been synthesized chemically and characterized by spectroscopic techniques. The binding sites of acr with Mg were identified by experimental and computational methods. Both experimental and theoretical results suggest that Mg coordinated with the oxygen atom of Cdbnd O group of acr. In-vitro cytotoxicity studies revealed significant decrease in the toxic level of acr-Mg complex as compared to pure acr. The decrease in toxicity on complexation with Mg may be a useful step for future research to reduce the toxicity of acr.
2014-01-01
Introduction The goal of this paper is to present a critical review of major Computer-Aided Detection systems (CADe) for lung cancer in order to identify challenges for future research. CADe systems must meet the following requirements: improve the performance of radiologists providing high sensitivity in the diagnosis, a low number of false positives (FP), have high processing speed, present high level of automation, low cost (of implementation, training, support and maintenance), the ability to detect different types and shapes of nodules, and software security assurance. Methods The relevant literature related to “CADe for lung cancer” was obtained from PubMed, IEEEXplore and Science Direct database. Articles published from 2009 to 2013, and some articles previously published, were used. A systemic analysis was made on these articles and the results were summarized. Discussion Based on literature search, it was observed that many if not all systems described in this survey have the potential to be important in clinical practice. However, no significant improvement was observed in sensitivity, number of false positives, level of automation and ability to detect different types and shapes of nodules in the studied period. Challenges were presented for future research. Conclusions Further research is needed to improve existing systems and propose new solutions. For this, we believe that collaborative efforts through the creation of open source software communities are necessary to develop a CADe system with all the requirements mentioned and with a short development cycle. In addition, future CADe systems should improve the level of automation, through integration with picture archiving and communication systems (PACS) and the electronic record of the patient, decrease the number of false positives, measure the evolution of tumors, evaluate the evolution of the oncological treatment, and its possible prognosis. PMID:24713067
Firmino, Macedo; Morais, Antônio H; Mendoça, Roberto M; Dantas, Marcel R; Hekis, Helio R; Valentim, Ricardo
2014-04-08
The goal of this paper is to present a critical review of major Computer-Aided Detection systems (CADe) for lung cancer in order to identify challenges for future research. CADe systems must meet the following requirements: improve the performance of radiologists providing high sensitivity in the diagnosis, a low number of false positives (FP), have high processing speed, present high level of automation, low cost (of implementation, training, support and maintenance), the ability to detect different types and shapes of nodules, and software security assurance. The relevant literature related to "CADe for lung cancer" was obtained from PubMed, IEEEXplore and Science Direct database. Articles published from 2009 to 2013, and some articles previously published, were used. A systemic analysis was made on these articles and the results were summarized. Based on literature search, it was observed that many if not all systems described in this survey have the potential to be important in clinical practice. However, no significant improvement was observed in sensitivity, number of false positives, level of automation and ability to detect different types and shapes of nodules in the studied period. Challenges were presented for future research. Further research is needed to improve existing systems and propose new solutions. For this, we believe that collaborative efforts through the creation of open source software communities are necessary to develop a CADe system with all the requirements mentioned and with a short development cycle. In addition, future CADe systems should improve the level of automation, through integration with picture archiving and communication systems (PACS) and the electronic record of the patient, decrease the number of false positives, measure the evolution of tumors, evaluate the evolution of the oncological treatment, and its possible prognosis.
Computer systems performance measurement techniques.
DOT National Transportation Integrated Search
1971-06-01
Computer system performance measurement techniques, tools, and approaches are presented as a foundation for future recommendations regarding the instrumentation of the ARTS ATC data processing subsystem for purposes of measurement and evaluation.
17 CFR 38.156 - Automated trade surveillance system.
Code of Federal Regulations, 2014 CFR
2014-04-01
... potential trade practice violations. The automated system must load and process daily orders and trades no... anomalies; compute, retain, and compare trading statistics; compute trade gains, losses, and futures...
17 CFR 38.156 - Automated trade surveillance system.
Code of Federal Regulations, 2013 CFR
2013-04-01
... potential trade practice violations. The automated system must load and process daily orders and trades no... anomalies; compute, retain, and compare trading statistics; compute trade gains, losses, and futures...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chevallier, J.J.; Quetier, F.P.; Marshall, D.W.
Sedco Forex has developed an integrated computer system to enhance the technical performance of the company at various operational levels and to increase the understanding and knowledge of the drill crews. This paper describes the system and how it is used for recording and processing drilling data at the rig site, for associated technical analyses, and for well design, planning, and drilling performance studies at the operational centers. Some capabilities related to the statistical analysis of the company's operational records are also described, and future development of rig computing systems for drilling applications and management tasks is discussed.
Computer vision for microscopy diagnosis of malaria.
Tek, F Boray; Dempster, Andrew G; Kale, Izzet
2009-07-13
This paper reviews computer vision and image analysis studies aiming at automated diagnosis or screening of malaria infection in microscope images of thin blood film smears. Existing works interpret the diagnosis problem differently or propose partial solutions to the problem. A critique of these works is furnished. In addition, a general pattern recognition framework to perform diagnosis, which includes image acquisition, pre-processing, segmentation, and pattern classification components, is described. The open problems are addressed and a perspective of the future work for realization of automated microscopy diagnosis of malaria is provided.
Computational analysis of liquid hypergolic propellant rocket engines
NASA Technical Reports Server (NTRS)
Krishnan, A.; Przekwas, A. J.; Gross, K. W.
1992-01-01
The combustion process in liquid rocket engines depends on a number of complex phenomena such as atomization, vaporization, spray dynamics, mixing, and reaction mechanisms. A computational tool to study their mutual interactions is developed to help analyze these processes with a view of improving existing designs and optimizing future designs of the thrust chamber. The focus of the article is on the analysis of the Variable Thrust Engine for the Orbit Maneuvering Vehicle. This engine uses a hypergolic liquid bipropellant combination of monomethyl hydrazine as fuel and nitrogen tetroxide as oxidizer.
Early Universe synthesis of asymmetric dark matter nuggets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gresham, Moira I.; Lou, Hou Keong; Zurek, Kathryn M.
We compute the mass function of bound states of asymmetric dark matter - nuggets - synthesized in the early Universe. We apply our results for the nugget density and binding energy computed from a nuclear model to obtain analytic estimates of the typical nugget size exiting synthesis. We numerically solve the Boltzmann equation for synthesis including two-to-two fusion reactions, estimating the impact of bottlenecks on the mass function exiting synthesis. These results provide the basis for studying the late Universe cosmology of nuggets in a future companion paper.
Early Universe synthesis of asymmetric dark matter nuggets
Gresham, Moira I.; Lou, Hou Keong; Zurek, Kathryn M.
2018-02-12
We compute the mass function of bound states of asymmetric dark matter - nuggets - synthesized in the early Universe. We apply our results for the nugget density and binding energy computed from a nuclear model to obtain analytic estimates of the typical nugget size exiting synthesis. We numerically solve the Boltzmann equation for synthesis including two-to-two fusion reactions, estimating the impact of bottlenecks on the mass function exiting synthesis. These results provide the basis for studying the late Universe cosmology of nuggets in a future companion paper.
NASA Astrophysics Data System (ADS)
Goodman, Joseph W.
1987-10-01
Work Accomplished: OPTICAL INTERCONNECTIONS - the powerful interconnect abilities of optical beams have led much optimism about the possible roles for optics in solving interconnect problems at various levels of computer architecture. Examined were the powerful requirements of optical interconnects at the gate-to-gate and chip-to-chip levels. OPTICAL NEUTRAL NETWORKS - basic studies of the convergence properties on the Holfield model, based on mathematical approach - graph theory. OPTICS AND ARTIFICIAL INTELLIGENCE - review the field of optical processing and artificial intelligence, with the aim of finding areas that might be particularly attractive for future investigation(s).
Early Universe synthesis of asymmetric dark matter nuggets
NASA Astrophysics Data System (ADS)
Gresham, Moira I.; Lou, Hou Keong; Zurek, Kathryn M.
2018-02-01
We compute the mass function of bound states of asymmetric dark matter—nuggets—synthesized in the early Universe. We apply our results for the nugget density and binding energy computed from a nuclear model to obtain analytic estimates of the typical nugget size exiting synthesis. We numerically solve the Boltzmann equation for synthesis including two-to-two fusion reactions, estimating the impact of bottlenecks on the mass function exiting synthesis. These results provide the basis for studying the late Universe cosmology of nuggets in a future companion paper.
Public preferences for future conditions in disturbed and undisturbed northern forest sites
Terry C. Daniel
2006-01-01
This study presented computer visualizations (pictures) of projected changes over an 80-year period to conditions in a northern forest that had been hit by a major blowdown. Study participants included local residents and forest visitors who were asked to choose between visualizations of projected outcome scenarios for 10 pairs of treatment versus no-treatment options...
1979-09-01
joint orientetion and joint slippage than to failure of the intact rock mass. Dixon (1971) noted the importance of including the confining influence of...dedicated computer. The area of research not covered by this investigation which holds promise for a future study is a detailed comparison of the results of...block data, type key "W". The program writes this data on Linc tapes for future retripval. This feature can be used to store the consolidated block
Gutiérrez-Maldonado, José; Wiederhold, Brenda K; Riva, Giuseppe
2016-02-01
Transdisciplinary efforts for further elucidating the etiology of eating and weight disorders and improving the effectiveness of the available evidence-based interventions are imperative at this time. Recent studies indicate that computer-generated graphic environments-virtual reality (VR)-can integrate and extend existing treatments for eating and weight disorders (EWDs). Future possibilities for VR to improve actual approaches include its use for altering in real time the experience of the body (embodiment) and as a cue exposure tool for reducing food craving.
Displaying Computer Simulations Of Physical Phenomena
NASA Technical Reports Server (NTRS)
Watson, Val
1991-01-01
Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.
ERIC Educational Resources Information Center
Cusick, Theresa; And Others
This examination of computer equity argues that current educational trends--which emphasize teaching applications of computers rather than programming--will limit the computer skills of students. Added to this difficulty is the argument that some students (often minority and female students) need not be pushed to learn programming if they don't…
Behold the Trojan Horse: Instructional vs. Productivity Computing in the Classroom.
ERIC Educational Resources Information Center
Loop, Liza
This background paper for a symposium on the school of the future reviews the current instructional applications of computers in the classroom (the computer as a means or the subject of instruction), and suggests strategies that administrators might use to move toward viewing the computer as a productivity tool for students, i.e., its use for word…
Approaches for scalable modeling and emulation of cyber systems : LDRD final report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayo, Jackson R.; Minnich, Ronald G.; Armstrong, Robert C.
2009-09-01
The goal of this research was to combine theoretical and computational approaches to better understand the potential emergent behaviors of large-scale cyber systems, such as networks of {approx} 10{sup 6} computers. The scale and sophistication of modern computer software, hardware, and deployed networked systems have significantly exceeded the computational research community's ability to understand, model, and predict current and future behaviors. This predictive understanding, however, is critical to the development of new approaches for proactively designing new systems or enhancing existing systems with robustness to current and future cyber threats, including distributed malware such as botnets. We have developed preliminarymore » theoretical and modeling capabilities that can ultimately answer questions such as: How would we reboot the Internet if it were taken down? Can we change network protocols to make them more secure without disrupting existing Internet connectivity and traffic flow? We have begun to address these issues by developing new capabilities for understanding and modeling Internet systems at scale. Specifically, we have addressed the need for scalable network simulation by carrying out emulations of a network with {approx} 10{sup 6} virtualized operating system instances on a high-performance computing cluster - a 'virtual Internet'. We have also explored mappings between previously studied emergent behaviors of complex systems and their potential cyber counterparts. Our results provide foundational capabilities for further research toward understanding the effects of complexity in cyber systems, to allow anticipating and thwarting hackers.« less
Screening for cognitive impairment in older individuals. Validation study of a computer-based test.
Green, R C; Green, J; Harrison, J M; Kutner, M H
1994-08-01
This study examined the validity of a computer-based cognitive test that was recently designed to screen the elderly for cognitive impairment. Criterion-related validity was examined by comparing test scores of impaired patients and normal control subjects. Construct-related validity was computed through correlations between computer-based subtests and related conventional neuropsychological subtests. University center for memory disorders. Fifty-two patients with mild cognitive impairment by strict clinical criteria and 50 unimpaired, age- and education-matched control subjects. Control subjects were rigorously screened by neurological, neuropsychological, imaging, and electrophysiological criteria to identify and exclude individuals with occult abnormalities. Using a cut-off total score of 126, this computer-based instrument had a sensitivity of 0.83 and a specificity of 0.96. Using a prevalence estimate of 10%, predictive values, positive and negative, were 0.70 and 0.96, respectively. Computer-based subtests correlated significantly with conventional neuropsychological tests measuring similar cognitive domains. Thirteen (17.8%) of 73 volunteers with normal medical histories were excluded from the control group, with unsuspected abnormalities on standard neuropsychological tests, electroencephalograms, or magnetic resonance imaging scans. Computer-based testing is a valid screening methodology for the detection of mild cognitive impairment in the elderly, although this particular test has important limitations. Broader applications of computer-based testing will require extensive population-based validation. Future studies should recognize that normal control subjects without a history of disease who are typically used in validation studies may have a high incidence of unsuspected abnormalities on neurodiagnostic studies.
Ocean Modeling and Visualization on Massively Parallel Computer
NASA Technical Reports Server (NTRS)
Chao, Yi; Li, P. Peggy; Wang, Ping; Katz, Daniel S.; Cheng, Benny N.
1997-01-01
Climate modeling is one of the grand challenges of computational science, and ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change.
2013-01-01
Objective. This study compared the relationship between computer experience and performance on computerized cognitive tests and a traditional paper-and-pencil cognitive test in a sample of older adults (N = 634). Method. Participants completed computer experience and computer attitudes questionnaires, three computerized cognitive tests (Useful Field of View (UFOV) Test, Road Sign Test, and Stroop task) and a paper-and-pencil cognitive measure (Trail Making Test). Multivariate analysis of covariance was used to examine differences in cognitive performance across the four measures between those with and without computer experience after adjusting for confounding variables. Results. Although computer experience had a significant main effect across all cognitive measures, the effect sizes were similar. After controlling for computer attitudes, the relationship between computer experience and UFOV was fully attenuated. Discussion. Findings suggest that computer experience is not uniquely related to performance on computerized cognitive measures compared with paper-and-pencil measures. Because the relationship between computer experience and UFOV was fully attenuated by computer attitudes, this may imply that motivational factors are more influential to UFOV performance than computer experience. Our findings support the hypothesis that computer use is related to cognitive performance, and this relationship is not stronger for computerized cognitive measures. Implications and directions for future research are provided. PMID:22929395
Hofstadter-Duke, Kristi L; Daly, Edward J
2015-03-01
This study investigated a method for conducting experimental analyses of academic responding. In the experimental analyses, academic responding (math computation), rather than problem behavior, was reinforced across conditions. Two separate experimental analyses (one with fluent math computation problems and one with non-fluent math computation problems) were conducted with three elementary school children using identical contingencies while math computation rate was measured. Results indicate that the experimental analysis with non-fluent problems produced undifferentiated responding across participants; however, differentiated responding was achieved for all participants in the experimental analysis with fluent problems. A subsequent comparison of the single-most effective condition from the experimental analyses replicated the findings with novel computation problems. Results are discussed in terms of the critical role of stimulus control in identifying controlling consequences for academic deficits, and recommendations for future research refining and extending experimental analysis to academic responding are made. © The Author(s) 2014.
Training School Administrators in Computer Use.
ERIC Educational Resources Information Center
Spuck, Dennis W.; Bozeman, William C.
1988-01-01
Presents results of a survey of faculty members in doctoral-level educational administration programs that examined the use of computers in administrative training programs. The present status and future directions of technological training of school administrators are discussed, and a sample curriculum for a course in technology and computing is…
Proceedings from the conference on high speed computing: High speed computing and national security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirons, K.P.; Vigil, M.; Carlson, R.
1997-07-01
This meeting covered the following topics: technologies/national needs/policies: past, present and future; information warfare; crisis management/massive data systems; risk assessment/vulnerabilities; Internet law/privacy and rights of society; challenges to effective ASCI programmatic use of 100 TFLOPs systems; and new computing technologies.
Women and Information Technology: Framing Some Issues for Education.
ERIC Educational Resources Information Center
Damarin, Suzanne K.
1992-01-01
Discusses relationships among technology, women, and education. Presents three views of the computer's future: (1) the robot as superior human; (2) the cyborg; and (3) the human-computer dyad. Discusses effects that the computer has had upon work and school, particularly for women and at risk and nonliterate students. (SG)
Advanced Computing Tools and Models for Accelerator Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryne, Robert; Ryne, Robert D.
2008-06-11
This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.
ERIC Educational Resources Information Center
Shelly, Gary B.; Cashman, Thomas J.; Gunter, Randolph E.; Gunter, Glenda A.
Intended for use in an introductory computer course for educators, this textbook contains the following chapters: (1) "Introduction to Using Computers in Education"; (2) "Communications, Networks, the Internet, and the World Wide Web"; (3) "Software Applications for Education,"; (4) "Hardware Applications for…
Using Interactive Computer to Communicate Scientific Information.
ERIC Educational Resources Information Center
Selnow, Gary W.
1988-01-01
Asks whether the computer is another channel of communication, if its interactive qualities make it an information source, or if it is an undefined hybrid. Concludes that computers are neither the medium nor the source but will in the future provide the possibility of a sophisticated interaction between human intelligence and artificial…
THE COMPUTER AND THE ARCHITECTURAL PROFESSION.
ERIC Educational Resources Information Center
HAVILAND, DAVID S.
THE ROLE OF ADVANCING TECHNOLOGY IN THE FIELD OF ARCHITECTURE IS DISCUSSED IN THIS REPORT. PROBLEMS IN COMMUNICATION AND THE DESIGN PROCESS ARE IDENTIFIED. ADVANTAGES AND DISADVANTAGES OF COMPUTERS ARE MENTIONED IN RELATION TO MAN AND MACHINE INTERACTION. PRESENT AND FUTURE IMPLICATIONS OF COMPUTER USAGE ARE IDENTIFIED AND DISCUSSED WITH RESPECT…
2016 Annual Report - Argonne Leadership Computing Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, Jim; Papka, Michael E.; Cerny, Beth A.
The Argonne Leadership Computing Facility (ALCF) helps researchers solve some of the world’s largest and most complex problems, while also advancing the nation’s efforts to develop future exascale computing systems. This report presents some of the ALCF’s notable achievements in key strategic areas over the past year.
Computer Competence for the Applied Gerontologist.
ERIC Educational Resources Information Center
Dickel, C. Timothy; Young, W. Wayne
This paper shares some ideas regarding the use of computers by persons who use their gerontology training in direct service to older persons and their families. It proposes that, as professionals serving older persons and their families look toward the future, they need to conscientiously incorporate computer competence into their practice. The…
Very Large Scale Integration (VLSI).
ERIC Educational Resources Information Center
Yeaman, Andrew R. J.
Very Large Scale Integration (VLSI), the state-of-the-art production techniques for computer chips, promises such powerful, inexpensive computing that, in the future, people will be able to communicate with computer devices in natural language or even speech. However, before full-scale VLSI implementation can occur, certain salient factors must be…
17 CFR 171.4 - Computation of time.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Computation of time. 171.4 Section 171.4 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION RULES RELATING TO... computing any period of time prescribed by these rules or allowed by the Commission, the day of the act...
17 CFR 12.5 - Computation of time.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Computation of time. 12.5 Section 12.5 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION RULES RELATING TO... general. In computing any period of time prescribed by these rules or allowed by the Commission, the...
Culture and Risk: Does the Future Compute? A Symposium.
ERIC Educational Resources Information Center
Barnes, Susan B.; Perkinson, Henry J.; Talbott, Stephen L.
1998-01-01
Presents a symposium on the impact of computers on culture. Argues that the computer has mathematized culture and that widespread risk aversion has been generated everywhere. Finds that the ways in which communication technologies are used in social contexts is a topic of concern to communication scholars. (PA)
Establishing Tools for Computing Hybrids
2006-10-01
moorelaw.html. September 26. pp. 1-28. 47. Sharma , Vijay . 2004. Is it Possible to Build Computers from Living Cells? BioTeach Journal, 2, 53-60. 48...by Vijay K. Varadan, Proceedings of SPIE Vol. 5389, SPIE, Bellingham, WA. Pp. 298-305. 58. Warren, Paul. 2002. The Future of Computing: New
Integrating Computational Science Tools into a Thermodynamics Course
ERIC Educational Resources Information Center
Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew
2018-01-01
Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of…
The Instrument of the Future: Computers in Education.
ERIC Educational Resources Information Center
Leonard, Rex; LeCroy, Barbara
Before computers will be able to fulfill their potential in education, two major challenges must be overcome--the lack of well-trained teachers and a lack of general knowledge about software and its capabilities. Teachers must acquire some computer literacy skills, including programming, word processing, materials generation and record keeping. In…
ERIC Educational Resources Information Center
1972
Recent and expected developments in the computer industry are discussed in this 628-page yearbook, successor to "The Punched Card Annual." The first section of the report is an overview of current computer hardware and software and includes articles about future applications of mainframes, an analysis of the software industry, and a summary of the…
NASA Astrophysics Data System (ADS)
Vallam, P.; Qin, X. S.
2017-10-01
Anthropogenic-driven climate change would affect the global ecosystem and is becoming a world-wide concern. Numerous studies have been undertaken to determine the future trends of meteorological variables at different scales. Despite these studies, there remains significant uncertainty in the prediction of future climates. To examine the uncertainty arising from using different schemes to downscale the meteorological variables for the future horizons, projections from different statistical downscaling schemes were examined. These schemes included statistical downscaling method (SDSM), change factor incorporated with LARS-WG, and bias corrected disaggregation (BCD) method. Global circulation models (GCMs) based on CMIP3 (HadCM3) and CMIP5 (CanESM2) were utilized to perturb the changes in the future climate. Five study sites (i.e., Alice Springs, Edmonton, Frankfurt, Miami, and Singapore) with diverse climatic conditions were chosen for examining the spatial variability of applying various statistical downscaling schemes. The study results indicated that the regions experiencing heavy precipitation intensities were most likely to demonstrate the divergence between the predictions from various statistical downscaling methods. Also, the variance computed in projecting the weather extremes indicated the uncertainty derived from selection of downscaling tools and climate models. This study could help gain an improved understanding about the features of different downscaling approaches and the overall downscaling uncertainty.
48 CFR 9904.415-60 - Illustrations.
Code of Federal Regulations, 2014 CFR
2014-10-01
... compensation for Contractor A shall be measured by the present value of the future benefits and shall be... calculate the future benefit. Any adjustment in the cost of deferred compensation which results from a... may not be included in the computation of the future benefits. The assignable cost for 1976 is...
48 CFR 9904.415-60 - Illustrations.
Code of Federal Regulations, 2013 CFR
2013-10-01
... compensation for Contractor A shall be measured by the present value of the future benefits and shall be... calculate the future benefit. Any adjustment in the cost of deferred compensation which results from a... may not be included in the computation of the future benefits. The assignable cost for 1976 is...
Ehrlinger, Joyce; Plant, E Ashby; Hartwig, Marissa K; Vossen, Jordan J; Columb, Corey J; Brewer, Lauren E
2018-01-01
Women are vastly underrepresented in the fields of computer science and engineering (CS&E). We examined whether women might view the intellectual characteristics of prototypical individuals in CS&E in more stereotype-consistent ways than men might and, consequently, show less interest in CS&E. We asked 269 U.S. college students (187, 69.5% women) to describe the prototypical computer scientist (Study 1) or engineer (Study 2) through open-ended descriptions as well as through a set of trait ratings. Participants also rated themselves on the same set of traits and rated their similarity to the prototype. Finally, participants in both studies were asked to describe their likelihood of pursuing future college courses and careers in computer science (Study 1) or engineering (Study 2). Across both studies, we found that women offered more stereotype-consistent ratings than did men of the intellectual characteristics of prototypes in CS (Study 1) and engineering (Study 2). Women also perceived themselves as less similar to the prototype than men did. Further, the observed gender differences in prototype perceptions mediated the tendency for women to report lower interest in CS&E fields relative to men. Our work highlights the importance of prototype perceptions for understanding the gender gap in CS&E and suggests avenues for interventions that may increase women's representation in these vital fields.
Blood Flow in Idealized Vascular Access for Hemodialysis: A Review of Computational Studies.
Ene-Iordache, Bogdan; Remuzzi, Andrea
2017-09-01
Although our understanding of the failure mechanism of vascular access for hemodialysis has increased substantially, this knowledge has not translated into successful therapies. Despite advances in technology, it is recognized that vascular access is difficult to maintain, due to complications such as intimal hyperplasia. Computational studies have been used to estimate hemodynamic changes induced by vascular access creation. Due to the heterogeneity of patient-specific geometries, and difficulties with obtaining reliable models of access vessels, idealized models were often employed. In this review we analyze the knowledge gained with the use of computational such simplified models. A review of the literature was conducted, considering studies employing a computational fluid dynamics approach to gain insights into the flow field phenotype that develops in idealized models of vascular access. Several important discoveries have originated from idealized model studies, including the detrimental role of disturbed flow and turbulent flow, and the beneficial role of spiral flow in intimal hyperplasia. The general flow phenotype was consistent among studies, but findings were not treated homogeneously since they paralleled achievements in cardiovascular biomechanics which spanned over the last two decades. Computational studies in idealized models are important for studying local blood flow features and evaluating new concepts that may improve the patency of vascular access for hemodialysis. For future studies we strongly recommend numerical modelling targeted at accurately characterizing turbulent flows and multidirectional wall shear disturbances.
Fundamental device design considerations in the development of disruptive nanoelectronics.
Singh, R; Poole, J O; Poole, K F; Vaidya, S D
2002-01-01
In the last quarter of a century silicon-based integrated circuits (ICs) have played a major role in the growth of the economy throughout the world. A number of new technologies, such as quantum computing, molecular computing, DNA molecules for computing, etc., are currently being explored to create a product to replace semiconductor transistor technology. We have examined all of the currently explored options and found that none of these options are suitable as silicon IC's replacements. In this paper we provide fundamental device criteria that must be satisfied for the successful operation of a manufacturable, not yet invented, device. The two fundamental limits are the removal of heat and reliability. The switching speed of any practical man-made computing device will be in the range of 10(-15) to 10(-3) s. Heisenberg's uncertainty principle and the computer architecture set the heat generation limit. The thermal conductivity of the materials used in the fabrication of a nanodimensional device sets the heat removal limit. In current electronic products, redundancy plays a significant part in improving the reliability of parts with macroscopic defects. In the future, microscopic and even nanoscopic defects will play a critical role in the reliability of disruptive nanoelectronics. The lattice vibrations will set the intrinsic reliability of future computing systems. The two critical limits discussed in this paper provide criteria for the selection of materials used in the fabrication of future devices. Our work shows that diamond contains the clue to providing computing devices that will surpass the performance of silicon-based nanoelectronics.
A Future Accelerated Cognitive Distributed Hybrid Testbed for Big Data Science Analytics
NASA Astrophysics Data System (ADS)
Halem, M.; Prathapan, S.; Golpayegani, N.; Huang, Y.; Blattner, T.; Dorband, J. E.
2016-12-01
As increased sensor spectral data volumes from current and future Earth Observing satellites are assimilated into high-resolution climate models, intensive cognitive machine learning technologies are needed to data mine, extract and intercompare model outputs. It is clear today that the next generation of computers and storage, beyond petascale cluster architectures, will be data centric. They will manage data movement and process data in place. Future cluster nodes have been announced that integrate multiple CPUs with high-speed links to GPUs and MICS on their backplanes with massive non-volatile RAM and access to active flash RAM disk storage. Active Ethernet connected key value store disk storage drives with 10Ge or higher are now available through the Kinetic Open Storage Alliance. At the UMBC Center for Hybrid Multicore Productivity Research, a future state-of-the-art Accelerated Cognitive Computer System (ACCS) for Big Data science is being integrated into the current IBM iDataplex computational system `bluewave'. Based on the next gen IBM 200 PF Sierra processor, an interim two node IBM Power S822 testbed is being integrated with dual Power 8 processors with 10 cores, 1TB Ram, a PCIe to a K80 GPU and an FPGA Coherent Accelerated Processor Interface card to 20TB Flash Ram. This system is to be updated to the Power 8+, an NVlink 1.0 with the Pascal GPU late in 2016. Moreover, the Seagate 96TB Kinetic Disk system with 24 Ethernet connected active disks is integrated into the ACCS storage system. A Lightweight Virtual File System developed at the NASA GSFC is installed on bluewave. Since remote access to publicly available quantum annealing computers is available at several govt labs, the ACCS will offer an in-line Restricted Boltzmann Machine optimization capability to the D-Wave 2X quantum annealing processor over the campus high speed 100 Gb network to Internet 2 for large files. As an evaluation test of the cognitive functionality of the architecture, the following studies utilizing all the system components will be presented; (i) a near real time climate change study generating CO2 fluxes and (ii) a deep dive capability into an 8000 x8000 pixel image pyramid display and (iii) Large dense and sparse eigenvalue decomposition.
2014-01-01
Background To improve effectiveness of future screen behaviour interventions, one needs to know whether an intervention works via the proposed mediating mechanisms and whether the intervention is equally effective among subgroups. Parental regulation is identified as a consistent correlate of screen behaviours, but prospective evidence as well as the mediation role of parental regulation is largely lacking. This study investigated post-intervention main effects on screen behaviours in the HEIA-intervention – a Norwegian school-based multiple-behaviour study, as well as mediation effects of parental regulation by adolescents’ and parents’ report. In addition, moderating effects of gender and weight status on the intervention and mediating effects were explored. Methods Participating schools were randomized to control (n = 25) or intervention (n = 12) condition. Adolescents (n = 908 Control; 510 Intervention) self-reported their weekday and weekend TV-viewing and computer/game-use. Change in adolescents’ behaviours was targeted through school and parents. Adolescents, mothers (n = 591 Control; 244 Interventions) and fathers (n = 469 Control; 199 Intervention) reported parental regulation of the screen behaviours post-intervention (at 20 month). The product-of-coefficient test using linear regression analysis was conducted to examine main and mediating effects. Results There was no intervention effect on the screen behaviours in the total sample. Gender moderated effect on weekend computer/game-use, while weight status moderated the effect on weekday TV-viewing and computer/game-use. Stratified analyses showed a small favourable intervention effect on weekday TV-viewing among the normal weight. Parental regulation did not mediate change in the screen behaviours. However, stronger parental regulation was associated with less TV-viewing and computer/game-use with effects being conditional on adolescents’ versus parental reports. Parental regulation of the screen behaviours, primarily by the parental report, was associated with change in the respective behaviours. Conclusion Multiple behaviour intervention may not affect all equally well, and the effect may differ by weight status and gender. In future interventions parents should be encouraged to regulate their adolescents’ TV-viewing and computer/game-use on both weekdays and weekends as parental regulation was identified as a determinant of these screen behaviours. However, future intervention studies may need to search for more effective intervention strategies targeting parental regulation. Trial registration Current Controlled Trials ISRCTN98552879 PMID:24568125
Clinician style and examination room computers: a video ethnography.
Ventres, William; Kooienga, Sarah; Marlin, Ryan; Vuckovic, Nancy; Stewart, Valerie
2005-04-01
The use of computers in medical examination rooms is growing. Advocates of this technology suggest that all family physicians should have and use examination room computers (ERCs) within the near future. This study explored how family physicians incorporate the use of ERCs in their interactions with patients. This qualitative study involved five family physicians, one family nurse practitioner, and a convenience sample of 29 patients. Data included videotaped visits, clinician interviews, and videotape reviews. The setting was an urban family practice with a 7-year history of viewing electronic medical records. The main outcome measures were themes emergent from videotaped data. We identified three distinct practice styles that shaped the use of the ERC: informational, interpersonal, and managerial styles. Clinicians with an informational style are guided by their attention to gathering data as prompted by the computer screen. Clinicians with an interpersonal style focus their attention and body language on patients. Clinicians with a managerial style bridge informational and interpersonal styles by alternating their attention in defined intervals between patients and the computer. Family physicians have varying practice styles that affect the way they use examination room computers during visits with patients.
NASA Astrophysics Data System (ADS)
Bergey, Bradley W.; Ketelhut, Diane Jass; Liang, Senfeng; Natarajan, Uma; Karakus, Melissa
2015-10-01
The primary aim of the study was to examine whether performance on a science assessment in an immersive virtual environment was associated with changes in scientific inquiry self-efficacy. A secondary aim of the study was to examine whether performance on the science assessment was equitable for students with different levels of computer game self-efficacy, including whether gender differences were observed. We examined 407 middle school students' scientific inquiry self-efficacy and computer game self-efficacy before and after completing a computer game-like assessment about a science mystery. Results from path analyses indicated that prior scientific inquiry self-efficacy predicted achievement on end-of-module questions, which in turn predicted change in scientific inquiry self-efficacy. By contrast, computer game self-efficacy was neither predictive of nor predicted by performance on the science assessment. While boys had higher computer game self-efficacy compared to girls, multi-group analyses suggested only minor gender differences in how efficacy beliefs related to performance. Implications for assessments with virtual environments and future design and research are discussed.
Computer assisted analysis of medical x-ray images
NASA Astrophysics Data System (ADS)
Bengtsson, Ewert
1996-01-01
X-rays were originally used to expose film. The early computers did not have enough capacity to handle images with useful resolution. The rapid development of computer technology over the last few decades has, however, led to the introduction of computers into radiology. In this overview paper, the various possible roles of computers in radiology are examined. The state of the art is briefly presented, and some predictions about the future are made.
Predicting uncertainty in future marine ice sheet volume using Bayesian statistical methods
NASA Astrophysics Data System (ADS)
Davis, A. D.
2015-12-01
The marine ice instability can trigger rapid retreat of marine ice streams. Recent observations suggest that marine ice systems in West Antarctica have begun retreating. However, unknown ice dynamics, computationally intensive mathematical models, and uncertain parameters in these models make predicting retreat rate and ice volume difficult. In this work, we fuse current observational data with ice stream/shelf models to develop probabilistic predictions of future grounded ice sheet volume. Given observational data (e.g., thickness, surface elevation, and velocity) and a forward model that relates uncertain parameters (e.g., basal friction and basal topography) to these observations, we use a Bayesian framework to define a posterior distribution over the parameters. A stochastic predictive model then propagates uncertainties in these parameters to uncertainty in a particular quantity of interest (QoI)---here, the volume of grounded ice at a specified future time. While the Bayesian approach can in principle characterize the posterior predictive distribution of the QoI, the computational cost of both the forward and predictive models makes this effort prohibitively expensive. To tackle this challenge, we introduce a new Markov chain Monte Carlo method that constructs convergent approximations of the QoI target density in an online fashion, yielding accurate characterizations of future ice sheet volume at significantly reduced computational cost.Our second goal is to attribute uncertainty in these Bayesian predictions to uncertainties in particular parameters. Doing so can help target data collection, for the purpose of constraining the parameters that contribute most strongly to uncertainty in the future volume of grounded ice. For instance, smaller uncertainties in parameters to which the QoI is highly sensitive may account for more variability in the prediction than larger uncertainties in parameters to which the QoI is less sensitive. We use global sensitivity analysis to help answer this question, and make the computation of sensitivity indices computationally tractable using a combination of polynomial chaos and Monte Carlo techniques.
Progress on the FabrIc for Frontier Experiments project at Fermilab
Box, Dennis; Boyd, Joseph; Dykstra, Dave; ...
2015-12-23
The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercialmore » cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. Hence, the progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide« less
Computational Pathology: A Path Ahead.
Louis, David N; Feldman, Michael; Carter, Alexis B; Dighe, Anand S; Pfeifer, John D; Bry, Lynn; Almeida, Jonas S; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E; Gilbertson, John R; Sinard, John H; Gerber, Georg K; Galli, Stephen J; Golden, Jeffrey A; Becich, Michael J
2016-01-01
We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. To define the scope and needs of computational pathology. A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and nonpathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Box, D.; Boyd, J.; Di Benedetto, V.
2016-01-01
The FabrIc for Frontier Experiments (FIFE) project is an initiative within the Fermilab Scientific Computing Division designed to steer the computing model for non-LHC Fermilab experiments across multiple physics areas. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying size, needs, and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of solutions for high throughput computing, data management, database access and collaboration management within an experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid compute sites alongmore » with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including a common job submission service, software and reference data distribution through CVMFS repositories, flexible and robust data transfer clients, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken the leading role in defining the computing model for Fermilab experiments, aided in the design of experiments beyond those hosted at Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.« less
Slotnick, Jeffrey P.; Khodadoust, Abdollah; Alonso, Juan J.; Darmofal, David L.; Gropp, William D.; Lurie, Elizabeth A.; Mavriplis, Dimitri J.; Venkatakrishnan, Venkat
2014-01-01
As global air travel expands rapidly to meet demand generated by economic growth, it is essential to continue to improve the efficiency of air transportation to reduce its carbon emissions and address concerns about climate change. Future transports must be ‘cleaner’ and designed to include technologies that will continue to lower engine emissions and reduce community noise. The use of computational fluid dynamics (CFD) will be critical to enable the design of these new concepts. In general, the ability to simulate aerodynamic and reactive flows using CFD has progressed rapidly during the past several decades and has fundamentally changed the aerospace design process. Advanced simulation capabilities not only enable reductions in ground-based and flight-testing requirements, but also provide added physical insight, and enable superior designs at reduced cost and risk. In spite of considerable success, reliable use of CFD has remained confined to a small region of the operating envelope due, in part, to the inability of current methods to reliably predict turbulent, separated flows. Fortunately, the advent of much more powerful computing platforms provides an opportunity to overcome a number of these challenges. This paper summarizes the findings and recommendations from a recent NASA-funded study that provides a vision for CFD in the year 2030, including an assessment of critical technology gaps and needed development, and identifies the key CFD technology advancements that will enable the design and development of much cleaner aircraft in the future. PMID:25024413
Radio-frequency measurement in semiconductor quantum computation
NASA Astrophysics Data System (ADS)
Han, TianYi; Chen, MingBo; Cao, Gang; Li, HaiOu; Xiao, Ming; Guo, GuoPing
2017-05-01
Semiconductor quantum dots have attracted wide interest for the potential realization of quantum computation. To realize efficient quantum computation, fast manipulation and the corresponding readout are necessary. In the past few decades, considerable progress of quantum manipulation has been achieved experimentally. To meet the requirements of high-speed readout, radio-frequency (RF) measurement has been developed in recent years, such as RF-QPC (radio-frequency quantum point contact) and RF-DGS (radio-frequency dispersive gate sensor). Here we specifically demonstrate the principle of the radio-frequency reflectometry, then review the development and applications of RF measurement, which provides a feasible way to achieve high-bandwidth readout in quantum coherent control and also enriches the methods to study these artificial mesoscopic quantum systems. Finally, we prospect the future usage of radio-frequency reflectometry in scaling-up of the quantum computing models.
Decision Making and Reward in Frontal Cortex
Kennerley, Steven W.; Walton, Mark E.
2011-01-01
Patients with damage to the prefrontal cortex (PFC)—especially the ventral and medial parts of PFC—often show a marked inability to make choices that meet their needs and goals. These decision-making impairments often reflect both a deficit in learning concerning the consequences of a choice, as well as deficits in the ability to adapt future choices based on experienced value of the current choice. Thus, areas of PFC must support some value computations that are necessary for optimal choice. However, recent frameworks of decision making have highlighted that optimal and adaptive decision making does not simply rest on a single computation, but a number of different value computations may be necessary. Using this framework as a guide, we summarize evidence from both lesion studies and single-neuron physiology for the representation of different value computations across PFC areas. PMID:21534649
Climate Ocean Modeling on Parallel Computers
NASA Technical Reports Server (NTRS)
Wang, P.; Cheng, B. N.; Chao, Y.
1998-01-01
Ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change. However, modeling the ocean circulation at various spatial and temporal scales is a very challenging computational task.
Computational Modeling in Liver Surgery
Christ, Bruno; Dahmen, Uta; Herrmann, Karl-Heinz; König, Matthias; Reichenbach, Jürgen R.; Ricken, Tim; Schleicher, Jana; Ole Schwen, Lars; Vlaic, Sebastian; Waschinsky, Navina
2017-01-01
The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery. PMID:29249974
Future in biomolecular computation
NASA Astrophysics Data System (ADS)
Wimmer, E.
1988-01-01
Large-scale computations for biomolecules are dominated by three levels of theory: rigorous quantum mechanical calculations for molecules with up to about 30 atoms, semi-empirical quantum mechanical calculations for systems with up to several hundred atoms, and force-field molecular dynamics studies of biomacromolecules with 10,000 atoms and more including surrounding solvent molecules. It can be anticipated that increased computational power will allow the treatment of larger systems of ever growing complexity. Due to the scaling of the computational requirements with increasing number of atoms, the force-field approaches will benefit the most from increased computational power. On the other hand, progress in methodologies such as density functional theory will enable us to treat larger systems on a fully quantum mechanical level and a combination of molecular dynamics and quantum mechanics can be envisioned. One of the greatest challenges in biomolecular computation is the protein folding problem. It is unclear at this point, if an approach with current methodologies will lead to a satisfactory answer or if unconventional, new approaches will be necessary. In any event, due to the complexity of biomolecular systems, a hierarchy of approaches will have to be established and used in order to capture the wide ranges of length-scales and time-scales involved in biological processes. In terms of hardware development, speed and power of computers will increase while the price/performance ratio will become more and more favorable. Parallelism can be anticipated to become an integral architectural feature in a range of computers. It is unclear at this point, how fast massively parallel systems will become easy enough to use so that new methodological developments can be pursued on such computers. Current trends show that distributed processing such as the combination of convenient graphics workstations and powerful general-purpose supercomputers will lead to a new style of computing in which the calculations are monitored and manipulated as they proceed. The combination of a numeric approach with artificial-intelligence approaches can be expected to open up entirely new possibilities. Ultimately, the most exciding aspect of the future in biomolecular computing will be the unexpected discoveries.
Discussion of DNS: Past, Present, and Future
NASA Technical Reports Server (NTRS)
Joslin, Ronald D.
1997-01-01
This paper covers the review, status, and projected future of direct numerical simulation (DNS) methodology relative to the state-of-the-art in computer technology, numerical methods, and the trends in fundamental research programs.
Kepner, William G.; Semmens, Darius J.; Hernandez, Mariano; Goodrich, David C.
2009-01-01
Envisioning and evaluating future scenarios has emerged as a critical component of both science and social decision-making. The ability to assess, report, map, and forecast the life support functions of ecosystems is absolutely critical to our capacity to make informed decisions to maintain the sustainable nature of our ecosystem services now and into the future. During the past two decades, important advances in the integration of remote imagery, computer processing, and spatial-analysis technologies have been used to develop landscape information that can be integrated with hydrologic models to determine long-term change and make predictive inferences about the future. Two diverse case studies in northwest Oregon (Willamette River basin) and southeastern Arizona (San Pedro River) were examined in regard to future land use scenarios relative to their impact on surface water conditions (e.g., sediment yield and surface runoff) using hydrologic models associated with the Automated Geospatial Watershed Assessment (AGWA) tool. The base reference grid for land cover was modified in both study locations to reflect stakeholder preferences 20 to 60 yrs into the future, and the consequences of landscape change were evaluated relative to the selected future scenarios. The two studies provide examples of integrating hydrologic modeling with a scenario analysis framework to evaluate plausible future forecasts and to understand the potential impact of landscape change on ecosystem services.
The Future Medical Science and Colorectal Surgeons
2017-01-01
Future medical technology breakthroughs will build from the incredible progress made in computers, biotechnology, and nanotechnology and from the information learned from the human genome. With such technology and information, computer-aided diagnoses, organ replacement, gene therapy, personalized drugs, and even age reversal will become possible. True 3-dimensional system technology will enable surgeons to envision key clinical features and will help them in planning complex surgery. Surgeons will enter surgical instructions in a virtual space from a remote medical center, order a medical robot to perform the operation, and review the operation in real time on a monitor. Surgeons will be better than artificial intelligence or automated robots when surgeons (or we) love patients and ask questions for a better future. The purpose of this paper is looking at the future medical science and the changes of colorectal surgeons. PMID:29354602
The Future Medical Science and Colorectal Surgeons.
Kim, Young Jin
2017-12-01
Future medical technology breakthroughs will build from the incredible progress made in computers, biotechnology, and nanotechnology and from the information learned from the human genome. With such technology and information, computer-aided diagnoses, organ replacement, gene therapy, personalized drugs, and even age reversal will become possible. True 3-dimensional system technology will enable surgeons to envision key clinical features and will help them in planning complex surgery. Surgeons will enter surgical instructions in a virtual space from a remote medical center, order a medical robot to perform the operation, and review the operation in real time on a monitor. Surgeons will be better than artificial intelligence or automated robots when surgeons (or we) love patients and ask questions for a better future. The purpose of this paper is looking at the future medical science and the changes of colorectal surgeons.
Operating Dedicated Data Centers - Is It Cost-Effective?
NASA Astrophysics Data System (ADS)
Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.
2014-06-01
The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.
Three-dimensional computational aerodynamics in the 1980's
NASA Technical Reports Server (NTRS)
Lomax, H.
1978-01-01
The future requirements for constructing codes that can be used to compute three-dimensional flows about aerodynamic shapes should be assessed in light of the constraints imposed by future computer architectures and the reality of usable algorithms that can provide practical three-dimensional simulations. On the hardware side, vector processing is inevitable in order to meet the CPU speeds required. To cope with three-dimensional geometries, massive data bases with fetch/store conflicts and transposition problems are inevitable. On the software side, codes must be prepared that: (1) can be adapted to complex geometries, (2) can (at the very least) predict the location of laminar and turbulent boundary layer separation, and (3) will converge rapidly to sufficiently accurate solutions.
McCammon, Richard B.; Ramani, Raja V.; Mozumdar, Bijoy K.; Samaddar, Arun B.
1994-01-01
Overcoming future difficulties in searching for ore deposits deeper in the earth's crust will require closer attention to the collection and analysis of more diverse types of data and to more efficient use of current computer technologies. Computer technologies of greatest interest include methods of storage and retrieval of resource information, methods for integrating geologic, geochemical, and geophysical data, and the introduction of advanced computer technologies such as expert systems, multivariate techniques, and neural networks. Much experience has been gained in the past few years in applying these technologies. More experience is needed if they are to be implemented for everyday use in future assessments and exploration.
NASA Technical Reports Server (NTRS)
Makivic, Miloje S.
1996-01-01
This is the final technical report for the project entitled: "High-Performance Computing and Four-Dimensional Data Assimilation: The Impact on Future and Current Problems", funded at NPAC by the DAO at NASA/GSFC. First, the motivation for the project is given in the introductory section, followed by the executive summary of major accomplishments and the list of project-related publications. Detailed analysis and description of research results is given in subsequent chapters and in the Appendix.
Women's decision to major in STEM fields
NASA Astrophysics Data System (ADS)
Conklin, Stephanie
This paper explores the lived experiences of high school female students who choose to enter into STEM fields, and describes the influencing factors which steered these women towards majors in computer science, engineering and biology. Utilizing phenomenological methodology, this study seeks to understand the essence of women's decisions to enter into STEM fields and further describe how the decision-making process varies for women in high female enrollment fields, like biology, as compared with low enrollment fields like, computer science and engineering. Using Bloom's 3-Stage Theory, this study analyzes how relationships, experiences and barriers influenced women towards, and possibly away, from STEM fields. An analysis of women's experiences highlight that support of family, sustained experience in a STEM program during high school as well as the presence of an influential teacher were all salient factors in steering women towards STEM fields. Participants explained that influential teacher worked individually with them, modified and extended assignments and also steered participants towards coursework and experiences. This study also identifies factors, like guidance counselors as well as personal challenges, which inhibited participant's path to STEM fields. Further, through analyzing all six participants' experiences, it is clear that a linear model, like Bloom's 3-Stage Model, with limited ability to include potential barriers inhibited the ability to capture the essence of each participant's decision-making process. Therefore, a revised model with no linear progression which allows for emerging factors, like personal challenges, has been proposed; this model focuses on how interest in STEM fields begins to develop and is honed and then mastered. This study also sought to identify key differences in the paths of female students pursuing different majors. The findings of this study suggest that the path to computer science and engineering is limited. Computer science majors faced few, if any, challenges, hoped to use computers as a tool to innovate and also participated in the same computer science program. For female engineering students, the essence of their experience focused on interaction at a young age with an expert in an engineering-related field as well as a strong desire to help solve world problems using engineering. These participants were able to articulate clearly future careers. In contrast, biology majors, faced more challenges and were undecided about their future career goals. These results suggest that a longitudinal study focused on women pursuing engineering and computer science fields is warranted; this will hopefully allow these findings to be substantiated and also for refinement of the revised theoretical model.
Future Naval Use of COTS Networking Infrastructure
2009-07-01
user to benefit from Google’s vast databases and computational resources. Obviously, the ability to harness the full power of the Cloud could be... Computing Impact Findings Action Items Take-Aways Appendices: Pages 54-68 A. Terms of Reference Document B. Sample Definitions of Cloud ...and definition of Cloud Computing . While Cloud Computing is developing in many variations – including Infrastructure as a Service (IaaS), Platform as
A Pipeline for Large Data Processing Using Regular Sampling for Unstructured Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berres, Anne Sabine; Adhinarayanan, Vignesh; Turton, Terece
2017-05-12
Large simulation data requires a lot of time and computational resources to compute, store, analyze, visualize, and run user studies. Today, the largest cost of a supercomputer is not hardware but maintenance, in particular energy consumption. Our goal is to balance energy consumption and cognitive value of visualizations of resulting data. This requires us to go through the entire processing pipeline, from simulation to user studies. To reduce the amount of resources, data can be sampled or compressed. While this adds more computation time, the computational overhead is negligible compared to the simulation time. We built a processing pipeline atmore » the example of regular sampling. The reasons for this choice are two-fold: using a simple example reduces unnecessary complexity as we know what to expect from the results. Furthermore, it provides a good baseline for future, more elaborate sampling methods. We measured time and energy for each test we did, and we conducted user studies in Amazon Mechanical Turk (AMT) for a range of different results we produced through sampling.« less
Kiper, Pawel; Szczudlik, Andrzej; Venneri, Annalena; Stozek, Joanna; Luque-Moreno, Carlos; Opara, Jozef; Baba, Alfonc; Agostini, Michela; Turolla, Andrea
2016-10-15
Computational approaches for modelling the central nervous system (CNS) aim to develop theories on processes occurring in the brain that allow the transformation of all information needed for the execution of motor acts. Computational models have been proposed in several fields, to interpret not only the CNS functioning, but also its efferent behaviour. Computational model theories can provide insights into neuromuscular and brain function allowing us to reach a deeper understanding of neuroplasticity. Neuroplasticity is the process occurring in the CNS that is able to permanently change both structure and function due to interaction with the external environment. To understand such a complex process several paradigms related to motor learning and computational modeling have been put forward. These paradigms have been explained through several internal model concepts, and supported by neurophysiological and neuroimaging studies. Therefore, it has been possible to make theories about the basis of different learning paradigms according to known computational models. Here we review the computational models and motor learning paradigms used to describe the CNS and neuromuscular functions, as well as their role in the recovery process. These theories have the potential to provide a way to rigorously explain all the potential of CNS learning, providing a basis for future clinical studies. Copyright © 2016 Elsevier B.V. All rights reserved.
Knowledge of computer among healthcare professionals of India: a key toward e-health.
Gour, Neeraj; Srivastava, Dhiraj
2010-11-01
Information technology has radically changed the way that many people work and think. Over the years, technology has touched a new acme and now it is not confined to developed countries. Developing countries such as India have kept pace with the world in modern technology. Healthcare professionals can no longer ignore the application of information technology to healthcare because they are key to e-health. This study was conducted to enlighten the perspective and implications of computers among healthcare professionals, with the objective to assess the knowledge, use, and need of computers among healthcare professionals. A cross-sectional study of 240 healthcare professionals, including doctors, nurses, lab technicians, and pharmacists, was conducted. Each participant was interviewed using a pretested, semistructured format. Of 240 healthcare professionals, 57.91% were knowledgeable about computers. Of them, 22.08% had extensive knowledge and 35.83% had partial knowledge. Computer knowledge was greater among the age group 20-25 years (high knowledge-43.33% and partial knowledge-46.66%). Of 99 males, 21.21% were found to have good knowledge and 42.42% had partial knowledge. A majority of doctors and nurses used computer for study purposes. The remaining healthcare professionals used it basically for the sake of entertainment, Internet, and e-mail. A majority of all healthcare professionals (95.41%) requested computer training, which according to them would definitely help to make their future more bright and nurtured as well as to enhance their knowledge regarding computers.
Computational challenges of structure-based approaches applied to HIV.
Forli, Stefano; Olson, Arthur J
2015-01-01
Here, we review some of the opportunities and challenges that we face in computational modeling of HIV therapeutic targets and structural biology, both in terms of methodology development and structure-based drug design (SBDD). Computational methods have provided fundamental support to HIV research since the initial structural studies, helping to unravel details of HIV biology. Computational models have proved to be a powerful tool to analyze and understand the impact of mutations and to overcome their structural and functional influence in drug resistance. With the availability of structural data, in silico experiments have been instrumental in exploiting and improving interactions between drugs and viral targets, such as HIV protease, reverse transcriptase, and integrase. Issues such as viral target dynamics and mutational variability, as well as the role of water and estimates of binding free energy in characterizing ligand interactions, are areas of active computational research. Ever-increasing computational resources and theoretical and algorithmic advances have played a significant role in progress to date, and we envision a continually expanding role for computational methods in our understanding of HIV biology and SBDD in the future.
Advances in computational design and analysis of airbreathing propulsion systems
NASA Technical Reports Server (NTRS)
Klineberg, John M.
1989-01-01
The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.
Pewaukee School District, Wisconsin. Case Study: Measures of Academic Progress
ERIC Educational Resources Information Center
Northwest Evaluation Association, 2015
2015-01-01
For more than a decade, Pewaukee School District Superintendent JoAnn Sternke has watched her district get better and better at its mission: opening the door to each student's future. The Wisconsin district began using Measures of Academic Progress® (MAP®) computer adaptive interim assessments from Northwest Evaluation Association™ (NWEA™) in 2004…
ERIC Educational Resources Information Center
Hinton, Vanessa; Flores, Margaret; Burton, Megan; Curtis, Rebecca
2015-01-01
The purpose of this mixed method study was to investigate future special education teachers' preparation for effectively teaching mathematics. During the last semester of their program, pre-service special education teachers completed elementary level mathematics computation and problem solving assessments, a mathematics efficacy beliefs survey,…
ERIC Educational Resources Information Center
Gilstrap, Donald L.
2013-01-01
In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…
CALL: Past, Present and Future--A Bibliometric Approach
ERIC Educational Resources Information Center
Jung, Udo O. H.
2005-01-01
A bibliometric approach is used not only to sketch out the development of CALL during the last 25 years, but also to assess the contribution of educational technology to 21st century foreign-language teaching and learning. This study is based on the six instalments of the author's International (and multilingual) Bibliography of Computer Assisted…
Solar Energy in America's Future, A Preliminary Assessment.
ERIC Educational Resources Information Center
Energy Research and Development Administration, Washington, DC. Div. of Solar Energy.
This report was prepared as an account of work sponsored by the United States Government. The report documents a Stanford Research Institute study of the potential roles that solar energy technologies could have for meeting U.S. energy needs over the next 45 years. Computer simulations of different energy supply projections were developed by…
A Low Cost Simulation System to Demonstrate Pilot Induced Oscillation Phenomenon
NASA Technical Reports Server (NTRS)
Ali, Syed Firasat
1997-01-01
A flight simulation system with graphics and software on Silicon Graphics computer workstations has been installed in the Flight Vehicle Design Laboratory at Tuskegee University. The system has F-15E flight simulation software from NASA Dryden which uses the graphics of SGI flight simulation demos. On the system, thus installed, a study of pilot induced oscillations is planned for future work. Preliminary research is conducted by obtaining two sets of straight level flights with pilot in the loop. In one set of flights no additional delay is used between the stick input and the appearance of airplane response on the computer monitor. In another set of flights, a 500 ms additional delay is used. The flight data is analyzed to find cross correlations between deflections of control surfaces and response of the airplane. The pilot dynamics features depicted from cross correlations of straight level flights are discussed in this report. The correlations presented here will serve as reference material for the corresponding correlations in a future study of pitch attitude tracking tasks involving pilot induced oscillations.
COINGRAD; Control Oriented Interactive Graphical Analysis and Design.
ERIC Educational Resources Information Center
Volz, Richard A.; And Others
The computer is currently a vital tool in engineering analysis and design. With the introduction of moderately priced graphics terminals, it will become even more important in the future as rapid graphic interaction between the engineer and the computer becomes more feasible in computer-aided design (CAD). To provide a vehicle for introducing…
Computational Modeling of Tires
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler); Tanner, John A. (Compiler)
1995-01-01
This document contains presentations and discussions from the joint UVA/NASA Workshop on Computational Modeling of Tires. The workshop attendees represented NASA, the Army and Air force, tire companies, commercial software developers, and academia. The workshop objectives were to assess the state of technology in the computational modeling of tires and to provide guidelines for future research.
The National Special Education Alliance: One Year Later.
ERIC Educational Resources Information Center
Green, Peter
1988-01-01
The National Special Education Alliance (a national network of local computer resource centers associated with Apple Computer, Inc.) consists, one year after formation, of 24 non-profit support centers staffed largely by volunteers. The NSEA now reaches more than 1000 disabled computer users each month and more growth in the future is expected.…
Analysis of rocket engine injection combustion processes
NASA Technical Reports Server (NTRS)
Salmon, J. W.; Saltzman, D. H.
1977-01-01
Mixing methodology improvement for the JANNAF DER and CICM injection/combustion analysis computer programs was accomplished. ZOM plane prediction model development was improved for installation into the new standardized DER computer program. An intra-element mixing model developing approach was recommended for gas/liquid coaxial injection elements for possible future incorporation into the CICM computer program.
Counselor Computer Competence: Future Agenda for Counselor Educators.
ERIC Educational Resources Information Center
Dickel, C. Timothy
This paper asserts that the computer has become an integral part of communication within the world culture and that it has tremendous utility for the counseling profession. Counselor educators are encouraged to incorporate computer competence into their curriculum. This report is divided into four parts. First, there is a brief discussion of the…
The Educational Value of Microcomputers: Perceptions among Parents of Young Gifted Children.
ERIC Educational Resources Information Center
Johnson, Lawrence J.; Lewman, Beverly S.
1986-01-01
Parents of 62 children enrolled in a private school for young gifted students completed a questionnaire designed to assess home use of computers, as well as parental concerns and expectations for appropriate concurrent and future computer use in educational settings. Familiarity with computers increased perceptions of their beneficial educational…
On the Emergence of New Computer Technologies
ERIC Educational Resources Information Center
Asaolu, Olumuyiwa Sunday
2006-01-01
This work presents a review of the development and application of computers. It traces the highlights of emergent computing technologies shaping our world. Recent trends in hardware and software deployment are chronicled as well as their impact on various segments of the society. The expectations for the future are also discussed along with…
17 CFR 10.5 - Computation of time.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Computation of time. 10.5 Section 10.5 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION RULES OF PRACTICE... computed is to be included unless it is a Saturday, a Sunday, or a legal holiday; in which event the period...
The Impact of Software on Associate Degree Programs in Electronic Engineering Technology.
ERIC Educational Resources Information Center
Hata, David M.
1986-01-01
Assesses the range and extent of computer assisted instruction software available in electronic engineering technology education. Examines the need for software skills in four areas: (1) high-level languages; (2) assembly language; (3) computer-aided engineering; and (4) computer-aided instruction. Outlines strategies for the future in three…
The Computer Industry. High Technology Industries: Profiles and Outlooks.
ERIC Educational Resources Information Center
International Trade Administration (DOC), Washington, DC.
A series of meetings was held to assess future problems in United States high technology, particularly in the fields of robotics, computers, semiconductors, and telecommunications. This report, which focuses on the computer industry, includes a profile of this industry and the papers presented by industry speakers during the meetings. The profile…
Campus Computing 1991. The EDUCOM-USC Survey of Desktop Computing in Higher Education.
ERIC Educational Resources Information Center
Green, Kenneth C.; Eastman, Skip
A national survey of desktop computing in higher education was conducted in 1991 of 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges, and community colleges. Respondents (N=1099) were individuals specifically responsible for the operation and future direction of academic…
Computer-Assisted Diagnostic Decision Support: History, Challenges, and Possible Paths Forward
ERIC Educational Resources Information Center
Miller, Randolph A.
2009-01-01
This paper presents a brief history of computer-assisted diagnosis, including challenges and future directions. Some ideas presented in this article on computer-assisted diagnostic decision support systems (CDDSS) derive from prior work by the author and his colleagues (see list in Acknowledgments) on the INTERNIST-1 and QMR projects. References…
Computer-Based English Language Testing in China: Present and Future
ERIC Educational Resources Information Center
Yu, Guoxing; Zhang, Jing
2017-01-01
In this special issue on high-stakes English language testing in China, the two articles on computer-based testing (Jin & Yan; He & Min) highlight a number of consistent, ongoing challenges and concerns in the development and implementation of the nationwide IB-CET (Internet Based College English Test) and institutional computer-adaptive…
The future of cerebral surgery: a kaleidoscope of opportunities.
Elder, James B; Hoh, Daniel J; Oh, Bryan C; Heller, A Chris; Liu, Charles Y; Apuzzo, Michael L J
2008-06-01
The emerging future of cerebral surgery will witness the refined evolution of current techniques, as well as the introduction of numerous novel concepts. Clinical practice and basic science research will benefit greatly from their application. The sum of these efforts will result in continued minimalism and improved accuracy and efficiency of neurosurgical diagnostic and therapeutic methodologies.Initially, the refinement of current technologies will further enhance various aspects of cerebral surgery. Advances in computing power and information technology will speed data acquisition, storage, and transfer. Miniaturization of current devices will impact diverse areas, such as modulation of endoscopy and endovascular techniques. The increased penetrance of surgical technologies such as stereotactic radiosurgery, neuronavigation, intraoperative imaging, and implantable electrodes for neurodegenerative disorders and epilepsy will enhance the knowledge and experience in these areas and facilitate refinements and advances in these technologies. Further into the future, technologies that are currently relatively remote to surgical events will fundamentally alter the complexity and scale at which a neurological disease may be treated or investigated. Seemingly futuristic concepts will become ubiquitous in the daily experience of the neurosurgeon. These include diverse fields such as nanotechnology, virtual reality, and robotics. Ultimately, combining advances in multiple fields will yield progress in diverse realms such as brain tumor therapy, neuromodulation for psychiatric diseases, and neuroprosthetics. Operating room equipment and design will benefit from each of the aforementioned advances. In this work, we discuss new developments in three parts. In Part I, concepts in minimalism important for future cerebral surgery are discussed. These include concrete and abstract ideas in miniaturization, as well as recent and future work in microelectromechanical systems and nanotechnology. Part II presents advances in computational sciences and technological fields dependent on these developments. Future breakthroughs in the components of the "computer," including data storage, electrical circuitry, and computing hardware and techniques, are discussed. Additionally, important concepts in the refinement of virtual environments and the brain-machine interface are presented, as their incorporation into cerebral surgery is closely linked to advances in computing and electronics. Finally, Part III offers insights into the future evolution of surgical and nonsurgical diagnostic and therapeutic modalities that are important for the future cerebral surgeon. A number of topics relevant to cerebral surgery are discussed, including the operative environment, imaging technologies, endoscopy, robotics, neuromodulation, stem cell therapy, radiosurgery, and technical methods of restoration of neural function. Cerebral surgery in the near and distant future will reflect the application of these emerging technologies. As this article indicates, the key to maximizing the impact of these advancements in the clinical arena is continued collaboration between scientists and neurosurgeons, as well as the emergence of a neurosurgeon whose scientific grounding and technical focus are far removed from those of his predecessors.
The Fabric for Frontier Experiments Project at Fermilab
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirby, Michael
2014-01-01
The FabrIc for Frontier Experiments (FIFE) project is a new, far-reaching initiative within the Fermilab Scientific Computing Division to drive the future of computing services for experiments at FNAL and elsewhere. It is a collaborative effort between computing professionals and experiment scientists to produce an end-to-end, fully integrated set of services for computing on the grid and clouds, managing data, accessing databases, and collaborating within experiments. FIFE includes 1) easy to use job submission services for processing physics tasks on the Open Science Grid and elsewhere, 2) an extensive data management system for managing local and remote caches, cataloging, querying,more » moving, and tracking the use of data, 3) custom and generic database applications for calibrations, beam information, and other purposes, 4) collaboration tools including an electronic log book, speakers bureau database, and experiment membership database. All of these aspects will be discussed in detail. FIFE sets the direction of computing at Fermilab experiments now and in the future, and therefore is a major driver in the design of computing services worldwide.« less
Computational structures technology and UVA Center for CST
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1992-01-01
Rapid advances in computer hardware have had a profound effect on various engineering and mechanics disciplines, including the materials, structures, and dynamics disciplines. A new technology, computational structures technology (CST), has recently emerged as an insightful blend between material modeling, structural and dynamic analysis and synthesis on the one hand, and other disciplines such as computer science, numerical analysis, and approximation theory, on the other hand. CST is an outgrowth of finite element methods developed over the last three decades. The focus of this presentation is on some aspects of CST which can impact future airframes and propulsion systems, as well as on the newly established University of Virginia (UVA) Center for CST. The background and goals for CST are described along with the motivations for developing CST, and a brief discussion is made on computational material modeling. We look at the future in terms of technical needs, computing environment, and research directions. The newly established UVA Center for CST is described. One of the research projects of the Center is described, and a brief summary of the presentation is given.
Endodontic applications of 3D printing.
Anderson, J; Wealleans, J; Ray, J
2018-02-27
Computer-aided design (CAD) and computer-aided manufacturing (CAM) technologies can leverage cone beam computed tomography data for production of objects used in surgical and nonsurgical endodontics and in educational settings. The aim of this article was to review all current applications of 3D printing in endodontics and to speculate upon future directions for research and clinical use within the specialty. A literature search of PubMed, Ovid and Scopus was conducted using the following terms: stereolithography, 3D printing, computer aided rapid prototyping, surgical guide, guided endodontic surgery, guided endodontic access, additive manufacturing, rapid prototyping, autotransplantation rapid prototyping, CAD, CAM. Inclusion criteria were articles in the English language documenting endodontic applications of 3D printing. Fifty-one articles met inclusion criteria and were utilized. The endodontic literature on 3D printing is generally limited to case reports and pre-clinical studies. Documented solutions to endodontic challenges include: guided access with pulp canal obliteration, applications in autotransplantation, pre-surgical planning and educational modelling and accurate location of osteotomy perforation sites. Acquisition of technical expertise and equipment within endodontic practices present formidable obstacles to widespread deployment within the endodontic specialty. As knowledge advances, endodontic postgraduate programmes should consider implementing 3D printing into their curriculums. Future research directions should include clinical outcomes assessments of treatments employing 3D printed objects. Published 2018. This article is a U.S. Government work and is in the public domain in the USA.
Gimeno-Blanes, Francisco J.; Blanco-Velasco, Manuel; Barquero-Pérez, Óscar; García-Alberola, Arcadi; Rojo-Álvarez, José L.
2016-01-01
Great effort has been devoted in recent years to the development of sudden cardiac risk predictors as a function of electric cardiac signals, mainly obtained from the electrocardiogram (ECG) analysis. But these prediction techniques are still seldom used in clinical practice, partly due to its limited diagnostic accuracy and to the lack of consensus about the appropriate computational signal processing implementation. This paper addresses a three-fold approach, based on ECG indices, to structure this review on sudden cardiac risk stratification. First, throughout the computational techniques that had been widely proposed for obtaining these indices in technical literature. Second, over the scientific evidence, that although is supported by observational clinical studies, they are not always representative enough. And third, via the limited technology transfer of academy-accepted algorithms, requiring further meditation for future systems. We focus on three families of ECG derived indices which are tackled from the aforementioned viewpoints, namely, heart rate turbulence (HRT), heart rate variability (HRV), and T-wave alternans. In terms of computational algorithms, we still need clearer scientific evidence, standardizing, and benchmarking, siting on advanced algorithms applied over large and representative datasets. New scenarios like electronic health recordings, big data, long-term monitoring, and cloud databases, will eventually open new frameworks to foresee suitable new paradigms in the near future. PMID:27014083
A preliminary study of flat-panel displays
NASA Technical Reports Server (NTRS)
Yancey, K. E.
1986-01-01
Six display technologies that might be of future value in a spacelab workstation are discussed. Some have been developed to the point where they could be used as a computer display while others have not. The display technologies studied are electroluminescents, light-emitting didodes, gas plasma, liquid crystal, electrochromic, and electrophoretic. An explanation of each mechanism is provided along with the state-of-the-art development.