Utility Computing: Reality and Beyond
NASA Astrophysics Data System (ADS)
Ivanov, Ivan I.
Utility Computing is not a new concept. It involves organizing and providing a wide range of computing-related services as public utilities. Much like water, gas, electricity and telecommunications, the concept of computing as public utility was announced in 1955. Utility Computing remained a concept for near 50 years. Now some models and forms of Utility Computing are emerging such as storage and server virtualization, grid computing, and automated provisioning. Recent trends in Utility Computing as a complex technology involve business procedures that could profoundly transform the nature of companies' IT services, organizational IT strategies and technology infrastructure, and business models. In the ultimate Utility Computing models, organizations will be able to acquire as much IT services as they need, whenever and wherever they need them. Based on networked businesses and new secure online applications, Utility Computing would facilitate "agility-integration" of IT resources and services within and between virtual companies. With the application of Utility Computing there could be concealment of the complexity of IT, reduction of operational expenses, and converting of IT costs to variable `on-demand' services. How far should technology, business and society go to adopt Utility Computing forms, modes and models?
Applied technology center business plan and market survey
NASA Technical Reports Server (NTRS)
Hodgin, Robert F.; Marchesini, Roberto
1990-01-01
Business plan and market survey for the Applied Technology Center (ATC), computer technology transfer and development non-profit corporation, is presented. The mission of the ATC is to stimulate innovation in state-of-the-art and leading edge computer based technology. The ATC encourages the practical utilization of late-breaking computer technologies by firms of all variety.
Three-Dimensional Nanobiocomputing Architectures With Neuronal Hypercells
2007-06-01
Neumann architectures, and CMOS fabrication. Novel solutions of massive parallel distributed computing and processing (pipelined due to systolic... and processing platforms utilizing molecular hardware within an enabling organization and architecture. The design technology is based on utilizing a...Microsystems and Nanotechnologies investigated a novel 3D3 (Hardware Software Nanotechnology) technology to design super-high performance computing
Recent development on computer aided tissue engineering--a review.
Sun, Wei; Lal, Pallavi
2002-02-01
The utilization of computer-aided technologies in tissue engineering has evolved in the development of a new field of computer-aided tissue engineering (CATE). This article reviews recent development and application of enabling computer technology, imaging technology, computer-aided design and computer-aided manufacturing (CAD and CAM), and rapid prototyping (RP) technology in tissue engineering, particularly, in computer-aided tissue anatomical modeling, three-dimensional (3-D) anatomy visualization and 3-D reconstruction, CAD-based anatomical modeling, computer-aided tissue classification, computer-aided tissue implantation and prototype modeling assisted surgical planning and reconstruction.
Learning Mathematics with Interactive Whiteboards and Computer-Based Graphing Utility
ERIC Educational Resources Information Center
Erbas, Ayhan Kursat; Ince, Muge; Kaya, Sukru
2015-01-01
The purpose of this study was to explore the effect of a technology-supported learning environment utilizing an interactive whiteboard (IWB) and NuCalc graphing software compared to a traditional direct instruction-based environment on student achievement in graphs of quadratic functions and attitudes towards mathematics and technology. Sixty-five…
NASA Technical Reports Server (NTRS)
Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.
1975-01-01
The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.
Computer-Based Methods for Collecting Peer Nomination Data: Utility, Practice, and Empirical Support
ERIC Educational Resources Information Center
van den Berg, Yvonne H. M.; Gommans, Rob
2017-01-01
New technologies have led to several major advances in psychological research over the past few decades. Peer nomination research is no exception. Thanks to these technological innovations, computerized data collection is becoming more common in peer nomination research. However, computer-based assessment is more than simply programming the…
Are Technology Interruptions Impacting Your Bottom Line? An Innovative Proposal for Change.
Ledbetter, Tamera; Shultz, Sarah; Beckham, Roxanne
2017-10-01
Nursing interruptions are a costly and dangerous variable in acute care hospitals. Malfunctioning technology equipment interrupts nursing care and prevents full utilization of computer safety systems to prevent patient care errors. This paper identifies an innovative approach to nursing interruptions related to computer and computer cart malfunctions. The impact on human resources is defined and outcome measures were proposed. A multifaceted proposal, based on a literature review, aimed at reducing nursing interruptions is presented. This proposal is expected to increase patient safety, as well as patient and nurse satisfaction. Acute care hospitals utilizing electronic medical records and bar-coded medication administration technology. Nurses, information technology staff, nursing informatics staff, and all leadership teams affected by technology problems and their proposed solutions. Literature from multiple fields was reviewed to evaluate research related to computer/computer cart failures, and the approaches used to resolve these issues. Outcome measured strategic goals related to patient safety, and nurse and patient satisfaction. Specific help desk metrics will demonstrate the effect of interventions. This paper addresses a gap in the literature and proposes practical and innovative solutions. A comprehensive computer and computer cart repair program is essential for patient safety, financial stewardship, and utilization of resources. © 2015 Wiley Periodicals, Inc.
Law of Large Numbers: The Theory, Applications and Technology-Based Education
ERIC Educational Resources Information Center
Dinov, Ivo D.; Christou, Nicolas; Gould, Robert
2009-01-01
Modern approaches for technology-based blended education utilize a variety of recently developed novel pedagogical, computational and network resources. Such attempts employ technology to deliver integrated, dynamically-linked, interactive-content and heterogeneous learning environments, which may improve student comprehension and information…
Consumer-based technology for distribution of surgical videos for objective evaluation.
Gonzalez, Ray; Martinez, Jose M; Lo Menzo, Emanuele; Iglesias, Alberto R; Ro, Charles Y; Madan, Atul K
2012-08-01
The Global Operative Assessment of Laparoscopic Skill (GOALS) is one validated metric utilized to grade laparoscopic skills and has been utilized to score recorded operative videos. To facilitate easier viewing of these recorded videos, we are developing novel techniques to enable surgeons to view these videos. The objective of this study is to determine the feasibility of utilizing widespread current consumer-based technology to assist in distributing appropriate videos for objective evaluation. Videos from residents were recorded via a direct connection from the camera processor via an S-video output via a cable into a hub to connect to a standard laptop computer via a universal serial bus (USB) port. A standard consumer-based video editing program was utilized to capture the video and record in appropriate format. We utilized mp4 format, and depending on the size of the file, the videos were scaled down (compressed), their format changed (using a standard video editing program), or sliced into multiple videos. Standard available consumer-based programs were utilized to convert the video into a more appropriate format for handheld personal digital assistants. In addition, the videos were uploaded to a social networking website and video sharing websites. Recorded cases of laparoscopic cholecystectomy in a porcine model were utilized. Compression was required for all formats. All formats were accessed from home computers, work computers, and iPhones without difficulty. Qualitative analyses by four surgeons demonstrated appropriate quality to grade for these formats. Our preliminary results show promise that, utilizing consumer-based technology, videos can be easily distributed to surgeons to grade via GOALS via various methods. Easy accessibility may help make evaluation of resident videos less complicated and cumbersome.
Training for New Manufacturing Technologies.
ERIC Educational Resources Information Center
Jacobs, James
1988-01-01
Examines the effects of computer-based manufacturing technologies on employment opportunities and job skills. Describes the establishment of the Industrial Technology Institute in Michigan to develop and utilize advanced manufacturing technologies, and the institute's relationship to the state's community colleges. Reviews lessons learned from…
ERIC Educational Resources Information Center
What Works Clearinghouse, 2012
2012-01-01
"Technology Enhanced Elementary and Middle School Science" ("TEEMSS") is a physical science curriculum for grades 3-8 that utilizes computers, sensors, and interactive models to support investigations of real-world phenomena. Through 15 inquiry-based instructional units, students interact with computers, gather and analyze…
A GPU-based mipmapping method for water surface visualization
NASA Astrophysics Data System (ADS)
Li, Hua; Quan, Wei; Xu, Chao; Wu, Yan
2018-03-01
Visualization of water surface is a hot topic in computer graphics. In this paper, we presented a fast method to generate wide range of water surface with good image quality both near and far from the viewpoint. This method utilized uniform mesh and Fractal Perlin noise to model water surface. Mipmapping technology was enforced to the surface textures, which adjust the resolution with respect to the distance from the viewpoint and reduce the computing cost. Lighting effect was computed based on shadow mapping technology, Snell's law and Fresnel term. The render pipeline utilizes a CPU-GPU shared memory structure, which improves the rendering efficiency. Experiment results show that our approach visualizes water surface with good image quality at real-time frame rates performance.
NASA Astrophysics Data System (ADS)
Kun, Luis G.
1995-10-01
During the first Health Care Technology Policy conference last year, during health care reform, four major issues were brought up in regards to the efforts underway to develop a computer based patient record (CBPR), the National Information Infrastructure (NII) as part of the high performance computers and communications (HPCC), and the so-called 'patient card.' More specifically it was explained how a national information system will greatly affect the way health care delivery is provided to the United States public and reduce its costs. These four issues were: (1) Constructing a national information infrastructure (NII); (2) Building a computer based patient record system; (3) Bringing the collective resources of our national laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; (4) Utilizing government (e.g., DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs, and accelerate technology transfer to address health care issues. This year a section of this conference entitled: 'Health Care Technology Assets of the Federal Government' addresses benefits of the technology transfer which should occur for maximizing already developed resources. This section entitled: 'Transfer and Utilization of Government Technology Assets to the Private Sector,' will look at both health care and non-health care related technologies since many areas such as information technologies (i.e. imaging, communications, archival/retrieval, systems integration, information display, multimedia, heterogeneous data bases, etc.) already exist and are part of our national labs and/or other federal agencies, i.e., ARPA. These technologies although they are not labeled under health care programs they could provide enormous value to address technical needs. An additional issue deals with both the technical (hardware, software) and human expertise that resides within these labs and their possible role in creating cost effective solutions.
A personal computer-based nuclear magnetic resonance spectrometer
NASA Astrophysics Data System (ADS)
Job, Constantin; Pearson, Robert M.; Brown, Michael F.
1994-11-01
Nuclear magnetic resonance (NMR) spectroscopy using personal computer-based hardware has the potential of enabling the application of NMR methods to fields where conventional state of the art equipment is either impractical or too costly. With such a strategy for data acquisition and processing, disciplines including civil engineering, agriculture, geology, archaeology, and others have the possibility of utilizing magnetic resonance techniques within the laboratory or conducting applications directly in the field. Another aspect is the possibility of utilizing existing NMR magnets which may be in good condition but unused because of outdated or nonrepairable electronics. Moreover, NMR applications based on personal computer technology may open up teaching possibilities at the college or even secondary school level. The goal of developing such a personal computer (PC)-based NMR standard is facilitated by existing technologies including logic cell arrays, direct digital frequency synthesis, use of PC-based electrical engineering software tools to fabricate electronic circuits, and the use of permanent magnets based on neodymium-iron-boron alloy. Utilizing such an approach, we have been able to place essentially an entire NMR spectrometer console on two printed circuit boards, with the exception of the receiver and radio frequency power amplifier. Future upgrades to include the deuterium lock and the decoupler unit are readily envisioned. The continued development of such PC-based NMR spectrometers is expected to benefit from the fast growing, practical, and low cost personal computer market.
Construction and application of Red5 cluster based on OpenStack
NASA Astrophysics Data System (ADS)
Wang, Jiaqing; Song, Jianxin
2017-08-01
With the application and development of cloud computing technology in various fields, the resource utilization rate of the data center has been improved obviously, and the system based on cloud computing platform has also improved the expansibility and stability. In the traditional way, Red5 cluster resource utilization is low and the system stability is poor. This paper uses cloud computing to efficiently calculate the resource allocation ability, and builds a Red5 server cluster based on OpenStack. Multimedia applications can be published to the Red5 cloud server cluster. The system achieves the flexible construction of computing resources, but also greatly improves the stability of the cluster and service efficiency.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wirick, D.W.; Montgomery, G.E.; Wagman, D.C.
1995-09-01
One technology that can assist utilities remain financially viable in competitive markets and help utilities and regulators to better serve the public is information technology. Because geography is an important part of an electric, natural gas, telecommunications, or water utility, computer-based Geographic Information Systems (GIS) and related Automated Mapping/Facilities Management systems are emerging as core technologies for managing an ever-expanding variety of formerly manual or paper-based tasks. This report focuses on GIS as an example of the types of information systems that can be used by utilities and regulatory commissions. Chapter 2 provides general information about information systems and effectsmore » of information on organizations; Chapter 3 explores the conversion of an organization to an information-based one; Chapters 4 and 5 set out GIS as an example of the use of information technologies to transform the operations of utilities and commissions; Chapter 6 describes the use of GIS and other information systems for organizational reengineering efforts; and Chapter 7 examines the regulatory treatment of information systems.« less
ERIC Educational Resources Information Center
Cohen, Edward Charles
2013-01-01
Design based research was utilized to investigate how students use a greenhouse effect simulation in order to derive best learning practices. During this process, students recognized the authentic scientific process involving computer simulations. The simulation used is embedded within an inquiry-based technology-mediated science curriculum known…
Flight control systems development of highly maneuverable aircraft technology /HiMAT/ vehicle
NASA Technical Reports Server (NTRS)
Petersen, K. L.
1979-01-01
The highly maneuverable aircraft technology (HiMAT) program was conceived to demonstrate advanced technology concepts through scaled-aircraft flight tests using a remotely piloted technique. Closed-loop primary flight control is performed from a ground-based cockpit, utilizing a digital computer and up/down telemetry links. A backup flight control system for emergency operation resides in an onboard computer. The onboard systems are designed to provide fail-operational capabilities and utilize two microcomputers, dual uplink receiver/decoders, and redundant hydraulic actuation and power systems. This paper discusses the design and validation of the primary and backup digital flight control systems as well as the unique pilot and specialized systems interfaces.
Applications of Technology in Neuropsychological Assessment
Parsey, Carolyn M.; Schmitter-Edgecombe, Maureen
2013-01-01
Most neuropsychological assessments include at least one measure that is administered, scored, or interpreted by computers or other technologies. Despite supportive findings for these technology-based assessments, there is resistance in the field of neuropsychology to adopt additional measures that incorporate technology components. This literature review addresses the research findings of technology-based neuropsychological assessments, including computer-, and virtual reality-based measures of cognitive and functional abilities. We evaluate the strengths and limitations of each approach, and examine the utility of technology-based assessments to obtain supplemental cognitive and behavioral information that may be otherwise undetected by traditional paper and pencil measures. We argue that the potential of technology use in neuropsychological assessment has not yet been realized, and continued adoption of new technologies could result in more comprehensive assessment of cognitive dysfunction and in turn, better informed diagnosis and treatments. Recommendations for future research are also provided. PMID:24041037
Applications of technology in neuropsychological assessment.
Parsey, Carolyn M; Schmitter-Edgecombe, Maureen
2013-01-01
Most neuropsychological assessments include at least one measure that is administered, scored, or interpreted by computers or other technologies. Despite supportive findings for these technology-based assessments, there is resistance in the field of neuropsychology to adopt additional measures that incorporate technology components. This literature review addresses the research findings of technology-based neuropsychological assessments, including computer- and virtual reality-based measures of cognitive and functional abilities. We evaluate the strengths and limitations of each approach, and examine the utility of technology-based assessments to obtain supplemental cognitive and behavioral information that may be otherwise undetected by traditional paper-and-pencil measures. We argue that the potential of technology use in neuropsychological assessment has not yet been realized, and continued adoption of new technologies could result in more comprehensive assessment of cognitive dysfunction and in turn, better informed diagnosis and treatments. Recommendations for future research are also provided.
NASA Astrophysics Data System (ADS)
Yue, S. S.; Wen, Y. N.; Lv, G. N.; Hu, D.
2013-10-01
In recent years, the increasing development of cloud computing technologies laid critical foundation for efficiently solving complicated geographic issues. However, it is still difficult to realize the cooperative operation of massive heterogeneous geographical models. Traditional cloud architecture is apt to provide centralized solution to end users, while all the required resources are often offered by large enterprises or special agencies. Thus, it's a closed framework from the perspective of resource utilization. Solving comprehensive geographic issues requires integrating multifarious heterogeneous geographical models and data. In this case, an open computing platform is in need, with which the model owners can package and deploy their models into cloud conveniently, while model users can search, access and utilize those models with cloud facility. Based on this concept, the open cloud service strategies for the sharing of heterogeneous geographic analysis models is studied in this article. The key technology: unified cloud interface strategy, sharing platform based on cloud service, and computing platform based on cloud service are discussed in detail, and related experiments are conducted for further verification.
van den Berg, Yvonne H M; Gommans, Rob
2017-09-01
New technologies have led to several major advances in psychological research over the past few decades. Peer nomination research is no exception. Thanks to these technological innovations, computerized data collection is becoming more common in peer nomination research. However, computer-based assessment is more than simply programming the questionnaire and asking respondents to fill it in on computers. In this chapter the advantages and challenges of computer-based assessments are discussed. In addition, a list of practical recommendations and considerations is provided to inform researchers on how computer-based methods can be applied to their own research. Although the focus is on the collection of peer nomination data in particular, many of the requirements, considerations, and implications are also relevant for those who consider the use of other sociometric assessment methods (e.g., paired comparisons, peer ratings, peer rankings) or computer-based assessments in general. © 2017 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Library of Congress, Washington, DC. Congressional Research Service.
This summary of the combined Hearing and Workshop on Applications of Computer-Based Information Systems and Services in Agriculture (May 19-20, 1982) offers an overview of the ways in which information technology--computers, telecommunications, microforms, word processing, video and audio devices--may be utilized by American farmers and ranchers.…
Central Limit Theorem: New SOCR Applet and Demonstration Activity
ERIC Educational Resources Information Center
Dinov, Ivo D.; Christou, Nicholas; Sanchez, Juana
2008-01-01
Modern approaches for information technology based blended education utilize a variety of novel instructional, computational and network resources. Such attempts employ technology to deliver integrated, dynamically linked, interactive content and multi-faceted learning environments, which may facilitate student comprehension and information…
2015-01-01
Background Incorporation of information communication technology in health care has gained wide acceptance in the last two decades. Developing countries are also incorporating information communication technology into the health system including the implementation of electronic medical records in major hospitals and the use of mobile health in rural community-based health interventions. However, the literature on the level of knowledge and utilization of information communication technology by health professionals in those settings is scarce for proper implementation planning. Objective The objective of this study is to assess knowledge, computer utilization, and associated factors among health professionals in hospitals and health institutions in Ethiopia. Methods A quantitative cross-sectional study was conducted on 554 health professionals working in 7 hospitals, 19 primary health centers, and 10 private clinics in the Harari region of Ethiopia. Data were collected using a semi-structured, self-administered, and pre-tested questionnaire. Descriptive and logistic regression techniques using SPSS version 16.0 (IBM Corporation) were applied to determine the level of knowledge and identify determinants of utilization of information communication technology. Results Out of 554 participants, 482 (87.0%) of them responded to the questionnaire. Among them, 90 (18.7%) demonstrated good knowledge of computers while 142 (29.5%) demonstrated good utilization habits. Health professionals who work in the primary health centers were found to have lower knowledge (3.4%) and utilization (18.4%). Age (adjusted odds ratio [AOR]=3.06, 95% CI 0.57-5.37), field of study (AOR=3.08, 95% CI 1.65-5.73), level of education (AOR=2.78, 95% CI 1.43-5.40), and previous computer training participation (AOR=3.65, 95% CI 1.62-8.21) were found to be significantly associated with computer utilization habits of health professionals. Conclusions Computer knowledge and utilization habits of health professionals, especially those who work in primary health centers, were found to be low. Providing trainings and continuous follow-up are necessary measures to increase the likelihood of the success of implemented eHealth systems in those settings. PMID:27025996
Alwan, Kalid; Awoke, Tadesse; Tilahun, Binyam
2015-03-26
Incorporation of information communication technology in health care has gained wide acceptance in the last two decades. Developing countries are also incorporating information communication technology into the health system including the implementation of electronic medical records in major hospitals and the use of mobile health in rural community-based health interventions. However, the literature on the level of knowledge and utilization of information communication technology by health professionals in those settings is scarce for proper implementation planning. The objective of this study is to assess knowledge, computer utilization, and associated factors among health professionals in hospitals and health institutions in Ethiopia. A quantitative cross-sectional study was conducted on 554 health professionals working in 7 hospitals, 19 primary health centers, and 10 private clinics in the Harari region of Ethiopia. Data were collected using a semi-structured, self-administered, and pre-tested questionnaire. Descriptive and logistic regression techniques using SPSS version 16.0 (IBM Corporation) were applied to determine the level of knowledge and identify determinants of utilization of information communication technology. Out of 554 participants, 482 (87.0%) of them responded to the questionnaire. Among them, 90 (18.7%) demonstrated good knowledge of computers while 142 (29.5%) demonstrated good utilization habits. Health professionals who work in the primary health centers were found to have lower knowledge (3.4%) and utilization (18.4%). Age (adjusted odds ratio [AOR]=3.06, 95% CI 0.57-5.37), field of study (AOR=3.08, 95% CI 1.65-5.73), level of education (AOR=2.78, 95% CI 1.43-5.40), and previous computer training participation (AOR=3.65, 95% CI 1.62-8.21) were found to be significantly associated with computer utilization habits of health professionals. Computer knowledge and utilization habits of health professionals, especially those who work in primary health centers, were found to be low. Providing trainings and continuous follow-up are necessary measures to increase the likelihood of the success of implemented eHealth systems in those settings.
Mobile Technology: Case-Based Suggestions for Classroom Integration and Teacher Educators
ERIC Educational Resources Information Center
Herro, Danielle; Kiger, Derick; Owens, Carl
2013-01-01
Mobile technologies permeate the lives of 21st century citizens. From smart-phones to tablet computers, people use these devices to navigate personal, social, and career responsibilities. Educators recognize the instructional potential of mobiles and are seeking ways to effectively utilize these technologies in support of learning. Research is…
ERIC Educational Resources Information Center
Spiro, Louis M.; Campbell, Jill F.
The development and use of a campus-based computerized faculty staffing model is described. In addition to considering market demands for current and proposed programs, decisionmakers need to consider how program development, modification, and elimination affect the total college faculty resource base. The application of computer technology,…
Zander, Thorsten O; Kothe, Christian
2011-04-01
Cognitive monitoring is an approach utilizing realtime brain signal decoding (RBSD) for gaining information on the ongoing cognitive user state. In recent decades this approach has brought valuable insight into the cognition of an interacting human. Automated RBSD can be used to set up a brain-computer interface (BCI) providing a novel input modality for technical systems solely based on brain activity. In BCIs the user usually sends voluntary and directed commands to control the connected computer system or to communicate through it. In this paper we propose an extension of this approach by fusing BCI technology with cognitive monitoring, providing valuable information about the users' intentions, situational interpretations and emotional states to the technical system. We call this approach passive BCI. In the following we give an overview of studies which utilize passive BCI, as well as other novel types of applications resulting from BCI technology. We especially focus on applications for healthy users, and the specific requirements and demands of this user group. Since the presented approach of combining cognitive monitoring with BCI technology is very similar to the concept of BCIs itself we propose a unifying categorization of BCI-based applications, including the novel approach of passive BCI.
Health care information infrastructure: what will it be and how will we get there?
NASA Astrophysics Data System (ADS)
Kun, Luis G.
1996-02-01
During the first Health Care Technology Policy [HCTPI conference last year, during Health Care Reform, four major issues were brought up in regards to the underway efforts to develop a Computer Based Patient Record (CBPR)I the National Information Infrastructure (NIl) as part of the High Performance Computers & Communications (HPCC), and the so-called "Patient Card" . More specifically it was explained how a national information system will greatly affect the way health care delivery is provided to the United States public and reduce its costs. These four issues were: Constructing a National Information Infrastructure (NIl); Building a Computer Based Patient Record System; Bringing the collective resources of our National Laboratories to bear in developing and implementing the NIl and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; Utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues. During the second HCTP conference, in mid 1 995, a section of this meeting entitled: "Health Care Technology Assets of the Federal Government" addressed benefits of the technology transfer which should occur for maximizing already developed resources. Also a section entitled:"Transfer and Utilization of Government Technology Assets to the Private Sector", looked at both Health Care and non-Health Care related technologies since many areas such as Information Technologies (i.e. imaging, communications, archival I retrieval, systems integration, information display, multimedia, heterogeneous data bases, etc.) already exist and are part of our National Labs and/or other federal agencies, i.e. ARPA. These technologies although they are not labeled under "Health Care" programs they could provide enormous value to address technical needs. An additional issue deals with both the technical (hardware, software) and human expertise that resides within these labs and their possible role in creating cost effective solutions.
A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations
NASA Astrophysics Data System (ADS)
Demir, I.; Agliamzanov, R.
2014-12-01
Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.
Nanoelectronic programmable synapses based on phase change materials for brain-inspired computing.
Kuzum, Duygu; Jeyasingh, Rakesh G D; Lee, Byoungil; Wong, H-S Philip
2012-05-09
Brain-inspired computing is an emerging field, which aims to extend the capabilities of information technology beyond digital logic. A compact nanoscale device, emulating biological synapses, is needed as the building block for brain-like computational systems. Here, we report a new nanoscale electronic synapse based on technologically mature phase change materials employed in optical data storage and nonvolatile memory applications. We utilize continuous resistance transitions in phase change materials to mimic the analog nature of biological synapses, enabling the implementation of a synaptic learning rule. We demonstrate different forms of spike-timing-dependent plasticity using the same nanoscale synapse with picojoule level energy consumption.
ERIC Educational Resources Information Center
Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas
2008-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and…
Software Accelerates Computing Time for Complex Math
NASA Technical Reports Server (NTRS)
2014-01-01
Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.
Memory management and compiler support for rapid recovery from failures in computer systems
NASA Technical Reports Server (NTRS)
Fuchs, W. K.
1991-01-01
This paper describes recent developments in the use of memory management and compiler technology to support rapid recovery from failures in computer systems. The techniques described include cache coherence protocols for user transparent checkpointing in multiprocessor systems, compiler-based checkpoint placement, compiler-based code modification for multiple instruction retry, and forward recovery in distributed systems utilizing optimistic execution.
Finelle, Gary; Lee, Sang J
Digital technology has been widely used in the field of implant dentistry. From a surgical standpoint, computer-guided surgery can be utilized to enhance primary implant stability and to improve the precision of implant placement. From a prosthetic standpoint, computer-aided design/computer-assisted manufacture (CAD/CAM) technology has brought about various restorative options, including the fabrication of customized abutments through a virtual design based on computer-guided surgical planning. This case report describes a novel technique combining the use of a three-dimensional (3D) printed surgical template for the immediate placement of an implant, with CAD/CAM technology to optimize hard and soft tissue healing after bone grafting with the use of a socket sealing abutment.
Using NCLab-karel to improve computational thinking skill of junior high school students
NASA Astrophysics Data System (ADS)
Kusnendar, J.; Prabawa, H. W.
2018-05-01
Increasingly human interaction with technology and the increasingly complex development of digital technology world make the theme of computer science education interesting to study. Previous studies on Computer Literacy and Competency reveal that Indonesian teachers in general have fairly high computational skill, but their skill utilization are limited to some applications. This engenders limited and minimum computer-related learning for the students. On the other hand, computer science education is considered unrelated to real-world solutions. This paper attempts to address the utilization of NCLab- Karel in shaping the computational thinking in students. This computational thinking is believed to be able to making learn students about technology. Implementation of Karel utilization provides information that Karel is able to increase student interest in studying computational material, especially algorithm. Observations made during the learning process also indicate the growth and development of computing mindset in students.
ERIC Educational Resources Information Center
Barak, Miri
2017-01-01
The new guidelines for science education emphasize the need to introduce computers and digital technologies as a means of enabling visualization and data collection and analysis. This requires science teachers to bring advanced technologies into the classroom and use them wisely. Hence, the goal of this study was twofold: to examine the…
Audiometric testing and hearing protection training through multimedia technology.
Hong, OiSaeng; Csaszar, Peter
2005-09-01
The purpose of this paper is to present the development process of a computer-based audiometric testing and tailored intervention program, and assess its feasibility by obtaining users' feedback. The program was implemented for 397 operating engineers at their union training center, and its feasibility was evaluated by obtaining quantitative and qualitative feedback from the participants through a survey and focus group. Over 96% of the participants indicated they liked receiving a hearing test by computer; the computer-based test worked smoothly; and the computer-based training was well organized, effective and held their interests. Almost all (more than 99%) said they would recommend this program to other workers. This project is considered as one of the first ones incorporating multimedia computer technology with self-administered audiometric testing and tailored training. Participants' favorable feedback strongly supported the continued utilization of this approach for designing and developing health screening and intervention to promote healthy behaviors.
Generic Divide and Conquer Internet-Based Computing
NASA Technical Reports Server (NTRS)
Radenski, Atanas; Follen, Gregory J. (Technical Monitor)
2001-01-01
The rapid growth of internet-based applications and the proliferation of networking technologies have been transforming traditional commercial application areas as well as computer and computational sciences and engineering. This growth stimulates the exploration of new, internet-oriented software technologies that can open new research and application opportunities not only for the commercial world, but also for the scientific and high -performance computing applications community. The general goal of this research project is to contribute to better understanding of the transition to internet-based high -performance computing and to develop solutions for some of the difficulties of this transition. More specifically, our goal is to design an architecture for generic divide and conquer internet-based computing, to develop a portable implementation of this architecture, to create an example library of high-performance divide-and-conquer computing agents that run on top of this architecture, and to evaluate the performance of these agents. We have been designing an architecture that incorporates a master task-pool server and utilizes satellite computational servers that operate on the Internet in a dynamically changing large configuration of lower-end nodes provided by volunteer contributors. Our designed architecture is intended to be complementary to and accessible from computational grids such as Globus, Legion, and Condor. Grids provide remote access to existing high-end computing resources; in contrast, our goal is to utilize idle processor time of lower-end internet nodes. Our project is focused on a generic divide-and-conquer paradigm and its applications that operate on a loose and ever changing pool of lower-end internet nodes.
ERIC Educational Resources Information Center
Khasawneh, Saleh
2010-01-01
In this era of rapidly advancing technologies, many governments around the globe are spending a great amount of money on these technologies, in order to increase their work performance. Therefore, the Jordanian government decided to implement IT in its public organizations. However, the picture is unclear about users' attitudes toward this…
ERIC Educational Resources Information Center
Krause, Lorinda M.
2014-01-01
This study utilized a mixed methods approach to examine the issue of how parents, students, and teachers (stakeholders) perceive accessibility and the utilization of computer and Internet technology within the Selinsgrove, Pennsylvania Area School District. Quantitative data was collected through the use of questionnaires distributed to the…
Francis, Diane B; Cates, Joan R; Wagner, Kyla P Garrett; Zola, Tracey; Fitter, Jenny E; Coyne-Beasley, Tamera
2017-07-01
This systematic review examines the effectiveness of communication technology interventions on HPV vaccination initiation and completion. A comprehensive search strategy was used to identify existing randomized controlled trials testing the impact of computer-, mobile- or internet-based interventions on receipt of any dose of the HPV vaccine. Twelve relevant studies were identified with a total of 38,945 participants. The interventions were delivered using several different methods, including electronic health record (i.e. recall/reminder) prompts, text messaging, automated phone calls, interactive computer videos, and email. Vaccine initiation and completion was greater for technology-based studies relative to their control conditions. There is evidence that interventions utilizing communication technologies as their sole or primary mode for HPV vaccination intervention delivery may increase vaccination coverage. Communication technologies hold much promise for the future of HPV vaccination efforts, especially initiatives in practice-based settings. Copyright © 2017 Elsevier B.V. All rights reserved.
Inquiry Style Interactive Virtual Experiments: A Case on Circular Motion
ERIC Educational Resources Information Center
Zhou, Shaona; Han, Jing; Pelz, Nathaniel; Wang, Xiaojun; Peng, Liangyu; Xiao, Hua; Bao, Lei
2011-01-01
Interest in computer-based learning, especially in the use of virtual reality simulations is increasing rapidly. While there are good reasons to believe that technologies have the potential to improve teaching and learning, how to utilize the technology effectively in teaching specific content difficulties is challenging. To help students develop…
[Economic efficiency of computer monitoring of health].
Il'icheva, N P; Stazhadze, L L
2001-01-01
Presents the method of computer monitoring of health, based on utilization of modern information technologies in public health. The method helps organize preventive activities of an outpatient clinic at a high level and essentially decrease the time and money loss. Efficiency of such preventive measures, increased number of computer and Internet users suggests that such methods are promising and further studies in this field are needed.
Health Monitoring System Technology Assessments: Cost Benefits Analysis
NASA Technical Reports Server (NTRS)
Kent, Renee M.; Murphy, Dennis A.
2000-01-01
The subject of sensor-based structural health monitoring is very diverse and encompasses a wide range of activities including initiatives and innovations involving the development of advanced sensor, signal processing, data analysis, and actuation and control technologies. In addition, it embraces the consideration of the availability of low-cost, high-quality contributing technologies, computational utilities, and hardware and software resources that enable the operational realization of robust health monitoring technologies. This report presents a detailed analysis of the cost benefit and other logistics and operational considerations associated with the implementation and utilization of sensor-based technologies for use in aerospace structure health monitoring. The scope of this volume is to assess the economic impact, from an end-user perspective, implementation health monitoring technologies on three structures. It specifically focuses on evaluating the impact on maintaining and supporting these structures with and without health monitoring capability.
The Effects Of Disruptive Technology On Project Interdiction
2016-12-01
state of the art in personal privacy and anonymity is changing every day [11], [12]. 6 Disruptive technologies like cryptology and the blockchain ...only parties to be threatened by implementations of blockchain technology. Brooklyn-based software developer ConsenSys aims to provide the same...services as Google, utilizing a distributed network of computers that synchronizes information exchange via a blockchain implementation known as Ethereum
Hospital positioning: a strategic tool for the 1990s.
San Augustine, A J; Long, W J; Pantzallis, J
1992-03-01
The authors extend the process of market positioning in the health care sector by focusing on the simultaneous utilization of traditional research methods and emerging new computer-based adaptive perceptual mapping technologies and techniques.
Distributed geospatial model sharing based on open interoperability standards
Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin
2009-01-01
Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.
NASA Astrophysics Data System (ADS)
Gen, Mitsuo; Kawakami, Hiroshi; Tsujimura, Yasuhiro; Handa, Hisashi; Lin, Lin; Okamoto, Azuma
As efficient utilization of computational resources is increasing, evolutionary technology based on the Genetic Algorithm (GA), Genetic Programming (GP), Evolution Strategy (ES) and other Evolutionary Computations (ECs) is making rapid progress, and its social recognition and the need as applied technology are increasing. This is explained by the facts that EC offers higher robustness for knowledge information processing systems, intelligent production and logistics systems, most advanced production scheduling and other various real-world problems compared to the approaches based on conventional theories, and EC ensures flexible applicability and usefulness for any unknown system environment even in a case where accurate mathematical modeling fails in the formulation. In this paper, we provide a comprehensive survey of the current state-of-the-art in the fundamentals and applications of evolutionary technologies.
ERIC Educational Resources Information Center
Hughes, John; And Others
This report provides a description of a Computer Aided Training System Development and Management (CATSDM) environment based on state-of-the-art hardware and software technology, and including recommendations for off the shelf systems to be utilized as a starting point in addressing the particular systematic training and instruction design and…
ERIC Educational Resources Information Center
Frick, Theodore W.; And Others
The document is part of the final report on Project STEEL (Special Teacher Education and Evaluation Laboratory) intended to extend the utilization of technology in the training of preservice special education teachers. This volume focuses on the second of four project objectives, the development of a special education teacher computer literacy…
Civil propulsion technology for the next twenty-five years
NASA Technical Reports Server (NTRS)
Rosen, Robert; Facey, John R.
1987-01-01
The next twenty-five years will see major advances in civil propulsion technology that will result in completely new aircraft systems for domestic, international, commuter and high-speed transports. These aircraft will include advanced aerodynamic, structural, and avionic technologies resulting in major new system capabilities and economic improvements. Propulsion technologies will include high-speed turboprops in the near term, very high bypass ratio turbofans, high efficiency small engines and advanced cycles utilizing high temperature materials for high-speed propulsion. Key fundamental enabling technologies include increased temperature capability and advanced design methods. Increased temperature capability will be based on improved composite materials such as metal matrix, intermetallics, ceramics, and carbon/carbon as well as advanced heat transfer techniques. Advanced design methods will make use of advances in internal computational fluid mechanics, reacting flow computation, computational structural mechanics and computational chemistry. The combination of advanced enabling technologies, new propulsion concepts and advanced control approaches will provide major improvements in civil aircraft.
Past, Present, and Future Trends in Teaching Clinical Skills through Web-Based Learning Environments
ERIC Educational Resources Information Center
Coe Regan, Jo Ann R.; Youn, Eric J.
2008-01-01
Distance education in social work has grown significantly due to the use of interactive television and computer networks. Given the recent developments in delivering distance education utilizing Web-based technology, this article presents a literature review focused on identifying generational trends in the development of Web-based learning…
2013-01-01
Background Incorporation of information technology advancements in healthcare has gained wide acceptance in the last two decades. Developed countries have successfully incorporated information technology advancements in their healthcare system thus, improving healthcare. However, only a limited application of information technology advancements is seen in developing countries in their healthcare system. Hence, this study was aimed at assessing knowledge and utilization of computer among health workers in Addis Ababa hospitals. Methods A quantitative cross-sectional study was conducted among 304 health workers who were selected using stratified sampling technique from all governmental hospitals in Addis Ababa. Data was collected from April 15 to April 30, 2010 using a structured, self-administered, and pre-tested questionnaire from five government hospitals in Addis Ababa. The data was entered into Epi Info version 3.5.1 and exported to SPSS version 16. Analysis was done using multinomial logistic regression technique. Results A total of 270 participants, age ranging from 21 to 60 years responded to the survey (88.8% response rate). A total of 91 (33.7%) respondents had an adequate knowledge of computers while 108 (40.0%) had fair knowledge and 71(26.3%) of the respondents showed inadequate knowledge. A total of 38(14.1%) were adequately utilizing computers, 14(5.2%) demonstrated average or fair utilization and majority of the respondents 218(80.7%) inadequately utilized computers. Significant predictor variables were average monthly income, job satisfaction index and own computer possession. Conclusions Computer knowledge and utilization habit of health workers were found to be very low. Increasing accessibility to computers and delivering training on the use of computers for workers will increases the knowledge and utilization of computers. This will facilitate the rate of diffusion of the technology to the health sector. Hence, programs targeted at enhancing knowledge and skill of computer use and increasing access to computer should be designed. The association between computer knowledge/skill and health care delivery competence should be studied. PMID:23514191
Utilization of communication technology by patients enrolled in substance abuse treatment
McClure, Erin A.; Acquavita, Shauna; Harding, Emily; Stitzer, Maxine
2012-01-01
Background Technology-based applications represent a promising method for providing efficacious, widely available interventions to substance abuse treatment patients. However, limited access to communication technology (i.e., mobile phones, computers, internet, and e-mail) could significantly impact the feasibility of these efforts, and little is known regarding technology utilization in substance abusing populations. Methods A survey was conducted to characterize utilization of communication technology in 266 urban, substance abuse treatment patients enrolled at eight drug-free, psychosocial or opioid-replacement therapy clinics. Results Survey participants averaged 41 years of age and 57% had a yearly household income of less than $15,000. The vast majority reported access to a mobile phone (91%), and to SMS text messaging (79%). Keeping a consistent mobile phone number and yearly mobile contract was higher for White participants, and also for those with higher education, and enrolled in drug-free, psychosocial treatment. Internet, e-mail, and computer use was much lower (39–45%), with younger age, higher education and income predicting greater use. No such differences existed for the use of mobile phones however. Conclusions Concern regarding the digital divide for marginalized populations appears to be disappearing with respect to mobile phones, but still exists for computer, internet, and e-mail access and use. Results suggest that mobile phone and texting applications may be feasibly applied for use in program-client interactions in substance abuse treatment. Careful consideration should be given to frequent phone number changes, access to technology, and motivation to engage with communication technology for treatment purposes. PMID:23107600
Utilization of communication technology by patients enrolled in substance abuse treatment.
McClure, Erin A; Acquavita, Shauna P; Harding, Emily; Stitzer, Maxine L
2013-04-01
Technology-based applications represent a promising method for providing efficacious, widely available interventions to substance abuse treatment patients. However, limited access to communication technology (i.e., mobile phones, computers, internet, and e-mail) could significantly impact the feasibility of these efforts, and little is known regarding technology utilization in substance abusing populations. A survey was conducted to characterize utilization of communication technology in 266 urban, substance abuse treatment patients enrolled at eight drug-free, psychosocial or opioid-replacement therapy clinics. Survey participants averaged 41 years of age and 57% had a yearly household income of less than $15,000. The vast majority reported access to a mobile phone (91%), and to SMS text messaging (79%). Keeping a consistent mobile phone number and yearly mobile contract was higher for White participants, and also for those with higher education, and enrolled in drug-free, psychosocial treatment. Internet, e-mail, and computer use was much lower (39-45%), with younger age, higher education and income predicting greater use. No such differences existed for the use of mobile phones however. Concern regarding the digital divide for marginalized populations appears to be disappearing with respect to mobile phones, but still exists for computer, internet, and e-mail access and use. Results suggest that mobile phone and texting applications may be feasibly applied for use in program-client interactions in substance abuse treatment. Careful consideration should be given to frequent phone number changes, access to technology, and motivation to engage with communication technology for treatment purposes. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
The Use of a Computer Simulation to Promote Scientific Conceptions of Moon Phases
ERIC Educational Resources Information Center
Bell, Randy L.; Trundle, Kathy Cabe
2008-01-01
This study described the conceptual understandings of 50 early childhood (Pre-K-3) preservice teachers about standards-based lunar concepts before and after inquiry-based instruction utilizing educational technology. The instructional intervention integrated the planetarium software "Starry Night Backyard[TM]" with instruction on moon phases from…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna
The Nuclear Electronic Work Packages - Enterprise Requirements (NEWPER) initiative is a step toward a vision of implementing an eWP framework that includes many types of eWPs. This will enable immediate paper-related cost savings in work management and provide a path to future labor efficiency gains through enhanced integration and process improvement in support of the Nuclear Promise (Nuclear Energy Institute 2016). The NEWPER initiative was organized by the Nuclear Information Technology Strategic Leadership (NITSL) group, which is an organization that brings together leaders from the nuclear utility industry and regulatory agencies to address issues involved with information technology usedmore » in nuclear-power utilities. NITSL strives to maintain awareness of industry information technology-related initiatives and events and communicates those events to its membership. NITSL and LWRS Program researchers have been coordinating activities, including joint organization of NEWPER-related meetings and report development. The main goal of the NEWPER initiative was to develop a set of utility generic functional requirements for eWP systems. This set of requirements will support each utility in their process of identifying plant-specific functional and non-functional requirements. The NEWPER initiative has 140 members where the largest group of members consists of 19 commercial U.S. nuclear utilities and eleven of the most prominent vendors of eWP solutions. Through the NEWPER initiative two sets of functional requirements were developed; functional requirements for electronic work packages and functional requirements for computer-based procedures. This paper will describe the development process as well as a summary of the requirements.« less
Offodile, Anaeze C; Chatterjee, Abhishek; Vallejo, Sergio; Fisher, Carla S; Tchou, Julia C; Guo, Lifei
2015-04-01
Computed tomographic angiography is a diagnostic tool increasingly used for preoperative vascular mapping in abdomen-based perforator flap breast reconstruction. This study compared the use of computed tomographic angiography and the conventional practice of Doppler ultrasonography only in postmastectomy reconstruction using a cost-utility model. Following a comprehensive literature review, a decision analytic model was created using the three most clinically relevant health outcomes in free autologous breast reconstruction with computed tomographic angiography versus Doppler ultrasonography only. Cost and utility estimates for each health outcome were used to derive the quality-adjusted life-years and incremental cost-utility ratio. One-way sensitivity analysis was performed to scrutinize the robustness of the authors' results. Six studies and 782 patients were identified. Cost-utility analysis revealed a baseline cost savings of $3179, a gain in quality-adjusted life-years of 0.25. This yielded an incremental cost-utility ratio of -$12,716, implying a dominant choice favoring preoperative computed tomographic angiography. Sensitivity analysis revealed that computed tomographic angiography was costlier when the operative time difference between the two techniques was less than 21.3 minutes. However, the clinical advantage of computed tomographic angiography over Doppler ultrasonography only showed that computed tomographic angiography would still remain the cost-effective option even if it offered no additional operating time advantage. The authors' results show that computed tomographic angiography is a cost-effective technology for identifying lower abdominal perforators for autologous breast reconstruction. Although the perfect study would be a randomized controlled trial of the two approaches with true cost accrual, the authors' results represent the best available evidence.
Fast data reconstructed method of Fourier transform imaging spectrometer based on multi-core CPU
NASA Astrophysics Data System (ADS)
Yu, Chunchao; Du, Debiao; Xia, Zongze; Song, Li; Zheng, Weijian; Yan, Min; Lei, Zhenggang
2017-10-01
Imaging spectrometer can gain two-dimensional space image and one-dimensional spectrum at the same time, which shows high utility in color and spectral measurements, the true color image synthesis, military reconnaissance and so on. In order to realize the fast reconstructed processing of the Fourier transform imaging spectrometer data, the paper designed the optimization reconstructed algorithm with OpenMP parallel calculating technology, which was further used for the optimization process for the HyperSpectral Imager of `HJ-1' Chinese satellite. The results show that the method based on multi-core parallel computing technology can control the multi-core CPU hardware resources competently and significantly enhance the calculation of the spectrum reconstruction processing efficiency. If the technology is applied to more cores workstation in parallel computing, it will be possible to complete Fourier transform imaging spectrometer real-time data processing with a single computer.
Computer Utilization in Industrial Arts/Technology Education. Curriculum Guide.
ERIC Educational Resources Information Center
Connecticut Industrial Arts Association.
This guide is intended to assist industrial arts/technology education teachers in helping students in grades K-12 understand the impact of computers and computer technology in the world. Discussed in the introductory sections are the ways in which computers have changed the face of business, industry, and education and training; the scope and…
Improvements in the efficiency of turboexpanders in cryogenic applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agahi, R.R.; Lin, M.C.; Ershaghi, B.
1996-12-31
Process designers have utilized turboexpanders in cryogenic processes because of their higher thermal efficiencies when compared with conventional refrigeration cycles. Process design and equipment performance have improved substantially through the utilization of modern technologies. Turboexpander manufacturers have also adopted Computational Fluid Dynamic Software, Computer Numerical Control Technology and Holography Techniques to further improve an already impressive turboexpander efficiency performance. In this paper, the authors explain the design process of the turboexpander utilizing modern technology. Two cases of turboexpanders processing helium (4.35{degrees}K) and hydrogen (56{degrees}K) will be presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattarai, Bishnu P.; Gentle, Jake P.; Hill, Porter
Abstract—Overhead transmission lines (TLs) are conventionally given seasonal ratings based on conservative environmental assumptions. Such an approach often results in underutilization of the line ampacity as the worst conditions prevail only for a short period over a year/season. We presents dynamic line rating (DLR) as an enabling smart grid technology that adaptively computes ratings of TLs based on local weather conditions to utilize additional headroom of existing lines. In particular, general line ampacity state solver utilizes measured weather data for computing the real-time thermal rating of the TLs. The performance of the presented method is demonstrated from a field studymore » of DLR technology implementation on four TL segments at AltaLink, Canada. The performance is evaluated and quantified by comparing the existing static and proposed dynamic line ratings, and the potential benefits of DLR for enhanced transmission assets utilization. For the given line segments, the proposed DLR results in real-time ratings above the seasonal static ratings for most of the time; up to 95.1% of the time, with a mean increase of 72% over static rating.« less
Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas
2008-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students' understanding and suggests better long-term knowledge retention.
Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas
2009-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students’ understanding and suggests better long-term knowledge retention. PMID:19750185
Generic Divide and Conquer Internet-Based Computing
NASA Technical Reports Server (NTRS)
Follen, Gregory J. (Technical Monitor); Radenski, Atanas
2003-01-01
The growth of Internet-based applications and the proliferation of networking technologies have been transforming traditional commercial application areas as well as computer and computational sciences and engineering. This growth stimulates the exploration of Peer to Peer (P2P) software technologies that can open new research and application opportunities not only for the commercial world, but also for the scientific and high-performance computing applications community. The general goal of this project is to achieve better understanding of the transition to Internet-based high-performance computing and to develop solutions for some of the technical challenges of this transition. In particular, we are interested in creating long-term motivation for end users to provide their idle processor time to support computationally intensive tasks. We believe that a practical P2P architecture should provide useful service to both clients with high-performance computing needs and contributors of lower-end computing resources. To achieve this, we are designing dual -service architecture for P2P high-performance divide-and conquer computing; we are also experimenting with a prototype implementation. Our proposed architecture incorporates a master server, utilizes dual satellite servers, and operates on the Internet in a dynamically changing large configuration of lower-end nodes provided by volunteer contributors. A dual satellite server comprises a high-performance computing engine and a lower-end contributor service engine. The computing engine provides generic support for divide and conquer computations. The service engine is intended to provide free useful HTTP-based services to contributors of lower-end computing resources. Our proposed architecture is complementary to and accessible from computational grids, such as Globus, Legion, and Condor. Grids provide remote access to existing higher-end computing resources; in contrast, our goal is to utilize idle processor time of lower-end Internet nodes. Our project is focused on a generic divide and conquer paradigm and on mobile applications of this paradigm that can operate on a loose and ever changing pool of lower-end Internet nodes.
An Overview of the NASA Aerospace Flight Battery Systems Program
NASA Technical Reports Server (NTRS)
Manzo, Michelle
2003-01-01
Develop an understanding of the safety issues relating to space use and qualification of new Li-Ion technology for manned applications. Enable use of new technology batteries into GFE equipment - laptop computers, camcorders. Establish a data base for an optimized set of cells (and batteries) exhibiting acceptable performance and abuse characteristics for utilization as building blocks for numerous applications.
Computers in medicine: liability issues for physicians.
Hafner, A W; Filipowicz, A B; Whitely, W P
1989-07-01
Physicians routinely use computers to store, access, and retrieve medical information. As computer use becomes even more widespread in medicine, failure to utilize information systems may be seen as a violation of professional custom and lead to findings of professional liability. Even when a technology is not widespread, failure to incorporate it into medical practice may give rise to liability if the technology is accessible to the physician and reduces risk to the patient. Improvement in the availability of medical information sources imposes a greater burden on the physician to keep current and to obtain informed consent from patients. To routinely perform computer-assisted literature searches for informed consent and diagnosis is 'good medicine'. Clinical and diagnostic applications of computer technology now include computer-assisted decision making with the aid of sophisticated databases. Although such systems will expand the knowledge base and competence of physicians, malfunctioning software raises a major liability question. Also, complex computer-driven technology is used in direct patient care. Defective or improperly used hardware or software can lead to patient injury, thus raising additional complicated questions of professional liability and product liability.
Weighted Description Logics Preference Formulas for Multiattribute Negotiation
NASA Astrophysics Data System (ADS)
Ragone, Azzurra; di Noia, Tommaso; Donini, Francesco M.; di Sciascio, Eugenio; Wellman, Michael P.
We propose a framework to compute the utility of an agreement w.r.t a preference set in a negotiation process. In particular, we refer to preferences expressed as weighted formulas in a decidable fragment of First-order Logic and agreements expressed as a formula. We ground our framework in Description Logics (DL) endowed with disjunction, to be compliant with Semantic Web technologies. A logic based approach to preference representation allows, when a background knowledge base is exploited, to relax the often unrealistic assumption of additive independence among attributes. We provide suitable definitions of the problem and present algorithms to compute utility in our setting. We also validate our approach through an experimental evaluation.
Using E-Learning and ICT Courses in Educational Environment: A Review
ERIC Educational Resources Information Center
Salehi, Hadi; Shojaee, Mohammad; Sattar, Susan
2015-01-01
With the quick emergence of computers and related technology, Electronic-learning (E-learning) and Information Communication and Technology (ICT) have been extensively utilized in the education and training field. Miscellaneous methods of integrating computer technology and the context in which computers are used have affected student learning in…
ERIC Educational Resources Information Center
Barker, Philip
1986-01-01
Discussion of developments in information storage technology likely to have significant impact upon library utilization focuses on hardware (videodisc technology) and software developments (knowledge databases; computer networks; database management systems; interactive video, computer, and multimedia user interfaces). Three generic computer-based…
Gross, Colin A; Reddy, Chandan K; Dazzo, Frank B
2010-02-01
Quantitative microscopy and digital image analysis are underutilized in microbial ecology largely because of the laborious task to segment foreground object pixels from background, especially in complex color micrographs of environmental samples. In this paper, we describe an improved computing technology developed to alleviate this limitation. The system's uniqueness is its ability to edit digital images accurately when presented with the difficult yet commonplace challenge of removing background pixels whose three-dimensional color space overlaps the range that defines foreground objects. Image segmentation is accomplished by utilizing algorithms that address color and spatial relationships of user-selected foreground object pixels. Performance of the color segmentation algorithm evaluated on 26 complex micrographs at single pixel resolution had an overall pixel classification accuracy of 99+%. Several applications illustrate how this improved computing technology can successfully resolve numerous challenges of complex color segmentation in order to produce images from which quantitative information can be accurately extracted, thereby gain new perspectives on the in situ ecology of microorganisms. Examples include improvements in the quantitative analysis of (1) microbial abundance and phylotype diversity of single cells classified by their discriminating color within heterogeneous communities, (2) cell viability, (3) spatial relationships and intensity of bacterial gene expression involved in cellular communication between individual cells within rhizoplane biofilms, and (4) biofilm ecophysiology based on ribotype-differentiated radioactive substrate utilization. The stand-alone executable file plus user manual and tutorial images for this color segmentation computing application are freely available at http://cme.msu.edu/cmeias/ . This improved computing technology opens new opportunities of imaging applications where discriminating colors really matter most, thereby strengthening quantitative microscopy-based approaches to advance microbial ecology in situ at individual single-cell resolution.
A qualitative study of technophobic students' reactions to a technology-rich college science course
NASA Astrophysics Data System (ADS)
Guttschow, Gena Lee
The use of technology in education has grown rapidly in the last 20 years. In fact, many of today's college students have had some sort of computer in their elementary school classrooms. One might think that this consistent exposure to computers would foster positive attitudes about computers but this is not always the case. Currently, a substantial number of college students dislike interacting with technology. People who dislike interacting with technology are often referred to as "technophobic". Technophobic people have negative thoughts and feelings about technology and they often have a desire to avoid interaction with technology. Technophobic students' negative feelings about technology have the potential to interfere with their learning when technology is utilized as a tool for instruction of school subjects. As computer use becomes prevalent and in many instances mandatory in education, the issue of technophobia increasingly needs to be understood and addressed. This is a qualitative study designed with the intent of gaining an understanding the experiences of technophobic students who are required to use technology to learn science in a college class. Six developmental college students enrolled in a computer based anatomy and physiology class were chosen to participate in the study based on their high technophobia scores. They were interviewed three times during the quarter and videotaped once. The interview data were transcribed, coded, and analyzed. The analysis resulted in six case studies describing each participant's experience and 11 themes representing overlapping areas in the participants' worlds of experience. A discussion of the themes, the meaning they hold for me as a science educator and how they relate to the existing literature, is presented. The participants' descriptions of their experiences showed that the technophobic students did use the computers and learned skills when they had to in order to complete assignments. It was also revealed that the technophobic participants' negative attitudes did not improve after learning computer skills. Lastly, based on the participants' experiences it seems important to start a class with step-by step computer training, teaching foundational computer skills, and slowly progress towards autonomous computer exploration.
Dunne, James R; McDonald, Claudia L
2010-07-01
Pulse!! The Virtual Clinical Learning Lab at Texas A&M University-Corpus Christi, in collaboration with the United States Navy, has developed a model for research and technological development that they believe is an essential element in the future of military and civilian medical education. The Pulse!! project models a strategy for providing cross-disciplinary expertise and resources to educational, governmental, and business entities challenged with meeting looming health care crises. It includes a three-dimensional virtual learning platform that provides unlimited, repeatable, immersive clinical experiences without risk to patients, and is available anywhere there is a computer. Pulse!! utilizes expertise in the fields of medicine, medical education, computer science, software engineering, physics, computer animation, art, and architecture. Lab scientists collaborate with the commercial virtual-reality simulation industry to produce research-based learning platforms based on cutting-edge computer technology.
Utilization of the Space Vision System as an Augmented Reality System For Mission Operations
NASA Technical Reports Server (NTRS)
Maida, James C.; Bowen, Charles
2003-01-01
Augmented reality is a technique whereby computer generated images are superimposed on live images for visual enhancement. Augmented reality can also be characterized as dynamic overlays when computer generated images are registered with moving objects in a live image. This technique has been successfully implemented, with low to medium levels of registration precision, in an NRA funded project entitled, "Improving Human Task Performance with Luminance Images and Dynamic Overlays". Future research is already being planned to also utilize a laboratory-based system where more extensive subject testing can be performed. However successful this might be, the problem will still be whether such a technology can be used with flight hardware. To answer this question, the Canadian Space Vision System (SVS) will be tested as an augmented reality system capable of improving human performance where the operation requires indirect viewing. This system has already been certified for flight and is currently flown on each shuttle mission for station assembly. Successful development and utilization of this system in a ground-based experiment will expand its utilization for on-orbit mission operations. Current research and development regarding the use of augmented reality technology is being simulated using ground-based equipment. This is an appropriate approach for development of symbology (graphics and annotation) optimal for human performance and for development of optimal image registration techniques. It is anticipated that this technology will become more pervasive as it matures. Because we know what and where almost everything is on ISS, this reduces the registration problem and improves the computer model of that reality, making augmented reality an attractive tool, provided we know how to use it. This is the basis for current research in this area. However, there is a missing element to this process. It is the link from this research to the current ISS video system and to flight hardware capable of utilizing this technology. This is the basis for this proposed Space Human Factors Engineering project, the determination of the display symbology within the performance limits of the Space Vision System that will objectively improve human performance. This utilization of existing flight hardware will greatly reduce the costs of implementation for flight. Besides being used onboard shuttle and space station and as a ground-based system for mission operational support, it also has great potential for science and medical training and diagnostics, remote learning, team learning, video/media conferencing, and educational outreach.
NASA Astrophysics Data System (ADS)
Langer-Osuna, Jennifer
2015-03-01
This paper draws on the constructs of hybridity, figured worlds, and cultural capital to examine how a group of African-American students in a technology-driven, project-based algebra classroom utilized the computer as a resource to coordinate personal and mathematical positional identities during group work. Analyses of several vignettes of small group dynamics highlight how hybridity was established as the students engaged in multiple on-task and off-task computer-based activities, each of which drew on different lived experiences and forms of cultural capital. The paper ends with a discussion on how classrooms that make use of student-led collaborative work, and where students are afforded autonomy, have the potential to support the academic engagement of students from historically marginalized communities.
Neuromorphic Computing for Very Large Test and Evaluation Data Analysis
2014-05-01
analysis and utilization of newly available hardware- based artificial neural network chips. These two aspects of the program are complementary. The...neuromorphic architectures research focused on long term disruptive technologies with high risk but revolutionary potential. The hardware- based neural...today. Overall, hardware- based neural processing research allows us to study the fundamental system and architectural issues relevant for employing
NASA Technical Reports Server (NTRS)
Vallee, J.; Wilson, T.
1976-01-01
Results are reported of the first experiments for a computer conference management information system at the National Aeronautics and Space Administration. Between August 1975 and March 1976, two NASA projects with geographically separated participants (NASA scientists) used the PLANET computer conferencing system for portions of their work. The first project was a technology assessment of future transportation systems. The second project involved experiments with the Communication Technology Satellite. As part of this project, pre- and postlaunch operations were discussed in a computer conference. These conferences also provided the context for an analysis of the cost of computer conferencing. In particular, six cost components were identified: (1) terminal equipment, (2) communication with a network port, (3) network connection, (4) computer utilization, (5) data storage and (6) administrative overhead.
[Research of controlling of smart home system based on P300 brain-computer interface].
Wang, Jinjia; Yang, Chengjie
2014-08-01
Using electroencephalogram (EEG) signal to control external devices has always been the research focus in the field of brain-computer interface (BCI). This is especially significant for those disabilities who have lost capacity of movements. In this paper, the P300-based BCI and the microcontroller-based wireless radio frequency (RF) technology are utilized to design a smart home control system, which can be used to control household appliances, lighting system, and security devices directly. Experiment results showed that the system was simple, reliable and easy to be populirised.
Wind power for the electric-utility industry: Policy incentives for fuel conservation
NASA Astrophysics Data System (ADS)
March, F.; Dlott, E. H.; Korn, D. H.; Madio, F. R.; McArthur, R. C.; Vachon, W. A.
1982-06-01
A systematic method for evaluating the economics of solar-electric/conservation technologies as fuel-savings investments for electric utilities in the presence of changing federal incentive policies is presented. The focus is on wind energy conversion systems (WECS) as the solar technology closest to near-term large scale implementation. Commercially available large WECS are described, along with computer models to calculate the economic impact of the inclusion of WECS as 10% of the base-load generating capacity on a grid. A guide to legal structures and relationships which impinge on large-scale WECS utilization is developed, together with a quantitative examination of the installation of 1000 MWe of WECS capacity by a utility in the northeast states. Engineering and financial analyses were performed, with results indicating government policy changes necessary to encourage the entrance of utilities into the field of windpower utilization.
[Application of computer-assisted 3D imaging simulation for surgery].
Matsushita, S; Suzuki, N
1994-03-01
This article describes trends in application of various imaging technology in surgical planning, navigation, and computer aided surgery. Imaging information is essential factor for simulation in medicine. It includes three dimensional (3D) image reconstruction, neuro-surgical navigation, creating substantial model based on 3D imaging data and etc. These developments depend mostly on 3D imaging technique, which is much contributed by recent computer technology. 3D imaging can offer new intuitive information to physician and surgeon, and this method is suitable for mechanical control. By utilizing simulated results, we can obtain more precise surgical orientation, estimation, and operation. For more advancement, automatic and high speed recognition of medical imaging is being developed.
Synchronous and Asynchronous Text-Based CMC in Educational Contexts: A Review of Recent Research
ERIC Educational Resources Information Center
Johnson, Genevieve, Marie
2006-01-01
This paper presents a review of recent research that examines the relative instructional utility of text-based synchronous and asynchronous computer-mediated communication (CMC). As a mechanism for limiting the number of studies reviewed as well as controlling for emergent technologies, only research published since 2000 was reviewed. The goal was…
Design and Verification of Remote Sensing Image Data Center Storage Architecture Based on Hadoop
NASA Astrophysics Data System (ADS)
Tang, D.; Zhou, X.; Jing, Y.; Cong, W.; Li, C.
2018-04-01
The data center is a new concept of data processing and application proposed in recent years. It is a new method of processing technologies based on data, parallel computing, and compatibility with different hardware clusters. While optimizing the data storage management structure, it fully utilizes cluster resource computing nodes and improves the efficiency of data parallel application. This paper used mature Hadoop technology to build a large-scale distributed image management architecture for remote sensing imagery. Using MapReduce parallel processing technology, it called many computing nodes to process image storage blocks and pyramids in the background to improve the efficiency of image reading and application and sovled the need for concurrent multi-user high-speed access to remotely sensed data. It verified the rationality, reliability and superiority of the system design by testing the storage efficiency of different image data and multi-users and analyzing the distributed storage architecture to improve the application efficiency of remote sensing images through building an actual Hadoop service system.
Microcomputer-Based Intelligent Tutoring Systems: An Assessment.
ERIC Educational Resources Information Center
Schaffer, John William
Computer-assisted instruction, while familiar to most teachers, has failed to become an effective self-motivating instructional tool. Developments in artificial intelligence, however, have provided new and better tools for exploring human knowledge acquisition and utilization. Expert system technology represents one of the most promising of these…
Fast Image Subtraction Using Multi-cores and GPUs
NASA Astrophysics Data System (ADS)
Hartung, Steven; Shukla, H.
2013-01-01
Many important image processing techniques in astronomy require a massive number of computations per pixel. Among them is an image differencing technique known as Optimal Image Subtraction (OIS), which is very useful for detecting and characterizing transient phenomena. Like many image processing routines, OIS computations increase proportionally with the number of pixels being processed, and the number of pixels in need of processing is increasing rapidly. Utilizing many-core graphical processing unit (GPU) technology in a hybrid conjunction with multi-core CPU and computer clustering technologies, this work presents a new astronomy image processing pipeline architecture. The chosen OIS implementation focuses on the 2nd order spatially-varying kernel with the Dirac delta function basis, a powerful image differencing method that has seen limited deployment in part because of the heavy computational burden. This tool can process standard image calibration and OIS differencing in a fashion that is scalable with the increasing data volume. It employs several parallel processing technologies in a hierarchical fashion in order to best utilize each of their strengths. The Linux/Unix based application can operate on a single computer, or on an MPI configured cluster, with or without GPU hardware. With GPU hardware available, even low-cost commercial video cards, the OIS convolution and subtraction times for large images can be accelerated by up to three orders of magnitude.
Jensen, Jakob D; King, Andy J; Davis, LaShara A; Guntzviller, Lisa M
2010-09-01
To examine whether low-income adults' utilization of Internet technology is predicted or mediated by health literacy, health numeracy, and computer assistance. Low-income adults (N = 131) from the midwestern United States were surveyed about their technology access and use. Individuals with low health literacy skills were less likely to use Internet technology (e.g., email, search engines, and online health information seeking), and those with low health numeracy skills were less likely to have access to Internet technology (e.g., computers and cell phones). Consistent with past research, males, older participants, and those with less education were less likely to search for health information online. The relationship between age and online health information seeking was mediated by participant literacy. The present study suggests that significant advances in technology access and use could be sparked by developing technology interfaces that are accessible to individuals with limited literacy skills.
COMBAT: mobile-Cloud-based cOmpute/coMmunications infrastructure for BATtlefield applications
NASA Astrophysics Data System (ADS)
Soyata, Tolga; Muraleedharan, Rajani; Langdon, Jonathan; Funai, Colin; Ames, Scott; Kwon, Minseok; Heinzelman, Wendi
2012-05-01
The amount of data processed annually over the Internet has crossed the zetabyte boundary, yet this Big Data cannot be efficiently processed or stored using today's mobile devices. Parallel to this explosive growth in data, a substantial increase in mobile compute-capability and the advances in cloud computing have brought the state-of-the- art in mobile-cloud computing to an inflection point, where the right architecture may allow mobile devices to run applications utilizing Big Data and intensive computing. In this paper, we propose the MObile Cloud-based Hybrid Architecture (MOCHA), which formulates a solution to permit mobile-cloud computing applications such as object recognition in the battlefield by introducing a mid-stage compute- and storage-layer, called the cloudlet. MOCHA is built on the key observation that many mobile-cloud applications have the following characteristics: 1) they are compute-intensive, requiring the compute-power of a supercomputer, and 2) they use Big Data, requiring a communications link to cloud-based database sources in near-real-time. In this paper, we describe the operation of MOCHA in battlefield applications, by formulating the aforementioned mobile and cloudlet to be housed within a soldier's vest and inside a military vehicle, respectively, and enabling access to the cloud through high latency satellite links. We provide simulations using the traditional mobile-cloud approach as well as utilizing MOCHA with a mid-stage cloudlet to quantify the utility of this architecture. We show that the MOCHA platform for mobile-cloud computing promises a future for critical battlefield applications that access Big Data, which is currently not possible using existing technology.
An overview of computer-based natural language processing
NASA Technical Reports Server (NTRS)
Gevarter, W. B.
1983-01-01
Computer based Natural Language Processing (NLP) is the key to enabling humans and their computer based creations to interact with machines in natural language (like English, Japanese, German, etc., in contrast to formal computer languages). The doors that such an achievement can open have made this a major research area in Artificial Intelligence and Computational Linguistics. Commercial natural language interfaces to computers have recently entered the market and future looks bright for other applications as well. This report reviews the basic approaches to such systems, the techniques utilized, applications, the state of the art of the technology, issues and research requirements, the major participants and finally, future trends and expectations. It is anticipated that this report will prove useful to engineering and research managers, potential users, and others who will be affected by this field as it unfolds.
Bioinformatics clouds for big data manipulation.
Dai, Lin; Gao, Xin; Guo, Yan; Xiao, Jingfa; Zhang, Zhang
2012-11-28
As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor.
Cloud based intelligent system for delivering health care as a service.
Kaur, Pankaj Deep; Chana, Inderveer
2014-01-01
The promising potential of cloud computing and its convergence with technologies such as mobile computing, wireless networks, sensor technologies allows for creation and delivery of newer type of cloud services. In this paper, we advocate the use of cloud computing for the creation and management of cloud based health care services. As a representative case study, we design a Cloud Based Intelligent Health Care Service (CBIHCS) that performs real time monitoring of user health data for diagnosis of chronic illness such as diabetes. Advance body sensor components are utilized to gather user specific health data and store in cloud based storage repositories for subsequent analysis and classification. In addition, infrastructure level mechanisms are proposed to provide dynamic resource elasticity for CBIHCS. Experimental results demonstrate that classification accuracy of 92.59% is achieved with our prototype system and the predicted patterns of CPU usage offer better opportunities for adaptive resource elasticity. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Tse computers. [ultrahigh speed optical processing for two dimensional binary image
NASA Technical Reports Server (NTRS)
Schaefer, D. H.; Strong, J. P., III
1977-01-01
An ultra-high-speed computer that utilizes binary images as its basic computational entity is being developed. The basic logic components perform thousands of operations simultaneously. Technologies of the fiber optics, display, thin film, and semiconductor industries are being utilized in the building of the hardware.
Health Information Technology as a Universal Donor to Bioethics Education.
Goodman, Kenneth W
2017-04-01
Health information technology, sometimes called biomedical informatics, is the use of computers and networks in the health professions. This technology has become widespread, from electronic health records to decision support tools to patient access through personal health records. These computational and information-based tools have engendered their own ethics literature and now present an opportunity to shape the standard medical and nursing ethics curricula. It is suggested that each of four core components in the professional education of clinicians-privacy, end-of-life care, access to healthcare and valid consent, and clinician-patient communication-offers an opportunity to leverage health information technology for curricular improvement. Using informatics in ethics education freshens ethics pedagogy and increases its utility, and does so without additional demands on overburdened curricula.
Impact of Collaborative Work on Technology Acceptance: A Case Study from Virtual Computing
ERIC Educational Resources Information Center
Konak, Abdullah; Kulturel-Konak, Sadan; Nasereddin, Mahdi; Bartolacci, Michael R.
2017-01-01
Aim/Purpose: This paper utilizes the Technology Acceptance Model (TAM) to examine the extent to which acceptance of Remote Virtual Computer Laboratories (RVCLs) is affected by students' technological backgrounds and the role of collaborative work. Background: RVCLs are widely used in information technology and cyber security education to provide…
NASA Astrophysics Data System (ADS)
Kun, Luis G.
1994-12-01
On October 18, 1991, the IEEE-USA produced an entity statement which endorsed the vital importance of the High Performance Computer and Communications Act of 1991 (HPCC) and called for the rapid implementation of all its elements. Efforts are now underway to develop a Computer Based Patient Record (CBPR), the National Information Infrastructure (NII) as part of the HPCC, and the so-called `Patient Card'. Multiple legislative initiatives which address these and related information technology issues are pending in Congress. Clearly, a national information system will greatly affect the way health care delivery is provided to the United States public. Timely and reliable information represents a critical element in any initiative to reform the health care system as well as to protect and improve the health of every person. Appropriately used, information technologies offer a vital means of improving the quality of patient care, increasing access to universal care and lowering overall costs within a national health care program. Health care reform legislation should reflect increased budgetary support and a legal mandate for the creation of a national health care information system by: (1) constructing a National Information Infrastructure; (2) building a Computer Based Patient Record System; (3) bringing the collective resources of our National Laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; and (4) utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tugurlan, Maria C.; Kirkham, Harold; Chassin, David P.
Abstract Budget and schedule overruns in product development due to the use of immature technologies constitute an important matter for program managers. Moreover, unexpected lack of technology maturity is also a problem for buyers. Both sides of the situation would benefit from an unbiased measure of technology maturity. This paper presents the use of a software maturity metric called Technology Readiness Level (TRL), in the milieu of the smart grid. For most of the time they have been in existence, power utilities have been protected monopolies, guaranteed a return on investment on anything they could justify adding to the ratemore » base. Such a situation did not encourage innovation, and instead led to widespread risk-avoidance behavior in many utilities. The situation changed at the end of the last century, with a series of regulatory measures, beginning with the Public Utility Regulatory Policy Act of 1978. However, some bad experiences have actually served to strengthen the resistance to innovation by some utilities. Some aspects of the smart grid, such as the addition of computer-based control to the power system, face an uphill battle. It is our position that the addition of TRLs to the decision-making process for smart grid power-system projects, will lead to an environment of more confident adoption.« less
Analysis and design of hospital management information system based on UML
NASA Astrophysics Data System (ADS)
Ma, Lin; Zhao, Huifang; You, Shi Jun; Ge, Wenyong
2018-05-01
With the rapid development of computer technology, computer information management system has been utilized in many industries. Hospital Information System (HIS) is in favor of providing data for directors, lightening the workload for the medical workers, and improving the workers efficiency. According to the HIS demand analysis and system design, this paper focus on utilizing unified modeling language (UML) models to establish the use case diagram, class diagram, sequence chart and collaboration diagram, and satisfying the demands of the daily patient visit, inpatient, drug management and other relevant operations. At last, the paper summarizes the problems of the system and puts forward an outlook of the HIS system.
Reconciliation of the cloud computing model with US federal electronic health record regulations
2011-01-01
Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing. PMID:21727204
Reconciliation of the cloud computing model with US federal electronic health record regulations.
Schweitzer, Eugene J
2012-01-01
Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing.
Telecommunication Networks. Tech Use Guide: Using Computer Technology.
ERIC Educational Resources Information Center
Council for Exceptional Children, Reston, VA. Center for Special Education Technology.
One of nine brief guides for special educators on using computer technology, this guide focuses on utilizing the telecommunications capabilities of computers. Network capabilities including electronic mail, bulletin boards, and access to distant databases are briefly explained. Networks useful to the educator, general commercial systems, and local…
Computer programs: Electronic circuit design criteria: A compilation
NASA Technical Reports Server (NTRS)
1973-01-01
A Technology Utilization Program for the dissemination of information on technological developments which have potential utility outside the aerospace community is presented. The 21 items reported herein describe programs that are applicable to electronic circuit design procedures.
ERIC Educational Resources Information Center
Serapiglia, Anthony; Serapiglia, Constance
2011-01-01
Handheld computer technology has been available for decades. The college student today has been exposed to various types of handheld computing devices for most of their lives yet there is little known about how a college student utilizes this type of technology tool as a learning advantage to an anytime or place scenario. This study looks at how…
Understanding the Critics of Educational Technology: Gender Inequities and Computers 1983-1993.
ERIC Educational Resources Information Center
Mangione, Melissa
Although many view computers purely as technological tools to be utilized in the classroom and workplace, attention has been drawn to the social differences computers perpetuate, including those of race, class, and gender. This paper focuses on gender and computing by examining recent analyses in regards to content, form, and usage concerns. The…
ERIC Educational Resources Information Center
Rollo, J. Michael; Marmarchev, Helen L.
1999-01-01
The explosion of computer applications in the modern workplace has required student affairs professionals to keep pace with technological advances for office productivity. This article recommends establishing an administrative computer user groups, utilizing coordinated web site development, and enhancing working relationships as ways of dealing…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-07
... technology, to include computer telecommunications or other electronic means, that the lead agency is... assess the capacity and resources of the public to utilize and maintain an electronic- or computer... the technology, to include computer telecommunications or other electronic means, that the lead agency...
NASA Astrophysics Data System (ADS)
Rothman, Alan H.
This study reports the results of research designed to examine the impact of computer-based science instruction on elementary school level students' science content achievement, their attitude about science learning, their level of critical thinking-inquiry skills, and their level of cognitive and English language development. The study compared these learning outcomes resulting from a computer-based approach compared to the learning outcomes from a traditional, textbook-based approach to science instruction. The computer-based approach was inherent in a curriculum titled The Voyage of the Mimi , published by The Bank Street College Project in Science and Mathematics (1984). The study sample included 209 fifth-grade students enrolled in three schools in a suburban school district. This sample was divided into three groups, each receiving one of the following instructional treatments: (a) Mixed-instruction primarily based on the use of a hardcopy textbook in conjunction with computer-based instructional materials as one component of the science course; (b) Non-Traditional, Technology-Based -instruction fully utilizing computer-based material; and (c) Traditional, Textbook-Based-instruction utilizing only the textbook as the basis for instruction. Pre-test, or pre-treatment, data related to each of the student learning outcomes was collected at the beginning of the school year and post-test data was collected at the end of the school year. Statistical analyses of pre-test data were used as a covariate to account for possible pre-existing differences with regard to the variables examined among the three student groups. This study concluded that non-traditional, computer-based instruction in science significantly improved students' attitudes toward science learning and their level of English language development. Non-significant, positive trends were found for the following student learning outcomes: overall science achievement and development of critical thinking-inquiry skills. These conclusions support the value of a non-traditional, computer-based approach to instruction, such as exemplified by The Voyage of the Mimi curriculum, and a recommendation for reform in science teaching that has recommended the use of computer technology to enhance learning outcomes from science instruction to assist in reversing the trend toward what has been perceived to be relatively poor science performance by American students, as documented by the 1996 Third International Mathematics and Science Study (TIMSS).
CSM research: Methods and application studies
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.
1989-01-01
Computational mechanics is that discipline of applied science and engineering devoted to the study of physical phenomena by means of computational methods based on mathematical modeling and simulation, utilizing digital computers. The discipline combines theoretical and applied mechanics, approximation theory, numerical analysis, and computer science. Computational mechanics has had a major impact on engineering analysis and design. When applied to structural mechanics, the discipline is referred to herein as computational structural mechanics. Complex structures being considered by NASA for the 1990's include composite primary aircraft structures and the space station. These structures will be much more difficult to analyze than today's structures and necessitate a major upgrade in computerized structural analysis technology. NASA has initiated a research activity in structural analysis called Computational Structural Mechanics (CSM). The broad objective of the CSM activity is to develop advanced structural analysis technology that will exploit modern and emerging computers, such as those with vector and/or parallel processing capabilities. Here, the current research directions for the Methods and Application Studies Team of the Langley CSM activity are described.
NASA Astrophysics Data System (ADS)
The effective integration of processes, systems, and procedures used in the production of aerospace systems using computer technology is managed by the Integration Technology Division (MTI). Under its auspices are the Information Management Branch, which is actively involved with information management, information sciences and integration, and the Implementation Branch, whose technology areas include computer integrated manufacturing, engineering design, operations research, and material handling and assembly. The Integration Technology Division combines design, manufacturing, and supportability functions within the same organization. The Processing and Fabrication Division manages programs to improve structural and nonstructural materials processing and fabrication. Within this division, the Metals Branch directs the manufacturing methods program for metals and metal matrix composites processing and fabrication. The Nonmetals Branch directs the manufacturing methods programs, which include all manufacturing processes for producing and utilizing propellants, plastics, resins, fibers, composites, fluid elastomers, ceramics, glasses, and coatings. The objective of the Industrial Base Analysis Division is to act as focal point for the USAF industrial base program for productivity, responsiveness, and preparedness planning.
Bioinformatics clouds for big data manipulation
2012-01-01
Abstract As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. Reviewers This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor. PMID:23190475
Technology and the Modern Library.
ERIC Educational Resources Information Center
Boss, Richard W.
1984-01-01
Overview of the impact of information technology on libraries highlights turnkey vendors, bibliographic utilities, commercial suppliers of records, state and regional networks, computer-to-computer linkages, remote database searching, terminals and microcomputers, building local databases, delivery of information, digital telefacsimile,…
Integrating an Educational Game in Moodle LMS
ERIC Educational Resources Information Center
Minovic, Miroslav; Milovanovic, Milos; Minovic, Jelena; Starcevic, Dusan
2012-01-01
The authors present a learning platform based on a computer game. Learning games combine two industries: education and entertainment, which is often called "Edutainment." The game is realized as a strategic game (similar to Risk[TM]), implemented as a module for Moodle CMS, utilizing Java Applet technology. Moodle is an open-source course…
Eleven quick tips for architecting biomedical informatics workflows with cloud computing.
Cole, Brian S; Moore, Jason H
2018-03-01
Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world's largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction.
Eleven quick tips for architecting biomedical informatics workflows with cloud computing
Moore, Jason H.
2018-01-01
Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world’s largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction. PMID:29596416
Students' Attitudes toward Computers at the College of Nursing at King Saud University (KSU)
ERIC Educational Resources Information Center
Samarkandi, Osama Abdulhaleem
2011-01-01
Computer knowledge and skills are becoming essential components technology in nursing education. Saudi nurses must be prepared to utilize these technologies for the advancement of science and nursing practice in local and global communities. Little attention has been directed to students' attitudes about computer usage in academic communities in…
ERIC Educational Resources Information Center
Adams, Ruifang Hope; Strickland, Jane
2012-01-01
This study investigated the effects of computer-assisted feedback strategies that have been utilized by university students in a technology education curriculum. Specifically, the study examined the effectiveness of the computer-assisted feedback strategy "Knowledge of Response feedback" (KOR), and the "Knowledge of Correct Responses feedback"…
Law of Large Numbers: the Theory, Applications and Technology-based Education
Dinov, Ivo D.; Christou, Nicolas; Gould, Robert
2011-01-01
Modern approaches for technology-based blended education utilize a variety of recently developed novel pedagogical, computational and network resources. Such attempts employ technology to deliver integrated, dynamically-linked, interactive-content and heterogeneous learning environments, which may improve student comprehension and information retention. In this paper, we describe one such innovative effort of using technological tools to expose students in probability and statistics courses to the theory, practice and usability of the Law of Large Numbers (LLN). We base our approach on integrating pedagogical instruments with the computational libraries developed by the Statistics Online Computational Resource (www.SOCR.ucla.edu). To achieve this merger we designed a new interactive Java applet and a corresponding demonstration activity that illustrate the concept and the applications of the LLN. The LLN applet and activity have common goals – to provide graphical representation of the LLN principle, build lasting student intuition and present the common misconceptions about the law of large numbers. Both the SOCR LLN applet and activity are freely available online to the community to test, validate and extend (Applet: http://socr.ucla.edu/htmls/exp/Coin_Toss_LLN_Experiment.html, and Activity: http://wiki.stat.ucla.edu/socr/index.php/SOCR_EduMaterials_Activities_LLN). PMID:21603584
Computer graphics in architecture and engineering
NASA Technical Reports Server (NTRS)
Greenberg, D. P.
1975-01-01
The present status of the application of computer graphics to the building profession or architecture and its relationship to other scientific and technical areas were discussed. It was explained that, due to the fragmented nature of architecture and building activities (in contrast to the aerospace industry), a comprehensive, economic utilization of computer graphics in this area is not practical and its true potential cannot now be realized due to the present inability of architects and structural, mechanical, and site engineers to rely on a common data base. Future emphasis will therefore have to be placed on a vertical integration of the construction process and effective use of a three-dimensional data base, rather than on waiting for any technological breakthrough in interactive computing.
How Tablets Are Utilized in the Classroom
ERIC Educational Resources Information Center
Ditzler, Christine; Hong, Eunsook; Strudler, Neal
2016-01-01
New technologies are a large part of the educational landscape in the 21st century. Emergent technologies are implemented in the classroom at an exponential rate. The newest technology to be added to the daily classroom is the tablet computer. Understanding students' and teachers' perceptions about the role of tablet computers is important as this…
Bounthavong, Mark; Pruitt, Larry D; Smolenski, Derek J; Gahm, Gregory A; Bansal, Aasthaa; Hansen, Ryan N
2018-02-01
Introduction Home-based telebehavioural healthcare improves access to mental health care for patients restricted by travel burden. However, there is limited evidence assessing the economic value of home-based telebehavioural health care compared to in-person care. We sought to compare the economic impact of home-based telebehavioural health care and in-person care for depression among current and former US service members. Methods We performed trial-based cost-minimisation and cost-utility analyses to assess the economic impact of home-based telebehavioural health care versus in-person behavioural care for depression. Our analyses focused on the payer perspective (Department of Defense and Department of Veterans Affairs) at three months. We also performed a scenario analysis where all patients possessed video-conferencing technology that was approved by these agencies. The cost-utility analysis evaluated the impact of different depression categories on the incremental cost-effectiveness ratio. One-way and probabilistic sensitivity analyses were performed to test the robustness of the model assumptions. Results In the base case analysis the total direct cost of home-based telebehavioural health care was higher than in-person care (US$71,974 versus US$20,322). Assuming that patients possessed government-approved video-conferencing technology, home-based telebehavioural health care was less costly compared to in-person care (US$19,177 versus US$20,322). In one-way sensitivity analyses, the proportion of patients possessing personal computers was a major driver of direct costs. In the cost-utility analysis, home-based telebehavioural health care was dominant when patients possessed video-conferencing technology. Results from probabilistic sensitivity analyses did not differ substantially from base case results. Discussion Home-based telebehavioural health care is dependent on the cost of supplying video-conferencing technology to patients but offers the opportunity to increase access to care. Health-care policies centred on implementation of home-based telebehavioural health care should ensure that these technologies are able to be successfully deployed on patients' existing technology.
Minimum Conflict Mainstreaming.
ERIC Educational Resources Information Center
Awen, Ed; And Others
Computer technology is discussed as a tool for facilitating the implementation of the mainstreaming process. Minimum conflict mainstreaming/merging (MCM) is defined as an approach which utilizes computer technology to circumvent such structural obstacles to mainstreaming as transportation scheduling, screening and assignment of students, testing,…
Paradigm Shift or Annoying Distraction
Spallek, H.; O’Donnell, J.; Clayton, M.; Anderson, P.; Krueger, A.
2010-01-01
Web 2.0 technologies, known as social media, social technologies or Web 2.0, have emerged into the mainstream. As they grow, these new technologies have the opportunity to influence the methods and procedures of many fields. This paper focuses on the clinical implications of the growing Web 2.0 technologies. Five developing trends are explored: information channels, augmented reality, location-based mobile social computing, virtual worlds and serious gaming, and collaborative research networks. Each trend is discussed based on their utilization and pattern of use by healthcare providers or healthcare organizations. In addition to explorative research for each trend, a vignette is presented which provides a future example of adoption. Lastly each trend lists several research challenge questions for applied clinical informatics. PMID:23616830
GPU accelerated dynamic functional connectivity analysis for functional MRI data.
Akgün, Devrim; Sakoğlu, Ünal; Esquivel, Johnny; Adinoff, Bryon; Mete, Mutlu
2015-07-01
Recent advances in multi-core processors and graphics card based computational technologies have paved the way for an improved and dynamic utilization of parallel computing techniques. Numerous applications have been implemented for the acceleration of computationally-intensive problems in various computational science fields including bioinformatics, in which big data problems are prevalent. In neuroimaging, dynamic functional connectivity (DFC) analysis is a computationally demanding method used to investigate dynamic functional interactions among different brain regions or networks identified with functional magnetic resonance imaging (fMRI) data. In this study, we implemented and analyzed a parallel DFC algorithm based on thread-based and block-based approaches. The thread-based approach was designed to parallelize DFC computations and was implemented in both Open Multi-Processing (OpenMP) and Compute Unified Device Architecture (CUDA) programming platforms. Another approach developed in this study to better utilize CUDA architecture is the block-based approach, where parallelization involves smaller parts of fMRI time-courses obtained by sliding-windows. Experimental results showed that the proposed parallel design solutions enabled by the GPUs significantly reduce the computation time for DFC analysis. Multicore implementation using OpenMP on 8-core processor provides up to 7.7× speed-up. GPU implementation using CUDA yielded substantial accelerations ranging from 18.5× to 157× speed-up once thread-based and block-based approaches were combined in the analysis. Proposed parallel programming solutions showed that multi-core processor and CUDA-supported GPU implementations accelerated the DFC analyses significantly. Developed algorithms make the DFC analyses more practical for multi-subject studies with more dynamic analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.
Computer Technology for Industry
NASA Technical Reports Server (NTRS)
1979-01-01
In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.
Reading Teachers' Beliefs and Utilization of Computer and Technology: A Case Study
ERIC Educational Resources Information Center
Remetio, Jessica Espinas
2014-01-01
Many researchers believe that computers have the ability to help improve the reading skills of students. In an effort to improve the poor reading scores of students on state tests, as well as improve students' overall academic performance, computers and other technologies have been installed in Frozen Bay School classrooms. As the success of these…
Folan, Alyce; Barclay, Linda; Cooper, Cathy; Robinson, Merren
2015-01-01
Assistive technology for computer access can be used to facilitate people with a spinal cord injury to utilize mainstream computer applications, thereby enabling participation in a variety of meaningful occupations. The aim of this study was to gain an understanding of the experiences of clients with tetraplegia trialing assistive technologies for computer access during different stages in a public rehabilitation service. In order to explore the experiences of clients with tetraplegia trialing assistive technologies for computer use, qualitative methodology was selected. Data were collected from seven participants using semi-structured interviews, which were audio-taped, transcribed and analyzed thematically. Three main themes were identified. These were: getting back into life, assisting in adjusting to injury and learning new skills. The findings from this study demonstrated that people with tetraplegia can be assisted to return to previous life roles or engage in new roles, through developing skills in the use of assistive technology for computer access. Being able to use computers for meaningful activities contributed to the participants gaining an enhanced sense of self-efficacy, and thereby quality of life. Implications for Rehabilitation Findings from this pilot study indicate that people with tetraplegia can be assisted to return to previous life roles, and develop new roles that have meaning to them through the use of assistive technologies for computer use. Being able to use the internet to socialize, and complete daily tasks, contributed to the participants gaining a sense of control over their lives. Early introduction to assistive technology is important to ensure sufficient time for newly injured people to feel comfortable enough with the assistive technology to use the computers productively by the time of discharge. Further research into this important and expanding area is indicated.
NASA Astrophysics Data System (ADS)
Greene, M. I.; Ladelfa, C. J.; Bivacca, S. J.
1980-05-01
Flash hydropyrolysis (FHP) of coal is an emerging technology for the direct production of methane, ethane and BTX in a single-stage, high throughput reactor. The FHP technique involves the short residence time (1-2 seconds), rapid heatup of coal in a dilute-phase, transport reactor. When integrated into an overall, grass-roots conversion complex, the FHP technique can be utilized to generate a product consisting of SNG, ethylene/propylene, benzene and Fischer-Tropsch-based alcohols. This paper summarizes the process engineering and economics of conceptualized facility based on an FHP reactor operation with a lignitic coal. The plant is hypothetically sited near the extensive lignite fields located in the Texas region of the United States. Utilizing utility-financing methods for the costing of SNG, and selling the chemicals cogenerated at petrochemical market prices, the 20-year average SNG cost has been computed to vary between $3-4/MM Btu, depending upon the coal costs, interest rates, debt/equity ratio, coproduct chemicals prices, etc.
Four-Year Summary, Educational and Commercial Utilization of a Chemical Information Center. Part I.
ERIC Educational Resources Information Center
Schipma, Peter B., Ed.
The major objective of the Illinois Institute of Technology (IIT) Computer Search Center (CSC) is to educate and link industry, academia, and government institutions to chemical and other scientific information systems and sources. The CSC is in full operation providing services to users from a variety of machine-readable data bases with minimal…
A highly efficient multi-core algorithm for clustering extremely large datasets
2010-01-01
Background In recent years, the demand for computational power in computational biology has increased due to rapidly growing data sets from microarray and other high-throughput technologies. This demand is likely to increase. Standard algorithms for analyzing data, such as cluster algorithms, need to be parallelized for fast processing. Unfortunately, most approaches for parallelizing algorithms largely rely on network communication protocols connecting and requiring multiple computers. One answer to this problem is to utilize the intrinsic capabilities in current multi-core hardware to distribute the tasks among the different cores of one computer. Results We introduce a multi-core parallelization of the k-means and k-modes cluster algorithms based on the design principles of transactional memory for clustering gene expression microarray type data and categorial SNP data. Our new shared memory parallel algorithms show to be highly efficient. We demonstrate their computational power and show their utility in cluster stability and sensitivity analysis employing repeated runs with slightly changed parameters. Computation speed of our Java based algorithm was increased by a factor of 10 for large data sets while preserving computational accuracy compared to single-core implementations and a recently published network based parallelization. Conclusions Most desktop computers and even notebooks provide at least dual-core processors. Our multi-core algorithms show that using modern algorithmic concepts, parallelization makes it possible to perform even such laborious tasks as cluster sensitivity and cluster number estimation on the laboratory computer. PMID:20370922
Bello, Ibrahim S; Sanusi, Abubakr A; Ezeoma, Ikechi T; Abioye-Kuteyi, Emmanuel A; Akinsola, Adewale
2004-01-01
Background The computer revolution and Information Technology (IT) have transformed modern health care systems in the areas of communication, teaching, storage and retrieval of medical information. These developments have positively impacted patient management and the training and retraining of healthcare providers. Little information is available on the level of training and utilization of IT among health care professionals in developing countries. Objectives To assess the knowledge and utilization pattern of information technology among health care professionals and medical students in a university teaching hospital in Nigeria. Methods Self-structured pretested questionnaires that probe into the knowledge, attitudes and utilization of computers and IT were administered to a randomly selected group of 180 health care professionals and medical students. Descriptive statistics on their knowledge, attitude and utilization patterns were calculated. Results A total of 148 participants (82%) responded, which included 60 medical students, 41 medical doctors and 47 health records staff. Their ages ranged between 22 and 54 years. Eighty respondents (54%) reportedly had received some form of computer training while the remaining 68 (46%) had no training. Only 39 respondents (26%) owned a computer while the remaining 109 (74%) had no computer. In spite of this a total of 28 respondents (18.9%) demonstrated a good knowledge of computers while 87 (58.8%) had average knowledge. Only 33 (22.3%) showed poor knowledge. Fifty-nine respondents (39.9%) demonstrated a good attitude and good utilization habits, while in 50 respondents (33.8%) attitude and utilization habits were average and in 39 (26.4%) they were poor. While 25% of students and 27% of doctors had good computer knowledge (P=.006), only 4.3% of the records officers demonstrated a good knowledge. Forty percent of the medical students, 54% of the doctors and 27.7% of the health records officers showed good utilization habits and attitudes (P=.01) Conclusion Only 26% of the respondents possess a computer, and only a small percentage of the respondents demonstrated good knowledge of computers and IT, hence the suboptimal utilization pattern. The fact that the health records officers by virtue of their profession had better training opportunities did not translate into better knowledge and utilization habits, hence the need for a more structured training, one which would form part of the curriculum. This would likely have more impact on the target population than ad hoc arrangements. PMID:15631969
Project Solo; Newsletter Number Four.
ERIC Educational Resources Information Center
Pittsburgh Univ., PA. Project Solo.
A paper titled "Myopia, Cornucopia and Utopia" makes up the major portion of this Project Solo Newsletter. It emphasizes the danger involved in the belief that the larger the system the better, and points out that although the computer utilizes technology, the human with judgment utilizes the computer. Some details of the Project Solo…
Design and operations technologies - Integrating the pieces. [for future space systems design
NASA Technical Reports Server (NTRS)
Eldred, C. H.
1979-01-01
As major elements of life-cycle costs (LCC) having critical impacts on the initiation and utilization of future space programs, the areas of vehicle design and operations are reviewed in order to identify technology requirements. Common to both areas is the requirement for efficient integration of broad, complex systems. Operations technologies focus on the extension of space-based capabilities and cost reduction through the combination of innovative design, low-maintenance hardware, and increased manpower productivity. Design technologies focus on computer-aided techniques which increase productivity while maintaining a high degree of flexibility which enhances creativity and permits graceful design changes.
Effectiveness of educational technology to improve patient care in pharmacy curricula.
Smith, Michael A; Benedict, Neal
2015-02-17
A review of the literature on the effectiveness of educational technologies to teach patient care skills to pharmacy students was conducted. Nineteen articles met inclusion criteria for the review. Seven of the articles included computer-aided instruction, 4 utilized human-patient simulation, 1 used both computer-aided instruction and human-patient simulation, and 7 utilized virtual patients. Educational technology was employed with more than 2700 students at 12 colleges and schools of pharmacy in courses including pharmacotherapeutics, skills and patient care laboratories, drug diversion, and advanced pharmacy practice experience (APPE) orientation. Students who learned by means of human-patient simulation and virtual patients reported enjoying the learning activity, whereas the results with computer-aided instruction were mixed. Moreover, the effect on learning was significant in the human-patient simulation and virtual patient studies, while conflicting data emerged on the effectiveness of computer-aided instruction.
NASA Astrophysics Data System (ADS)
Ruffin, Monya Aisha
The evolution of increased global accessibility and dependency on computer technologies has revolutionized most aspects of everyday life, including a rapid transformation of 21st century schools. Current changes in education reflect the need for the integration of effective computer technologies in school curricula. The principal objective of this investigation was to examine the acquisition of computer skills and inquiry skills by urban eighth grade students in a technology-supported environment. The study specifically focused on students' ability to identify, understand, and work through the process of scientific inquiry, while also developing computer technology tool skills. The unique component of the study was its contextualization within a local historically significant setting---an African-American cemetery. Approximately seventy students, in a local middle school, participated in the five-week treatment. Students conducted research investigations on site and over the Internet, worked in collaborative groups, utilized technology labs, and received inquiry and computer technology instruction. A mixed method design employing quantitative and qualitative methods was used. Two pilot studies conducted in an after-school science club format helped sharpen the research question, data collection methods, and survey used in the school-based study. Complete sets of data from pre and post surveys and journals were collected from sixty students. Six students were randomly selected to participate in in-depth focus group interviews. Researcher observations and inferences were also included in the analysis. The research findings showed that, after the treatment, students: (a) acquired more inquiry skills and computer skills, (b) broadened their basic conceptual understanding and perspective about science, (c) engaged actively in a relevant learning process, (d) created tangible evidence of their inquiry skills and computer skills, and (e) recalled and retained more details about the inquiry process and the computer technology tools (when they attended at least 80% of the treatment sessions). The findings indicated that project-based, technology-supported experiences allowed students to learn content in an interdisciplinary way (building on culturally relevant local histories) and provided enjoyable learning opportunities for students and teachers. Participation in the treatment encouraged students to think beyond the technical aspects of technology and relate its relevancy and usefulness to solving scientific queries.
Over the last several years, there has been increased pressure to utilize novel technologies derived from computational chemistry, molecular biology and systems biology in toxicological risk assessment. This new area has been referred to as "Computational Toxicology". Our resear...
Computer-Assisted Foreign Language Teaching and Learning: Technological Advances
ERIC Educational Resources Information Center
Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.
2013-01-01
Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…
Determinants of Computer Utilization by Extension Personnel: A Structural Equations Approach
ERIC Educational Resources Information Center
Sivakumar, Paramasivan Sethuraman; Parasar, Bibudha; Das, Raghu Nath; Anantharaman, Mathevanpillai
2014-01-01
Purpose: Information technology (IT) has tremendous potential for fostering grassroots development and the Indian government has created various capital-intensive computer networks to promote agricultural development. However, research studies have shown that information technology investments are not always translated into productivity gains due…
The use of interactive technology in the classroom.
Kresic, P
1999-01-01
This article discusses the benefits that clinical laboratory science students and instructors experienced through the use of and integration of computer technology, microscopes, and digitizing cameras. Patient specimens were obtained from the participating clinical affiliates, slides stained or wet mounts prepared, images viewed under the microscope, digitized, and after labeling, stored into an appropriate folder. The individual folders were labeled as Hematology, Microbiology, Chemistry, or Urinalysis. Students, after obtaining the necessary specimens and pertinent data, created case study presentations for class discussions. After two semesters of utilizing videomicroscopy/computer technology in the classroom, students and instructors realized the potential associated with the technology, namely, the vast increase in the amount of organized visual and scientific information accessible and the availability of collaborative and interactive learning to complement individualized instruction. The instructors, on the other hand, were able to provide a wider variety of visual information on individual bases. In conclusion, the appropriate use of technology can enhance students' learning and participation. Increased student involvement through the use of videomicroscopy and computer technology heightened their sense of pride and ownership in providing suitable information in case study presentations. Also, visualization provides students and educators with alternative methods of teaching/learning and increased retention of information.
Computer-Managed Instruction: Theory, Application, and Some Key Implementation Issues.
1984-03-01
who have endorsed computer technology but fail to adopt it . As one educational consultant claims: "Educators appear to have a deep-set skepticism toward...widespread use. i-1 II. BACKGROUND A. HISTORICAL PERSPECTIVE In the mid-1950’s, while still in its infancy, computer technology entered the world of education...to utilize the new technology , and to do it most.. extensively. Implementation of CMI in a standalone configuration using microcomputers has been
The role of a clinically based computer department of instruction in a school of medicine.
Yamamoto, W S
1991-10-01
The evolution of activities and educational directions of a department of instruction in medical computer technology in a school of medicine are reviewed. During the 18 years covered, the society at large has undergone marked change in availability and use of computation in every aspect of medical care. It is argued that a department of instruction should be clinical and develop revenue sources based on patient care, perform technical services for the institution with a decentralized structure, and perform both health services and scientific research. Distinction should be drawn between utilization of computing in medical specialties, library function, and instruction in computer science. The last is the proper arena for the academic content of instruction and is best labelled as the philosophical basis of medical knowledge, in particular, its epistemology. Contemporary pressures for teaching introductory computer skills are probably temporary.
National electronic medical records integration on cloud computing system.
Mirza, Hebah; El-Masri, Samir
2013-01-01
Few Healthcare providers have an advanced level of Electronic Medical Record (EMR) adoption. Others have a low level and most have no EMR at all. Cloud computing technology is a new emerging technology that has been used in other industry and showed a great success. Despite the great features of Cloud computing, they haven't been utilized fairly yet in healthcare industry. This study presents an innovative Healthcare Cloud Computing system for Integrating Electronic Health Record (EHR). The proposed Cloud system applies the Cloud Computing technology on EHR system, to present a comprehensive EHR integrated environment.
Utilizing Internet Technologies in Observatory Control Systems
NASA Astrophysics Data System (ADS)
Cording, Dean
2002-12-01
The 'Internet boom' of the past few years has spurred the development of a number of technologies to provide services such as secure communications, reliable messaging, information publishing and application distribution for commercial applications. Over the same period, a new generation of computer languages have also developed to provide object oriented design and development, improved reliability, and cross platform compatibility. Whilst the business models of the 'dot.com' era proved to be largely unviable, the technologies that they were based upon have survived and have matured to the point were they can now be utilized to build secure, robust and complete observatory control control systems. This paper will describe how Electro Optic Systems has utilized these technologies in the development of its third generation Robotic Observatory Control System (ROCS). ROCS provides an extremely flexible configuration capability within a control system structure to provide truly autonomous robotic observatory operation including observation scheduling. ROCS was built using Internet technologies such as Java, Java Messaging Service (JMS), Lightweight Directory Access Protocol (LDAP), Secure Sockets Layer (SSL), eXtendible Markup Language (XML), Hypertext Transport Protocol (HTTP) and Java WebStart. ROCS was designed to be capable of controlling all aspects of an observatory and be able to be reconfigured to handle changing equipment configurations or user requirements without the need for an expert computer programmer. ROCS consists of many small components, each designed to perform a specific task, with the configuration of the system specified using a simple meta language. The use of small components facilitates testing and makes it possible to prove that the system is correct.
Practical applications of interactive voice technologies: Some accomplishments and prospects
NASA Technical Reports Server (NTRS)
Grady, Michael W.; Hicklin, M. B.; Porter, J. E.
1977-01-01
A technology assessment of the application of computers and electronics to complex systems is presented. Three existing systems which utilize voice technology (speech recognition and speech generation) are described. Future directions in voice technology are also described.
Reeves, Rustin E; Aschenbrenner, John E; Wordinger, Robert J; Roque, Rouel S; Sheedlo, Harold J
2004-05-01
The need to increase the efficiency of dissection in the gross anatomy laboratory has been the driving force behind the technologic changes we have recently implemented. With the introduction of an integrated systems-based medical curriculum and a reduction in laboratory teaching hours, anatomy faculty at the University of North Texas Health Science Center (UNTHSC) developed a computer-based dissection manual to adjust to these curricular changes and time constraints. At each cadaver workstation, Apple iMac computers were added and a new dissection manual, running in a browser-based format, was installed. Within the text of the manual, anatomical structures required for dissection were linked to digital images from prosected materials; in addition, for each body system, the dissection manual included images from cross sections, radiographs, CT scans, and histology. Although we have placed a high priority on computerization of the anatomy laboratory, we remain strong advocates of the importance of cadaver dissection. It is our belief that the utilization of computers for dissection is a natural evolution of technology and fosters creative teaching strategies adapted for anatomy laboratories in the 21st century. Our strategy has significantly enhanced the independence and proficiency of our students, the efficiency of their dissection time, and the quality of laboratory instruction by the faculty. Copyright 2004 Wiley-Liss, Inc.
The President's Report, 1983-84.
ERIC Educational Resources Information Center
Bok, Derek
The 1983-84 annual report of the President of Harvard University to members of the Board of Overseers addresses the advantages and disadvantages of the utilization of new technologies by a university, comments on the instructional uses of computers (including computer assisted instruction (CAI)) and video technology, and cites specific examples in…
Design considerations for a 10-kW integrated hydrogen-oxygen regenerative fuel cell system
NASA Technical Reports Server (NTRS)
Hoberecht, M. A.; Miller, T. B.; Rieker, L. L.; Gonzalez-Sanabria, O. D.
1984-01-01
Integration of an alkaline fuel cell subsystem with an alkaline electrolysis subsystem to form a regenerative fuel cell (RFC) system for low earth orbit (LEO) applications characterized by relatively high overall round trip electrical efficiency, long life, and high reliability is possible with present state of the art technology. A hypothetical 10 kW system computer modeled and studied based on data from ongoing contractual efforts in both the alkaline fuel cell and alkaline water electrolysis areas. The alkaline fuel cell technology is under development utilizing advanced cell components and standard Shuttle Orbiter system hardware. The alkaline electrolysis technology uses a static water vapor feed technique and scaled up cell hardware is developed. The computer aided study of the performance, operating, and design parameters of the hypothetical system is addressed.
Applications of Computer Technology in Complex Craniofacial Reconstruction.
Day, Kristopher M; Gabrick, Kyle S; Sargent, Larry A
2018-03-01
To demonstrate our use of advanced 3-dimensional (3D) computer technology in the analysis, virtual surgical planning (VSP), 3D modeling (3DM), and treatment of complex congenital and acquired craniofacial deformities. We present a series of craniofacial defects treated at a tertiary craniofacial referral center utilizing state-of-the-art 3D computer technology. All patients treated at our center using computer-assisted VSP, prefabricated custom-designed 3DMs, and/or 3D printed custom implants (3DPCI) in the reconstruction of craniofacial defects were included in this analysis. We describe the use of 3D computer technology to precisely analyze, plan, and reconstruct 31 craniofacial deformities/syndromes caused by: Pierre-Robin (7), Treacher Collins (5), Apert's (2), Pfeiffer (2), Crouzon (1) Syndromes, craniosynostosis (6), hemifacial microsomia (2), micrognathia (2), multiple facial clefts (1), and trauma (3). In select cases where the available bone was insufficient for skeletal reconstruction, 3DPCIs were fabricated using 3D printing. We used VSP in 30, 3DMs in all 31, distraction osteogenesis in 16, and 3DPCIs in 13 cases. Utilizing these technologies, the above complex craniofacial defects were corrected without significant complications and with excellent aesthetic results. Modern 3D technology allows the surgeon to better analyze complex craniofacial deformities, precisely plan surgical correction with computer simulation of results, customize osteotomies, plan distractions, and print 3DPCI, as needed. The use of advanced 3D computer technology can be applied safely and potentially improve aesthetic and functional outcomes after complex craniofacial reconstruction. These techniques warrant further study and may be reproducible in various centers of care.
Flow Control Research at NASA Langley in Support of High-Lift Augmentation
NASA Technical Reports Server (NTRS)
Sellers, William L., III; Jones, Gregory S.; Moore, Mark D.
2002-01-01
The paper describes the efforts at NASA Langley to apply active and passive flow control techniques for improved high-lift systems, and advanced vehicle concepts utilizing powered high-lift techniques. The development of simplified high-lift systems utilizing active flow control is shown to provide significant weight and drag reduction benefits based on system studies. Active flow control that focuses on separation, and the development of advanced circulation control wings (CCW) utilizing unsteady excitation techniques will be discussed. The advanced CCW airfoils can provide multifunctional controls throughout the flight envelope. Computational and experimental data are shown to illustrate the benefits and issues with implementation of the technology.
Pan, W R; Rozen, W M; Stretch, J; Thierry, B; Ashton, M W; Corlett, R J
2008-09-01
Lymphatic anatomy has become increasingly clinically important as surgical techniques evolve for investigating and treating cancer metastases. However, due to limited anatomical techniques available, research in this field has been insufficient. The techniques of computed tomography (CT) and magnetic resonance (MR) lymphangiography have not been described previously in the imaging of cadaveric lymphatic anatomy. This preliminary work describes the feasibility of these advanced imaging technologies for imaging lymphatic anatomy. A single, fresh cadaveric lower limb underwent lymphatic dissection and cannulation utilizing microsurgical techniques. Contrast materials for both CT and MR studies were chosen based on their suitability for subsequent clinical use, and imaging was undertaken with a view to mapping lymphatic anatomy. Microdissection studies were compared with imaging findings in each case. Both MR-based and CT-based contrast media in current clinical use were found to be suitable for demonstrating cadaveric lymphatic anatomy upon direct intralymphatic injection. MR lymphangiography and CT lymphangiography are feasible modalities for cadaveric anatomical research for lymphatic anatomy. Future studies including refinements in scanning techniques may offer these technologies to the clinical setting.
Computer Applications in the Design Process.
ERIC Educational Resources Information Center
Winchip, Susan
Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…
A Tools-Based Approach to Teaching Data Mining Methods
ERIC Educational Resources Information Center
Jafar, Musa J.
2010-01-01
Data mining is an emerging field of study in Information Systems programs. Although the course content has been streamlined, the underlying technology is still in a state of flux. The purpose of this paper is to describe how we utilized Microsoft Excel's data mining add-ins as a front-end to Microsoft's Cloud Computing and SQL Server 2008 Business…
Adapting Technological Interventions to Meet the Needs of Priority Populations.
Linke, Sarah E; Larsen, Britta A; Marquez, Becky; Mendoza-Vasconez, Andrea; Marcus, Bess H
2016-01-01
Cardiovascular diseases (CVD) comprise the leading cause of mortality worldwide, accounting for 3 in 10 deaths. Individuals with certain risk factors, including tobacco use, obesity, low levels of physical activity, type 2 diabetes mellitus, racial/ethnic minority status and low socioeconomic status, experience higher rates of CVD and are, therefore, considered priority populations. Technological devices such as computers and smartphones are now routinely utilized in research studies aiming to prevent CVD and its risk factors, and they are also rampant in the public and private health sectors. Traditional health behavior interventions targeting these risk factors have been adapted for technology-based approaches. This review provides an overview of technology-based interventions conducted in these priority populations as well as the challenges and gaps to be addressed in future research. Researchers currently possess tremendous opportunities to engage in technology-based implementation and dissemination science to help spread evidence-based programs focusing on CVD risk factors in these and other priority populations. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Ando, K.; Fujita, S.; Ito, J.; Yuasa, S.; Suzuki, Y.; Nakatani, Y.; Miyazaki, T.; Yoda, H.
2014-05-01
Most parts of present computer systems are made of volatile devices, and the power to supply them to avoid information loss causes huge energy losses. We can eliminate this meaningless energy loss by utilizing the non-volatile function of advanced spin-transfer torque magnetoresistive random-access memory (STT-MRAM) technology and create a new type of computer, i.e., normally off computers. Critical tasks to achieve normally off computers are implementations of STT-MRAM technologies in the main memory and low-level cache memories. STT-MRAM technology for applications to the main memory has been successfully developed by using perpendicular STT-MRAMs, and faster STT-MRAM technologies for applications to the cache memory are now being developed. The present status of STT-MRAMs and challenges that remain for normally off computers are discussed.
Many-core computing for space-based stereoscopic imaging
NASA Astrophysics Data System (ADS)
McCall, Paul; Torres, Gildo; LeGrand, Keith; Adjouadi, Malek; Liu, Chen; Darling, Jacob; Pernicka, Henry
The potential benefits of using parallel computing in real-time visual-based satellite proximity operations missions are investigated. Improvements in performance and relative navigation solutions over single thread systems can be achieved through multi- and many-core computing. Stochastic relative orbit determination methods benefit from the higher measurement frequencies, allowing them to more accurately determine the associated statistical properties of the relative orbital elements. More accurate orbit determination can lead to reduced fuel consumption and extended mission capabilities and duration. Inherent to the process of stereoscopic image processing is the difficulty of loading, managing, parsing, and evaluating large amounts of data efficiently, which may result in delays or highly time consuming processes for single (or few) processor systems or platforms. In this research we utilize the Single-Chip Cloud Computer (SCC), a fully programmable 48-core experimental processor, created by Intel Labs as a platform for many-core software research, provided with a high-speed on-chip network for sharing information along with advanced power management technologies and support for message-passing. The results from utilizing the SCC platform for the stereoscopic image processing application are presented in the form of Performance, Power, Energy, and Energy-Delay-Product (EDP) metrics. Also, a comparison between the SCC results and those obtained from executing the same application on a commercial PC are presented, showing the potential benefits of utilizing the SCC in particular, and any many-core platforms in general for real-time processing of visual-based satellite proximity operations missions.
Design, processing and testing of LSI arrays, hybrid microelectronics task
NASA Technical Reports Server (NTRS)
Himmel, R. P.; Stuhlbarg, S. M.; Ravetti, R. G.; Zulueta, P. J.; Rothrock, C. W.
1979-01-01
Mathematical cost models previously developed for hybrid microelectronic subsystems were refined and expanded. Rework terms related to substrate fabrication, nonrecurring developmental and manufacturing operations, and prototype production are included. Sample computer programs were written to demonstrate hybrid microelectric applications of these cost models. Computer programs were generated to calculate and analyze values for the total microelectronics costs. Large scale integrated (LST) chips utilizing tape chip carrier technology were studied. The feasibility of interconnecting arrays of LSU chips utilizing tape chip carrier and semiautomatic wire bonding technology was demonstrated.
Technology-Based Interventions for Asthma-Can They Help Decrease Health Disparities?
Baptist, Alan P; Islam, Nishat; Joseph, Christine L M
Asthma is a condition that has consistently demonstrated significant health outcome inequalities for minority populations. One approach used for care of patients with asthma is the incorporation of technology for behavioral modification, symptom monitoring, education, and/or treatment decision making. Whether such technological interventions can improve the care of black and inner-city patients is unknown. We reviewed all randomized controlled trial technological interventions from 2000 to 2015 performed in minority populations. A total of 16 articles met inclusion and exclusion criteria; all but 1 was performed in a childhood or adolescent age group. The interventions used MPEG audio layer-3 players, text messaging, computer/Web-based systems, video games, and interactive voice response. Many used tailored content and/or a specific behavior theory. Although the interventions were based on technology, most required additional special staffing. Subject user satisfaction was positive, and improvements were noted in asthma knowledge, medication adherence, asthma symptoms, and quality of life. Unfortunately, health care utilization (emergency department visits and/or hospitalizations) was typically not improved by the interventions. Although no single intervention modality was vastly superior, the computer-based interventions appeared to have the most positive results. In summary, technology-based interventions have a high level of user satisfaction among minority and urban/low-income individuals with asthma, and can improve asthma outcomes. Further large-scale studies are needed to assess whether such interventions can decrease health disparities in asthma. Copyright © 2016 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Study of Fluid Experiment System (FES)/CAST/Holographic Ground System (HGS)
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Cummings, Rick; Jones, Brian
1992-01-01
The use of holographic and schlieren optical techniques for studying the concentration gradients in solidification processes has been used by several investigators over the years. The HGS facility at MSFC has been primary resource in researching this capability. Consequently, scientific personnel have been able to utilize these techniques in both ground based research and in space experiments. An important event in the scientific utilization of the HGS facilities was the TGS Crystal Growth and the casting and solidification technology (CAST) experiments that were flown on the International Microgravity Laboratory (IML) mission in March of this year. The preparation and processing of these space observations are the primary experiments reported in this work. This project provides some ground-based studies to optimize on the holographic techniques used to acquire information about the crystal growth processes flown on IML. Since the ground-based studies will be compared with the space-based experimental results, it is necessary to conduct sufficient ground based studies to best determine how the experiment worked in space. The current capabilities in computer based systems for image processing and numerical computation have certainly assisted in those efforts. As anticipated, this study has certainly shown that these advanced computing capabilities are helpful in the data analysis of such experiments.
Lindstrand, Peg
2002-01-01
This study focuses on differences between the ways in which we look at girls' and boys' computer activities. It is evident that the gender varieties per se generate different conditions for boys and girls. Generally, children with disabilities have great difficulty defining their needs and wishes. Pedagogues, habilitation staff and parents are needed as support for both boys' and girls' development. If technology is to be part of this development, we must pay attention to it. Research within this area highlights the differences and patterns that occur. The study stresses the expectations and experiences that parents of children with disabilities have of computer-based activities for their children, with a focus on gender-related issues.
Computational Support for Technology- Investment Decisions
NASA Technical Reports Server (NTRS)
Adumitroaie, Virgil; Hua, Hook; Lincoln, William; Block, Gary; Mrozinski, Joseph; Shelton, Kacie; Weisbin, Charles; Elfes, Alberto; Smith, Jeffrey
2007-01-01
Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format.
76 FR 80901 - National Medal of Technology and Innovation Nomination Evaluation Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-27
... Nation's highest honor for technological innovation, awarded annually by the President of the United... utilizing technological innovation and/or be familiar with the education, training, employment and... Management/Computing/IT/Manufacturing Innovation; Technological Manpower/Workforce Training/Education. Under...
TiO2-based memristors and ReRAM: materials, mechanisms and models (a review)
NASA Astrophysics Data System (ADS)
Gale, Ella
2014-10-01
The memristor is the fundamental nonlinear circuit element, with uses in computing and computer memory. Resistive Random Access Memory (ReRAM) is a resistive switching memory proposed as a non-volatile memory. In this review we shall summarize the state of the art for these closely-related fields, concentrating on titanium dioxide, the well-utilized and archetypal material for both. We shall cover material properties, switching mechanisms and models to demonstrate what ReRAM and memristor scientists can learn from each other and examine the outlook for these technologies.
Utility and Usability as Factors Influencing Teacher Decisions about Software Integration
ERIC Educational Resources Information Center
Okumus, Samet; Lewis, Lindsey; Wiebe, Eric; Hollebrands, Karen
2016-01-01
Given the importance of teacher in the implementation of computer technology in classrooms, the technology acceptance model and TPACK model were used to better understand the decision-making process teachers use in determining how, when, and where computer software is used in mathematics classrooms. Thirty-four (34) teachers implementing…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelley, B. M.
The electric utility industry is undergoing significant transformations in its operation model, including a greater emphasis on automation, monitoring technologies, and distributed energy resource management systems (DERMS). With these changes and new technologies, while driving greater efficiencies and reliability, these new models may introduce new vectors of cyber attack. The appropriate cybersecurity controls to address and mitigate these newly introduced attack vectors and potential vulnerabilities are still widely unknown and performance of the control is difficult to vet. This proposal argues that modeling and simulation (M&S) is a necessary tool to address and better understand these problems introduced by emergingmore » technologies for the grid. M&S will provide electric utilities a platform to model its transmission and distribution systems and run various simulations against the model to better understand the operational impact and performance of cybersecurity controls.« less
NASA Technical Reports Server (NTRS)
Wattson, R. B.; Harvey, P.; Swift, R.
1975-01-01
An intrinsic silicon charge injection device (CID) television sensor array has been used in conjunction with a CaMoO4 colinear tunable acousto optic filter, a 61 inch reflector, a sophisticated computer system, and a digital color TV scan converter/computer to produce near IR images of Saturn and Jupiter with 10A spectral resolution and approximately 3 inch spatial resolution. The CID camera has successfully obtained digitized 100 x 100 array images with 5 minutes of exposure time, and slow-scanned readout to a computer. Details of the equipment setup, innovations, problems, experience, data and final equipment performance limits are given.
Videodisc-Computer Interfaces.
ERIC Educational Resources Information Center
Zollman, Dean
1984-01-01
Lists microcomputer-videodisc interfaces currently available from 26 sources, including home use systems connected through remote control jack and industrial/educational systems utilizing computer ports and new laser reflective and stylus technology. Information provided includes computer and videodisc type, language, authoring system, educational…
Physically Based Virtual Surgery Planning and Simulation Tools for Personal Health Care Systems
NASA Astrophysics Data System (ADS)
Dogan, Firat; Atilgan, Yasemin
The virtual surgery planning and simulation tools have gained a great deal of importance in the last decade in a consequence of increasing capacities at the information technology level. The modern hardware architectures, large scale database systems, grid based computer networks, agile development processes, better 3D visualization and all the other strong aspects of the information technology brings necessary instruments into almost every desk. The last decade’s special software and sophisticated super computer environments are now serving to individual needs inside “tiny smart boxes” for reasonable prices. However, resistance to learning new computerized environments, insufficient training and all the other old habits prevents effective utilization of IT resources by the specialists of the health sector. In this paper, all the aspects of the former and current developments in surgery planning and simulation related tools are presented, future directions and expectations are investigated for better electronic health care systems.
BIM based virtual environment for fire emergency evacuation.
Wang, Bin; Li, Haijiang; Rezgui, Yacine; Bradley, Alex; Ong, Hoang N
2014-01-01
Recent building emergency management research has highlighted the need for the effective utilization of dynamically changing building information. BIM (building information modelling) can play a significant role in this process due to its comprehensive and standardized data format and integrated process. This paper introduces a BIM based virtual environment supported by virtual reality (VR) and a serious game engine to address several key issues for building emergency management, for example, timely two-way information updating and better emergency awareness training. The focus of this paper lies on how to utilize BIM as a comprehensive building information provider to work with virtual reality technologies to build an adaptable immersive serious game environment to provide real-time fire evacuation guidance. The innovation lies on the seamless integration between BIM and a serious game based virtual reality (VR) environment aiming at practical problem solving by leveraging state-of-the-art computing technologies. The system has been tested for its robustness and functionality against the development requirements, and the results showed promising potential to support more effective emergency management.
A chiral-based magnetic memory device without a permanent magnet
Dor, Oren Ben; Yochelis, Shira; Mathew, Shinto P.; Naaman, Ron; Paltiel, Yossi
2013-01-01
Several technologies are currently in use for computer memory devices. However, there is a need for a universal memory device that has high density, high speed and low power requirements. To this end, various types of magnetic-based technologies with a permanent magnet have been proposed. Recent charge-transfer studies indicate that chiral molecules act as an efficient spin filter. Here we utilize this effect to achieve a proof of concept for a new type of chiral-based magnetic-based Si-compatible universal memory device without a permanent magnet. More specifically, we use spin-selective charge transfer through a self-assembled monolayer of polyalanine to magnetize a Ni layer. This magnitude of magnetization corresponds to applying an external magnetic field of 0.4 T to the Ni layer. The readout is achieved using low currents. The presented technology has the potential to overcome the limitations of other magnetic-based memory technologies to allow fabricating inexpensive, high-density universal memory-on-chip devices. PMID:23922081
A chiral-based magnetic memory device without a permanent magnet.
Ben Dor, Oren; Yochelis, Shira; Mathew, Shinto P; Naaman, Ron; Paltiel, Yossi
2013-01-01
Several technologies are currently in use for computer memory devices. However, there is a need for a universal memory device that has high density, high speed and low power requirements. To this end, various types of magnetic-based technologies with a permanent magnet have been proposed. Recent charge-transfer studies indicate that chiral molecules act as an efficient spin filter. Here we utilize this effect to achieve a proof of concept for a new type of chiral-based magnetic-based Si-compatible universal memory device without a permanent magnet. More specifically, we use spin-selective charge transfer through a self-assembled monolayer of polyalanine to magnetize a Ni layer. This magnitude of magnetization corresponds to applying an external magnetic field of 0.4 T to the Ni layer. The readout is achieved using low currents. The presented technology has the potential to overcome the limitations of other magnetic-based memory technologies to allow fabricating inexpensive, high-density universal memory-on-chip devices.
The assessment of virtual reality for human anatomy instruction
NASA Technical Reports Server (NTRS)
Benn, Karen P.
1994-01-01
This research project seeks to meet the objective of science training by developing, assessing, and validating virtual reality as a human anatomy training medium. In ideal situations, anatomic models, computer-based instruction, and cadaver dissection are utilized to augment the traditional methods of instruction. At many institutions, lack of financial resources limits anatomy instruction to textbooks and lectures. However, human anatomy is three dimensional, unlike the one dimensional depiction found in textbooks and the two dimensional depiction found on the computer. Virtual reality is a breakthrough technology that allows one to step through the computer screen into a three dimensional world. This technology offers many opportunities to enhance science education. Therefore, a virtual testing environment of the abdominopelvic region of a human cadaver was created to study the placement of body parts within the nine anatomical divisions of the abdominopelvic region and the four abdominal quadrants.
A study of computer graphics technology in application of communication resource management
NASA Astrophysics Data System (ADS)
Li, Jing; Zhou, Liang; Yang, Fei
2017-08-01
With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.
NASA Astrophysics Data System (ADS)
Li, Haiqing; Chatterjee, Samir
With rapid advances in information and communication technology, computer-mediated communication (CMC) technologies are utilizing multiple IT platforms such as email, websites, cell-phones/PDAs, social networking sites, and gaming environments. However, no studies have compared the effectiveness of a persuasive system using such alternative channels and various persuasive techniques. Moreover, how affective computing impacts the effectiveness of persuasive systems is not clear. This study proposes (1) persuasive technology channels in combination with persuasive strategies will have different persuasive effectiveness; (2) Adding positive emotion to a message that leads to a better overall user experience could increase persuasive effectiveness. The affective computing or emotion information was added to the experiment using emoticons. The initial results of a pilot study show that computer-mediated communication channels along with various persuasive strategies can affect the persuasive effectiveness to varying degrees. These results also shows that adding a positive emoticon to a message leads to a better user experience which increases the overall persuasive effectiveness of a system.
Varmazyar, Mohsen; Dehghanbaghi, Maryam; Afkhami, Mehdi
2016-10-01
Balanced Scorecard (BSC) is a strategic evaluation tool using both financial and non-financial indicators to determine the business performance of organizations or companies. In this paper, a new integrated approach based on the Balanced Scorecard (BSC) and multi-criteria decision making (MCDM) methods are proposed to evaluate the performance of research centers of research and technology organization (RTO) in Iran. Decision-Making Trial and Evaluation Laboratory (DEMATEL) are employed to reflect the interdependencies among BSC perspectives. Then, Analytic Network Process (ANP) is utilized to weight the indices influencing the considered problem. In the next step, we apply four MCDM methods including Additive Ratio Assessment (ARAS), Complex Proportional Assessment (COPRAS), Multi-Objective Optimization by Ratio Analysis (MOORA), and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) for ranking of alternatives. Finally, the utility interval technique is applied to combine the ranking results of MCDM methods. Weighted utility intervals are computed by constructing a correlation matrix between the ranking methods. A real case is presented to show the efficacy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.
Fabrication of corner cube array retro-reflective structure with DLP-based 3D printing technology
NASA Astrophysics Data System (ADS)
Riahi, Mohammadreza
2016-06-01
In this article, the fabrication of a corner cube array retro-reflective structure is presented by using DLP-based 3D printing technology. In this additive manufacturing technology a pattern of a cube corner array is designed in a computer and sliced with specific software. The image of each slice is then projected from the bottom side of a reservoir, containing UV cure resin, utilizing a DLP video projector. The projected area is cured and attached to a base plate. This process is repeated until the entire part is made. The best orientation of the printing process and the effect of layer thicknesses on the surface finish of the cube has been investigated. The thermal reflow surface finishing and replication with soft molding has also been presented in this article.
ERIC Educational Resources Information Center
Ward, Nicholas D.; Finley, Rachel J.; Keil, Richard G.; Clay, Tansy G.
2013-01-01
This study explores the utility of a set of tablet-based personal computers in the K-12 science, technology, engineering, and mathematics classroom. Specifically, a lesson on food-chain dynamics and predator-prey population controls was designed on the Apple® iPad platform and delivered to three sophomore-level ecology classes (roughly 30 students…
Personal Computers in Iowa Vocational Agriculture Programs: Competency Assessment and Usage.
ERIC Educational Resources Information Center
Miller, W. Wade; And Others
The competencies needed by Iowa vocational agriculture instructors at the secondary school level to integrate computer technology into the classroom were assessed, as well as the status of computer usage, types of computer use and software utilities and hardware used, and the sources of computer training obtained by instructors. Surveys were…
Abdullahi, Mohammed; Ngadi, Md Asri
2016-01-01
Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.
Abdullahi, Mohammed; Ngadi, Md Asri
2016-01-01
Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan. PMID:27348127
Smarter Software For Enhanced Vehicle Health Monitoring and Inter-Planetary Exploration
NASA Technical Reports Server (NTRS)
Larson, William E.; Goodrich, Charles H.; Steinrock, Todd (Technical Monitor)
2001-01-01
The existing philosophy for space mission control was born in the early days of the space program when technology did not exist to put significant control responsibility onboard the spacecraft. NASA relied on a team of ground control experts to troubleshoot systems when problems occurred. As computing capability improved, more responsibility was handed over to the systems software. However, there is still a large contingent of both launch and flight controllers supporting each mission. New technology can update this philosophy to increase mission assurance and reduce the cost of inter-planetary exploration. The advent of model-based diagnosis and intelligent planning software enables spacecraft to handle most routine problems automatically and allocate resources in a flexible way to realize mission objectives. The manifests for recent missions include multiple subsystems and complex experiments. Spacecraft must operate at longer distances from earth where communications delays make earthbound command and control impractical. NASA's Ames Research Center (ARC) has demonstrated the utility of onboard diagnosis and planning with the Remote Agent experiment in 1999. KSC has pioneered model-based diagnosis and demonstrated its utility for ground support operations. KSC and ARC are cooperating in research to improve the state of the art of this technology. This paper highlights model-based reasoning applications for Moon and Mars missions including in-situ resource utilization and enhanced vehicle health monitoring.
A Delphi Forecast of Technology in Education.
ERIC Educational Resources Information Center
Robinson, Burke E.
The forecast reported here surveys expected utilization levels, organizational structures, and values concerning technology in education in 1990. The focus is upon educational technology and forecasting methodology; televised instruction, computer-assisted instruction (CAI), and information services are considered. The methodology employed…
The Use of Information Operations (IO) in Immersive Virtual Environments (IVE)
2010-06-01
are motivated or persuaded when interacting with computing products rather than through them. [26] In 2003, Dr. B.J. Fogg , leader of the Stanford...comparable IO utility may be possible through the other computing technologies listed. 23 Figure 6. Captology Focus. From [25] In his book, Dr. Fogg ...Self- Representation on Behavior.” Human Communication Research, no. 33 pp. 271– 290, 2007. [26] B. J. Fogg . Persuasive Technology: Using Computers
Dudding-Byth, Tracy; Baxter, Anne; Holliday, Elizabeth G; Hackett, Anna; O'Donnell, Sheridan; White, Susan M; Attia, John; Brunner, Han; de Vries, Bert; Koolen, David; Kleefstra, Tjitske; Ratwatte, Seshika; Riveros, Carlos; Brain, Steve; Lovell, Brian C
2017-12-19
Massively parallel genetic sequencing allows rapid testing of known intellectual disability (ID) genes. However, the discovery of novel syndromic ID genes requires molecular confirmation in at least a second or a cluster of individuals with an overlapping phenotype or similar facial gestalt. Using computer face-matching technology we report an automated approach to matching the faces of non-identical individuals with the same genetic syndrome within a database of 3681 images [1600 images of one of 10 genetic syndrome subgroups together with 2081 control images]. Using the leave-one-out method, two research questions were specified: 1) Using two-dimensional (2D) photographs of individuals with one of 10 genetic syndromes within a database of images, did the technology correctly identify more than expected by chance: i) a top match? ii) at least one match within the top five matches? or iii) at least one in the top 10 with an individual from the same syndrome subgroup? 2) Was there concordance between correct technology-based matches and whether two out of three clinical geneticists would have considered the diagnosis based on the image alone? The computer face-matching technology correctly identifies a top match, at least one correct match in the top five and at least one in the top 10 more than expected by chance (P < 0.00001). There was low agreement between the technology and clinicians, with higher accuracy of the technology when results were discordant (P < 0.01) for all syndromes except Kabuki syndrome. Although the accuracy of the computer face-matching technology was tested on images of individuals with known syndromic forms of intellectual disability, the results of this pilot study illustrate the potential utility of face-matching technology within deep phenotyping platforms to facilitate the interpretation of DNA sequencing data for individuals who remain undiagnosed despite testing the known developmental disorder genes.
An Architecture for Cross-Cloud System Management
NASA Astrophysics Data System (ADS)
Dodda, Ravi Teja; Smith, Chris; van Moorsel, Aad
The emergence of the cloud computing paradigm promises flexibility and adaptability through on-demand provisioning of compute resources. As the utilization of cloud resources extends beyond a single provider, for business as well as technical reasons, the issue of effectively managing such resources comes to the fore. Different providers expose different interfaces to their compute resources utilizing varied architectures and implementation technologies. This heterogeneity poses a significant system management problem, and can limit the extent to which the benefits of cross-cloud resource utilization can be realized. We address this problem through the definition of an architecture to facilitate the management of compute resources from different cloud providers in an homogenous manner. This preserves the flexibility and adaptability promised by the cloud computing paradigm, whilst enabling the benefits of cross-cloud resource utilization to be realized. The practical efficacy of the architecture is demonstrated through an implementation utilizing compute resources managed through different interfaces on the Amazon Elastic Compute Cloud (EC2) service. Additionally, we provide empirical results highlighting the performance differential of these different interfaces, and discuss the impact of this performance differential on efficiency and profitability.
Information Technology and the Autonomous Control of a Mars In-Situ Propellant Production System
NASA Technical Reports Server (NTRS)
Gross, Anthony R.; Sridhar, K. R.; Larson, William E.; Clancy, Daniel J.; Peschur, Charles; Briggs, Geoffrey A.; Zornetzer, Steven F. (Technical Monitor)
1999-01-01
With the rapidly increasing performance of information technology, i.e., computer hardware and software systems, as well as networks and communication systems, a new capability is being developed that holds the clear promise of greatly increased exploration capability, along with dramatically reduced design, development, and operating costs. These new intelligent systems technologies, utilizing knowledge-based software and very high performance computer systems, will provide new design and development tools, scheduling mechanisms, and vehicle and system health monitoring capabilities. In addition, specific technologies such as neural nets will provide a degree of machine intelligence and associated autonomy which has previously been unavailable to the mission and spacecraft designer and to the system operator. One of the most promising applications of these new information technologies is to the area of in situ resource utilization. Useful resources such as oxygen, compressed carbon dioxide, water, methane, and buffer gases can be extracted and/or generated from planetary atmospheres, such as the Martian atmosphere. These products, when used for propulsion and life-support needs can provide significant savings in the launch mass and costs for both robotic and crewed missions. In the longer term the utilization of indigenous resources is an enabling technology that is vital to sustaining long duration human presence on Mars. This paper will present the concepts that are currently under investigation and development for mining the Martian atmosphere, such as temperature-swing adsorption, zirconia electrolysis etc., to create propellants and life-support materials. This description will be followed by an analysis of the information technology and control needs for the reliable and autonomous operation of such processing plants in a fault tolerant manner, as well as the approach being taken for the development of the controlling software. Finally, there will be a brief discussion of the verification and validation process so crucial to the implementation of mission-critical software.
Assessment of the National Combustion Code
NASA Technical Reports Server (NTRS)
Liu, nan-Suey; Iannetti, Anthony; Shih, Tsan-Hsing
2007-01-01
The advancements made during the last decade in the areas of combustion modeling, numerical simulation, and computing platform have greatly facilitated the use of CFD based tools in the development of combustion technology. Further development of verification, validation and uncertainty quantification will have profound impact on the reliability and utility of these CFD based tools. The objectives of the present effort are to establish baseline for the National Combustion Code (NCC) and experimental data, as well as to document current capabilities and identify gaps for further improvements.
Software-defined Radio Based Measurement Platform for Wireless Networks
Chao, I-Chun; Lee, Kang B.; Candell, Richard; Proctor, Frederick; Shen, Chien-Chung; Lin, Shinn-Yan
2015-01-01
End-to-end latency is critical to many distributed applications and services that are based on computer networks. There has been a dramatic push to adopt wireless networking technologies and protocols (such as WiFi, ZigBee, WirelessHART, Bluetooth, ISA100.11a, etc.) into time-critical applications. Examples of such applications include industrial automation, telecommunications, power utility, and financial services. While performance measurement of wired networks has been extensively studied, measuring and quantifying the performance of wireless networks face new challenges and demand different approaches and techniques. In this paper, we describe the design of a measurement platform based on the technologies of software-defined radio (SDR) and IEEE 1588 Precision Time Protocol (PTP) for evaluating the performance of wireless networks. PMID:27891210
Software-defined Radio Based Measurement Platform for Wireless Networks.
Chao, I-Chun; Lee, Kang B; Candell, Richard; Proctor, Frederick; Shen, Chien-Chung; Lin, Shinn-Yan
2015-10-01
End-to-end latency is critical to many distributed applications and services that are based on computer networks. There has been a dramatic push to adopt wireless networking technologies and protocols (such as WiFi, ZigBee, WirelessHART, Bluetooth, ISA100.11a, etc. ) into time-critical applications. Examples of such applications include industrial automation, telecommunications, power utility, and financial services. While performance measurement of wired networks has been extensively studied, measuring and quantifying the performance of wireless networks face new challenges and demand different approaches and techniques. In this paper, we describe the design of a measurement platform based on the technologies of software-defined radio (SDR) and IEEE 1588 Precision Time Protocol (PTP) for evaluating the performance of wireless networks.
ERIC Educational Resources Information Center
Garfield, Gary M.; McDonough, Suzanne
This book discusses how to effectively integrate technology into the classroom. It examines the benefits of curriculum development utilizing technology and presents sample learning activities. Highlights include: technology's past and present role in education; access to computers; the roles of teacher and learner; professional development;…
Design of an LVDS to USB3.0 adapter and application
NASA Astrophysics Data System (ADS)
Qiu, Xiaohan; Wang, Yu; Zhao, Xin; Chang, Zhen; Zhang, Quan; Tian, Yuze; Zhang, Yunyi; Lin, Fang; Liu, Wenqing
2016-10-01
USB 3.0 specification was published in 2008. With the development of technology, USB 3.0 is becoming popular. LVDS(Low Voltage Differential Signaling) to USB 3.0 Adapter connects the communication port of spectrometer device and the USB 3.0 port of a computer, and converts the output of an LVDS spectrometer device data to USB. In order to adapt to the changing and developing of technology, LVDS to USB3.0 Adapter was designed and developed based on LVDS to USB2.0 Adapter. The CYUSB3014, a new generation of USB bus interface chip produced by Cypress and conforming to USB3.0 communication protocol, utilizes GPIF-II (GPIF, general programmable interface) to connect the FPGA and increases effective communication speed to 2Gbps. Therefore, the adapter, based on USB3.0 technology, is able to connect more spectrometers to single computer and provides technical basis for the development of the higher speed industrial camera. This article describes the design and development process of the LVDS to USB3.0 adapter.
Interest in Use of Technology for Healthcare Among Veterans Receiving Treatment for Mental Health.
Miller, Christopher J; McInnes, D Keith; Stolzmann, Kelly; Bauer, Mark S
2016-10-01
There is great interest in leveraging technology, including cell phones and computers, to improve healthcare. A range of e-health applications pertaining to mental health such as messaging for prescription refill or mobile device videoconferencing are becoming more available, but little is known about the mental health patient's interest in using these newer applications. We mailed a survey to 300 patients seen in the general mental health clinic of a local Veterans Affairs Medical Center. Survey questions focused on interest in use of cell phones, tablets, and other computers in patients' interactions with the healthcare system. A total of 74 patients, primarily treated for depression, post-traumatic stress disorder, or anxiety disorders, returned completed surveys. Nearly all reported having a cell phone (72/74, 97%), but fewer than half reported having a smartphone (35/74, 47%). Overall, a substantial majority (64/74, 86%) had access to an Internet-capable device (smartphone or computer, including tablets). Respondents appeared to prefer computers to cell phones for some health-related communications, but did not express differential interest for other tasks (such as receiving appointment reminders). Interest in use was higher among younger veterans. Most veterans with a mental health diagnosis have access to technology (including cell phones and computers) and are interested in using that technology for some types of healthcare-related communications. While there is capacity to utilize information technology for healthcare purposes in this population, interests vary widely, and a substantial minority does not have access to relevant devices. Although interest in using computers for health-related communication was higher than interest in using cell phones, single-platform technology-based interventions may nonetheless exclude crucial segments of the population.
Dynamic electronic institutions in agent oriented cloud robotic systems.
Nagrath, Vineet; Morel, Olivier; Malik, Aamir; Saad, Naufal; Meriaudeau, Fabrice
2015-01-01
The dot-com bubble bursted in the year 2000 followed by a swift movement towards resource virtualization and cloud computing business model. Cloud computing emerged not as new form of computing or network technology but a mere remoulding of existing technologies to suit a new business model. Cloud robotics is understood as adaptation of cloud computing ideas for robotic applications. Current efforts in cloud robotics stress upon developing robots that utilize computing and service infrastructure of the cloud, without debating on the underlying business model. HTM5 is an OMG's MDA based Meta-model for agent oriented development of cloud robotic systems. The trade-view of HTM5 promotes peer-to-peer trade amongst software agents. HTM5 agents represent various cloud entities and implement their business logic on cloud interactions. Trade in a peer-to-peer cloud robotic system is based on relationships and contracts amongst several agent subsets. Electronic Institutions are associations of heterogeneous intelligent agents which interact with each other following predefined norms. In Dynamic Electronic Institutions, the process of formation, reformation and dissolution of institutions is automated leading to run time adaptations in groups of agents. DEIs in agent oriented cloud robotic ecosystems bring order and group intellect. This article presents DEI implementations through HTM5 methodology.
Raster Scan Computer Image Generation (CIG) System Based On Refresh Memory
NASA Astrophysics Data System (ADS)
Dichter, W.; Doris, K.; Conkling, C.
1982-06-01
A full color, Computer Image Generation (CIG) raster visual system has been developed which provides a high level of training sophistication by utilizing advanced semiconductor technology and innovative hardware and firmware techniques. Double buffered refresh memory and efficient algorithms eliminate the problem of conventional raster line ordering by allowing the generated image to be stored in a random fashion. Modular design techniques and simplified architecture provide significant advantages in reduced system cost, standardization of parts, and high reliability. The major system components are a general purpose computer to perform interfacing and data base functions; a geometric processor to define the instantaneous scene image; a display generator to convert the image to a video signal; an illumination control unit which provides final image processing; and a CRT monitor for display of the completed image. Additional optional enhancements include texture generators, increased edge and occultation capability, curved surface shading, and data base extensions.
Costello, John P; Olivieri, Laura J; Krieger, Axel; Thabit, Omar; Marshall, M Blair; Yoo, Shi-Joon; Kim, Peter C; Jonas, Richard A; Nath, Dilip S
2014-07-01
The current educational approach for teaching congenital heart disease (CHD) anatomy to students involves instructional tools and techniques that have significant limitations. This study sought to assess the feasibility of utilizing present-day three-dimensional (3D) printing technology to create high-fidelity synthetic heart models with ventricular septal defect (VSD) lesions and applying these models to a novel, simulation-based educational curriculum for premedical and medical students. Archived, de-identified magnetic resonance images of five common VSD subtypes were obtained. These cardiac images were then segmented and built into 3D computer-aided design models using Mimics Innovation Suite software. An Objet500 Connex 3D printer was subsequently utilized to print a high-fidelity heart model for each VSD subtype. Next, a simulation-based educational curriculum using these heart models was developed and implemented in the instruction of 29 premedical and medical students. Assessment of this curriculum was undertaken with Likert-type questionnaires. High-fidelity VSD models were successfully created utilizing magnetic resonance imaging data and 3D printing. Following instruction with these high-fidelity models, all students reported significant improvement in knowledge acquisition (P < .0001), knowledge reporting (P < .0001), and structural conceptualization (P < .0001) of VSDs. It is feasible to use present-day 3D printing technology to create high-fidelity heart models with complex intracardiac defects. Furthermore, this tool forms the foundation for an innovative, simulation-based educational approach to teach students about CHD and creates a novel opportunity to stimulate their interest in this field. © The Author(s) 2014.
ERIC Educational Resources Information Center
Impelluso, Thomas J.
2009-01-01
Cognitive Load Theory (CLT) was used as a foundation to redesign a computer programming class for mechanical engineers, in which content was delivered with hybrid/distance technology. The effort confirmed the utility of CLT in course design. And it demonstrates that hybrid/distance learning is not merely a tool of convenience, but one, which, when…
ERIC Educational Resources Information Center
Kunzler, Jayson S.
2012-01-01
This dissertation describes a research study designed to explore whether customization of online instruction results in improved learning in a college business statistics course. The study involved utilizing computer spreadsheet technology to develop an intelligent tutoring system (ITS) designed to: a) collect and monitor individual real-time…
Oh, Pok-Ja; Kim, Il-Ok; Shin, Sung-Rae; Jung, Hoe-Kyung
2004-10-01
This study was to develop Web-based multimedia content for Physical Examination and Health Assessment. The multimedia content was developed based on Jung's teaching and learning structure plan model, using the following 5 processes : 1) Analysis Stage, 2) Planning Stage, 3) Storyboard Framing and Production Stage, 4) Program Operation Stage, and 5) Final Evaluation Stage. The web based multimedia content consisted of an intro movie, main page and sub pages. On the main page, there were 6 menu bars that consisted of Announcement center, Information of professors, Lecture guide, Cyber lecture, Q&A, and Data centers, and a site map which introduced 15 week lectures. In the operation of web based multimedia content, HTML, JavaScript, Flash, and multimedia technology (Audio and Video) were utilized and the content consisted of text content, interactive content, animation, and audio & video. Consultation with the experts in context, computer engineering, and educational technology was utilized in the development of these processes. Web-based multimedia content is expected to offer individualized and tailored learning opportunities to maximize and facilitate the effectiveness of the teaching and learning process. Therefore, multimedia content should be utilized concurrently with the lecture in the Physical Examination and Health Assessment classes as a vital teaching aid to make up for the weakness of the face-to- face teaching-learning method.
IMAGE: A Design Integration Framework Applied to the High Speed Civil Transport
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.
1993-01-01
Effective design of the High Speed Civil Transport requires the systematic application of design resources throughout a product's life-cycle. Information obtained from the use of these resources is used for the decision-making processes of Concurrent Engineering. Integrated computing environments facilitate the acquisition, organization, and use of required information. State-of-the-art computing technologies provide the basis for the Intelligent Multi-disciplinary Aircraft Generation Environment (IMAGE) described in this paper. IMAGE builds upon existing agent technologies by adding a new component called a model. With the addition of a model, the agent can provide accountable resource utilization in the presence of increasing design fidelity. The development of a zeroth-order agent is used to illustrate agent fundamentals. Using a CATIA(TM)-based agent from previous work, a High Speed Civil Transport visualization system linking CATIA, FLOPS, and ASTROS will be shown. These examples illustrate the important role of the agent technologies used to implement IMAGE, and together they demonstrate that IMAGE can provide an integrated computing environment for the design of the High Speed Civil Transport.
National Geographic Society Kids Network: Report on 1994 teacher participants
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
In 1994, National Geographic Society Kids Network, a computer/telecommunications-based science curriculum, was presented to elementary and middle school teachers through summer programs sponsored by NGS and US DOE. The network program assists teachers in understanding the process of doing science; understanding the role of computers and telecommunications in the study of science, math, and engineering; and utilizing computers and telecommunications appropriately in the classroom. The program enables teacher to integrate science, math, and technology with other subjects with the ultimate goal of encouraging students of all abilities to pursue careers in science/math/engineering. This report assesses the impact of the networkmore » program on participating teachers.« less
NASA Astrophysics Data System (ADS)
Beshears, Ronald D.; Hediger, Lisa H.
1994-10-01
The Advanced Computed Tomography Inspection System (ACTIS) was developed by the Marshall Space Flight Center to support in-house solid propulsion test programs. ACTIS represents a significant advance in state-of-the-art inspection systems. Its flexibility and superior technical performance have made ACTIS very popular, both within and outside the aerospace community. Through Technology Utilization efforts, ACTIS has been applied to inspection problems in commercial aerospace, lumber, automotive, and nuclear waste disposal industries. ACTIS has even been used to inspect items of historical interest. ACTIS has consistently produced valuable results, providing information which was unattainable through conventional inspection methods. Although many successes have already been demonstrated, the full potential of ACTIS has not yet been realized. It is currently being applied in the commercial aerospace industry by Boeing Aerospace Company. Smaller systems, based on ACTIS technology are becoming increasingly available. This technology has much to offer small businesses and industry, especially in identifying design and process problems early in the product development cycle to prevent defects. Several options are available to businesses interested in pursuing this technology.
NASA Technical Reports Server (NTRS)
Hediger, Lisa H.
1991-01-01
The Advanced Computed Tomography Inspection System (ACTIS) was developed by NASA Marshall to support solid propulsion test programs. ACTIS represents a significant advance in state-of-the-art inspection systems. Its flexibility and superior technical performance have made ACTIS very popular, both within and outside the aerospace community. Through technology utilization efforts, ACTIS has been applied to inspection problems in commercial aerospace, lumber, automotive, and nuclear waste disposal industries. ACTIS has been used to inspect items of historical interest. ACTIS has consistently produced valuable results, providing information which was unattainable through conventional inspection methods. Although many successes have already been shown, the full potential of ACTIS has not yet been realized. It is currently being applied in the commercial aerospace industry by Boeing. Smaller systems, based on ACTIS technology, are becoming increasingly available. This technology has much to offer the small business and industry, especially in identifying design and process problems early in the product development cycle to prevent defects. Several options are available to businesses interested in this technology.
Planetary cartography in the next decade: Digital cartography and emerging opportunities
NASA Technical Reports Server (NTRS)
1989-01-01
Planetary maps being produced today will represent views of the solar system for many decades to come. The primary objective of the planetary cartography program is to produce the most complete and accurate maps from hundreds of thousands of planetary images in support of scientific studies and future missions. Here, the utilization of digital techniques and digital bases in response to recent advances in computer technology are emphasized.
An Analysis of Defense Information and Information Technology Articles: A Sixteen-Year Perspective
2009-03-01
exploratory,” or “subjective” ( Denzin & Lincoln , 2000). Existing Research This research is based on content analysis methodologies utilized by Carter...same codes ( Denzin & Lincoln , 2000). Different analysts should code the same text in a similar manner (Weber, 1990). Typically, researchers compute...chosen. Krippendorf recommends an agreement level of at least .70 (Krippendorff, 2004). Some scholars use a cut-off rate of .80 ( Denzin & Lincoln
ERIC Educational Resources Information Center
Griffin, Irma Amado
This study describes a pilot program utilizing various multimedia computer programs on a MacQuadra 840 AV. The target group consisted of six advanced dance students who participated in the pilot program within the dance curriculum by creating a database of dance movement using video and still photography. The students combined desktop publishing,…
Adiabatic quantum computing with spin qubits hosted by molecules.
Yamamoto, Satoru; Nakazawa, Shigeaki; Sugisaki, Kenji; Sato, Kazunobu; Toyota, Kazuo; Shiomi, Daisuke; Takui, Takeji
2015-01-28
A molecular spin quantum computer (MSQC) requires electron spin qubits, which pulse-based electron spin/magnetic resonance (ESR/MR) techniques can afford to manipulate for implementing quantum gate operations in open shell molecular entities. Importantly, nuclear spins, which are topologically connected, particularly in organic molecular spin systems, are client qubits, while electron spins play a role of bus qubits. Here, we introduce the implementation for an adiabatic quantum algorithm, suggesting the possible utilization of molecular spins with optimized spin structures for MSQCs. We exemplify the utilization of an adiabatic factorization problem of 21, compared with the corresponding nuclear magnetic resonance (NMR) case. Two molecular spins are selected: one is a molecular spin composed of three exchange-coupled electrons as electron-only qubits and the other an electron-bus qubit with two client nuclear spin qubits. Their electronic spin structures are well characterized in terms of the quantum mechanical behaviour in the spin Hamiltonian. The implementation of adiabatic quantum computing/computation (AQC) has, for the first time, been achieved by establishing ESR/MR pulse sequences for effective spin Hamiltonians in a fully controlled manner of spin manipulation. The conquered pulse sequences have been compared with the NMR experiments and shown much faster CPU times corresponding to the interaction strength between the spins. Significant differences are shown in rotational operations and pulse intervals for ESR/MR operations. As a result, we suggest the advantages and possible utilization of the time-evolution based AQC approach for molecular spin quantum computers and molecular spin quantum simulators underlain by sophisticated ESR/MR pulsed spin technology.
Safety Metrics for Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Leveson, Nancy G; Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Incorporating Concept Mapping in Project-Based Learning: Lessons from Watershed Investigations
NASA Astrophysics Data System (ADS)
Rye, James; Landenberger, Rick; Warner, Timothy A.
2013-06-01
The concept map tool set forth by Novak and colleagues is underutilized in education. A meta-analysis has encouraged teachers to make extensive use of concept mapping, and researchers have advocated computer-based concept mapping applications that exploit hyperlink technology. Through an NSF sponsored geosciences education grant, middle and secondary science teachers participated in professional development to apply computer-based concept mapping in project-based learning (PBL) units that investigated local watersheds. Participants attended a summer institute, engaged in a summer through spring online learning academy, and presented PBL units at a subsequent fall science teachers' convention. The majority of 17 teachers who attended the summer institute had previously used the concept mapping strategy with students and rated it highly. Of the 12 teachers who continued beyond summer, applications of concept mapping ranged from collaborative planning of PBL projects to building students' vocabulary to students producing maps related to the PBL driving question. Barriers to the adoption and use of concept mapping included technology access at the schools, lack of time for teachers to advance their technology skills, lack of student motivation to choose to learn, and student difficulty with linking terms. In addition to mitigating the aforementioned barriers, projects targeting teachers' use of technology tools may enhance adoption by recruiting teachers as partners from schools as well as a small number that already are proficient in the targeted technology and emphasizing the utility of the concept map as a planning tool.
NASA Technical Reports Server (NTRS)
Wong, M. D.
1974-01-01
The role of technology in nontraditional higher education with particular emphasis on technology-based networks is analyzed nontraditional programs, institutions, and consortia are briefly reviewed. Nontraditional programs which utilize technology are studied. Technology-based networks are surveyed and analyzed with regard to kinds of students, learning locations, technology utilization, interinstitutional relationships, cost aspects, problems, and future outlook.
Intelligent model-based diagnostics for vehicle health management
NASA Astrophysics Data System (ADS)
Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki
2003-08-01
The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.
No Photon Left Behind: Advanced Optics at ARPA-E for Buildings and Solar Energy
NASA Astrophysics Data System (ADS)
Branz, Howard M.
2015-04-01
Key technology challenges in building efficiency and solar energy utilization require transformational optics, plasmonics and photonics technologies. We describe advanced optical technologies funded by the Advanced Research Projects Agency - Energy. Buildings technologies include a passive daytime photonic cooler, infra-red computer vision mapping for energy audit, and dual-band electrochromic windows based on plasmonic absorption. Solar technologies include novel hybrid energy converters that combine high-efficiency photovoltaics with concentrating solar thermal collection and storage. Because the marginal cost of thermal energy storage is low, these systems enable generation of inexpensive and dispatchable solar energy that can be deployed when the sun doesn't shine. The solar technologies under development include nanoparticle plasmonic spectrum splitting, Rugate filter interference structures and photovoltaic cells that can operate efficiently at over 400° C.
ERIC Educational Resources Information Center
Shaqour, Ali Zuhdi H.
2005-01-01
This study introduces a "Technology Integration Model" for a learning environment utilizing constructivist learning principles and integrating new technologies namely computers and the Internet into pre-service teacher training programs. The technology integrated programs and learning environments may assist learners to gain experiences…
Sabti, Ahmed Abdulateef; Chaichan, Rasha Sami
2014-01-01
This study examines the attitudes of Saudi Arabian high school students toward the use of computer technologies in learning English. The study also discusses the possible barriers that affect and limit the actual usage of computers. Quantitative approach is applied in this research, which involved 30 Saudi Arabia students of a high school in Kuala Lumpur, Malaysia. The respondents comprised 15 males and 15 females with ages between 16 years and 18 years. Two instruments, namely, Scale of Attitude toward Computer Technologies (SACT) and Barriers affecting Students' Attitudes and Use (BSAU) were used to collect data. The Technology Acceptance Model (TAM) of Davis (1989) was utilized. The analysis of the study revealed gender differences in attitudes toward the use of computer technologies in learning English. Female students showed high and positive attitudes towards the use of computer technologies in learning English than males. Both male and female participants demonstrated high and positive perception of Usefulness and perceived Ease of Use of computer technologies in learning English. Three barriers that affected and limited the use of computer technologies in learning English were identified by the participants. These barriers are skill, equipment, and motivation. Among these barriers, skill had the highest effect, whereas motivation showed the least effect.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Vinod
2017-05-05
High fidelity computational models of thermocline-based thermal energy storage (TES) were developed. The research goal was to advance the understanding of a single tank nanofludized molten salt based thermocline TES system under various concentration and sizes of the particles suspension. Our objectives were to utilize sensible-heat that operates with least irreversibility by using nanoscale physics. This was achieved by performing computational analysis of several storage designs, analyzing storage efficiency and estimating cost effectiveness for the TES systems under a concentrating solar power (CSP) scheme using molten salt as the storage medium. Since TES is one of the most costly butmore » important components of a CSP plant, an efficient TES system has potential to make the electricity generated from solar technologies cost competitive with conventional sources of electricity.« less
EPA CHEMICAL PRIORITIZATION COMMUNITY OF PRACTICE.
IN 2005 THE NATIONAL CENTER FOR COMPUTATIONAL TOXICOLOGY (NCCT) ORGANIZED EPA CHEMICAL PRIORITIATION COMMUNITY OF PRACTICE (CPCP) TO PROVIDE A FORUM FOR DISCUSSING THE UTILITY OF COMPUTATIONAL CHEMISTRY, HIGH-THROUGHPUT SCREENIG (HTS) AND VARIOUS TOXICOGENOMIC TECHNOLOGIES FOR CH...
NASA Technical Reports Server (NTRS)
Schulte, Erin
2017-01-01
As augmented and virtual reality grows in popularity, and more researchers focus on its development, other fields of technology have grown in the hopes of integrating with the up-and-coming hardware currently on the market. Namely, there has been a focus on how to make an intuitive, hands-free human-computer interaction (HCI) utilizing AR and VR that allows users to control their technology with little to no physical interaction with hardware. Computer vision, which is utilized in devices such as the Microsoft Kinect, webcams and other similar hardware has shown potential in assisting with the development of a HCI system that requires next to no human interaction with computing hardware and software. Object and facial recognition are two subsets of computer vision, both of which can be applied to HCI systems in the fields of medicine, security, industrial development and other similar areas.
COMPUTER-AIDED DRUG DISCOVERY AND DEVELOPMENT (CADDD): in silico-chemico-biological approach
Kapetanovic, I.M.
2008-01-01
It is generally recognized that drug discovery and development are very time and resources consuming processes. There is an ever growing effort to apply computational power to the combined chemical and biological space in order to streamline drug discovery, design, development and optimization. In biomedical arena, computer-aided or in silico design is being utilized to expedite and facilitate hit identification, hit-to-lead selection, optimize the absorption, distribution, metabolism, excretion and toxicity profile and avoid safety issues. Commonly used computational approaches include ligand-based drug design (pharmacophore, a 3-D spatial arrangement of chemical features essential for biological activity), structure-based drug design (drug-target docking), and quantitative structure-activity and quantitative structure-property relationships. Regulatory agencies as well as pharmaceutical industry are actively involved in development of computational tools that will improve effectiveness and efficiency of drug discovery and development process, decrease use of animals, and increase predictability. It is expected that the power of CADDD will grow as the technology continues to evolve. PMID:17229415
Quantitative Investigation of the Technologies That Support Cloud Computing
ERIC Educational Resources Information Center
Hu, Wenjin
2014-01-01
Cloud computing is dramatically shaping modern IT infrastructure. It virtualizes computing resources, provides elastic scalability, serves as a pay-as-you-use utility, simplifies the IT administrators' daily tasks, enhances the mobility and collaboration of data, and increases user productivity. We focus on providing generalized black-box…
ERIC Educational Resources Information Center
Pommerich, Mary
2007-01-01
Computer administered tests are becoming increasingly prevalent as computer technology becomes more readily available on a large scale. For testing programs that utilize both computer and paper administrations, mode effects are problematic in that they can result in examinee scores that are artificially inflated or deflated. As such, researchers…
BIM Based Virtual Environment for Fire Emergency Evacuation
Rezgui, Yacine; Ong, Hoang N.
2014-01-01
Recent building emergency management research has highlighted the need for the effective utilization of dynamically changing building information. BIM (building information modelling) can play a significant role in this process due to its comprehensive and standardized data format and integrated process. This paper introduces a BIM based virtual environment supported by virtual reality (VR) and a serious game engine to address several key issues for building emergency management, for example, timely two-way information updating and better emergency awareness training. The focus of this paper lies on how to utilize BIM as a comprehensive building information provider to work with virtual reality technologies to build an adaptable immersive serious game environment to provide real-time fire evacuation guidance. The innovation lies on the seamless integration between BIM and a serious game based virtual reality (VR) environment aiming at practical problem solving by leveraging state-of-the-art computing technologies. The system has been tested for its robustness and functionality against the development requirements, and the results showed promising potential to support more effective emergency management. PMID:25197704
Academic Library Resource Sharing through Bibliographic Utility Program Participation.
ERIC Educational Resources Information Center
Trochim, Mary Kane
Information on the growth of bibliographic utilities and academic library networking is presented in this report, as well as profiles of interlibrary loan activity at six academic libraries who are members of a major bibliographic utility. Applications of computer technology and network participation in academic libraries, and the major events in…
ERIC Educational Resources Information Center
Trochim, Mary Kane
This summary briefly outlines a separate report containing information on the growth of bibliographic utilities and academic library networking, as well as profiles of interlibrary loan activity at six academic libraries who are members or users of a major bibliographic utility. Applications of computer technology and network participation in…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Germain, Shawn
Nuclear Power Plant (NPP) refueling outages create some of the most challenging activities the utilities face in both tracking and coordinating thousands of activities in a short period of time. Other challenges, including nuclear safety concerns arising from atypical system configurations and resource allocation issues, can create delays and schedule overruns, driving up outage costs. Today the majority of the outage communication is done using processes that do not take advantage of advances in modern technologies that enable enhanced communication, collaboration and information sharing. Some of the common practices include: runners that deliver paper-based requests for approval, radios, telephones, desktopmore » computers, daily schedule printouts, and static whiteboards that are used to display information. Many gains have been made to reduce the challenges facing outage coordinators; however; new opportunities can be realized by utilizing modern technological advancements in communication and information tools that can enhance the collective situational awareness of plant personnel leading to improved decision-making. Ongoing research as part of the Light Water Reactor Sustainability Program (LWRS) has been targeting NPP outage improvement. As part of this research, various applications of collaborative software have been demonstrated through pilot project utility partnerships. Collaboration software can be utilized as part of the larger concept of Computer-Supported Cooperative Work (CSCW). Collaborative software can be used for emergent issue resolution, Outage Control Center (OCC) displays, and schedule monitoring. Use of collaboration software enables outage staff and subject matter experts (SMEs) to view and update critical outage information from any location on site or off.« less
Magnetic tunnel junction based spintronic logic devices
NASA Astrophysics Data System (ADS)
Lyle, Andrew Paul
The International Technology Roadmap for Semiconductors (ITRS) predicts that complimentary metal oxide semiconductor (CMOS) based technologies will hit their last generation on or near the 16 nm node, which we expect to reach by the year 2025. Thus future advances in computational power will not be realized from ever-shrinking device sizes, but rather by 'outside the box' designs and new physics, including molecular or DNA based computation, organics, magnonics, or spintronic. This dissertation investigates magnetic logic devices for post-CMOS computation. Three different architectures were studied, each relying on a different magnetic mechanism to compute logic functions. Each design has it benefits and challenges that must be overcome. This dissertation focuses on pushing each design from the drawing board to a realistic logic technology. The first logic architecture is based on electrically connected magnetic tunnel junctions (MTJs) that allow direct communication between elements without intermediate sensing amplifiers. Two and three input logic gates, which consist of two and three MTJs connected in parallel, respectively were fabricated and are compared. The direct communication is realized by electrically connecting the output in series with the input and applying voltage across the series connections. The logic gates rely on the fact that a change in resistance at the input modulates the voltage that is needed to supply the critical current for spin transfer torque switching the output. The change in resistance at the input resulted in a voltage margin of 50--200 mV and 250--300 mV for the closest input states for the three and two input designs, respectively. The two input logic gate realizes the AND, NAND, NOR, and OR logic functions. The three input logic function realizes the Majority, AND, NAND, NOR, and OR logic operations. The second logic architecture utilizes magnetostatically coupled nanomagnets to compute logic functions, which is the basis of Magnetic Quantum Cellular Automata (MQCA). MQCA has the potential to be thousands of times more energy efficient than CMOS technology. While interesting, these systems are academic unless they can be interfaced into current technologies. This dissertation pushed past a major hurdle by experimentally demonstrating a spintronic input/output (I/O) interface for the magnetostatically coupled nanomagnets by incorporating MTJs. This spintronic interface allows individual nanomagnets to be programmed using spin transfer torque and read using magneto resistance structure. Additionally the spintronic interface allows statistical data on the reliability of the magnetic coupling utilized for data propagation to be easily measured. The integration of spintronics and MQCA for an electrical interface to achieve a magnetic logic device with low power creates a competitive post-CMOS logic device. The final logic architecture that was studied used MTJs to compute logic functions and magnetic domain walls to communicate between gates. Simulations were used to optimize the design of this architecture. Spin transfer torque was used to compute logic function at each MTJ gate and was used to drive the domain walls. The design demonstrated that multiple nanochannels could be connected to each MTJ to realize fan-out from the logic gates. As a result this logic scheme eliminates the need for intermediate reads and conversions to pass information from one logic gate to another.
NASA Astrophysics Data System (ADS)
Binti Shamsuddin, Norsila
Technology advancement and development in a higher learning institution is a chance for students to be motivated to learn in depth in the information technology areas. Students should take hold of the opportunity to blend their skills towards these technologies as preparation for them when graduating. The curriculum itself can rise up the students' interest and persuade them to be directly involved in the evolvement of the technology. The aim of this study is to see how deep is the students' involvement as well as their acceptance towards the adoption of the technology used in Computer Graphics and Image Processing subjects. The study will be towards the Bachelor students in Faculty of Industrial Information Technology (FIIT), Universiti Industri Selangor (UNISEL); Bac. In Multimedia Industry, BSc. Computer Science and BSc. Computer Science (Software Engineering). This study utilizes the new Unified Theory of Acceptance and Use of Technology (UTAUT) to further validate the model and enhance our understanding of the adoption of Computer Graphics and Image Processing Technologies. Four (4) out of eight (8) independent factors in UTAUT will be studied towards the dependent factor.
Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...
2014-12-31
Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less
NASA Technical Reports Server (NTRS)
Steinberg, R.
1984-01-01
It is suggested that the very short range forecast problem for aviation is one of data management rather than model development and the possibility of improving the aviation forecast using current technology is underlined. The MERIT concept of modeling technology, and advanced man/computer interactive data management and enhancement techniques to provide a tailored, accurate and timely forecast for aviation is outlined. The MERIT includes utilization of the Langrangian approach, extensive use of the automated aircraft report to complement the present data base and provide the most current observations; and the concept that a 2 to 12 hour forecast provided every 3 hr can meet the domestic needs of aviation instead of the present 18 and 24 hr forecast provided every 12 hr.
NASA Technical Reports Server (NTRS)
Jani, Yashvant
1992-01-01
As part of the Research Institute for Computing and Information Systems (RICIS) activity, the reinforcement learning techniques developed at Ames Research Center are being applied to proximity and docking operations using the Shuttle and Solar Max satellite simulation. This activity is carried out in the software technology laboratory utilizing the Orbital Operations Simulator (OOS). This interim report provides the status of the project and outlines the future plans.
Accelerating Technology Development through Integrated Computation and Experimentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shekhawat, Dushyant; Srivastava, Rameshwar D.; Ciferno, Jared
2013-08-15
This special section of Energy & Fuels comprises a selection of papers presented at the topical conference “Accelerating Technology Development through Integrated Computation and Experimentation”, sponsored and organized by the United States Department of Energy’s National Energy Technology Laboratory (NETL) as part of the 2012 American Institute of Chemical Engineers (AIChE) Annual Meeting held in Pittsburgh, PA, Oct 28-Nov 2, 2012. That topical conference focused on the latest research and development efforts in five main areas related to fossil energy, with each area focusing on the utilization of both experimental and computational approaches: (1) gas separations (membranes, sorbents, and solventsmore » for CO{sub 2}, H{sub 2}, and O{sub 2} production), (2) CO{sub 2} utilization (enhanced oil recovery, chemical production, mineralization, etc.), (3) carbon sequestration (flow in natural systems), (4) advanced power cycles (oxy-combustion, chemical looping, gasification, etc.), and (5) fuel processing (H{sub 2} production for fuel cells).« less
Automatic computation of 2D cardiac measurements from B-mode echocardiography
NASA Astrophysics Data System (ADS)
Park, JinHyeong; Feng, Shaolei; Zhou, S. Kevin
2012-03-01
We propose a robust and fully automatic algorithm which computes the 2D echocardiography measurements recommended by America Society of Echocardiography. The algorithm employs knowledge-based imaging technologies which can learn the expert's knowledge from the training images and expert's annotation. Based on the models constructed from the learning stage, the algorithm searches initial location of the landmark points for the measurements by utilizing heart structure of left ventricle including mitral valve aortic valve. It employs the pseudo anatomic M-mode image generated by accumulating the line images in 2D parasternal long axis view along the time to refine the measurement landmark points. The experiment results with large volume of data show that the algorithm runs fast and is robust comparable to expert.
Research on elastic resource management for multi-queue under cloud computing environment
NASA Astrophysics Data System (ADS)
CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang
2017-10-01
As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.
Asset Management of Roadway Signs Through Advanced Technology
DOT National Transportation Integrated Search
2003-06-01
This research project aims to ease the process of Roadway Sign asset management. The project utilized handheld computer and global positioning system (GPS) technology to capture sign location data along with a timestamp. This data collection effort w...
DOT National Transportation Integrated Search
1997-01-01
Intelligent transportation systems (ITS) are systems that utilize advanced technologies, including computer, communications and process control technologies, to improve the efficiency and safety of the transportation system. These systems encompass a...
The emergence of spatial cyberinfrastructure.
Wright, Dawn J; Wang, Shaowen
2011-04-05
Cyberinfrastructure integrates advanced computer, information, and communication technologies to empower computation-based and data-driven scientific practice and improve the synthesis and analysis of scientific data in a collaborative and shared fashion. As such, it now represents a paradigm shift in scientific research that has facilitated easy access to computational utilities and streamlined collaboration across distance and disciplines, thereby enabling scientific breakthroughs to be reached more quickly and efficiently. Spatial cyberinfrastructure seeks to resolve longstanding complex problems of handling and analyzing massive and heterogeneous spatial datasets as well as the necessity and benefits of sharing spatial data flexibly and securely. This article provides an overview and potential future directions of spatial cyberinfrastructure. The remaining four articles of the special feature are introduced and situated in the context of providing empirical examples of how spatial cyberinfrastructure is extending and enhancing scientific practice for improved synthesis and analysis of both physical and social science data. The primary focus of the articles is spatial analyses using distributed and high-performance computing, sensor networks, and other advanced information technology capabilities to transform massive spatial datasets into insights and knowledge.
The emergence of spatial cyberinfrastructure
Wright, Dawn J.; Wang, Shaowen
2011-01-01
Cyberinfrastructure integrates advanced computer, information, and communication technologies to empower computation-based and data-driven scientific practice and improve the synthesis and analysis of scientific data in a collaborative and shared fashion. As such, it now represents a paradigm shift in scientific research that has facilitated easy access to computational utilities and streamlined collaboration across distance and disciplines, thereby enabling scientific breakthroughs to be reached more quickly and efficiently. Spatial cyberinfrastructure seeks to resolve longstanding complex problems of handling and analyzing massive and heterogeneous spatial datasets as well as the necessity and benefits of sharing spatial data flexibly and securely. This article provides an overview and potential future directions of spatial cyberinfrastructure. The remaining four articles of the special feature are introduced and situated in the context of providing empirical examples of how spatial cyberinfrastructure is extending and enhancing scientific practice for improved synthesis and analysis of both physical and social science data. The primary focus of the articles is spatial analyses using distributed and high-performance computing, sensor networks, and other advanced information technology capabilities to transform massive spatial datasets into insights and knowledge. PMID:21467227
Internal fluid mechanics research on supercomputers for aerospace propulsion systems
NASA Technical Reports Server (NTRS)
Miller, Brent A.; Anderson, Bernhard H.; Szuch, John R.
1988-01-01
The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid mechanics (ICFM) to a state of practical application for aerospace propulsion systems. The strategies used to achieve this goal are to: (1) pursue an understanding of flow physics, surface heat transfer, and combustion via analysis and fundamental experiments, (2) incorporate improved understanding of these phenomena into verified 3-D CFD codes, and (3) utilize state-of-the-art computational technology to enhance experimental and CFD research. Presented is an overview of the ICFM program in high-speed propulsion, including work in inlets, turbomachinery, and chemical reacting flows. Ongoing efforts to integrate new computer technologies, such as parallel computing and artificial intelligence, into high-speed aeropropulsion research are described.
Controlling Infrastructure Costs: Right-Sizing the Mission Control Facility
NASA Technical Reports Server (NTRS)
Martin, Keith; Sen-Roy, Michael; Heiman, Jennifer
2009-01-01
Johnson Space Center's Mission Control Center is a space vehicle, space program agnostic facility. The current operational design is essentially identical to the original facility architecture that was developed and deployed in the mid-90's. In an effort to streamline the support costs of the mission critical facility, the Mission Operations Division (MOD) of Johnson Space Center (JSC) has sponsored an exploratory project to evaluate and inject current state-of-the-practice Information Technology (IT) tools, processes and technology into legacy operations. The general push in the IT industry has been trending towards a data-centric computer infrastructure for the past several years. Organizations facing challenges with facility operations costs are turning to creative solutions combining hardware consolidation, virtualization and remote access to meet and exceed performance, security, and availability requirements. The Operations Technology Facility (OTF) organization at the Johnson Space Center has been chartered to build and evaluate a parallel Mission Control infrastructure, replacing the existing, thick-client distributed computing model and network architecture with a data center model utilizing virtualization to provide the MCC Infrastructure as a Service. The OTF will design a replacement architecture for the Mission Control Facility, leveraging hardware consolidation through the use of blade servers, increasing utilization rates for compute platforms through virtualization while expanding connectivity options through the deployment of secure remote access. The architecture demonstrates the maturity of the technologies generally available in industry today and the ability to successfully abstract the tightly coupled relationship between thick-client software and legacy hardware into a hardware agnostic "Infrastructure as a Service" capability that can scale to meet future requirements of new space programs and spacecraft. This paper discusses the benefits and difficulties that a migration to cloud-based computing philosophies has uncovered when compared to the legacy Mission Control Center architecture. The team consists of system and software engineers with extensive experience with the MCC infrastructure and software currently used to support the International Space Station (ISS) and Space Shuttle program (SSP).
NASA Astrophysics Data System (ADS)
Binboğa, Elif; Korhan, Orhan
2014-10-01
Educational ergonomics focuses on the interaction between educational performance and educational design. By improving the design or pointing out the possible problems, educational ergonomics can be utilized to have positive impacts on the student performance and thus on education process. Laptops and tablet computers are becoming widely used by school children and beginning to be used effectively for educational purposes. As the latest generation of laptops and tablet computers are mobile and lightweight compared to conventional personal computers, they support student-centred interaction-based learning. However, these technologies have been introduced into schools with minimal adaptations to furniture or attention to ergonomics. There are increasing reports of an association between increased musculoskeletal (MSK) problems in children and use of such technologies. Although children are among the users of laptops and tablet computers both in their everyday lives and at schools, the literature investigating MSK activities and possible MSK discomfort regarding children using portable technologies is limited. This study reviews the literature to identify published studies that investigated posture, MSK activities, and possible MSK discomfort among children using mobile technologies (laptops or tablet computers) for educational purposes. An electronic search of the literature published in English between January 1994 and January 2014 was performed in several databases. The literature search terms were identified and combined to search the databases. The search results that the resources investigating MSK outcomes of laptop or tablet use of children are very scarce. This review points out the research gaps in this field, and identifying areas for future studies.
A Computing Infrastructure for Supporting Climate Studies
NASA Astrophysics Data System (ADS)
Yang, C.; Bambacus, M.; Freeman, S. M.; Huang, Q.; Li, J.; Sun, M.; Xu, C.; Wojcik, G. S.; Cahalan, R. F.; NASA Climate @ Home Project Team
2011-12-01
Climate change is one of the major challenges facing us on the Earth planet in the 21st century. Scientists build many models to simulate the past and predict the climate change for the next decades or century. Most of the models are at a low resolution with some targeting high resolution in linkage to practical climate change preparedness. To calibrate and validate the models, millions of model runs are needed to find the best simulation and configuration. This paper introduces the NASA effort on Climate@Home project to build a supercomputer based-on advanced computing technologies, such as cloud computing, grid computing, and others. Climate@Home computing infrastructure includes several aspects: 1) a cloud computing platform is utilized to manage the potential spike access to the centralized components, such as grid computing server for dispatching and collecting models runs results; 2) a grid computing engine is developed based on MapReduce to dispatch models, model configuration, and collect simulation results and contributing statistics; 3) a portal serves as the entry point for the project to provide the management, sharing, and data exploration for end users; 4) scientists can access customized tools to configure model runs and visualize model results; 5) the public can access twitter and facebook to get the latest about the project. This paper will introduce the latest progress of the project and demonstrate the operational system during the AGU fall meeting. It will also discuss how this technology can become a trailblazer for other climate studies and relevant sciences. It will share how the challenges in computation and software integration were solved.
Computer Competency of Nursing Students at a University in Thailand
ERIC Educational Resources Information Center
Niyomkar, Srimana
2012-01-01
During the past years, computer and information technology has been rapidly integrated into the education and healthcare fields. In the 21st century, computers are more powerful than ever, and are used in all aspects of nursing, including education, practice, policy, and research. Consequently, student nurses will need to utilize computer…
ERIC Educational Resources Information Center
Federal Coordinating Council for Science, Engineering and Technology, Washington, DC.
This report presents a review of the High Performance Computing and Communications (HPCC) Program, which has as its goal the acceleration of the commercial availability and utilization of the next generation of high performance computers and networks in order to: (1) extend U.S. technological leadership in high performance computing and computer…
Innovative Technology in Engineering Education.
ERIC Educational Resources Information Center
Fishwick, Wilfred
1991-01-01
Discusses the impact that computer-assisted technologies, including applications to software, video recordings, and satellite broadcasts, have had upon the conventions and procedures within engineering education. Calls for the complete utilization of such devices through their appropriate integration into updated education activities effectively…
Cone beam computed tomography in dentistry: what dental educators and learners should know.
Adibi, Shawn; Zhang, Wenjian; Servos, Tom; O'Neill, Paula N
2012-11-01
Recent advances in cone beam computed tomography (CBCT) in dentistry have identified the importance of providing outcomes related to the appropriate use of this innovative technology to practitioners, educators, and investigators. To assist in determining whether and what types of evidence exist, the authors conducted PubMed, Google, and Cochrane Library searches in the spring of 2011 using the key words "cone beam computed tomography and dentistry." This search resulted in over 26,900 entries in more than 700 articles including forty-one reviews recently published in national and international journals. This article is based on existing publications and studies and will provide readers with an overview of the advantages, disadvantages, and indications/contraindications of this emerging technology as well as some thoughts on the current educational status of CBCT in U.S. dental schools. It is the responsibility of dental educators to incorporate the most updated information on this technology into their curricula in a timely manner, so that the next generation of oral health providers and educators will be competent in utilizing this technology for the best interest of patients. To do so, there is a need to conduct studies meeting methodological standards to demonstrate the diagnostic efficacy of CBCT in the dental field.
Knowledge-based environment for optical system design
NASA Astrophysics Data System (ADS)
Johnson, R. Barry
1991-01-01
Optical systems are extensively utilized by industry government and military organizations. The conceptual design engineering design fabrication and testing of these systems presently requires significant time typically on the order of 3-5 years. The Knowledge-Based Environment for Optical System Design (KB-OSD) Program has as its principal objectives the development of a methodology and tool(s) that will make a notable reduction in the development time of optical system projects reduce technical risk and overall cost. KB-OSD can be considered as a computer-based optical design associate for system engineers and design engineers. By utilizing artificial intelligence technology coupled with extensive design/evaluation computer application programs and knowledge bases the KB-OSD will provide the user with assistance and guidance to accomplish such activities as (i) develop system level and hardware level requirements from mission requirements (ii) formulate conceptual designs (iii) construct a statement of work for an RFP (iv) develop engineering level designs (v) evaluate an existing design and (vi) explore the sensitivity of a system to changing scenarios. The KB-OSD comprises a variety of computer platforms including a Stardent Titan supercomputer numerous design programs (lens design coating design thermal materials structural atmospherics etc. ) data bases and heuristic knowledge bases. An important element of the KB-OSD Program is the inclusion of the knowledge of individual experts in various areas of optics and optical system engineering. This knowledge is obtained by KB-OSD knowledge engineers performing
A Quantitative Exploration of Preservice Teachers' Intent to Use Computer-based Technology
ERIC Educational Resources Information Center
Kim, Kioh; Jain, Sachin; Westhoff, Guy; Rezabek, Landra
2008-01-01
Based on Bandura's (1977) social learning theory, the purpose of this study is to identify the relationship of preservice teachers' perceptions of faculty modeling of computer-based technology and preservice teachers' intent of using computer-based technology in educational settings. There were 92 participants in this study; they were enrolled in…
Malleable architecture generator for FPGA computing
NASA Astrophysics Data System (ADS)
Gokhale, Maya; Kaba, James; Marks, Aaron; Kim, Jang
1996-10-01
The malleable architecture generator (MARGE) is a tool set that translates high-level parallel C to configuration bit streams for field-programmable logic based computing systems. MARGE creates an application-specific instruction set and generates the custom hardware components required to perform exactly those computations specified by the C program. In contrast to traditional fixed-instruction processors, MARGE's dynamic instruction set creation provides for efficient use of hardware resources. MARGE processes intermediate code in which each operation is annotated by the bit lengths of the operands. Each basic block (sequence of straight line code) is mapped into a single custom instruction which contains all the operations and logic inherent in the block. A synthesis phase maps the operations comprising the instructions into register transfer level structural components and control logic which have been optimized to exploit functional parallelism and function unit reuse. As a final stage, commercial technology-specific tools are used to generate configuration bit streams for the desired target hardware. Technology- specific pre-placed, pre-routed macro blocks are utilized to implement as much of the hardware as possible. MARGE currently supports the Xilinx-based Splash-2 reconfigurable accelerator and National Semiconductor's CLAy-based parallel accelerator, MAPA. The MARGE approach has been demonstrated on systolic applications such as DNA sequence comparison.
Transfer of space technology to industry
NASA Technical Reports Server (NTRS)
Hamilton, J. T.
1974-01-01
Some of the most significant applications of the NASA aerospace technology transfer to industry and other government agencies are briefly outlined. The technology utilization program encompasses computer programs for structural problems, life support systems, fuel cell development, and rechargeable cardiac pacemakers as well as reliability and quality research for oil recovery operations and pollution control.
NASA Astrophysics Data System (ADS)
Bian, Jun; Fu, Huijian; Shang, Qian; Zhou, Xiangyang; Ma, Qingguo
This paper analyzes the outstanding problems in current industrial production by reviewing the three stages of the Industrial Engineering Development. Based on investigations and interviews in enterprises, we propose the new idea of applying "computer video analysis technology" to new industrial engineering management software, and add "loose-coefficient" of the working station to this software in order to arrange scientific and humanistic production. Meanwhile, we suggest utilizing Biofeedback Technology to promote further research on "the rules of workers' physiological, psychological and emotional changes in production". This new kind of combination will push forward industrial engineering theories and benefit enterprises in progressing towards flexible social production, thus it will be of great theory innovation value, social significance and application value.
Utilizing data grid architecture for the backup and recovery of clinical image data.
Liu, Brent J; Zhou, M Z; Documet, J
2005-01-01
Grid Computing represents the latest and most exciting technology to evolve from the familiar realm of parallel, peer-to-peer and client-server models. However, there has been limited investigation into the impact of this emerging technology in medical imaging and informatics. In particular, PACS technology, an established clinical image repository system, while having matured significantly during the past ten years, still remains weak in the area of clinical image data backup. Current solutions are expensive or time consuming and the technology is far from foolproof. Many large-scale PACS archive systems still encounter downtime for hours or days, which has the critical effect of crippling daily clinical operations. In this paper, a review of current backup solutions will be presented along with a brief introduction to grid technology. Finally, research and development utilizing the grid architecture for the recovery of clinical image data, in particular, PACS image data, will be presented. The focus of this paper is centered on applying a grid computing architecture to a DICOM environment since DICOM has become the standard for clinical image data and PACS utilizes this standard. A federation of PACS can be created allowing a failed PACS archive to recover its image data from others in the federation in a seamless fashion. The design reflects the five-layer architecture of grid computing: Fabric, Resource, Connectivity, Collective, and Application Layers. The testbed Data Grid is composed of one research laboratory and two clinical sites. The Globus 3.0 Toolkit (Co-developed by the Argonne National Laboratory and Information Sciences Institute, USC) for developing the core and user level middleware is utilized to achieve grid connectivity. The successful implementation and evaluation of utilizing data grid architecture for clinical PACS data backup and recovery will provide an understanding of the methodology for using Data Grid in clinical image data backup for PACS, as well as establishment of benchmarks for performance from future grid technology improvements. In addition, the testbed can serve as a road map for expanded research into large enterprise and federation level data grids to guarantee CA (Continuous Availability, 99.999% up time) in a variety of medical data archiving, retrieval, and distribution scenarios.
NASA Technical Reports Server (NTRS)
Beecken, Brian P.; Kleinman, Randall R.
2004-01-01
New developments in infrared sensor technology have potentially made possible a new space-based system which can measure far-infrared radiation at lower costs (mass, power and expense). The Stationary Imaging Fourier Transform Spectrometer (SIFTS) proposed by NASA Langley Research Center, makes use of new detector array technology. A mathematical model which simulates resolution and spectral range relationships has been developed for analyzing the utility of such a radically new approach to spectroscopy. Calculations with this forward model emulate the effects of a detector array on the ability to retrieve accurate spectral features. Initial computations indicate significant attenuation at high wavenumbers.
NASA Technical Reports Server (NTRS)
1998-01-01
Recom Technologies, Inc., was established in 1980 by Jack Lee, a former NASA contractor. After forming the new company, Recom was awarded NASA contracts, which eventually grew to 50 percent of the company's business. Two companies have spun-off from Recom, both of which have their basis in NASA technology. The first is Attention Control Systems, Inc. with utilizes intelligent planning software that Recom developed for NASA Ames Computational Sciences Division in a hand-held planner used as an aid in cognitive rehabilitation of brain injury patients. The second is MiraNet, Inc. who uses CLIPS as the foundation for WEXpert, the first rules based help system on the Web.
Cranioplasty prosthesis manufacturing based on reverse engineering technology
Chrzan, Robert; Urbanik, Andrzej; Karbowski, Krzysztof; Moskała, Marek; Polak, Jarosław; Pyrich, Marek
2012-01-01
Summary Background Most patients with large focal skull bone loss after craniectomy are referred for cranioplasty. Reverse engineering is a technology which creates a computer-aided design (CAD) model of a real structure. Rapid prototyping is a technology which produces physical objects from virtual CAD models. The aim of this study was to assess the clinical usefulness of these technologies in cranioplasty prosthesis manufacturing. Material/Methods CT was performed on 19 patients with focal skull bone loss after craniectomy, using a dedicated protocol. A material model of skull deficit was produced using computer numerical control (CNC) milling, and individually pre-operatively adjusted polypropylene-polyester prosthesis was prepared. In a control group of 20 patients a prosthesis was manually adjusted to each patient by a neurosurgeon during surgery, without using CT-based reverse engineering/rapid prototyping. In each case, the prosthesis was implanted into the patient. The mean operating times in both groups were compared. Results In the group of patients with reverse engineering/rapid prototyping-based cranioplasty, the mean operating time was shorter (120.3 min) compared to that in the control group (136.5 min). The neurosurgeons found the new technology particularly useful in more complicated bone deficits with different curvatures in various planes. Conclusions Reverse engineering and rapid prototyping may reduce the time needed for cranioplasty neurosurgery and improve the prosthesis fitting. Such technologies may utilize data obtained by commonly used spiral CT scanners. The manufacturing of individually adjusted prostheses should be commonly used in patients planned for cranioplasty with synthetic material. PMID:22207125
Hazardous Environment Robotics
NASA Technical Reports Server (NTRS)
1996-01-01
Jet Propulsion Laboratory (JPL) developed video overlay calibration and demonstration techniques for ground-based telerobotics. Through a technology sharing agreement with JPL, Deneb Robotics added this as an option to its robotics software, TELEGRIP. The software is used for remotely operating robots in nuclear and hazardous environments in industries including automotive and medical. The option allows the operator to utilize video to calibrate 3-D computer models with the actual environment, and thus plan and optimize robot trajectories before the program is automatically generated.
JAX Colony Management System (JCMS): an extensible colony and phenotype data management system.
Donnelly, Chuck J; McFarland, Mike; Ames, Abigail; Sundberg, Beth; Springer, Dave; Blauth, Peter; Bult, Carol J
2010-04-01
The Jackson Laboratory Colony Management System (JCMS) is a software application for managing data and information related to research mouse colonies, associated biospecimens, and experimental protocols. JCMS runs directly on computers that run one of the PC Windows operating systems, but can be accessed via web browser interfaces from any computer running a Windows, Macintosh, or Linux operating system. JCMS can be configured for a single user or multiple users in small- to medium-size work groups. The target audience for JCMS includes laboratory technicians, animal colony managers, and principal investigators. The application provides operational support for colony management and experimental workflows, sample and data tracking through transaction-based data entry forms, and date-driven work reports. Flexible query forms allow researchers to retrieve database records based on user-defined criteria. Recent advances in handheld computers with integrated barcode readers, middleware technologies, web browsers, and wireless networks add to the utility of JCMS by allowing real-time access to the database from any networked computer.
Use of agents to implement an integrated computing environment
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application to both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. Agents are used to implement the overall infrastructure on the computer. Successful agent utilization requires that they be made of three components: the resource, the model, and the wrap. Current work is focused on the development of generalized agent schemes and associated demonstration projects. When in place, the technology independent computing infrastructure will aid the designer in systematically generating knowledge used to facilitate decision-making.
A Prototype SSVEP Based Real Time BCI Gaming System
Martišius, Ignas
2016-01-01
Although brain-computer interface technology is mainly designed with disabled people in mind, it can also be beneficial to healthy subjects, for example, in gaming or virtual reality systems. In this paper we discuss the typical architecture, paradigms, requirements, and limitations of electroencephalogram-based gaming systems. We have developed a prototype three-class brain-computer interface system, based on the steady state visually evoked potentials paradigm and the Emotiv EPOC headset. An online target shooting game, implemented in the OpenViBE environment, has been used for user feedback. The system utilizes wave atom transform for feature extraction, achieving an average accuracy of 78.2% using linear discriminant analysis classifier, 79.3% using support vector machine classifier with a linear kernel, and 80.5% using a support vector machine classifier with a radial basis function kernel. PMID:27051414
A Prototype SSVEP Based Real Time BCI Gaming System.
Martišius, Ignas; Damaševičius, Robertas
2016-01-01
Although brain-computer interface technology is mainly designed with disabled people in mind, it can also be beneficial to healthy subjects, for example, in gaming or virtual reality systems. In this paper we discuss the typical architecture, paradigms, requirements, and limitations of electroencephalogram-based gaming systems. We have developed a prototype three-class brain-computer interface system, based on the steady state visually evoked potentials paradigm and the Emotiv EPOC headset. An online target shooting game, implemented in the OpenViBE environment, has been used for user feedback. The system utilizes wave atom transform for feature extraction, achieving an average accuracy of 78.2% using linear discriminant analysis classifier, 79.3% using support vector machine classifier with a linear kernel, and 80.5% using a support vector machine classifier with a radial basis function kernel.
Dong, Hengjin; Buxton, Martin
2006-01-01
The objective of this study is to apply a Markov model to compare cost-effectiveness of total knee replacement (TKR) using computer-assisted surgery (CAS) with that of TKR using a conventional manual method in the absence of formal clinical trial evidence. A structured search was carried out to identify evidence relating to the clinical outcome, cost, and effectiveness of TKR. Nine Markov states were identified based on the progress of the disease after TKR. Effectiveness was expressed by quality-adjusted life years (QALYs). The simulation was carried out initially for 120 cycles of a month each, starting with 1,000 TKRs. A discount rate of 3.5 percent was used for both cost and effectiveness in the incremental cost-effectiveness analysis. Then, a probabilistic sensitivity analysis was carried out using a Monte Carlo approach with 10,000 iterations. Computer-assisted TKR was a long-term cost-effective technology, but the QALYs gained were small. After the first 2 years, the incremental cost per QALY of computer-assisted TKR was dominant because of cheaper and more QALYs. The incremental cost-effectiveness ratio (ICER) was sensitive to the "effect of CAS," to the CAS extra cost, and to the utility of the state "Normal health after primary TKR," but it was not sensitive to utilities of other Markov states. Both probabilistic and deterministic analyses produced similar cumulative serious or minor complication rates and complex or simple revision rates. They also produced similar ICERs. Compared with conventional TKR, computer-assisted TKR is a cost-saving technology in the long-term and may offer small additional QALYs. The "effect of CAS" is to reduce revision rates and complications through more accurate and precise alignment, and although the conclusions from the model, even when allowing for a full probabilistic analysis of uncertainty, are clear, the "effect of CAS" on the rate of revisions awaits long-term clinical evidence.
NASA Astrophysics Data System (ADS)
Cleveland, April Jones
The integration of technology into the K--12 classroom has become a key focus in the last several years. However, teachers are often left out of this integration process, and subsequently training in the use of the technologies in a classroom setting is often minimal in nature. Teachers are left on their own as they struggle to integrate technology into their curriculum. Web-based professional development has the potential to alleviate both time and place constraints teachers often confront when trying to attend traditional professional programs to upgrade their technology skills. This study focuses on 70 upper elementary, middle, and high school teachers who volunteered to take part in this study in which a web-based tutorial was used as a tool for professional development and data collection. A comparison of settings allowed these teachers to participate in one of three ways: (1) in a workshop-type setting with an instructional leader; (2) in a workshop-type setting with a facilitator; and (3) on the web without an instructional leader or informal peer interaction. All the groups used the same web-based tutorial on water quality monitoring for instructional purposes. Research data included pretest and post-test measurement from all three groups as well as their analysis of a known water sample. The Microcomputer Utilization in Teaching Efficacy Beliefs Instrument (MUTEBI) was administered to all the participants as a measurement of self-efficacy beliefs as they relate to the use of computers in science teaching. In addition to the quantitative data collected, qualitative data was also compiled. The results of the study indicate that all the participants were equal in terms of knowledge acquisition, but may have derived "unanticipated benefits" from interaction with their peers in the workshop-type settings. The results also indicate that as teachers' self-rating of computer expertise increased, their scores on the Microcomputer Utilization in Teaching Efficacy Beliefs Instrument (MUTEBI) increased as well.
Central Limit Theorem: New SOCR Applet and Demonstration Activity
Dinov, Ivo D.; Christou, Nicolas; Sanchez, Juana
2011-01-01
Modern approaches for information technology based blended education utilize a variety of novel instructional, computational and network resources. Such attempts employ technology to deliver integrated, dynamically linked, interactive content and multifaceted learning environments, which may facilitate student comprehension and information retention. In this manuscript, we describe one such innovative effort of using technological tools for improving student motivation and learning of the theory, practice and usability of the Central Limit Theorem (CLT) in probability and statistics courses. Our approach is based on harnessing the computational libraries developed by the Statistics Online Computational Resource (SOCR) to design a new interactive Java applet and a corresponding demonstration activity that illustrate the meaning and the power of the CLT. The CLT applet and activity have clear common goals; to provide graphical representation of the CLT, to improve student intuition, and to empirically validate and establish the limits of the CLT. The SOCR CLT activity consists of four experiments that demonstrate the assumptions, meaning and implications of the CLT and ties these to specific hands-on simulations. We include a number of examples illustrating the theory and applications of the CLT. Both the SOCR CLT applet and activity are freely available online to the community to test, validate and extend (Applet: http://www.socr.ucla.edu/htmls/SOCR_Experiments.html and Activity: http://wiki.stat.ucla.edu/socr/index.php/SOCR_EduMaterials_Activities_GeneralCentralLimitTheorem). PMID:21833159
Central Limit Theorem: New SOCR Applet and Demonstration Activity.
Dinov, Ivo D; Christou, Nicolas; Sanchez, Juana
2008-07-01
Modern approaches for information technology based blended education utilize a variety of novel instructional, computational and network resources. Such attempts employ technology to deliver integrated, dynamically linked, interactive content and multifaceted learning environments, which may facilitate student comprehension and information retention. In this manuscript, we describe one such innovative effort of using technological tools for improving student motivation and learning of the theory, practice and usability of the Central Limit Theorem (CLT) in probability and statistics courses. Our approach is based on harnessing the computational libraries developed by the Statistics Online Computational Resource (SOCR) to design a new interactive Java applet and a corresponding demonstration activity that illustrate the meaning and the power of the CLT. The CLT applet and activity have clear common goals; to provide graphical representation of the CLT, to improve student intuition, and to empirically validate and establish the limits of the CLT. The SOCR CLT activity consists of four experiments that demonstrate the assumptions, meaning and implications of the CLT and ties these to specific hands-on simulations. We include a number of examples illustrating the theory and applications of the CLT. Both the SOCR CLT applet and activity are freely available online to the community to test, validate and extend (Applet: http://www.socr.ucla.edu/htmls/SOCR_Experiments.html and Activity: http://wiki.stat.ucla.edu/socr/index.php/SOCR_EduMaterials_Activities_GeneralCentralLimitTheorem).
Robonaut's Flexible Information Technology Infrastructure
NASA Technical Reports Server (NTRS)
Askew, Scott; Bluethmann, William; Alder, Ken; Ambrose, Robert
2003-01-01
Robonaut, NASA's humanoid robot, is designed to work as both an astronaut assistant and, in certain situations, an astronaut surrogate. This highly dexterous robot performs complex tasks under telepresence control that could previously only be carried out directly by humans. Currently with 47 degrees of freedom (DOF), Robonaut is a state-of-the-art human size telemanipulator system. while many of Robonaut's embedded components have been custom designed to meet packaging or environmental requirements, the primary computing systems used in Robonaut are currently commercial-off-the-shelf (COTS) products which have some correlation to flight qualified computer systems. This loose coupling of information technology (IT) resources allows Robonaut to exploit cost effective solutions while floating the technology base to take advantage of the rapid pace of IT advances. These IT systems utilize a software development environment, which is both compatible with COTS hardware as well as flight proven computing systems, preserving the majority of software development for a flight system. The ability to use highly integrated and flexible COTS software development tools improves productivity while minimizing redesign for a space flight system. Further, the flexibility of Robonaut's software and communication architecture has allowed it to become a widely used distributed development testbed for integrating new capabilities and furthering experimental research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.
Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. Formore » better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.« less
Information Technology and Literacy Assessment.
ERIC Educational Resources Information Center
Balajthy, Ernest
2002-01-01
Compares technology predictions from around 1989 with the technology of 2002. Discusses the place of computer-based assessment today, computer-scored testing, computer-administered formal assessment, Internet-based formal assessment, computerized adaptive tests, placement tests, informal assessment, electronic portfolios, information management,…
Neylon, J; Min, Y; Kupelian, P; Low, D A; Santhanam, A
2017-04-01
In this paper, a multi-GPU cloud-based server (MGCS) framework is presented for dose calculations, exploring the feasibility of remote computing power for parallelization and acceleration of computationally and time intensive radiotherapy tasks in moving toward online adaptive therapies. An analytical model was developed to estimate theoretical MGCS performance acceleration and intelligently determine workload distribution. Numerical studies were performed with a computing setup of 14 GPUs distributed over 4 servers interconnected by a 1 Gigabits per second (Gbps) network. Inter-process communication methods were optimized to facilitate resource distribution and minimize data transfers over the server interconnect. The analytically predicted computation time predicted matched experimentally observations within 1-5 %. MGCS performance approached a theoretical limit of acceleration proportional to the number of GPUs utilized when computational tasks far outweighed memory operations. The MGCS implementation reproduced ground-truth dose computations with negligible differences, by distributing the work among several processes and implemented optimization strategies. The results showed that a cloud-based computation engine was a feasible solution for enabling clinics to make use of fast dose calculations for advanced treatment planning and adaptive radiotherapy. The cloud-based system was able to exceed the performance of a local machine even for optimized calculations, and provided significant acceleration for computationally intensive tasks. Such a framework can provide access to advanced technology and computational methods to many clinics, providing an avenue for standardization across institutions without the requirements of purchasing, maintaining, and continually updating hardware.
An exploration of neuromorphic systems and related design issues/challenges in dark silicon era
NASA Astrophysics Data System (ADS)
Chandaliya, Mudit; Chaturvedi, Nitin; Gurunarayanan, S.
2018-03-01
The current microprocessors has shown a remarkable performance and memory capacity improvement since its innovation. However, due to power and thermal limitations, only a fraction of cores can operate at full frequency at any instant of time irrespective of the advantages of new technology generation. This phenomenon of under-utilization of microprocessor is called as dark silicon which leads to distraction in innovative computing. To overcome the limitation of utilization wall, IBM technologies explored and invented neurosynaptic system chips. It has opened a wide scope of research in the field of innovative computing, technology, material sciences, machine learning etc. In this paper, we first reviewed the diverse stages of research that have been influential in the innovation of neurosynaptic architectures. These, architectures focuses on the development of brain-like framework which is efficient enough to execute a broad set of computations in real time while maintaining ultra-low power consumption as well as area considerations in mind. We also reveal the inadvertent challenges and the opportunities of designing neuromorphic systems as presented by the existing technologies in the dark silicon era, which constitute the utmost area of research in future.
ERIC Educational Resources Information Center
Pezzoli, Jean A.
In June 1992, Maui Community College (MCC), in Hawaii, conducted a survey of the communities of Maui, Molokai, Lanai, and Hana to determine perceived needs for an associate degree and certificate program in electronics and computer engineering. Questionnaires were mailed to 500 firms utilizing electronic or computer services, seeking information…
ERIC Educational Resources Information Center
Tas, Yasemin; Balgalmis, Esra
2016-01-01
The goal of this study was to describe Turkish mathematics and science teachers' use of computer in their classroom instruction by utilizing TIMSS 2011 data. Analyses results revealed that teachers most frequently used computers for preparation purpose and least frequently used computers for administration. There was no difference in teachers'…
An Interview with Matthew P. Greving, PhD. Interview by Vicki Glaser.
Greving, Matthew P
2011-10-01
Matthew P. Greving is Chief Scientific Officer at Nextval Inc., a company founded in early 2010 that has developed a discovery platform called MassInsight™.. He received his PhD in Biochemistry from Arizona State University, and prior to that he spent nearly 7 years working as a software engineer. This experience in solving complex computational problems fueled his interest in developing technologies and algorithms related to acquisition and analysis of high-dimensional biochemical data. To address the existing problems associated with label-based microarray readouts, he beganwork on a technique for label-free mass spectrometry (MS) microarray readout compatible with both matrix-assisted laser/desorption ionization (MALDI) and matrix-free nanostructure initiator mass spectrometry (NIMS). This is the core of Nextval’s MassInsight technology, which utilizes picoliter noncontact deposition of high-density arrays on mass-readout substrates along with computational algorithms for high-dimensional data processingand reduction.
NASA Astrophysics Data System (ADS)
Narayanan, M.
2004-12-01
Catherine Palomba and Trudy Banta offer the following definition of assessment, adapted from one provided by Marches in 1987. Assessment in the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development. (Palomba and Banta 1999). It is widely recognized that sophisticated computing technologies are becoming a key element in today's classroom instructional techniques. Regardless, the Professor must be held responsible for creating an instructional environment in which the technology actually supplements learning outcomes of the students. Almost all academic disciplines have found a niche for computer-based instruction in their respective professional domain. In many cases, it is viewed as an essential and integral part of the educational process. Educational institutions are committing substantial resources to the establishment of dedicated technology-based laboratories, so that they will be able to accommodate and fulfill students' desire to master certain of these specific skills. This type of technology-based instruction may raise some fundamental questions about the core competencies of the student learner. Some of the most important questions are : 1. Is the utilization of these fast high-powered computers and user-friendly software programs creating a totally non-challenging instructional environment for the student learner ? 2. Can technology itself all too easily overshadow the learning outcomes intended ? 3. Are the educational institutions simply training students how to use technology rather than educating them in the appropriate field ? 4. Are we still teaching content-driven courses and analysis oriented subject matter ? 5. Are these sophisticated modern era technologies contributing to a decline in the Critical Thinking Capabilities of the 21st century technology-savvy students ? The author tries to focus on technology as a tool and not on the technology itself. He further argues that students must demonstrate that they have the have the ability to think critically before they make an attempt to use technology in a chosen application-specific environment. The author further argues that training-based instruction has a very narrow focus that puts modern technology at the forefront of the learning enterprise system. The author promotes education-oriented strategies to provide the students with a broader perspective of the subject matter. The author is also of the opinion that students entering the workplace should clearly understand the context in which modern technologies are influencing the productive outcomes of the industrialized world. References : Marchese, T. J. (1987). Third Down, Ten Years to go. AAHE Bulletin, Vol. 40, pages 3-8. Marchese, T. J. (1994). Assessment, Quality and Undergraduate Improvement. Assessment Update, Vol. 6, No. 3. pages 1-14. Montagu, A. S. (2001). High-technology instruction: A framework for teaching computer-based technologies. Journal on Excellence in College Teaching, 12 (1), 109-128. Palomba, Catherine A. and Banta, Trudy W.(1999). Assessment Essentials :Planning, Implementing and Improving Assessment in Higher Education. San Francisco : Jossey Bass Publishers.
The Power of Computer-aided Tomography to Investigate Marine Benthic Communities
Utilization of Computer-aided-Tomography (CT) technology is a powerful tool to investigate benthic communities in aquatic systems. In this presentation, we will attempt to summarize our 15 years of experience in developing specific CT methods and applications to marine benthic co...
Hong, OiSaeng; Eakin, Brenda L; Chin, Dal Lae; Feld, Jamie; Vogel, Stephen
2013-07-01
Noise-induced hearing loss is a significant occupational injury for firefighters exposed to intermittent noise on the job. It is important to educate firefighters about using hearing protection devices whenever they are exposed to loud noise. Computer technology is a relatively new health education approach and can be useful for tailoring specific aspects of behavioral change training. The purpose of this study is to present the development process of an Internet-based tailored intervention program and to assess its efficacy. The intervention programs were implemented for 372 firefighters (mean age = 44 years, Caucasian = 82%, male = 95%) in three states (California, Illinois, and Indiana). The efficacy was assessed from firefighters' feedback through an Internet-based survey. A multimedia Internet-based training program was developed through (a) determining program content and writing scripts, (b) developing decision-making algorithms for tailoring, (c) graphic design and audio and video productions, (d) creating computer software and a database, and (e) postproduction quality control and pilot testing. Participant feedback regarding the training has been very positive. Participants reported that they liked completing the training via computer (83%) and also that the Internet-based training program was well organized (97%), easy to use (97%), and effective (98%) and held their interest (79%). Almost all (95%) would recommend this Internet training program to other firefighters. Interactive multimedia computer technology using the Internet was a feasible mode of delivery for a hearing protection intervention among firefighters. Participants' favorable feedback strongly supports the continued utilization of this approach for designing and developing interventions to promote healthy behaviors.
Medication safety and knowledge-based functions: a stepwise approach against information overload.
Patapovas, Andrius; Dormann, Harald; Sedlmayr, Brita; Kirchner, Melanie; Sonst, Anja; Müller, Fabian; Pfistermeister, Barbara; Plank-Kiegele, Bettina; Vogler, Renate; Maas, Renke; Criegee-Rieck, Manfred; Prokosch, Hans-Ulrich; Bürkle, Thomas
2013-09-01
The aim was to improve medication safety in an emergency department (ED) by enhancing the integration and presentation of safety information for drug therapy. Based on an evaluation of safety of drug therapy issues in the ED and a review of computer-assisted intervention technologies we redesigned an electronic case sheet and implemented computer-assisted interventions into the routine work flow. We devised a four step system of alerts, and facilitated access to different levels of drug information. System use was analyzed over a period of 6 months. In addition, physicians answered a survey based on the technology acceptance model TAM2. The new application was implemented in an informal manner to avoid work flow disruption. Log files demonstrated that step I, 'valid indication' was utilized for 3% of the recorded drugs and step II 'tooltip for well-known drug risks' for 48% of the drugs. In the questionnaire, the computer-assisted interventions were rated better than previous paper based measures (checklists, posters) with regard to usefulness, support of work and information quality. A stepwise assisting intervention received positive user acceptance. Some intervention steps have been seldom used, others quite often. We think that we were able to avoid over-alerting and work flow intrusion in a critical ED environment. © 2013 The Authors. British Journal of Clinical Pharmacology © 2013 The British Pharmacological Society.
NASA Technical Reports Server (NTRS)
1991-01-01
Technology 2000 was the first major industrial conference and exposition spotlighting NASA technology and technology transfer. It's purpose was, and continues to be, to increase awareness of existing NASA-developed technologies that are available for immediate use in the development of new products and processes, and to lay the groundwork for the effective utilization of emerging technologies. Included are sessions on: computer technology and software engineering; human factors engineering and life sciences; materials science; sensors and measurement technology; artificial intelligence; environmental technology; optics and communications; and superconductivity.
ERIC Educational Resources Information Center
Umunnakwe, Ngozi; Sello, Queen
2016-01-01
The study investigates the effective utilization of Information and Communication Technology (ICT) by first year undergraduates of the University of Botswana (UB) in their reading and writing skills. The first year students are not first language (L1) learners of English. They have not utilized computers for learning reading and writing in their…
The lucky image-motion prediction for simple scene observation based soft-sensor technology
NASA Astrophysics Data System (ADS)
Li, Yan; Su, Yun; Hu, Bin
2015-08-01
High resolution is important to earth remote sensors, while the vibration of the platforms of the remote sensors is a major factor restricting high resolution imaging. The image-motion prediction and real-time compensation are key technologies to solve this problem. For the reason that the traditional autocorrelation image algorithm cannot meet the demand for the simple scene image stabilization, this paper proposes to utilize soft-sensor technology in image-motion prediction, and focus on the research of algorithm optimization in imaging image-motion prediction. Simulations results indicate that the improving lucky image-motion stabilization algorithm combining the Back Propagation Network (BP NN) and support vector machine (SVM) is the most suitable for the simple scene image stabilization. The relative error of the image-motion prediction based the soft-sensor technology is below 5%, the training computing speed of the mathematical predication model is as fast as the real-time image stabilization in aerial photography.
An electron beam linear scanning mode for industrial limited-angle nano-computed tomography.
Wang, Chengxiang; Zeng, Li; Yu, Wei; Zhang, Lingli; Guo, Yumeng; Gong, Changcheng
2018-01-01
Nano-computed tomography (nano-CT), which utilizes X-rays to research the inner structure of some small objects and has been widely utilized in biomedical research, electronic technology, geology, material sciences, etc., is a high spatial resolution and non-destructive research technique. A traditional nano-CT scanning model with a very high mechanical precision and stability of object manipulator, which is difficult to reach when the scanned object is continuously rotated, is required for high resolution imaging. To reduce the scanning time and attain a stable and high resolution imaging in industrial non-destructive testing, we study an electron beam linear scanning mode of nano-CT system that can avoid mechanical vibration and object movement caused by the continuously rotated object. Furthermore, to further save the scanning time and study how small the scanning range could be considered with acceptable spatial resolution, an alternating iterative algorithm based on ℓ 0 minimization is utilized to limited-angle nano-CT reconstruction problem with the electron beam linear scanning mode. The experimental results confirm the feasibility of the electron beam linear scanning mode of nano-CT system.
An electron beam linear scanning mode for industrial limited-angle nano-computed tomography
NASA Astrophysics Data System (ADS)
Wang, Chengxiang; Zeng, Li; Yu, Wei; Zhang, Lingli; Guo, Yumeng; Gong, Changcheng
2018-01-01
Nano-computed tomography (nano-CT), which utilizes X-rays to research the inner structure of some small objects and has been widely utilized in biomedical research, electronic technology, geology, material sciences, etc., is a high spatial resolution and non-destructive research technique. A traditional nano-CT scanning model with a very high mechanical precision and stability of object manipulator, which is difficult to reach when the scanned object is continuously rotated, is required for high resolution imaging. To reduce the scanning time and attain a stable and high resolution imaging in industrial non-destructive testing, we study an electron beam linear scanning mode of nano-CT system that can avoid mechanical vibration and object movement caused by the continuously rotated object. Furthermore, to further save the scanning time and study how small the scanning range could be considered with acceptable spatial resolution, an alternating iterative algorithm based on ℓ0 minimization is utilized to limited-angle nano-CT reconstruction problem with the electron beam linear scanning mode. The experimental results confirm the feasibility of the electron beam linear scanning mode of nano-CT system.
NASA Astrophysics Data System (ADS)
Freeman, S.; Kintsch, A.
2003-12-01
Boulder High School Special Education students work in teams on donated wireless computers to solve problems created by global climate change. Their text is Richard Somerville's The Forgiving Air. They utilize Wheeling Jesuit University's remote sensing web site and private computer bulletin board. Their central source for problem-based learning (PBL) is www.cotf.edu, NASA's Classroom of the Future Global Change web site. As a result, students not only improve their abilities to write, read, do math and research, speak, and work as team members, they also improve self-esteem, resilience, and willingness to take more challenging classes. Two special education students passed AP exams, Calculus and U.S. Government, last spring and Jay Matthews of Newsweek rates Boulder High as 201st of the nation's top 1000 high schools.
Racial/ethnic disparities in the utilization of high-technology hospitals.
Kim, Tae Hyun; Samson, Linda F; Lu, Ning
2010-09-01
Hospitals with high-technology services may have better outcomes. However, access to high-technology hospitals might not be uniform across racial/ethnic groups. This study examined if racial/ethnic minorities, compared to whites, are less likely to utilize hospitals that have the availability of technology services and infrastructure items such as computed tomography, positron emission tomography, magnetic resonance imaging, diagnostics radiation facility, and a level 1 trauma unit. Data were obtained from the 2003 Healthcare Cost & Utilization Project's Nationwide Inpatient Sample and the 2003 American Hospital Association's annual survey data. The sample consisted of 3381 324 patients admitted to and discharged from 368 hospitals in 18 states in the United States. Logistic regression results suggest that Hispanic patients are less likely than whites to utilize high-technology hospitals when controlling for other factors (odds ratio[OR], 0.47; 95% confidence interval [CI], 0.28-0.79). Our study adds empirical evidence that significant gaps persist in access to care between minorities and whites. Particularly, access to high-technology hospitals for Hispanics appears to be a major problem.
Initiative for safe driving and enhanced utilization of crash data
NASA Astrophysics Data System (ADS)
Wagner, John F.
1994-03-01
This initiative addresses the utilization of current technology to increase the efficiency of police officers to complete required Driving Under the Influence (DUI) forms and to enhance their ability to acquire and record crash and accident information. The project is a cooperative program among the New Mexico Alliance for Transportation Research (ATR), Science Applications International Corporation (SAIC), Los Alamos National Laboratory, and the New Mexico State Highway and Transportation Department. The approach utilizes an in-car computer and associated sensors for information acquisition and recording. Los Alamos artificial intelligence technology is leveraged to ensure ease of data entry and use.
Seismic waveform modeling over cloud
NASA Astrophysics Data System (ADS)
Luo, Cong; Friederich, Wolfgang
2016-04-01
With the fast growing computational technologies, numerical simulation of seismic wave propagation achieved huge successes. Obtaining the synthetic waveforms through numerical simulation receives an increasing amount of attention from seismologists. However, computational seismology is a data-intensive research field, and the numerical packages usually come with a steep learning curve. Users are expected to master considerable amount of computer knowledge and data processing skills. Training users to use the numerical packages, correctly access and utilize the computational resources is a troubled task. In addition to that, accessing to HPC is also a common difficulty for many users. To solve these problems, a cloud based solution dedicated on shallow seismic waveform modeling has been developed with the state-of-the-art web technologies. It is a web platform integrating both software and hardware with multilayer architecture: a well designed SQL database serves as the data layer, HPC and dedicated pipeline for it is the business layer. Through this platform, users will no longer need to compile and manipulate various packages on the local machine within local network to perform a simulation. By providing users professional access to the computational code through its interfaces and delivering our computational resources to the users over cloud, users can customize the simulation at expert-level, submit and run the job through it.
Efficient utilization of graphics technology for space animation
NASA Technical Reports Server (NTRS)
Panos, Gregory Peter
1989-01-01
Efficient utilization of computer graphics technology has become a major investment in the work of aerospace engineers and mission designers. These new tools are having a significant impact in the development and analysis of complex tasks and procedures which must be prepared prior to actual space flight. Design and implementation of useful methods in applying these tools has evolved into a complex interaction of hardware, software, network, video and various user interfaces. Because few people can understand every aspect of this broad mix of technology, many specialists are required to build, train, maintain and adapt these tools to changing user needs. Researchers have set out to create systems where an engineering designer can easily work to achieve goals with a minimum of technological distraction. This was accomplished with high-performance flight simulation visual systems and supercomputer computational horsepower. Control throughout the creative process is judiciously applied while maintaining generality and ease of use to accommodate a wide variety of engineering needs.
Women and Computer Based Technologies: A Feminist Perspective.
ERIC Educational Resources Information Center
Morritt, Hope
The use of computer based technologies by professional women in education is examined through a feminist standpoint theory in this paper. The theory is grounded in eight claims which form the basis of the conceptual framework for the study. The experiences of nine women participants with computer based technologies were categorized using three…
Program on application of communications satellites to educational development
NASA Technical Reports Server (NTRS)
Morgan, R. P.; Singh, J. P.
1971-01-01
Interdisciplinary research in needs analysis, communications technology studies, and systems synthesis is reported. Existing and planned educational telecommunications services are studied and library utilization of telecommunications is described. Preliminary estimates are presented of ranges of utilization of educational telecommunications services for 1975 and 1985; instructional and public television, computer-aided instruction, computing resources, and information resource sharing for various educational levels and purposes. Communications technology studies include transmission schemes for still-picture television, use of Gunn effect devices, and TV receiver front ends for direct satellite reception at 12 GHz. Two major studies in the systems synthesis project concern (1) organizational and administrative aspects of a large-scale instructional satellite system to be used with schools and (2) an analysis of future development of instructional television, with emphasis on the use of video tape recorders and cable television. A communications satellite system synthesis program developed for NASA is now operational on the university IBM 360-50 computer.
Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing
NASA Technical Reports Server (NTRS)
Wells, B. Earl
2003-01-01
The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.
ERIC Educational Resources Information Center
Arnold, Savittree Rochanasmita; Padilla, Michael J.; Tunhikorn, Bupphachart
2009-01-01
In the rapidly developing digital world, technology is and will be a force in workplaces, communities, and everyday lives in the 21st century. Information and Communication Technology (ICT) including computer hardware/software, networking and other technologies such as audio, video, and other multimedia tools became learning tools for students in…
NASA Astrophysics Data System (ADS)
Chuang, Yi-Ting
The advancement of mobile computing technology has provided diverse way for education. Combination of mobile devices and GIS tools has become a trend in many geospatial technology applications (i.e., Google Maps application on smartphones). This research aims to develop an iBook prototype (a GIS textbook) for GIS education on Apple iPads and to evaluate the effectiveness of adopting the GIS iBook in classes and fieldwork exercises. We conducted the evaluation tests in two GIS courses (GEOG104 and GEOG381) in Fall 2014 at San Diego State University. There are two main research questions in this study: (1) How to assess and evaluate the effectiveness of location-based learning exercises (from iBook) and fieldwork exercises for first-time GIS students? (2) What were major technical challenges and opportunities to utilize mobile device and mobile technology in GIS education? The procedures of developing and evaluating the prototype of the GIS iBook include creating two new chapters (chapter three: Wander the World through Remote Sensing Data and chapter four: Internet and Mobile GIS), interviewing five educators from high schools and community colleges, and improving the contents of the GIS iBook after the interview. There were 31 students who tested the GIS iBook and did a fieldwork exercise with iPads. The 31 students were required to finish five questionnaires after the exercise to express their user experiences and thoughts about the GIS iBook. Based on the result of questionnaires, most students preferred to take GIS classes with the free GIS iBook and thought fieldwork exercise can help their learning. The students also performed better in knowledge oriented survey after reading the GIS iBook. This research also adopts the SWOT analysis method to evaluate the prototype of the GIS iBook. The result of the SWOT analysis indicates that utilizing mobile device in GIS education does have a great potential value in enhancing student's understanding. The strengths of utilizing mobile device in GIS education include portability, easy update contents and abundant free development resources, while the weaknesses include distracting multimedia widgets, lack of Internet access, and security issues. The opportunities of SWOT analysis include financial plan for iPads and lack of competitors, while the threats include higher price and incompatibility of iBooks on other tablet computers. The major limitations and key challenges are limited survey time, small sample size, and technical difficulties of developing the GIS iBook.
Lightweight Sensor Authentication Scheme for Energy Efficiency in Ubiquitous Computing Environments.
Lee, Jaeseung; Sung, Yunsick; Park, Jong Hyuk
2016-12-01
The Internet of Things (IoT) is the intelligent technologies and services that mutually communicate information between humans and devices or between Internet-based devices. In IoT environments, various device information is collected from the user for intelligent technologies and services that control the devices. Recently, wireless sensor networks based on IoT environments are being used in sectors as diverse as medicine, the military, and commerce. Specifically, sensor techniques that collect relevant area data via mini-sensors after distributing smart dust in inaccessible areas like forests or military zones have been embraced as the future of information technology. IoT environments that utilize smart dust are composed of the sensor nodes that detect data using wireless sensors and transmit the detected data to middle nodes. Currently, since the sensors used in these environments are composed of mini-hardware, they have limited memory, processing power, and energy, and a variety of research that aims to make the best use of these limited resources is progressing. This paper proposes a method to utilize these resources while considering energy efficiency, and suggests lightweight mutual verification and key exchange methods based on a hash function that has no restrictions on operation quantity, velocity, and storage space. This study verifies the security and energy efficiency of this method through security analysis and function evaluation, comparing with existing approaches. The proposed method has great value in its applicability as a lightweight security technology for IoT environments.
Lightweight Sensor Authentication Scheme for Energy Efficiency in Ubiquitous Computing Environments
Lee, Jaeseung; Sung, Yunsick; Park, Jong Hyuk
2016-01-01
The Internet of Things (IoT) is the intelligent technologies and services that mutually communicate information between humans and devices or between Internet-based devices. In IoT environments, various device information is collected from the user for intelligent technologies and services that control the devices. Recently, wireless sensor networks based on IoT environments are being used in sectors as diverse as medicine, the military, and commerce. Specifically, sensor techniques that collect relevant area data via mini-sensors after distributing smart dust in inaccessible areas like forests or military zones have been embraced as the future of information technology. IoT environments that utilize smart dust are composed of the sensor nodes that detect data using wireless sensors and transmit the detected data to middle nodes. Currently, since the sensors used in these environments are composed of mini-hardware, they have limited memory, processing power, and energy, and a variety of research that aims to make the best use of these limited resources is progressing. This paper proposes a method to utilize these resources while considering energy efficiency, and suggests lightweight mutual verification and key exchange methods based on a hash function that has no restrictions on operation quantity, velocity, and storage space. This study verifies the security and energy efficiency of this method through security analysis and function evaluation, comparing with existing approaches. The proposed method has great value in its applicability as a lightweight security technology for IoT environments. PMID:27916962
NASA Astrophysics Data System (ADS)
Moon, Hongsik
What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the changing computer hardware platforms in order to provide fast, accurate and efficient solutions to large, complex electromagnetic problems. The research in this dissertation proves that the performance of parallel code is intimately related to the configuration of the computer hardware and can be maximized for different hardware platforms. To benchmark and optimize the performance of parallel CEM software, a variety of large, complex projects are created and executed on a variety of computer platforms. The computer platforms used in this research are detailed in this dissertation. The projects run as benchmarks are also described in detail and results are presented. The parameters that affect parallel CEM software on High Performance Computing Clusters (HPCC) are investigated. This research demonstrates methods to maximize the performance of parallel CEM software code.
A New Test Method of Circuit Breaker Spring Telescopic Characteristics Based Image Processing
NASA Astrophysics Data System (ADS)
Huang, Huimin; Wang, Feifeng; Lu, Yufeng; Xia, Xiaofei; Su, Yi
2018-06-01
This paper applied computer vision technology to the fatigue condition monitoring of springs, and a new telescopic characteristics test method is proposed for circuit breaker operating mechanism spring based on image processing technology. High-speed camera is utilized to capture spring movement image sequences when high voltage circuit breaker operated. Then the image-matching method is used to obtain the deformation-time curve and speed-time curve, and the spring expansion and deformation parameters are extracted from it, which will lay a foundation for subsequent spring force analysis and matching state evaluation. After performing simulation tests at the experimental site, this image analyzing method could solve the complex problems of traditional mechanical sensor installation and monitoring online, status assessment of the circuit breaker spring.
A demanding web-based PACS supported by web services technology
NASA Astrophysics Data System (ADS)
Costa, Carlos M. A.; Silva, Augusto; Oliveira, José L.; Ribeiro, Vasco G.; Ribeiro, José
2006-03-01
During the last years, the ubiquity of web interfaces have pushed practically all PACS suppliers to develop client applications in which clinical practitioners can receive and analyze medical images, using conventional personal computers and Web browsers. However, due to security and performance issues, the utilization of these software packages has been restricted to Intranets. Paradigmatically, one of the most important advantages of digital image systems is to simplify the widespread sharing and remote access of medical data between healthcare institutions. This paper analyses the traditional PACS drawbacks that contribute to their reduced usage in the Internet and describes a PACS based on Web Services technology that supports a customized DICOM encoding syntax and a specific compression scheme providing all historical patient data in a unique Web interface.
Light transport and general aviation aircraft icing research requirements
NASA Technical Reports Server (NTRS)
Breeze, R. K.; Clark, G. M.
1981-01-01
A short term and a long term icing research and technology program plan was drafted for NASA LeRC based on 33 separate research items. The specific items listed resulted from a comprehensive literature search, organized and assisted by a computer management file and an industry/Government agency survey. Assessment of the current facilities and icing technology was accomplished by presenting summaries of ice sensitive components and protection methods; and assessments of penalty evaluation, the experimental data base, ice accretion prediction methods, research facilities, new protection methods, ice protection requirements, and icing instrumentation. The intent of the research plan was to determine what icing research NASA LeRC must do or sponsor to ultimately provide for increased utilization and safety of light transport and general aviation aircraft.
Consistent multiphysics simulation of a central tower CSP plant as applied to ISTORE
NASA Astrophysics Data System (ADS)
Votyakov, Evgeny V.; Papanicolas, Costas N.
2017-06-01
We present a unified consistent multiphysics approach to model a central tower CSP plant. The framework for the model includes Monte Carlo ray tracing (RT) and computational fluid dynamics (CFD) components utilizing the OpenFOAM C++ software library. The RT part works effectively with complex surfaces of engineering design given in CAD formats. The CFD simulation, which is based on 3D Navier-Stokes equations, takes into account all possible heat transfer mechanisms: radiation, conduction, and convection. Utilizing this package, the solar field of the experimental Platform for Research, Observation, and TEchnological Applications in Solar Energy (PROTEAS) and the Integrated STOrage and Receiver (ISTORE), developed at the Cyprus Institute, are being examined.
Zhang, Wenchao; Dai, Xinbin; Wang, Qishan; Xu, Shizhong; Zhao, Patrick X
2016-05-01
The term epistasis refers to interactions between multiple genetic loci. Genetic epistasis is important in regulating biological function and is considered to explain part of the 'missing heritability,' which involves marginal genetic effects that cannot be accounted for in genome-wide association studies. Thus, the study of epistasis is of great interest to geneticists. However, estimating epistatic effects for quantitative traits is challenging due to the large number of interaction effects that must be estimated, thus significantly increasing computing demands. Here, we present a new web server-based tool, the Pipeline for estimating EPIStatic genetic effects (PEPIS), for analyzing polygenic epistatic effects. The PEPIS software package is based on a new linear mixed model that has been used to predict the performance of hybrid rice. The PEPIS includes two main sub-pipelines: the first for kinship matrix calculation, and the second for polygenic component analyses and genome scanning for main and epistatic effects. To accommodate the demand for high-performance computation, the PEPIS utilizes C/C++ for mathematical matrix computing. In addition, the modules for kinship matrix calculations and main and epistatic-effect genome scanning employ parallel computing technology that effectively utilizes multiple computer nodes across our networked cluster, thus significantly improving the computational speed. For example, when analyzing the same immortalized F2 rice population genotypic data examined in a previous study, the PEPIS returned identical results at each analysis step with the original prototype R code, but the computational time was reduced from more than one month to about five minutes. These advances will help overcome the bottleneck frequently encountered in genome wide epistatic genetic effect analysis and enable accommodation of the high computational demand. The PEPIS is publically available at http://bioinfo.noble.org/PolyGenic_QTL/.
Waghmare, Lalitbhushan S; Jagzape, Arunita T; Rawekar, Alka T; Quazi, Nazli Z; Mishra, Ved Prakash
2014-01-01
Background: Higher education has undergone profound transformation due to recent technological advancements. Resultantly health profession students have a strong base to utilize information technology for their professional development. Studies over recent past reflect a striking change in pattern of technology usage amongst medical students expanding prospects exponentially by e-books, science apps, readymade power-point presentations, evidence based medicine, Wikipedia, etc. Aim & Objectives: The study was undertaken with an aim to explore the general perceptions of medical students and faculties about the role of Information Communication Technology in higher education and to gauge student’s dependence on the same for seeking knowledge and information. Study Design: Cross-sectional, mixed research design. Materials and Methods: The study was conducted in Department of Physiology, Datta Meghe Institute of Medical Sciences (Deemed University). Study population included students (n=150) and teaching faculty (n=10) of Ist phase of medical curriculum. The survey questionnaire (10 closed ended and 5 open ended items) and Focus group discussion (FGD) captured the perceptions and attitudes of students and faculties respectively regarding the role and relevance of technology in higher education. Observations and Results: Quantitative analysis of closed ended responses was done by percentage distribution and Qualitative analysis of open ended responses and FGD excerpts was done by coding and observing the trends and patterns respectively. Overall the observations were in favour of increasing usability and dependability on technology as ready reference tool of subject information. Learners valued text books and technology almost equally and regarded computer training as a desirable incorporation in medical curriculum. Conclusion: Role of technology in education should be anticipated and appropriate measures should be undertaken for its adequate and optimum utilization by proper training of students as well as facilitators. PMID:25121049
Srivastava, Tripti K; Waghmare, Lalitbhushan S; Jagzape, Arunita T; Rawekar, Alka T; Quazi, Nazli Z; Mishra, Ved Prakash
2014-06-01
Higher education has undergone profound transformation due to recent technological advancements. Resultantly health profession students have a strong base to utilize information technology for their professional development. Studies over recent past reflect a striking change in pattern of technology usage amongst medical students expanding prospects exponentially by e-books, science apps, readymade power-point presentations, evidence based medicine, Wikipedia, etc. Aim & Objectives: The study was undertaken with an aim to explore the general perceptions of medical students and faculties about the role of Information Communication Technology in higher education and to gauge student's dependence on the same for seeking knowledge and information. Cross-sectional, mixed research design. The study was conducted in Department of Physiology, Datta Meghe Institute of Medical Sciences (Deemed University). Study population included students (n=150) and teaching faculty (n=10) of I(st) phase of medical curriculum. The survey questionnaire (10 closed ended and 5 open ended items) and Focus group discussion (FGD) captured the perceptions and attitudes of students and faculties respectively regarding the role and relevance of technology in higher education. Quantitative analysis of closed ended responses was done by percentage distribution and Qualitative analysis of open ended responses and FGD excerpts was done by coding and observing the trends and patterns respectively. Overall the observations were in favour of increasing usability and dependability on technology as ready reference tool of subject information. Learners valued text books and technology almost equally and regarded computer training as a desirable incorporation in medical curriculum. Role of technology in education should be anticipated and appropriate measures should be undertaken for its adequate and optimum utilization by proper training of students as well as facilitators.
NASA Astrophysics Data System (ADS)
Lou, Yang; Zhou, Weimin; Matthews, Thomas P.; Appleton, Catherine M.; Anastasio, Mark A.
2017-04-01
Photoacoustic computed tomography (PACT) and ultrasound computed tomography (USCT) are emerging modalities for breast imaging. As in all emerging imaging technologies, computer-simulation studies play a critically important role in developing and optimizing the designs of hardware and image reconstruction methods for PACT and USCT. Using computer-simulations, the parameters of an imaging system can be systematically and comprehensively explored in a way that is generally not possible through experimentation. When conducting such studies, numerical phantoms are employed to represent the physical properties of the patient or object to-be-imaged that influence the measured image data. It is highly desirable to utilize numerical phantoms that are realistic, especially when task-based measures of image quality are to be utilized to guide system design. However, most reported computer-simulation studies of PACT and USCT breast imaging employ simple numerical phantoms that oversimplify the complex anatomical structures in the human female breast. We develop and implement a methodology for generating anatomically realistic numerical breast phantoms from clinical contrast-enhanced magnetic resonance imaging data. The phantoms will depict vascular structures and the volumetric distribution of different tissue types in the breast. By assigning optical and acoustic parameters to different tissue structures, both optical and acoustic breast phantoms will be established for use in PACT and USCT studies.
Environment and health: Probes and sensors for environment digital control
NASA Astrophysics Data System (ADS)
Schettini, Chiara
2014-05-01
The idea of studying the environment using New Technologies (NT) came from a MIUR (Ministry of Education of the Italian Government) notice that allocated funds for the realization of innovative school science projects. The "Environment and Health" project uses probes and sensors for digital control of environment (water, air and soil). The working group was composed of 4 Science teachers from 'Liceo Statale G. Mazzini ', under the coordination of teacher Chiara Schettini. The Didactic Section of Naples City of Sciences helped the teachers in developing the project and it organized a refresher course for them on the utilization of digital control sensors. The project connects Environment and Technology because the study of the natural aspects and the analysis of the chemical-physical parameters give students and teachers skills for studying the environment based on the utilization of NT in computing data elaboration. During the practical project, samples of air, water and soil are gathered in different contexts. Sample analysis was done in the school's scientific laboratory with digitally controlled sensors. The data are elaborated with specific software and the results have been written in a booklet and in a computing database. During the first year, the project involved 6 school classes (age of the students 14—15 years), under the coordination of Science teachers. The project aims are: 1) making students more aware about environmental matters 2) achieving basic skills for evaluating air, water and soil quality. 3) achieving strong skills for the utilization of digitally controlled sensors. 4) achieving computing skills for elaborating and presenting data. The project aims to develop a large environmental conscience and the need of a ' good ' environment for defending our health. Moreover it would increase the importance of NT as an instrument of knowledge.
Simplified gas sensor model based on AlGaN/GaN heterostructure Schottky diode
DOE Office of Scientific and Technical Information (OSTI.GOV)
Das, Subhashis, E-mail: subhashis.ds@gmail.com; Majumdar, S.; Kumar, R.
2015-08-28
Physics based modeling of AlGaN/GaN heterostructure Schottky diode gas sensor has been investigated for high sensitivity and linearity of the device. Here the surface and heterointerface properties are greatly exploited. The dependence of two dimensional electron gas (2DEG) upon the surface charges is mainly utilized. The simulation of Schottky diode has been done in Technology Computer Aided Design (TCAD) tool and I-V curves are generated, from the I-V curves 76% response has been recorded in presence of 500 ppm gas at a biasing voltage of 0.95 Volt.
NASA Technical Reports Server (NTRS)
1991-01-01
The purpose of the conference was to increase awareness of existing NASA developed technologies that are available for immediate use in the development of new products and processes, and to lay the groundwork for the effective utilization of emerging technologies. There were sessions on the following: Computer technology and software engineering; Human factors engineering and life sciences; Information and data management; Material sciences; Manufacturing and fabrication technology; Power, energy, and control systems; Robotics; Sensors and measurement technology; Artificial intelligence; Environmental technology; Optics and communications; and Superconductivity.
ERIC Educational Resources Information Center
Gallo, Dennis; Welty, Kenneth
This document contains technology-based learning activities for the Illinois energy utilization technology course at the orientation level (grades 9 and 10). This packet includes a course rationale, course mission statement, course description, course outline, suggested learning objectives for each of the energy utilization areas, and suggested…
PERKAM: Personalized Knowledge Awareness Map for Computer Supported Ubiquitous Learning
ERIC Educational Resources Information Center
El-Bishouty, Moushir M.; Ogata, Hiroaki; Yano, Yoneo
2007-01-01
This paper introduces a ubiquitous computing environment in order to support the learners while doing tasks; this environment is called PERKAM (PERsonalized Knowledge Awareness Map). PERKAM allows the learners to share knowledge, interact, collaborate, and exchange individual experiences. It utilizes the RFID ubiquities technology to detect the…
Identification of agricultural crops by computer processing of ERTS MSS data
NASA Technical Reports Server (NTRS)
Bauer, M. E.; Cipra, J. E.
1973-01-01
Quantitative evaluation of computer-processed ERTS MSS data classifications has shown that major crop species (corn and soybeans) can be accurately identified. The classifications of satellite data over a 2000 square mile area not only covered more than 100 times the area previously covered using aircraft, but also yielded improved results through the use of temporal and spatial data in addition to the spectral information. Furthermore, training sets could be extended over far larger areas than was ever possible with aircraft scanner data. And, preliminary comparisons of acreage estimates from ERTS data and ground-based systems agreed well. The results demonstrate the potential utility of this technology for obtaining crop production information.
NASA Technical Reports Server (NTRS)
Fishbach, L. H.
1979-01-01
The computational techniques utilized to determine the optimum propulsion systems for future aircraft applications and to identify system tradeoffs and technology requirements are described. The characteristics and use of the following computer codes are discussed: (1) NNEP - a very general cycle analysis code that can assemble an arbitrary matrix fans, turbines, ducts, shafts, etc., into a complete gas turbine engine and compute on- and off-design thermodynamic performance; (2) WATE - a preliminary design procedure for calculating engine weight using the component characteristics determined by NNEP; (3) POD DRG - a table look-up program to calculate wave and friction drag of nacelles; (4) LIFCYC - a computer code developed to calculate life cycle costs of engines based on the output from WATE; and (5) INSTAL - a computer code developed to calculate installation effects, inlet performance and inlet weight. Examples are given to illustrate how these computer techniques can be applied to analyze and optimize propulsion system fuel consumption, weight, and cost for representative types of aircraft and missions.
Agent-based user-adaptive service provision in ubiquitous systems
NASA Astrophysics Data System (ADS)
Saddiki, H.; Harroud, H.; Karmouch, A.
2012-11-01
With the increasing availability of smartphones, tablets and other computing devices, technology consumers have grown accustomed to performing all of their computing tasks anytime, anywhere and on any device. There is a greater need to support ubiquitous connectivity and accommodate users by providing software as network-accessible services. In this paper, we propose a MAS-based approach to adaptive service composition and provision that automates the selection and execution of a suitable composition plan for a given service. With agents capable of autonomous and intelligent behavior, the composition plan is selected in a dynamic negotiation driven by a utility-based decision-making mechanism; and the composite service is built by a coalition of agents each providing a component necessary to the target service. The same service can be built in variations for catering to dynamic user contexts and further personalizing the user experience. Also multiple services can be grouped to satisfy new user needs.
A review of emerging non-volatile memory (NVM) technologies and applications
NASA Astrophysics Data System (ADS)
Chen, An
2016-11-01
This paper will review emerging non-volatile memory (NVM) technologies, with the focus on phase change memory (PCM), spin-transfer-torque random-access-memory (STTRAM), resistive random-access-memory (RRAM), and ferroelectric field-effect-transistor (FeFET) memory. These promising NVM devices are evaluated in terms of their advantages, challenges, and applications. Their performance is compared based on reported parameters of major industrial test chips. Memory selector devices and cell structures are discussed. Changing market trends toward low power (e.g., mobile, IoT) and data-centric applications create opportunities for emerging NVMs. High-performance and low-cost emerging NVMs may simplify memory hierarchy, introduce non-volatility in logic gates and circuits, reduce system power, and enable novel architectures. Storage-class memory (SCM) based on high-density NVMs could fill the performance and density gap between memory and storage. Some unique characteristics of emerging NVMs can be utilized for novel applications beyond the memory space, e.g., neuromorphic computing, hardware security, etc. In the beyond-CMOS era, emerging NVMs have the potential to fulfill more important functions and enable more efficient, intelligent, and secure computing systems.
Discovering the intelligence in molecular biology.
Uberbacher, E
1995-12-01
The Third International Conference on Intelligent Systems in Molecular Biology was truly an outstanding event. Computational methods in molecular biology have reached a new level of maturity and utility, resulting in many high-impact applications. The success of this meeting bodes well for the rapid and continuing development of computational methods, intelligent systems and information-based approaches for the biosciences. The basic technology, originally most often applied to 'feasibility' problems, is now dealing effectively with the most difficult real-world problems. Significant progress has been made in understanding protein-structure information, structural classification, and how functional information and the relevant features of active-site geometry can be gleaned from structures by automated computational approaches. The value and limits of homology-based methods, and the ability to classify proteins by structure in the absence of homology, have reached a new level of sophistication. New methods for covariation analysis in the folding of large structures such as RNAs have shown remarkably good results, indicating the long-term potential to understand very complicated molecules and multimolecular complexes using computational means. Novel methods, such as HMMs, context-free grammars and the uses of mutual information theory, have taken center stage as highly valuable tools in our quest to represent and characterize biological information. A focus on creative uses of intelligent systems technologies and the trend toward biological application will undoubtedly continue and grow at the 1996 ISMB meeting in St Louis.
Wilkie, Diana J; Kim, Young Ok; Suarez, Marie L; Dauw, Colleen M; Stapleton, Stephen J; Gorman, Geraldine; Storfjell, Judith; Zhao, Zhongsheng
2009-07-01
We aimed to determine the acceptability and feasibility of a pentablet-based software program, PAINReportIt-Plus, as a means for patients with cancer in home hospice to report their symptoms and differences in acceptability by demographic variables. Of the 131 participants (mean age = 59 +/- 13, 58% women, 48.1% African American), 44% had never used a computer, but all participants easily used the computerized tool and reported an average computer acceptability score of 10.3 +/- 1.8, indicating high acceptability. Participants required an average of 19.1 +/- 9.5 minutes to complete the pain section, 9.8 +/- 6.5 minutes for the medication section, and 4.8 +/- 2.3 minutes for the symptom section. The acceptability scores were not statistically different by demographic variables but time to complete the tool differed by racial/ethnic groups. Our findings demonstrate that terminally ill patients with cancer are willing and able to utilize computer pentablet technology to record and describe their pain and other symptoms. Visibility of pain and distress is the first step necessary for the hospice team to develop a care plan for improving control of noxious symptoms.
A minimal SATA III Host Controller based on FPGA
NASA Astrophysics Data System (ADS)
Liu, Hailiang
2018-03-01
SATA (Serial Advanced Technology Attachment) is an advanced serial bus which has a outstanding performance in transmitting high speed real-time data applied in Personal Computers, Financial Industry, astronautics and aeronautics, etc. In this express, a minimal SATA III Host Controller based on Xilinx Kintex 7 serial FPGA is designed and implemented. Compared to the state-of-art, registers utilization are reduced 25.3% and LUTs utilization are reduced 65.9%. According to the experimental results, the controller works precisely and steady with the reading bandwidth of up to 536 MB per second and the writing bandwidth of up to 512 MB per second, both of which are close to the maximum bandwidth of the SSD(Solid State Disk) device. The host controller is very suitable for high speed data transmission and mass data storage.
Utilizing Modern Technology in Adult and Continuing Education Programs.
ERIC Educational Resources Information Center
New York State Education Dept., Albany. Bureau of Curriculum Development.
This publication, designed as a supplement to the manual entitled "Managing Programs for Adults" (1983), provides guidelines for establishing or expanding the use of video and computers by administration and staff of adult education programs. The first section presents the use of video technology for program promotion, instruction, and staff…
ERIC Educational Resources Information Center
Li, Haiqing
2010-01-01
With rapid advancements in information and communication technologies, computer-mediated communication channels such as email, web, mobile smart-phones with SMS, social networking websites (Facebook), multimedia websites, and OEM devices provide users with multiple technology choices to seek information. However, no study has compared the…
Technology: Catalyst for Enhancing Chemical Education for Pre-service Teachers
NASA Astrophysics Data System (ADS)
Kumar, Vinay; Bedell, Julia Yang; Seed, Allen H.
1999-05-01
A DOE/KYEPSCoR-funded project enabled us to introduce a new curricular initiative aimed at improving the chemical education of pre-service elementary teachers. The new curriculum was developed in collaboration with the School of Education faculty. A new course for the pre-service teachers, "Discovering Chemistry with Lab" (CHE 105), was developed. The integrated lecture and lab course covers basic principles of chemistry and their applications in daily life. The course promotes reasoning and problem-solving skills and utilizes hands-on, discovery/guided-inquiry, and cooperative learning approaches. This paper describes the implementation of technology (computer-interfacing and simulation experiments) in the lab. Results of two assessment surveys conducted in the laboratory are also discussed. The key features of the lab course are eight new experiments, including four computer-interfacing/simulation experiments involving the use of Macintosh Power PCs, temperature and pH probes, and a serial box interface, and use of household materials. Several experiments and the midterm and final lab practical exams emphasize the discovery/guided-inquiry approach. The results of pre- and post-surveys showed very significant positive changes in students' attitude toward the relevancy of chemistry, use of technology (computers) in elementary school classrooms, and designing and teaching discovery-based units. Most students indicated that they would be very interested (52%) or interested (36%) in using computers in their science teaching.
Earth System Grid II, Turning Climate Datasets into Community Resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Middleton, Don
2006-08-01
The Earth System Grid (ESG) II project, funded by the Department of Energy’s Scientific Discovery through Advanced Computing program, has transformed climate data into community resources. ESG II has accomplished this goal by creating a virtual collaborative environment that links climate centers and users around the world to models and data via a computing Grid, which is based on the Department of Energy’s supercomputing resources and the Internet. Our project’s success stems from partnerships between climate researchers and computer scientists to advance basic and applied research in the terrestrial, atmospheric, and oceanic sciences. By interfacing with other climate science projects,more » we have learned that commonly used methods to manage and remotely distribute data among related groups lack infrastructure and under-utilize existing technologies. Knowledge and expertise gained from ESG II have helped the climate community plan strategies to manage a rapidly growing data environment more effectively. Moreover, approaches and technologies developed under the ESG project have impacted datasimulation integration in other disciplines, such as astrophysics, molecular biology and materials science.« less
NASA Technical Reports Server (NTRS)
Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)
1983-01-01
Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.
Dynamically adaptive data-driven simulation of extreme hydrological flows
NASA Astrophysics Data System (ADS)
Kumar Jain, Pushkar; Mandli, Kyle; Hoteit, Ibrahim; Knio, Omar; Dawson, Clint
2018-02-01
Hydrological hazards such as storm surges, tsunamis, and rainfall-induced flooding are physically complex events that are costly in loss of human life and economic productivity. Many such disasters could be mitigated through improved emergency evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses on the novel combination of methodologies in forward simulation and data assimilation. The forward geophysical model utilizes adaptive mesh refinement (AMR), a process by which a computational mesh can adapt in time and space based on the current state of a simulation. The forward solution is combined with ensemble based data assimilation methods, whereby observations from an event are assimilated into the forward simulation to improve the veracity of the solution, or used to invert for uncertain physical parameters. The novelty in our approach is the tight two-way coupling of AMR and ensemble filtering techniques. The technology is tested using actual data from the Chile tsunami event of February 27, 2010. These advances offer the promise of significantly transforming data-driven, real-time modeling of hydrological hazards, with potentially broader applications in other science domains.
Energy Consumption Management of Virtual Cloud Computing Platform
NASA Astrophysics Data System (ADS)
Li, Lin
2017-11-01
For energy consumption management research on virtual cloud computing platforms, energy consumption management of virtual computers and cloud computing platform should be understood deeper. Only in this way can problems faced by energy consumption management be solved. In solving problems, the key to solutions points to data centers with high energy consumption, so people are in great need to use a new scientific technique. Virtualization technology and cloud computing have become powerful tools in people’s real life, work and production because they have strong strength and many advantages. Virtualization technology and cloud computing now is in a rapid developing trend. It has very high resource utilization rate. In this way, the presence of virtualization and cloud computing technologies is very necessary in the constantly developing information age. This paper has summarized, explained and further analyzed energy consumption management questions of the virtual cloud computing platform. It eventually gives people a clearer understanding of energy consumption management of virtual cloud computing platform and brings more help to various aspects of people’s live, work and son on.
Advances in computer imaging/applications in facial plastic surgery.
Papel, I D; Jiannetto, D F
1999-01-01
Rapidly progressing computer technology, ever-increasing expectations of patients, and a confusing medicolegal environment requires a clarification of the role of computer imaging/applications. Advances in computer technology and its applications are reviewed. A brief historical discussion is included for perspective. Improvements in both hardware and software with the advent of digital imaging have allowed great increases in speed and accuracy in patient imaging. This facilitates doctor-patient communication and possibly realistic patient expectations. Patients seeking cosmetic surgery now often expect preoperative imaging. Although society in general has become more litigious, a literature search up to 1998 reveals no lawsuits directly involving computer imaging. It appears that conservative utilization of computer imaging by the facial plastic surgeon may actually reduce liability and promote communication. Recent advances have significantly enhanced the value of computer imaging in the practice of facial plastic surgery. These technological advances in computer imaging appear to contribute a useful technique for the practice of facial plastic surgery. Inclusion of computer imaging should be given serious consideration as an adjunct to clinical practice.
Study on the application of mobile internet cloud computing platform
NASA Astrophysics Data System (ADS)
Gong, Songchun; Fu, Songyin; Chen, Zheng
2012-04-01
The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.
ERIC Educational Resources Information Center
Davies, Daniel K.; Stock, Steven E.; Wehmeyer, Michael L.
2003-01-01
This report describes results of an initial investigation of the utility of a specially designed money management software program for improving management of personal checking accounts for individuals with mental retardation. Use with 19 adults with mental retardation indicated the software resulted in significant reduction in check writing and…
1977-01-26
Sisteme Matematicheskogo Obespecheniya YeS EVM [ Applied Programs in the Software System for the Unified System of Computers], by A. Ye. Fateyev, A. I...computerized systems are most effective in large production complexes , in which the level of utilization of computers can be as high as 500,000...performance of these tasks could be furthered by the complex introduction of electronic computers in automated control systems. The creation of ASU
Implementation of Information Technology in the Free Trade Era for Indonesia
1998-06-01
computer usage, had been organized before Thailand, Malaysia , and China. Also, use of computers for crude oil process applications, and marketing and...seismic computing in Pertamina had been installed and in operation ahead of Taiwan, Malaysia , and Brunei. There are many examples of computer usage at...such as: Malaysia , Thailand, USA, China, Germany, and many others. Although IT development is utilized in Indonesia’s development program, it should
Broadband Satellite Technologies and Markets Assessed
NASA Technical Reports Server (NTRS)
Wallett, Thomas M.
1999-01-01
The current usage of broadband (data rate greater than 64 kilobits per second (kbs)) for multimedia network computer applications is increasing, and the need for network communications technologies and systems to support this use is also growing. Satellite technology will likely be an important part of the National Information Infrastructure (NII) and the Global Information Infrastructure (GII) in the next decade. Several candidate communications technologies that may be used to carry a portion of the increased data traffic have been reviewed, and estimates of the future demand for satellite capacity have been made. A study was conducted by the NASA Lewis Research Center to assess the satellite addressable markets for broadband applications. This study effort included four specific milestones: (1) assess the changing nature of broadband applications and their usage, (2) assess broadband satellite and terrestrial technologies, (3) estimate the size of the global satellite addressable market from 2000 to 2010, and (4) identify how the impact of future technology developments could increase the utility of satellite-based transport to serve this market.
Construction of In-house Databases in a Corporation
NASA Astrophysics Data System (ADS)
Senoo, Tetsuo
As computer technology, communication technology and others have progressed, many corporations are likely to locate constructing and utilizing their own databases at the center of the information activities, and aim at developing their information activities newly. This paper considers how information management in a corporation is affected under changing management and technology environments, and clarifies and generalizes what in-house databases should be constructed and utilized from the viewpoints of requirements to be furnished, types and forms of information to be dealt, indexing, use type and frequency, evaluation method and so on. The author outlines an information system of Matsushita called MATIS (Matsushita Technical Information System) as an actual example, and describes the present status and some points to be reminded in constructing and utilizing databases of REP, BOOK and SYMP.
NASA Technical Reports Server (NTRS)
Gross, Anthony R.; Sims, Michael H.; Briggs, Geoffrey A.
1996-01-01
From the beginning to the present expeditions to the Moon have involved a large investment of human labor. This has been true for all aspects of the process, from the initial design of the mission, whether scientific or technological, through the development of the instruments and the spacecraft, to the flight and operational phases. In addition to the time constraints that this situation imposes, there is also a significant cost associated with the large labor costs. As a result lunar expeditions have been limited to a few robotic missions and the manned Apollo program missions of the 1970s. With the rapid rise of the new information technologies, new paradigms are emerging that promise to greatly reduce both the time and cost of such missions. With the rapidly increasing capabilities of computer hardware and software systems, as well as networks and communication systems, a new balance of work is being developed between the human and the machine system. This new balance holds the promise of greatly increased exploration capability, along with dramatically reduced design, development, and operating costs. These new information technologies, utilizing knowledge-based software and very highspeed computer systems, will provide new design and development tools, scheduling mechanisms, and vehicle and system health monitoring capabilities that have hitherto been unavailable to the mission and spacecraft designer and the system operator. This paper will utilize typical lunar missions, both robotic and crewed, as a basis to describe and illustrate how these new information system technologies could be applied to all aspects such missions. In particular, new system design tradeoff tools will be described along with technologies that will allow a very much greater degree of autonomy of exploration vehicles than has heretofore been possible. In addition, new information technologies that will significantly reduce the human operational requirements will be discussed.
Reviews on Security Issues and Challenges in Cloud Computing
NASA Astrophysics Data System (ADS)
An, Y. Z.; Zaaba, Z. F.; Samsudin, N. F.
2016-11-01
Cloud computing is an Internet-based computing service provided by the third party allowing share of resources and data among devices. It is widely used in many organizations nowadays and becoming more popular because it changes the way of how the Information Technology (IT) of an organization is organized and managed. It provides lots of benefits such as simplicity and lower costs, almost unlimited storage, least maintenance, easy utilization, backup and recovery, continuous availability, quality of service, automated software integration, scalability, flexibility and reliability, easy access to information, elasticity, quick deployment and lower barrier to entry. While there is increasing use of cloud computing service in this new era, the security issues of the cloud computing become a challenges. Cloud computing must be safe and secure enough to ensure the privacy of the users. This paper firstly lists out the architecture of the cloud computing, then discuss the most common security issues of using cloud and some solutions to the security issues since security is one of the most critical aspect in cloud computing due to the sensitivity of user's data.
Internet-based computer technology on radiotherapy.
Chow, James C L
2017-01-01
Recent rapid development of Internet-based computer technologies has made possible many novel applications in radiation dose delivery. However, translational speed of applying these new technologies in radiotherapy could hardly catch up due to the complex commissioning process and quality assurance protocol. Implementing novel Internet-based technology in radiotherapy requires corresponding design of algorithm and infrastructure of the application, set up of related clinical policies, purchase and development of software and hardware, computer programming and debugging, and national to international collaboration. Although such implementation processes are time consuming, some recent computer advancements in the radiation dose delivery are still noticeable. In this review, we will present the background and concept of some recent Internet-based computer technologies such as cloud computing, big data processing and machine learning, followed by their potential applications in radiotherapy, such as treatment planning and dose delivery. We will also discuss the current progress of these applications and their impacts on radiotherapy. We will explore and evaluate the expected benefits and challenges in implementation as well.
Computer-Based Career Interventions.
ERIC Educational Resources Information Center
Mau, Wei-Cheng
The possible utilities and limitations of computer-assisted career guidance systems (CACG) have been widely discussed although the effectiveness of CACG has not been systematically considered. This paper investigates the effectiveness of a theory-based CACG program, integrating Sequential Elimination and Expected Utility strategies. Three types of…
Computer programs: Operational and mathematical, a compilation
NASA Technical Reports Server (NTRS)
1973-01-01
Several computer programs which are available through the NASA Technology Utilization Program are outlined. Presented are: (1) Computer operational programs which can be applied to resolve procedural problems swiftly and accurately. (2) Mathematical applications for the resolution of problems encountered in numerous industries. Although the functions which these programs perform are not new and similar programs are available in many large computer center libraries, this collection may be of use to centers with limited systems libraries and for instructional purposes for new computer operators.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Newman, G.A.; Commer, M.
Three-dimensional (3D) geophysical imaging is now receiving considerable attention for electrical conductivity mapping of potential offshore oil and gas reservoirs. The imaging technology employs controlled source electromagnetic (CSEM) and magnetotelluric (MT) fields and treats geological media exhibiting transverse anisotropy. Moreover when combined with established seismic methods, direct imaging of reservoir fluids is possible. Because of the size of the 3D conductivity imaging problem, strategies are required exploiting computational parallelism and optimal meshing. The algorithm thus developed has been shown to scale to tens of thousands of processors. In one imaging experiment, 32,768 tasks/processors on the IBM Watson Research Blue Gene/Lmore » supercomputer were successfully utilized. Over a 24 hour period we were able to image a large scale field data set that previously required over four months of processing time on distributed clusters based on Intel or AMD processors utilizing 1024 tasks on an InfiniBand fabric. Electrical conductivity imaging using massively parallel computational resources produces results that cannot be obtained otherwise and are consistent with timeframes required for practical exploration problems.« less
Design & Delivery of Training for a State-Wide Data Communication Network.
ERIC Educational Resources Information Center
Zacher, Candace M.
This report describes the process of development of training for agricultural research, teaching, and extension professionals in how to use the Fast Agricultural Communications Terminal (FACTS) computer network at Purdue University (Indiana), which is currently being upgraded in order to utilize the latest computer technology. The FACTS system is…
Effects of Educational Beliefs on Attitudes towards Using Computer Technologies
ERIC Educational Resources Information Center
Onen, Aysem Seda
2012-01-01
This study, aiming to determine the relationship between pre-service teachers' beliefs about education and their attitudes towards utilizing computers and internet, is a descriptive study in scanning model. The sampling of the study consisted of 270 pre-service teachers. The potential relationship between the beliefs of pre-service teachers about…
A Method of Synthesizing Large Bodies of Knowledge in the Social Sciences.
ERIC Educational Resources Information Center
Thiemann, Francis C.
Employing concepts of formal symbolic logic, the philosophy of science, computer technology, and the work of Hans Zetterberg, a format is suggested for synthesizing and increasing use of the rapidly expanding knowledge of the social sciences. Steps in the process include formulating basic propositions, utilizing computers to establish sets, and…
Increasing Mathematical Computation Skills for Students with Physical and Health Disabilities
ERIC Educational Resources Information Center
Webb, Paula
2017-01-01
Students with physical and health disabilities struggle with basic mathematical concepts. The purpose of this research study was to increase the students' mathematical computation skills through implementing new strategies and/or methods. The strategies implemented with the students was utilizing the ten-frame tiles and technology with the purpose…
Keeping PCs up to Date Can Be Fun
ERIC Educational Resources Information Center
Goldsborough, Reid
2004-01-01
The "joy" of computer maintenance takes many forms. These days, automation is the byword. Operating systems such as Microsoft Windows and utility suites such as Symantec's Norton Internet Security let you automatically keep crucial parts of your computer system up to date. It's fun to watch the technology keep tabs on itself. This document offers…
Technology survey of computer software as applicable to the MIUS project
NASA Technical Reports Server (NTRS)
Fulbright, B. E.
1975-01-01
Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.
An Innovative Improvement of Engineering Learning System Using Computational Fluid Dynamics Concept
ERIC Educational Resources Information Center
Hung, T. C.; Wang, S. K.; Tai, S. W.; Hung, C. T.
2007-01-01
An innovative concept of an electronic learning system has been established in an attempt to achieve a technology that provides engineering students with an instructive and affordable framework for learning engineering-related courses. This system utilizes an existing Computational Fluid Dynamics (CFD) package, Active Server Pages programming,…
Integrated instrumentation & computation environment for GRACE
NASA Astrophysics Data System (ADS)
Dhekne, P. S.
2002-03-01
The project GRACE (Gamma Ray Astrophysics with Coordinated Experiments) aims at setting up a state of the art Gamma Ray Observatory at Mt. Abu, Rajasthan for undertaking comprehensive scientific exploration over a wide spectral window (10's keV - 100's TeV) from a single location through 4 coordinated experiments. The cumulative data collection rate of all the telescopes is expected to be about 1 GB/hr, necessitating innovations in the data management environment. As real-time data acquisition and control as well as off-line data processing, analysis and visualization environment of these systems is based on the us cutting edge and affordable technologies in the field of computers, communications and Internet. We propose to provide a single, unified environment by seamless integration of instrumentation and computations by taking advantage of the recent advancements in Web based technologies. This new environment will allow researchers better acces to facilities, improve resource utilization and enhance collaborations by having identical environments for online as well as offline usage of this facility from any location. We present here a proposed implementation strategy for a platform independent web-based system that supplements automated functions with video-guided interactive and collaborative remote viewing, remote control through virtual instrumentation console, remote acquisition of telescope data, data analysis, data visualization and active imaging system. This end-to-end web-based solution will enhance collaboration among researchers at the national and international level for undertaking scientific studies, using the telescope systems of the GRACE project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hale, M.A.; Craig, J.I.
Integrated Product and Process Development (IPPD) embodies the simultaneous application to both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. Agents are used to implementmore » the overall infrastructure on the computer. Successful agent utilization requires that they be made of three components: the resource, the model, and the wrap. Current work is focused on the development of generalized agent schemes and associated demonstration projects. When in place, the technology independent computing infrastructure will aid the designer in systematically generating knowledge used to facilitate decision-making.« less
E-Commerce and Business Models
NASA Astrophysics Data System (ADS)
Ogasawara, Yasushi
The development of IT will lead to the integration of computers and networks, and IT will become more function oriented service and result in an indispensable part of the social infrastructure. This means that the critical point will shift from prioritizing “ownership of IS (Information Systems) before anything else” to “how IT will be utilized.” By reaching this technology level, Western origin non-discretion oriented management concept where IT is used as an enabler and IT-based business tools can be flexible enough to accommodate highly discretion oriented practices in Japanese organizations. In other words, IT can finally be utilized in a Japanese way. Taking account of the technological development trend, there is a need to take a macro look at the meaning of the concept of business models, something that has become viewed in the “micro” as patent-related issues. Under such trends, the greater freedom in business design the multipurpose use of IT functions is providing, the more critical a capability in the design of an elaborate business model is becoming.
Critical Computer Literacy: Computers in First-Year Composition as Topic and Environment.
ERIC Educational Resources Information Center
Duffelmeyer, Barbara Blakely
2000-01-01
Addresses how first-year students understand the influence of computers by cultural assumptions about technology. Presents three meaning perspectives on technology that students expressed based on formative experiences they have had with it. Discusses implications for how computers and composition scholars incorporate computer technology into…
ERIC Educational Resources Information Center
Teicholz, Eric
1997-01-01
Reports research on trends in computer-aided facilities management using the Internet and geographic information system (GIS) technology for space utilization research. Proposes that facility assessment software holds promise for supporting facility management decision making, and outlines four areas for its use: inventory; evaluation; reporting;…
A Causal Model of Teacher Acceptance of Technology
ERIC Educational Resources Information Center
Chang, Jui-Ling; Lieu, Pang-Tien; Liang, Jung-Hui; Liu, Hsiang-Te; Wong, Seng-lee
2012-01-01
This study proposes a causal model for investigating teacher acceptance of technology. We received 258 effective replies from teachers at public and private universities in Taiwan. A questionnaire survey was utilized to test the proposed model. The Lisrel was applied to test the proposed hypotheses. The result shows that computer self-efficacy has…
ERIC Educational Resources Information Center
Rio Salado Community Coll., AZ.
Rio Salado Community College offers a variety of alternative delivery courses utilizing different forms of instructional technology (e.g., broadcast and cable television, radio, audio and video cassettes, and computer-managed instruction) for both credit and non-credit instruction. This manual provides information for student operators of a…
ERIC Educational Resources Information Center
Bucknall, Ruary
1996-01-01
Overview of the interactive technologies used by the Northern Territory Secondary Correspondence School in Australia: print media utilizing desktop publishing and electronic transfer; telephone or H-F radio; interactive television; and interactive computing. More fully describes its interactive CD-ROM courses. Emphasizes that the programs are…
Information Technology: Making It All Fit. Track I: Policy and Planning.
ERIC Educational Resources Information Center
CAUSE, Boulder, CO.
Seven papers from the 1988 CAUSE conference's Track I, Policy and Planning, are presented. They include: "Developing a Strategic Plan for Academic Computing" (Arthur S. Gloster II); "New Technologies Are Presenting a Crisis for Middle Management" (M. Lewis Temares and Ruben Lopez); "An Information Utility: The Light, Gas,…
Shinbane, Jerold S; Saxon, Leslie A
Advances in imaging technology have led to a paradigm shift from planning of cardiovascular procedures and surgeries requiring the actual patient in a "brick and mortar" hospital to utilization of the digitalized patient in the virtual hospital. Cardiovascular computed tomographic angiography (CCTA) and cardiovascular magnetic resonance (CMR) digitalized 3-D patient representation of individual patient anatomy and physiology serves as an avatar allowing for virtual delineation of the most optimal approaches to cardiovascular procedures and surgeries prior to actual hospitalization. Pre-hospitalization reconstruction and analysis of anatomy and pathophysiology previously only accessible during the actual procedure could potentially limit the intrinsic risks related to time in the operating room, cardiac procedural laboratory and overall hospital environment. Although applications are specific to areas of cardiovascular specialty focus, there are unifying themes related to the utilization of technologies. The virtual patient avatar computer can also be used for procedural planning, computational modeling of anatomy, simulation of predicted therapeutic result, printing of 3-D models, and augmentation of real time procedural performance. Examples of the above techniques are at various stages of development for application to the spectrum of cardiovascular disease processes, including percutaneous, surgical and hybrid minimally invasive interventions. A multidisciplinary approach within medicine and engineering is necessary for creation of robust algorithms for maximal utilization of the virtual patient avatar in the digital medical center. Utilization of the virtual advanced cardiac imaging patient avatar will play an important role in the virtual health care system. Although there has been a rapid proliferation of early data, advanced imaging applications require further assessment and validation of accuracy, reproducibility, standardization, safety, efficacy, quality, cost effectiveness, and overall value to medical care. Copyright © 2018 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.
1989-01-01
The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.
On Representative Spaceflight Instrument and Associated Instrument Sensor Web Framework
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Patel, Umeshkumar; Vootukuru, Meg
2007-01-01
Sensor Web-based adaptation and sharing of space flight mission resources, including those of the Space-Ground and Control-User communication segment, could greatly benefit from utilization of heritage Internet Protocols and devices applied for Spaceflight (SpaceIP). This had been successfully demonstrated by a few recent spaceflight experiments. However, while terrestrial applications of Internet protocols are well developed and understood (mostly due to billions of dollars in investments by the military and industry), the spaceflight application of Internet protocols is still in its infancy. Progress in the developments of SpaceIP-enabled instrument components will largely determine the SpaceIP utilization of those investments and acceptance in years to come. Likewise SpaceIP, the development of commercial real-time and instrument colocated computational resources, data compression and storage, can be enabled on-board a spacecraft and, in turn, support a powerful application to Sensor Web-based design of a spaceflight instrument. Sensor Web-enabled reconfiguration and adaptation of structures for hardware resources and information systems will commence application of Field Programmable Arrays (FPGA) and other aerospace programmable logic devices for what this technology was intended. These are a few obvious potential benefits of Sensor Web technologies for spaceflight applications. However, they are still waiting to be explored. This is because there is a need for a new approach to spaceflight instrumentation in order to make these mature sensor web technologies applicable for spaceflight. In this paper we present an approach in developing related and enabling spaceflight instrument-level technologies based on the new concept of a representative spaceflight Instrument Sensor Web (ISW).
Distributed GPU Computing in GIScience
NASA Astrophysics Data System (ADS)
Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.
2013-12-01
Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE Transactions on, 9(3), 378-394. 2. Li, J., Jiang, Y., Yang, C., Huang, Q., & Rice, M. (2013). Visualizing 3D/4D Environmental Data Using Many-core Graphics Processing Units (GPUs) and Multi-core Central Processing Units (CPUs). Computers & Geosciences, 59(9), 78-89. 3. Owens, J. D., Houston, M., Luebke, D., Green, S., Stone, J. E., & Phillips, J. C. (2008). GPU computing. Proceedings of the IEEE, 96(5), 879-899.
Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kozacik, Stephen
Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.
Performing quantum computing experiments in the cloud
NASA Astrophysics Data System (ADS)
Devitt, Simon J.
2016-09-01
Quantum computing technology has reached a second renaissance in the past five years. Increased interest from both the private and public sector combined with extraordinary theoretical and experimental progress has solidified this technology as a major advancement in the 21st century. As anticipated my many, some of the first realizations of quantum computing technology has occured over the cloud, with users logging onto dedicated hardware over the classical internet. Recently, IBM has released the Quantum Experience, which allows users to access a five-qubit quantum processor. In this paper we take advantage of this online availability of actual quantum hardware and present four quantum information experiments. We utilize the IBM chip to realize protocols in quantum error correction, quantum arithmetic, quantum graph theory, and fault-tolerant quantum computation by accessing the device remotely through the cloud. While the results are subject to significant noise, the correct results are returned from the chip. This demonstrates the power of experimental groups opening up their technology to a wider audience and will hopefully allow for the next stage of development in quantum information technology.
NASA Astrophysics Data System (ADS)
Sumarudin, A.; Ghozali, A. L.; Hasyim, A.; Effendi, A.
2016-04-01
Indonesian agriculture has great potensial for development. Agriculture a lot yet based on data collection for soil or plant, data soil can use for analys soil fertility. We propose e-agriculture system for monitoring soil. This system can monitoring soil status. Monitoring system based on wireless sensor mote that sensing soil status. Sensor monitoring utilize soil moisture, humidity and temperature. System monitoring design with mote based on microcontroler and xbee connection. Data sensing send to gateway with star topology with one gateway. Gateway utilize with mini personal computer and connect to xbee cordinator mode. On gateway, gateway include apache server for store data based on My-SQL. System web base with YII framework. System done implementation and can show soil status real time. Result the system can connection other mote 40 meters and mote lifetime 7 hours and minimum voltage 7 volt. The system can help famer for monitoring soil and farmer can making decision for treatment soil based on data. It can improve the quality in agricultural production and would decrease the management and farming costs.
Wearable computer for mobile augmented-reality-based controlling of an intelligent robot
NASA Astrophysics Data System (ADS)
Turunen, Tuukka; Roening, Juha; Ahola, Sami; Pyssysalo, Tino
2000-10-01
An intelligent robot can be utilized to perform tasks that are either hazardous or unpleasant for humans. Such tasks include working in disaster areas or conditions that are, for example, too hot. An intelligent robot can work on its own to some extent, but in some cases the aid of humans will be needed. This requires means for controlling the robot from somewhere else, i.e. teleoperation. Mobile augmented reality can be utilized as a user interface to the environment, as it enhances the user's perception of the situation compared to other interfacing methods and allows the user to perform other tasks while controlling the intelligent robot. Augmented reality is a method that combines virtual objects into the user's perception of the real world. As computer technology evolves, it is possible to build very small devices that have sufficient capabilities for augmented reality applications. We have evaluated the existing wearable computers and mobile augmented reality systems to build a prototype of a future mobile terminal- the CyPhone. A wearable computer with sufficient system resources for applications, wireless communication media with sufficient throughput and enough interfaces for peripherals has been built at the University of Oulu. It is self-sustained in energy, with enough operating time for the applications to be useful, and uses accurate positioning systems.
Study on GIS-based sport-games information system
NASA Astrophysics Data System (ADS)
Peng, Hongzhi; Yang, Lingbin; Deng, Meirong; Han, Yongshun
2008-10-01
With the development of internet and such info-technologies as, Information Superhighway, Computer Technology, Remote Sensing(RS), Global Positioning System(GPS), Digital Communication and National Information Network(NIN),etc. Geographic Information System (GIS) becomes more and more popular in fields of science and industries. It is not only feasible but also necessary to apply GIS to large-scale sport games. This paper firstly discussed GIS technology and its application, then elaborated on the frame and content of Sport-Games Geography Information System(SG-GIS) with the function of gathering, storing, processing, sharing, exchanging and utilizing all kind of spatial-temporal information about sport games, and lastly designed and developed a public service GIS for the 6th Asian Winter Games in Changchun, China(CAWGIS). The application of CAWGIS showed that the established SG-GIS was feasible and GIS-based sport games information system was able to effectively process a large amount of sport-games information and provide the real-time sport games service for governors, athletes and the public.
Feasibility and demonstration of a cloud-based RIID analysis system
NASA Astrophysics Data System (ADS)
Wright, Michael C.; Hertz, Kristin L.; Johnson, William C.; Sword, Eric D.; Younkin, James R.; Sadler, Lorraine E.
2015-06-01
A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed.
Lee, Jun-Hak; Lim, Jeong-Hwan; Hwang, Han-Jeong; Im, Chang-Hwan
2013-01-01
The main goal of this study was to develop a hybrid mental spelling system combining a steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI) technology and a webcam-based eye-tracker, which utilizes information from the brain electrical activity and eye gaze direction at the same time. In the hybrid mental spelling system, a character decoded using SSVEP was not typed if the position of the selected character was not matched with the eye direction information ('left' or 'right') obtained from the eye-tracker. Thus, the users did not need to correct a misspelled character using a 'BACKSPACE' key. To verify the feasibility of the developed hybrid mental spelling system, we conducted online experiments with ten healthy participants. Each participant was asked to type 15 English words consisting of 68 characters. As a result, 16.6 typing errors could be prevented on average, demonstrating that the implemented hybrid mental spelling system could enhance the practicality of our mental spelling system.
Pre-Hardware Optimization and Implementation Of Fast Optics Closed Control Loop Algorithms
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Lyon, Richard G.; Herman, Jay R.; Abuhassan, Nader
2004-01-01
One of the main heritage tools used in scientific and engineering data spectrum analysis is the Fourier Integral Transform and its high performance digital equivalent - the Fast Fourier Transform (FFT). The FFT is particularly useful in two-dimensional (2-D) image processing (FFT2) within optical systems control. However, timing constraints of a fast optics closed control loop would require a supercomputer to run the software implementation of the FFT2 and its inverse, as well as other image processing representative algorithm, such as numerical image folding and fringe feature extraction. A laboratory supercomputer is not always available even for ground operations and is not feasible for a night project. However, the computationally intensive algorithms still warrant alternative implementation using reconfigurable computing technologies (RC) such as Digital Signal Processors (DSP) and Field Programmable Gate Arrays (FPGA), which provide low cost compact super-computing capabilities. We present a new RC hardware implementation and utilization architecture that significantly reduces the computational complexity of a few basic image-processing algorithm, such as FFT2, image folding and phase diversity for the NASA Solar Viewing Interferometer Prototype (SVIP) using a cluster of DSPs and FPGAs. The DSP cluster utilization architecture also assures avoidance of a single point of failure, while using commercially available hardware. This, combined with the control algorithms pre-hardware optimization, or the first time allows construction of image-based 800 Hertz (Hz) optics closed control loops on-board a spacecraft, based on the SVIP ground instrument. That spacecraft is the proposed Earth Atmosphere Solar Occultation Imager (EASI) to study greenhouse gases CO2, C2H, H2O, O3, O2, N2O from Lagrange-2 point in space. This paper provides an advanced insight into a new type of science capabilities for future space exploration missions based on on-board image processing for control and for robotics missions using vision sensors. It presents a top-level description of technologies required for the design and construction of SVIP and EASI and to advance the spatial-spectral imaging and large-scale space interferometry science and engineering.
Emerging, Photonic Based Technologies for NASA Space Communications Applications
NASA Technical Reports Server (NTRS)
Pouch, John; Nguyen, Hung; Lee, Richard; Levi, Anthony; Bos, Philip; Titus, Charles; Lavrentovich, Oleg
2002-01-01
An objective of NASA's Computing, Information, and Communications Technology program is to support the development of technologies that could potentially lower the cost of the Earth science and space exploration missions, and result in greater scientific returns. NASA-supported photonic activities which will impact space communications will be described. The objective of the RF microphotonic research is to develop a Ka-band receiver that will enable the microwaves detected by an antenna to modulate a 1.55- micron optical carrier. A key element is the high-Q, microphotonic modulator that employs a lithium niobate microdisk. The technical approach could lead to new receivers that utilize ultra-fast, photonic signal processing techniques, and are low cost, compact, low weight and power efficient. The progress in the liquid crystal (LC) beam steering research will also be reported. The predicted benefits of an LC-based device on board a spacecraft include non-mechanical, submicroradian laser-beam pointing, milliradian scanning ranges, and wave-front correction. The potential applications of these emerging technologies to the various NASA missions will be presented.
NASA Technical Reports Server (NTRS)
Lyle, Karen H.
2014-01-01
Acceptance of new spacecraft structural architectures and concepts requires validated design methods to minimize the expense involved with technology validation via flighttesting. This paper explores the implementation of probabilistic methods in the sensitivity analysis of the structural response of a Hypersonic Inflatable Aerodynamic Decelerator (HIAD). HIAD architectures are attractive for spacecraft deceleration because they are lightweight, store compactly, and utilize the atmosphere to decelerate a spacecraft during re-entry. However, designers are hesitant to include these inflatable approaches for large payloads or spacecraft because of the lack of flight validation. In the example presented here, the structural parameters of an existing HIAD model have been varied to illustrate the design approach utilizing uncertainty-based methods. Surrogate models have been used to reduce computational expense several orders of magnitude. The suitability of the design is based on assessing variation in the resulting cone angle. The acceptable cone angle variation would rely on the aerodynamic requirements.
Parameterized Facial Expression Synthesis Based on MPEG-4
NASA Astrophysics Data System (ADS)
Raouzaiou, Amaryllis; Tsapatsoulis, Nicolas; Karpouzis, Kostas; Kollias, Stefanos
2002-12-01
In the framework of MPEG-4, one can include applications where virtual agents, utilizing both textual and multisensory data, including facial expressions and nonverbal speech help systems become accustomed to the actual feelings of the user. Applications of this technology are expected in educational environments, virtual collaborative workplaces, communities, and interactive entertainment. Facial animation has gained much interest within the MPEG-4 framework; with implementation details being an open research area (Tekalp, 1999). In this paper, we describe a method for enriching human computer interaction, focusing on analysis and synthesis of primary and intermediate facial expressions (Ekman and Friesen (1978)). To achieve this goal, we utilize facial animation parameters (FAPs) to model primary expressions and describe a rule-based technique for handling intermediate ones. A relation between FAPs and the activation parameter proposed in classical psychological studies is established, leading to parameterized facial expression analysis and synthesis notions, compatible with the MPEG-4 standard.
Ketelhut, Diane Jass; Niemi, Steven M
2007-01-01
This article examines several new and exciting communication technologies. Many of the technologies were developed by the entertainment industry; however, other industries are adopting and modifying them for their own needs. These new technologies allow people to collaborate across distance and time and to learn in simulated work contexts. The article explores the potential utility of these technologies for advancing laboratory animal care and use through better education and training. Descriptions include emerging technologies such as augmented reality and multi-user virtual environments, which offer new approaches with different capabilities. Augmented reality interfaces, characterized by the use of handheld computers to infuse the virtual world into the real one, result in deeply immersive simulations. In these simulations, users can access virtual resources and communicate with real and virtual participants. Multi-user virtual environments enable multiple participants to simultaneously access computer-based three-dimensional virtual spaces, called "worlds," and to interact with digital tools. They allow for authentic experiences that promote collaboration, mentoring, and communication. Because individuals may learn or train differently, it is advantageous to combine the capabilities of these technologies and applications with more traditional methods to increase the number of students who are served by using current methods alone. The use of these technologies in animal care and use programs can create detailed training and education environments that allow students to learn the procedures more effectively, teachers to assess their progress more objectively, and researchers to gain insights into animal care.
Sensitivity analysis of the add-on price estimate for the silicon web growth process
NASA Technical Reports Server (NTRS)
Mokashi, A. R.
1981-01-01
The web growth process, a silicon-sheet technology option, developed for the flat plate solar array (FSA) project, was examined. Base case data for the technical and cost parameters for the technical and commercial readiness phase of the FSA project are projected. The process add on price, using the base case data for cost parameters such as equipment, space, direct labor, materials and utilities, and the production parameters such as growth rate and run length, using a computer program developed specifically to do the sensitivity analysis with improved price estimation are analyzed. Silicon price, sheet thickness and cell efficiency are also discussed.
Augmented Reality and Mobile Art
NASA Astrophysics Data System (ADS)
Gwilt, Ian
The combined notions of augmented-reality (AR) and mobile art are based on the amalgamation of a number of enabling technologies including computer imaging, emergent display and tracking systems and the increased computing-power in hand-held devices such as Tablet PCs, smart phones, or personal digital assistants (PDAs) which have been utilized in the making of works of art. There is much published research on the technical aspects of AR and the ongoing work being undertaken in the development of faster more efficient AR systems [1] [2]. In this text I intend to concentrate on how AR and its associated typologies can be applied in the context of new media art practices, with particular reference to its application on hand-held or mobile devices.
Analysis and Preliminary Design of an Advanced Technology Transport Flight Control System
NASA Technical Reports Server (NTRS)
Frazzini, R.; Vaughn, D.
1975-01-01
The analysis and preliminary design of an advanced technology transport aircraft flight control system using avionics and flight control concepts appropriate to the 1980-1985 time period are discussed. Specifically, the techniques and requirements of the flight control system were established, a number of candidate configurations were defined, and an evaluation of these configurations was performed to establish a recommended approach. Candidate configurations based on redundant integration of various sensor types, computational methods, servo actuator arrangements and data-transfer techniques were defined to the functional module and piece-part level. Life-cycle costs, for the flight control configurations, as determined in an operational environment model for 200 aircraft over a 15-year service life, were the basis of the optimum configuration selection tradeoff. The recommended system concept is a quad digital computer configuration utilizing a small microprocessor for input/output control, a hexad skewed set of conventional sensors for body rate and body acceleration, and triple integrated actuators.
NASA Astrophysics Data System (ADS)
Cieszewski, Radoslaw; Linczuk, Maciej
2016-09-01
The development of FPGA technology and the increasing complexity of applications in recent decades have forced compilers to move to higher abstraction levels. Compilers interprets an algorithmic description of a desired behavior written in High-Level Languages (HLLs) and translate it to Hardware Description Languages (HDLs). This paper presents a RPython based High-Level synthesis (HLS) compiler. The compiler get the configuration parameters and map RPython program to VHDL. Then, VHDL code can be used to program FPGA chips. In comparison of other technologies usage, FPGAs have the potential to achieve far greater performance than software as a result of omitting the fetch-decode-execute operations of General Purpose Processors (GPUs), and introduce more parallel computation. This can be exploited by utilizing many resources at the same time. Creating parallel algorithms computed with FPGAs in pure HDL is difficult and time consuming. Implementation time can be greatly reduced with High-Level Synthesis compiler. This article describes design methodologies and tools, implementation and first results of created VHDL backend for RPython compiler.
Integrated Artificial Intelligence Approaches for Disease Diagnostics.
Vashistha, Rajat; Chhabra, Deepak; Shukla, Pratyoosh
2018-06-01
Mechanocomputational techniques in conjunction with artificial intelligence (AI) are revolutionizing the interpretations of the crucial information from the medical data and converting it into optimized and organized information for diagnostics. It is possible due to valuable perfection in artificial intelligence, computer aided diagnostics, virtual assistant, robotic surgery, augmented reality and genome editing (based on AI) technologies. Such techniques are serving as the products for diagnosing emerging microbial or non microbial diseases. This article represents a combinatory approach of using such approaches and providing therapeutic solutions towards utilizing these techniques in disease diagnostics.
Through-the-wall surveillance for homeland security and law enforcement
NASA Astrophysics Data System (ADS)
Borek, Stanley E.; Clarke, Bernard J.; Costianes, Peter J.
2005-05-01
The Air Force Research Laboratory Information Directorate (AFRL/IF), under sponsorship of the Department of Justice's (DOJ), National Institute of Justice (NIJ) Office of Science and Technology (OS&T), is currently developing and evaluating advanced Through the Wall Surveillance (TWS) technologies. These technologies are partitioned into two categories: inexpensive, handheld systems for locating an individual(s) behind a wall or door; and portable, personal computer (PC) based standoff systems to enable the determination of events during critical incident situations. The technologies utilized are primarily focused on active radars operating in the UHF, L, S (ultra wideband (UWB)), X, and Ku Bands. The data displayed by these systems is indicative of range (1 Dimension), or range and azimuth (2 Dimensions) to the moving individual(s). This paper will highlight the technologies employed in five (5) prototype TWS systems delivered to NIJ and AFRL/IF for test and evaluation. It will discuss the systems backgrounds, applications, current states of evolution, and future plans for enhanced assessment.
NASA Astrophysics Data System (ADS)
Silvernail, Nathan L.
This research was carried out in collaboration with the United Launch Alliance (ULA), to advance an innovative Centaur-based on-orbit propellant storage and transfer system that takes advantage of rotational settling to simplify Fluid Management (FM), specifically enabling settled fluid transfer between two tanks and settled pressure control. This research consists of two specific objectives: (1) technique and process validation and (2) computational model development. In order to raise the Technology Readiness Level (TRL) of this technology, the corresponding FM techniques and processes must be validated in a series of experimental tests, including: laboratory/ground testing, microgravity flight testing, suborbital flight testing, and orbital testing. Researchers from Embry-Riddle Aeronautical University (ERAU) have joined with the Massachusetts Institute of Technology (MIT) Synchronized Position Hold Engage and Reorient Experimental Satellites (SPHERES) team to develop a prototype FM system for operations aboard the International Space Station (ISS). Testing of the integrated system in a representative environment will raise the FM system to TRL 6. The tests will demonstrate the FM system and provide unique data pertaining to the vehicle's rotational dynamics while undergoing fluid transfer operations. These data sets provide insight into the behavior and physical tendencies of the on-orbit refueling system. Furthermore, they provide a baseline for comparison against the data produced by various computational models; thus verifying the accuracy of the models output and validating the modeling approach. Once these preliminary models have been validated, the parameters defined by them will provide the basis of development for accurate simulations of full scale, on-orbit systems. The completion of this project and the models being developed will accelerate the commercialization of on-orbit propellant storage and transfer technologies as well as all in-space technologies that utilize or will utilize similar FM techniques and processes.
Evaluation of Digital Technology and Software Use among Business Education Teachers
ERIC Educational Resources Information Center
Ellis, Richard S.; Okpala, Comfort O.
2004-01-01
Digital video cameras are part of the evolution of multimedia digital products that have positive applications for educators, students, and industry. Multimedia digital video can be utilized by any personal computer and it allows the user to control, combine, and manipulate different types of media, such as text, sound, video, computer graphics,…
DOT National Transportation Integrated Search
2000-03-09
The Texas Department of Transportations (TxDOT) "smart highway" project, called TransGuide, scheduled to go on line in 1995 in San Antonio, utilizes high speed computer technology to help drivers anticipate traffic conditions-- in an effort to inc...
EPA is developing methods for utilizing computational chemistry, high-throughput screening (HTS)and genomic technologies to predict potential toxicity and prioritize the use of limited testing resources.
Guzik, Przemyslaw; Malik, Marek
Mobile electrocardiographs consist of three components: a mobile device (e.g. a smartphone), an electrocardiographic device or accessory, and a mobile application. Mobile platforms are small computers with sufficient computational power, good quality display, suitable data storage, and several possibilities of data transmission. Electrocardiographic electrodes and sensors for mobile use utilize unconventional materials, e.g. rubber, e-textile, and inkjet-printed nanoparticle electrodes. Mobile devices can be handheld, worn as vests or T-shirts, or attached to patient's skin as biopatches. Mobile electrocardiographic devices and accessories may additionally record other signals including respiratory rate, activity level, and geolocation. Large-scale clinical studies that utilize electrocardiography are easier to conduct using mobile technologies and the collected data are suitable for "big data" processing. This is expected to reveal phenomena so far inaccessible by standard electrocardiographic techniques. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Association for the Development of Computer-based Instructional Systems.
These proceedings present 74 selected abstracts and 47 selected formal papers under 14 special interest group headings. Topics addressed by the papers include constructing multimedia; interactive video; computers in secondary school mathematics; access in computer-based instruction; implementing computer-based technology; advisor development;…
NASA Technical Reports Server (NTRS)
Klich, P. J.; Macconochie, I. O.
1979-01-01
A study of an array of advanced earth-to-orbit space transportation systems with a focus on mass properties and technology requirements is presented. Methods of estimating weights of these vehicles differ from those used for commercial and military aircraft; the new techniques emphasizing winged horizontal and vertical takeoff advanced systems are described utilizing the space shuttle subsystem data base for the weight estimating equations. The weight equations require information on mission profile, the structural materials, the thermal protection system, and the ascent propulsion system, allowing for the type of construction and various propellant tank shapes. The overall system weights are calculated using this information and incorporated into the Systems Engineering Mass Properties Computer Program.
[Isolation and identification methods of enterobacteria group and its technological advancement].
Furuta, Itaru
2007-08-01
In the last half-century, isolation and identification methods of enterobacteria groups have markedly improved by technological advancement. Clinical microbiology tests have changed overtime from tube methods to commercial identification kits and automated identification. Tube methods are the original method for the identification of enterobacteria groups, that is, a basically essential method to recognize bacterial fermentation and biochemical principles. In this paper, traditional tube tests are discussed, such as the utilization of carbohydrates, indole, methyl red, and citrate and urease tests. Commercial identification kits and automated instruments by computer based analysis as current methods are also discussed, and those methods provide rapidity and accuracy. Nonculture techniques of nucleic acid typing methods using PCR analysis, and immunochemical methods using monoclonal antibodies can be further developed.
ERIC Educational Resources Information Center
Larbi-Apau, Josephine A.; Moseley, James L.
2012-01-01
This study examined the validity of Selwyn's computer attitude scale (CAS) and its implication for technology-based performance of randomly sampled (n = 167) multidiscipline teaching faculty in higher education in Ghana. Considered, computer attitude is a critical function of computer attitude and potential performance. Composed of four…
CMOS cassette for digital upgrade of film-based mammography systems
NASA Astrophysics Data System (ADS)
Baysal, Mehmet A.; Toker, Emre
2006-03-01
While full-field digital mammography (FFDM) technology is gaining clinical acceptance, the overwhelming majority (96%) of the installed base of mammography systems are conventional film-screen (FSM) systems. A high performance, and economical digital cassette based product to conveniently upgrade FSM systems to FFDM would accelerate the adoption of FFDM, and make the clinical and technical advantages of FFDM available to a larger population of women. The planned FFDM cassette is based on our commercial Digital Radiography (DR) cassette for 10 cm x 10 cm field-of-view spot imaging and specimen radiography, utilizing a 150 micron columnar CsI(Tl) scintillator and 48 micron active-pixel CMOS sensor modules. Unlike a Computer Radiography (CR) cassette, which requires an external digitizer, our DR cassette transfers acquired images to a display workstation within approximately 5 seconds of exposure, greatly enhancing patient flow. We will present the physical performance of our prototype system against other FFDM systems in clinical use today, using established objective criteria such as the Modulation Transfer Function (MTF), Detective Quantum Efficiency (DQE), and subjective criteria, such as a contrast-detail (CD-MAM) observer performance study. Driven by the strong demand from the computer industry, CMOS technology is one of the lowest cost, and the most readily accessible technologies available for FFDM today. Recent popular use of CMOS imagers in high-end consumer cameras have also resulted in significant advances in the imaging performance of CMOS sensors against rivaling CCD sensors. This study promises to take advantage of these unique features to develop the first CMOS based FFDM upgrade cassette.
NASA Astrophysics Data System (ADS)
Gupta, V.; Gupta, N.; Gupta, S.; Field, E.; Maechling, P.
2003-12-01
Modern laptop computers, and personal computers, can provide capabilities that are, in many ways, comparable to workstations or departmental servers. However, this doesn't mean we should run all computations on our local computers. We have identified several situations in which it preferable to implement our seismological application programs in a distributed, server-based, computing model. In this model, application programs on the user's laptop, or local computer, invoke programs that run on an organizational server, and the results are returned to the invoking system. Situations in which a server-based architecture may be preferred include: (a) a program is written in a language, or written for an operating environment, that is unsupported on the local computer, (b) software libraries or utilities required to execute a program are not available on the users computer, (c) a computational program is physically too large, or computationally too expensive, to run on a users computer, (d) a user community wants to enforce a consistent method of performing a computation by standardizing on a single implementation of a program, and (e) the computational program may require current information, that is not available to all client computers. Until recently, distributed, server-based, computational capabilities were implemented using client/server architectures. In these architectures, client programs were often written in the same language, and they executed in the same computing environment, as the servers. Recently, a new distributed computational model, called Web Services, has been developed. Web Services are based on Internet standards such as XML, SOAP, WDSL, and UDDI. Web Services offer the promise of platform, and language, independent distributed computing. To investigate this new computational model, and to provide useful services to the SCEC Community, we have implemented several computational and utility programs using a Web Service architecture. We have hosted these Web Services as a part of the SCEC Community Modeling Environment (SCEC/CME) ITR Project (http://www.scec.org/cme). We have implemented Web Services for several of the reasons sited previously. For example, we implemented a FORTRAN-based Earthquake Rupture Forecast (ERF) as a Web Service for use by client computers that don't support a FORTRAN runtime environment. We implemented a Generic Mapping Tool (GMT) Web Service for use by systems that don't have local access to GMT. We implemented a Hazard Map Calculator Web Service to execute Hazard calculations that are too computationally intensive to run on a local system. We implemented a Coordinate Conversion Web Service to enforce a standard and consistent method for converting between UTM and Lat/Lon. Our experience developing these services indicates both strengths and weakness in current Web Service technology. Client programs that utilize Web Services typically need network access, a significant disadvantage at times. Programs with simple input and output parameters were the easiest to implement as Web Services, while programs with complex parameter-types required a significant amount of additional development. We also noted that Web services are very data-oriented, and adapting object-oriented software into the Web Service model proved problematic. Also, the Web Service approach of converting data types into XML format for network transmission has significant inefficiencies for some data sets.
WEB-BASED DATABASE ON RENEWAL TECHNOLOGIES ...
As U.S. utilities continue to shore up their aging infrastructure, renewal needs now represent over 43% of annual expenditures compared to new construction for drinking water distribution and wastewater collection systems (Underground Construction [UC], 2016). An increased understanding of renewal options will ultimately assist drinking water utilities in reducing water loss and help wastewater utilities to address infiltration and inflow issues in a cost-effective manner. It will also help to extend the service lives of both drinking water and wastewater mains. This research effort involved collecting case studies on the use of various trenchless pipeline renewal methods and providing the information in an online searchable database. The overall objective was to further support technology transfer and information sharing regarding emerging and innovative renewal technologies for water and wastewater mains. The result of this research is a Web-based, searchable database that utility personnel can use to obtain technology performance and cost data, as well as case study references. The renewal case studies include: technologies used; the conditions under which the technology was implemented; costs; lessons learned; and utility contact information. The online database also features a data mining tool for automated review of the technologies selected and cost data. Based on a review of the case study results and industry data, several findings are presented on tren
ERIC Educational Resources Information Center
CAUSE, Boulder, CO.
Six papers and two abstracts of papers are presented from the 1995 CAUSE conference track on user services issues faced by managers of information technology at colleges and universities. The papers include: (1) "Academic Computing Services: MORE than a Utility" (Scott Bierman and Cathy Smith), which focuses on Carleton College's efforts…
Recent technological advances in computed tomography and the clinical impact therein.
Runge, Val M; Marquez, Herman; Andreisek, Gustav; Valavanis, Anton; Alkadhi, Hatem
2015-02-01
Current technological advances in CT, specifically those with a major impact on clinical imaging, are discussed. The intent was to provide for both medical physicists and practicing radiologists a summary of the clinical impact of each advance, offering guidance in terms of utility and day-to-day clinical implementation, with specific attention to radiation dose reduction.
ERIC Educational Resources Information Center
Hassanzadeh, Vahideh; Gholami, Reza; Allahyar, Negah; Noordin, Nooreen
2012-01-01
Nowadays technology has practically changed every aspect of language teaching. There are numerous studies focusing on the personality traits of students or other internet users towards the internet utilization. Nonetheless, little research has examined the personality traits of teachers towards using computers for educational purposes especially…
ERIC Educational Resources Information Center
Ogunlade, Oyeronke Olufunmilola; Fagbola, Oluwafunmilayo Faith; Ogunlade, Amos Akindele; Amosa, Abdulganiyu Alasela
2015-01-01
The use of the Internet can further equip teachers by providing them with the latest information on their discipline. The purpose of technology in teacher training is to provide pre-service teachers with the capability of integrating computer technologies into curriculum and instructional activities.This study therefore assessed the internet…
ERIC Educational Resources Information Center
Miller, Charman L.; Leadingham, Camille; Vance, Ronald
2010-01-01
Associate Degree Nursing (ADN) faculty are challenged by the monumental responsibility of preparing students to function as safe, professional nurses in a two year course of study. Advances in computer technology and emphasis on integrating technology and active learning strategies into existing course structures have prompted many nurse educators…
Computers in the Undergraduate Curriculum: An Aspect of the Many Section Problem.
ERIC Educational Resources Information Center
Churchill, Geoffrey
A brief case study of the resistance to technological change is presented using DOG, a small scale deterministic business game, as the example of technology. DOG, a decision mathematics game for the purpose of providing an environment for application of mathematical concepts, consists of assignments mostly utilizing matrix algebra but also some…
Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian
2011-08-30
Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.
An ultra-compact processor module based on the R3000
NASA Astrophysics Data System (ADS)
Mullenhoff, D. J.; Kaschmitter, J. L.; Lyke, J. C.; Forman, G. A.
1992-08-01
Viable high density packaging is of critical importance for future military systems, particularly space borne systems which require minimum weight and size and high mechanical integrity. A leading, emerging technology for high density packaging is multi-chip modules (MCM). During the 1980's, a number of different MCM technologies have emerged. In support of Strategic Defense Initiative Organization (SDIO) programs, Lawrence Livermore National Laboratory (LLNL) has developed, utilized, and evaluated several different MCM technologies. Prior LLNL efforts include modules developed in 1986, using hybrid wafer scale packaging, which are still operational in an Air Force satellite mission. More recent efforts have included very high density cache memory modules, developed using laser pantography. As part of the demonstration effort, LLNL and Phillips Laboratory began collaborating in 1990 in the Phase 3 Multi-Chip Module (MCM) technology demonstration project. The goal of this program was to demonstrate the feasibility of General Electric's (GE) High Density Interconnect (HDI) MCM technology. The design chosen for this demonstration was the processor core for a MIPS R3000 based reduced instruction set computer (RISC), which has been described previously. It consists of the R3000 microprocessor, R3010 floating point coprocessor and 128 Kbytes of cache memory.
NASA Astrophysics Data System (ADS)
Andersson, David L.
The field of Computer Information Systems (CIS) or Information Technology (IT) is experiencing rapid change. A 2003 study analyzing the IT degree programs and those of competing disciplines at 10 post-secondary institutions concluded that information technology programs are perceived differently from information systems and computer science programs and are significantly less focused on both math and pure science subjects. In Information Technology programs, voluntary professional certifications, generally known in the Information Technology field as "IT" certifications, are used as indicators of professional skill. A descriptive study noting one subject group's responses to items that were nearly identical except for IT certification information was done to investigate undergraduate CIS/IT student perceptions of IT industry certified instructors. The subject group was comprised of undergraduate CIS/IT students from a regionally accredited private institution and a public institution. The methodology was descriptive, based on a previous model by Dr. McKillip, Professor of Psychology, Southern Illinois University at Carbondale, utilizing a web-based survey instrument with a Likert scale, providing for voluntary anonymous responses outside the classroom over a ten day window. The results indicated that IT certification affected student perceptions of instructor effectiveness, teaching methodology, and student engagement in the class, and to a lesser degree, instructor technical qualifications. The implications suggest that additional research on this topic is merited. Although the study was not designed to examine the precise cause and effect, an important implication is that students may be motivated to attend classes taught by instructors they view as more confident and effective and that teachers with IT industry certification can better engage their students.
Cohn, Amy M.; Hunter-Reel, Dorian; Hagman, Brett T.; Mitchell, Jessica
2011-01-01
Background Interactive and mobile technologies (i.e., smartphones such as Blackberries, iPhones, and palm-top computers) show promise as an efficacious and cost-effective means of communicating health-behavior risks, improving public health outcomes, and accelerating behavior change (Abroms and Maibach, 2008). The present study was conducted as a “needs assessment” to examine the current available mobile smartphone applications (e.g., apps) that utilize principles of ecological momentary assessment (EMA) -- daily self-monitoring or near real-time self-assessment of alcohol use behavior -- to promote positive behavior change, alcohol harm reduction, psycho-education about alcohol use, or abstinence from alcohol. Methods Data were collected and analyzed from iTunes for Apple iPhone©. An inventory assessed the number of available apps that directly addressed alcohol use and consumption, alcohol treatment, or recovery, and whether these apps incorporated empirically-based components of alcohol treatment. Results Findings showed that few apps addressed alcohol use behavior change or recovery. Aside from tracking drinking consumption, a minority utilized empirically-based components of alcohol treatment. Some apps claimed they could serve as an intervention, however no empirical evidence was provided. Conclusions More studies are needed to examine the efficacy of mobile technology in alcohol intervention studies. The large gap between availability of mobile apps and their use in alcohol treatment programs indicate several important future directions for research. PMID:21689119
NASA Technical Reports Server (NTRS)
2001-01-01
Howmet Research Corporation was the first to commercialize an innovative cast metal technology developed at Auburn University, Auburn, Alabama. With funding assistance from NASA's Marshall Space Flight Center, Auburn University's Solidification Design Center (a NASA Commercial Space Center), developed accurate nickel-based superalloy data for casting molten metals. Through a contract agreement, Howmet used the data to develop computer model predictions of molten metals and molding materials in cast metal manufacturing. Howmet Metal Mold (HMM), part of Howmet Corporation Specialty Products, of Whitehall, Michigan, utilizes metal molds to manufacture net shape castings in various alloys and amorphous metal (metallic glass). By implementing the thermophysical property data from by Auburn researchers, Howmet employs its newly developed computer model predictions to offer customers high-quality, low-cost, products with significantly improved mechanical properties. Components fabricated with this new process replace components originally made from forgings or billet. Compared with products manufactured through traditional casting methods, Howmet's computer-modeled castings come out on top.
PRESAGE: PRivacy-preserving gEnetic testing via SoftwAre Guard Extension.
Chen, Feng; Wang, Chenghong; Dai, Wenrui; Jiang, Xiaoqian; Mohammed, Noman; Al Aziz, Md Momin; Sadat, Md Nazmus; Sahinalp, Cenk; Lauter, Kristin; Wang, Shuang
2017-07-26
Advances in DNA sequencing technologies have prompted a wide range of genomic applications to improve healthcare and facilitate biomedical research. However, privacy and security concerns have emerged as a challenge for utilizing cloud computing to handle sensitive genomic data. We present one of the first implementations of Software Guard Extension (SGX) based securely outsourced genetic testing framework, which leverages multiple cryptographic protocols and minimal perfect hash scheme to enable efficient and secure data storage and computation outsourcing. We compared the performance of the proposed PRESAGE framework with the state-of-the-art homomorphic encryption scheme, as well as the plaintext implementation. The experimental results demonstrated significant performance over the homomorphic encryption methods and a small computational overhead in comparison to plaintext implementation. The proposed PRESAGE provides an alternative solution for secure and efficient genomic data outsourcing in an untrusted cloud by using a hybrid framework that combines secure hardware and multiple crypto protocols.
Learning technologies and the cyber-science classroom
NASA Astrophysics Data System (ADS)
Houlihan, Gerard
Access to computer and communication technology has long been regarded `part-and-parcel' of a good education. No educator can afford to ignore the profound impact of learning technologies on the way we teach science, nor fail to acknowledge that information literacy and computing skills will be fundamental to the practice of science in the next millennium. Nevertheless, there is still confusion concerning what technologies educators should employ in teaching science. Furthermore, a lack of knowledge combined with the pressures to be `seen' utilizing technology has lead some schools to waste scarce resources in a `grab-bag' attitude towards computers and technology. Such popularized `wish lists' can only drive schools to accumulate expensive equipment for no real learning purpose. In the future educators will have to reconsider their curriculum and pedagogy with a focus on the learning environment before determining what appropriate computing resources to acquire. This will be fundamental to the capabilities of science classrooms to engage with cutting-edge issues in science. This session will demonstrate the power of a broad range of learning technologies to enhance science education. The aim is to explore classroom possibilities as well as to provide a basic introduction to technical aspects of various software and hardware applications, including robotics and dataloggers and simulation software.
Design and implementation of a Windows NT network to support CNC activities
NASA Technical Reports Server (NTRS)
Shearrow, C. A.
1996-01-01
The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.
Building IT capability in health-care organizations.
Khatri, Naresh
2006-05-01
While computer technology has revolutionized industries such as banking and airlines, it has done little for health care so far. Most of the health-care organizations continue the early-computer-era practice of buying the latest technology without knowing how it might effectively be employed in achieving business goals. By investing merely in information technology (IT) rather than in IT capabilities they acquire IT components--primarily hardware, software, and vendor-provided services--which they do not understand and, as a result, are not capable of fully utilizing for achieving organizational objectives. In the absence of internal IT capabilities, health-care organizations have relied heavily on the fragmented IT vendor market in which vendors do not offer an open architecture, and are unwilling to offer electronic interfaces that would make their 'closed' systems compatible with those of other vendors. They are hamstrung as a result because they have implemented so many different technologies and databases that information stays in silos. Health systems can meet this challenge by developing internal IT capabilities that would allow them to seamlessly integrate clinical and business IT systems and develop innovative uses of IT. This paper develops a comprehensive conception of IT capability grounded in the resource-based theory of the firm as a remedy to the woes of IT investments in health care.
Tubaishat, Ahmad
2017-09-18
Electronic health records (EHRs) are increasingly being implemented in healthcare organizations but little attention has been paid to the degree to which nurses as end-users will accept these systems and subsequently use them. To explore nurses' perceptions of usefulness and ease-of-use of EHRs. The relationship between these constructs was examined, and its predictors were studied. A national exploratory study was conducted with 1539 nurses from 15 randomly selected hospitals, representative of different regions and healthcare sectors in Jordan. Data were collected using a self-administered questionnaire, which was based on the Technology Acceptance Model. Correlations and linear multiple regression were utilized to analyze the data. Jordanian nurses demonstrated a positive perception of the usefulness and ease-of-use of EHRs, and subsequently accepted the technology. Significant positive correlations were found between these two constructs. The variables that predict usefulness were the gender, professional rank, EHR experience, and computer skills of the nurses. The perceived ease-of-use was affected by nursing and EHR experience, and computers skills. This study adds to the growing body of knowledge on issues related to the acceptance of technology in the health informatics field, focusing on nurses' acceptance of EHRs.
NASA Astrophysics Data System (ADS)
Sendek, Austin D.; Yang, Qian; Cubuk, Ekin D.; Duerloo, Karel-Alexander N.; Cui, Yi; Reed, Evan J.
We present a new type of large-scale computational screening approach for identifying promising candidate materials for solid state electrolytes for lithium ion batteries that is capable of screening all known lithium containing solids. To predict the likelihood of a candidate material exhibiting high lithium ion conductivity, we leverage machine learning techniques to train an ionic conductivity classification model using logistic regression based on experimental measurements reported in the literature. This model, which is built on easily calculable atomistic descriptors, provides new insight into the structure-property relationship for superionic behavior in solids and is approximately one million times faster to evaluate than DFT-based approaches to calculating diffusion coefficients or migration barriers. We couple this model with several other technologically motivated heuristics to reduce the list of candidate materials from the more than 12,000 known lithium containing solids to 21 structures that show promise as electrolytes, few of which have been examined experimentally. Our screening utilizes structures and electronic information contained in the Materials Project database. This work is supported by an Office of Technology Licensing Fellowship through the Stanford Graduate Fellowship Program and a seed Grant from the TomKat Center for Sustainable Energy at Stanford.
Singh, Pankaj Kumar; Negi, Arvind; Gupta, Pawan Kumar; Chauhan, Monika; Kumar, Raj
2016-08-01
Toxicity is a common drawback of newly designed chemotherapeutic agents. With the exception of pharmacophore-induced toxicity (lack of selectivity at higher concentrations of a drug), the toxicity due to chemotherapeutic agents is based on the toxicophore moiety present in the drug. To date, methodologies implemented to determine toxicophores may be broadly classified into biological, bioanalytical and computational approaches. The biological approach involves analysis of bioactivated metabolites, whereas the computational approach involves a QSAR-based method, mapping techniques, an inverse docking technique and a few toxicophore identification/estimation tools. Being one of the major steps in drug discovery process, toxicophore identification has proven to be an essential screening step in drug design and development. The paper is first of its kind, attempting to cover and compare different methodologies employed in predicting and determining toxicophores with an emphasis on their scope and limitations. Such information may prove vital in the appropriate selection of methodology and can be used as screening technology by researchers to discover the toxicophoric potentials of their designed and synthesized moieties. Additionally, it can be utilized in the manipulation of molecules containing toxicophores in such a manner that their toxicities might be eliminated or removed.
Gold, Laura S; Klein, Gregory; Carr, Lauren; Kessler, Larry; Sullivan, Sean D
2012-01-25
In this article, we trace the chronology of developments in breast imaging technologies that are used for diagnosis and staging of breast cancer, including mammography, ultrasonography, magnetic resonance imaging, computed tomography, and positron emission tomography. We explore factors that affected clinical acceptance and utilization of these technologies from discovery to clinical use, including milestones in peer-reviewed publication, US Food and Drug Administration approval, reimbursement by payers, and adoption into clinical guidelines. The factors driving utilization of new imaging technologies are mainly driven by regulatory approval and reimbursement by payers rather than evidence that they provide benefits to patients. Comparative effectiveness research can serve as a useful tool to investigate whether these imaging modalities provide information that improves patient outcomes in real-world settings.
Using multimedia virtual patients to enhance the clinical curriculum for medical students.
McGee, J B; Neill, J; Goldman, L; Casey, E
1998-01-01
Changes in the environment in which clinical medical education takes place in the United States has profoundly affected the quality of the learning experience. A shift to out-patient based care, minimization of hospitalization time, and shrinking clinical revenues has changed the teaching hospital or "classroom" to a degree that we must develop innovative approaches to medical education. One solution is the Virtual Patient Project. Utilizing state-of-the-art computer-based multimedia technology, we are building a library of simulated patient encounters that will serve to fill some of the educational gaps that the current health care system has created. This project is part of a newly formed and unique organization, the Harvard Medical School-Beth Israel Deaconess Mount Auburn Institute for Education and Research (the Institute), which supports in-house educational design, production, and faculty time to create Virtual Patients. These problem-based clinical cases allow the medical student to evaluate a patient at initial presentation, order diagnostic tests, observe the outcome and obtain context-sensitive feedback through a computer program designed at the Institute. Multimedia technology and authoring programs have reached a level of sophistication to allow content experts (the teaching faculty) to design and create the majority of the program themselves and to allow students to adapt the program to their individual learning needs.
Advanced imaging in acute stroke management-Part I: Computed tomographic.
Saini, Monica; Butcher, Ken
2009-01-01
Neuroimaging is fundamental to stroke diagnosis and management. Non-contrast computed tomography (NCCT) has been the primary imaging modality utilized for this purpose for almost four decades. Although NCCT does permit identification of intracranial hemorrhage and parenchymal ischemic changes, insights into blood vessel patency and cerebral perfusion are limited. Advances in reperfusion strategies have made identification of potentially salvageable brain tissue a more practical concern. Advances in CT technology now permit identification of acute and chronic arterial lesions, as well as cerebral blood flow deficits. This review outlines principles of advanced CT image acquisition and its utility in acute stroke management.
Strategic Planning for Computer-Based Educational Technology.
ERIC Educational Resources Information Center
Bozeman, William C.
1984-01-01
Offers educational practitioners direction for the development of a master plan for the implementation and application of computer-based educational technology by briefly examining computers in education, discussing organizational change from a theoretical perspective, and presenting an overview of the planning strategy known as the planning and…
NASA Technical Reports Server (NTRS)
Smith, Jeffrey
2003-01-01
The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.
26 CFR 1.168(j)-1T - Questions and answers concerning tax-exempt entity leasing rules (temporary).
Code of Federal Regulations, 2011 CFR
2011-04-01
... technological equipment” means (1) any computer or peripheral equipment, (2) any high technology telephone..., electromechanical, or computer-based high technology equipment which is tangible personal property used in the... before the expiration of its physical useful life. High technology medical equipment may include computer...
NASA Astrophysics Data System (ADS)
Grujicic, M.; Arakere, G.; Hariharan, A.; Pandurangan, B.
2012-06-01
The introduction of newer joining technologies like the so-called friction-stir welding (FSW) into automotive engineering entails the knowledge of the joint-material microstructure and properties. Since, the development of vehicles (including military vehicles capable of surviving blast and ballistic impacts) nowadays involves extensive use of the computational engineering analyses (CEA), robust high-fidelity material models are needed for the FSW joints. A two-level material-homogenization procedure is proposed and utilized in this study to help manage computational cost and computer storage requirements for such CEAs. The method utilizes experimental (microstructure, microhardness, tensile testing, and x-ray diffraction) data to construct: (a) the material model for each weld zone and (b) the material model for the entire weld. The procedure is validated by comparing its predictions with the predictions of more detailed but more costly computational analyses.
ERIC Educational Resources Information Center
Hyland, Matthew R.; Pinto-Zipp, Genevieve; Olson, Valerie; Lichtman, Steven W.
2010-01-01
Technological advancements and competition in student recruitment have challenged educational institutions to expand upon traditional teaching methods in order to attract, engage and retain students. One strategy to meet this shift from educator-directed teaching to student-centered learning is greater computer utilization as an integral aspect of…
Using Neural Net Technology To Enhance the Efficiency of a Computer Adaptive Testing Application.
ERIC Educational Resources Information Center
Van Nelson, C.; Henriksen, Larry W.
The potential for computer adaptive testing (CAT) has been well documented. In order to improve the efficiency of this process, it may be possible to utilize a neural network, or more specifically, a back propagation neural network. The paper asserts that in order to accomplish this end, it must be shown that grouping examinees by ability as…
ERIC Educational Resources Information Center
Li, Yi
2012-01-01
This study focuses on the issue of learning equity in colleges and universities where teaching and learning have come to depend heavily on computer technologies. The study uses the Multiple Indicators Multiple Causes (MIMIC) latent variable model to quantitatively investigate whether there is a gender /ethnicity difference in using computer based…
Life-Span Differences in the Uses and Gratifications of Tablets: Implications for Older Adults
Magsamen-Conrad, Kate; Dowd, John; Abuljadail, Mohammad; Alsulaiman, Saud; Shareefi, Adnan
2015-01-01
This study extends Uses and Gratifications theory by examining the uses and gratifications of a new technological device, the tablet computer, and investigating the differential uses and gratifications of tablet computers across the life-span. First, we utilized a six-week tablet training intervention to adapt and extend existing measures to the tablet as a technological device. Next, we used paper-based and online surveys (N=847), we confirmed four main uses of tablets: 1) Information Seeking, 2) Relationship Maintenance, 3) Style, 4) Amusement and Killing time, and added one additional use category 5) Organization. We discovered differences among the five main uses of tablets across the life-span, with older adults using tablets the least overall. Builders, Boomers, GenX and GenY all reported the highest means for information seeking. Finally, we used a structural equation model to examine how uses and gratifications predicts hours of tablet use. The study provides limitations and suggestions for future research and marketers. In particular, this study offers insight to the relevancy of theory as it applies to particular information and communication technologies and consideration of how different periods in the life-span affect tablet motivations. PMID:26113769
Life-Span Differences in the Uses and Gratifications of Tablets: Implications for Older Adults.
Magsamen-Conrad, Kate; Dowd, John; Abuljadail, Mohammad; Alsulaiman, Saud; Shareefi, Adnan
2015-11-01
This study extends Uses and Gratifications theory by examining the uses and gratifications of a new technological device, the tablet computer, and investigating the differential uses and gratifications of tablet computers across the life-span. First, we utilized a six-week tablet training intervention to adapt and extend existing measures to the tablet as a technological device. Next, we used paper-based and online surveys ( N =847), we confirmed four main uses of tablets: 1) Information Seeking, 2) Relationship Maintenance, 3) Style, 4) Amusement and Killing time, and added one additional use category 5) Organization. We discovered differences among the five main uses of tablets across the life-span, with older adults using tablets the least overall. Builders, Boomers, GenX and GenY all reported the highest means for information seeking. Finally, we used a structural equation model to examine how uses and gratifications predicts hours of tablet use. The study provides limitations and suggestions for future research and marketers. In particular, this study offers insight to the relevancy of theory as it applies to particular information and communication technologies and consideration of how different periods in the life-span affect tablet motivations.
Song, Seung-Joon; Choi, Jaesoon; Park, Yong-Doo; Lee, Jung-Joo; Hong, So Young; Sun, Kyung
2010-11-01
Bioprinting is an emerging technology for constructing tissue or bioartificial organs with complex three-dimensional (3D) structures. It provides high-precision spatial shape forming ability on a larger scale than conventional tissue engineering methods, and simultaneous multiple components composition ability. Bioprinting utilizes a computer-controlled 3D printer mechanism for 3D biological structure construction. To implement minimal pattern width in a hydrogel-based bioprinting system, a study on printing characteristics was performed by varying printer control parameters. The experimental results showed that printing pattern width depends on associated printer control parameters such as printing flow rate, nozzle diameter, and nozzle velocity. The system under development showed acceptable feasibility of potential use for accurate printing pattern implementation in tissue engineering applications and is another example of novel techniques for regenerative medicine based on computer-aided biofabrication system. © 2010, Copyright the Authors. Artificial Organs © 2010, International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
The digital transformation of health care.
Coile, R C
2000-01-01
The arrival of the Internet offers the opportunity to fundamentally reinvent medicine and health care delivery. The "e-health" era is nothing less than the digital transformation of the practice of medicine, as well as the business side of the health industry. Health care is only now arriving in the "Information Economy." The Internet is the next frontier of health care. Health care consumers are flooding into cyberspace, and an Internet-based industry of health information providers is springing up to serve them. Internet technology may rank with antibiotics, genetics, and computers as among the most important changes for medical care delivery. Utilizing e-health strategies will expand exponentially in the next five years, as America's health care executives shift to applying IS/IT (information systems/information technology) to the fundamental business and clinical processes of the health care enterprise. Internet-savvy physician executives will provide a bridge between medicine and management in the adoption of e-health technology.
Computer-Based Education (CBE): Tomorrow's Traditional System.
ERIC Educational Resources Information Center
Rizza, Peter J., Jr.
1981-01-01
Examines the role of computer technology in education; discusses reasons for the slow evolution of Computer-Based Education (CBE); explores educational areas in which CBE can be used; presents barriers to widespread use of CBE; and describes the responsibilities of education, government, and business in supporting technology-oriented education.…
Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing
NASA Technical Reports Server (NTRS)
Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane
2012-01-01
Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.
Direct Methanol Fuel Cell Power Supply For All-Day True Wireless Mobile Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brian Wells
PolyFuel has developed state-of-the-art portable fuel cell technology for the portable computing market. A novel approach to passive water recycling within the MEA has led to significant system simplification and size reduction. Miniature stack technology with very high area utilization and minimalist seals has been developed. A highly integrated balance of plant with very low parasitic losses has been constructed around the new stack design. Demonstration prototype systems integrated with laptop computers have been shown in recent months to leading OEM computer manufacturers. PolyFuel intends to provide this technology to its customers as a reference design as a means ofmore » accelerating the commercialization of portable fuel cell technology. The primary goal of the project was to match the energy density of a commercial lithium ion battery for laptop computers. PolyFuel made large strides against this goal and has now demonstrated 270 Wh/liter compared with lithium ion energy densities of 300 Wh/liter. Further, more incremental, improvements in energy density are envisioned with an additional 20-30% gains possible in each of the next two years given further research and development.« less
THE TOXCAST PROGRAM FOR PRIORITIZING TOXICITY TESTING OF ENVIRONMENTAL CHEMICALS
The United States Environmental Protection Agency (EPA) is developing methods for utilizing computational chemistry, high-throughput screening (HTS) and various toxicogenomic technologies to predict potential for toxicity and prioritize limited testing resources towards chemicals...
Wong, Vincent Kam-Wai; Law, Betty Yuen-Kwan; Yao, Xiao-Jun; Chen, Xi; Xu, Su Wei; Liu, Liang; Leung, Elaine Lai-Han
2016-09-01
Traditional biotechnology has been utilized by human civilization for long in wide aspects of our daily life, such as wine and vinegar production, which can generate new phytochemicals from natural products using micro-organism. Today, with advanced biotechnology, diverse applications and advantages have been exhibited not only in bringing benefits to increase the diversity and composition of herbal phytochemicals, but also helping to elucidate the treatment mechanism and accelerate new drug discovery from Chinese herbal medicine (CHM). Applications on phytochemical biotechnologies and microbial biotechnologies have been promoted to enhance phytochemical diversity. Cell labeling and imaging technology and -omics technology have been utilized to elucidate CHM treatment mechanism. Application of computational methods, such as chemoinformatics and bioinformatics provide new insights on direct target of CHM. Overall, these technologies provide efficient ways to overcome the bottleneck of CHM, such as helping to increase the phytochemical diversity, match their molecular targets and elucidate the treatment mechanism. Potentially, new oriented herbal phytochemicals and their corresponding drug targets can be identified. In perspective, tighter integration of multi-disciplinary biotechnology and computational technology will be the cornerstone to accelerate new arena formation, advancement and revolution in the fields of CHM and world pharmaceutical industry. Copyright © 2016 Elsevier Ltd. All rights reserved.
Gonzalez, Roxana; O'Brien-Barry, Patricia; Ancheta, Reginaldo; Razal, Rennuel; Clyne, Mary Ellen
A quasiexperimental study was conducted to demonstrate which teaching modality, peer education or computer-based education, improves the utilization of the library electronic databases and thereby evidence-based knowledge at the point of care. No significant differences were found between the teaching modalities. However, the study identified the need to explore professional development teaching modalities outside the traditional classroom to support an evidence-based practice healthcare environment.
Wunderlich, Adam; Abbey, Craig K
2013-11-01
Studies of lesion detectability are often carried out to evaluate medical imaging technology. For such studies, several approaches have been proposed to measure observer performance, such as the receiver operating characteristic (ROC), the localization ROC (LROC), the free-response ROC (FROC), the alternative free-response ROC (AFROC), and the exponentially transformed FROC (EFROC) paradigms. Therefore, an experimenter seeking to carry out such a study is confronted with an array of choices. Traditionally, arguments for different approaches have been made on the basis of practical considerations (statistical power, etc.) or the gross level of analysis (case-level or lesion-level). This article contends that a careful consideration of utility should form the rationale for matching the assessment paradigm to the clinical task of interest. In utility theory, task performance is commonly evaluated with total expected utility, which integrates the various event utilities against the probability of each event. To formalize the relationship between expected utility and the summary curve associated with each assessment paradigm, the concept of a "natural" utility structure is proposed. A natural utility structure is defined for a summary curve when the variables associated with the summary curve axes are sufficient for computing total expected utility, assuming that the disease prevalence is known. Natural utility structures for ROC, LROC, FROC, AFROC, and EFROC curves are introduced, clarifying how the utilities of correct and incorrect decisions are aggregated by summary curves. Further, conditions are given under which general utility structures for localization-based methodologies reduce to case-based assessment. Overall, the findings reveal how summary curves correspond to natural utility structures of diagnostic tasks, suggesting utility as a motivating principle for choosing an assessment paradigm.
Understanding How Adolescents with Reading Difficulties Utilize Technology-Based Tools
ERIC Educational Resources Information Center
Marino, Matthew T.
2009-01-01
This article reports the findings from a study that examined how adolescent students with reading difficulties utilized cognitive tools that were embedded in a technology-based middle school science curriculum. The curriculum contained salient features of the Universal Design for Learning (UDL) theoretical framework. Sixteen general education…
NASA Technical Reports Server (NTRS)
Ewell, Robert N.
1994-01-01
The U.S. Space Foundation displayed its prototype Space Technology Hall of Fame exhibit design at the Technology 2003 conference in Anaheim, CA, December 7-9, 1993. In order to sample public opinion on space technology in general and the exhibit in particular, a computer-based survey was set up as a part of the display. The data collected was analyzed.
The Diffusion of Computer-Based Technology in K-12 Schools: Teachers' Perspectives
ERIC Educational Resources Information Center
Colandrea, John Louis
2012-01-01
Because computer technology represents a major financial outlay for school districts and is an efficient method of preparing and delivering lessons, studying the process of teacher adoption of computer use is beneficial and adds to the current body of knowledge. Because the teacher is the ultimate user of computer technology for lesson preparation…
NASA Technical Reports Server (NTRS)
1994-01-01
The NASA-OAI High Performance Communication and Computing K- 12 School Partnership program has been completed. Cleveland School of the Arts, Empire Computech Center, Grafton Local Schools and the Bug O Nay Ge Shig School have all received network equipment and connections. Each school is working toward integrating computer and communications technology into their classroom curriculum. Cleveland School of the Arts students are creating computer software. Empire Computech Center is a magnet school for technology education at the elementary school level. Grafton Local schools is located in a rural community and is using communications technology to bring to their students some of the same benefits students from suburban and urban areas receive. The Bug O Nay Ge Shig School is located on an Indian Reservation in Cass Lake, MN. The students at this school are using the computer to help them with geological studies. A grant has been issued to the friends of the Nashville Library. Nashville is a small township in Holmes County, Ohio. A community organization has been formed to turn their library into a state of the art Media Center. Their goal is to have a place where rural students can learn about different career options and how to go about pursuing those careers. Taylor High School in Cincinnati, Ohio was added to the schools involved in the Wind Tunnel Project. A mini grant has been awarded to Taylor High School for computer equipment. The computer equipment is utilized in the school's geometry class to computationally design objects which will be tested for their aerodynamic properties in the Barberton Wind Tunnel. The students who create the models can view the test in the wind tunnel via desk top conferencing. Two teachers received stipends for helping with the Regional Summer Computer Workshop. Both teachers were brought in to teach a session within the workshop. They were selected to teach the session based on their expertise in particular software applications.
The female knee: anatomic variations.
Conley, Sheryl; Rosenberg, Aaron; Crowninshield, Roy
2007-01-01
Traditional knee implants have been designed "down the middle,"based on the combined average size and shape of male and female knee anatomy.Sex-based research in the field of orthopaedics has led to new understanding of the anatomic differences between the sexes and the associated implications for women undergoing total knee arthroplasty. Through the use of a comprehensive bone morphology atlas that utilizes novel three-dimensional computed tomography analysis technology, significant anatomic differences have been documented in the shape and size of female knees compared with male knees. This research identifies three notable anatomic differences in the female population: a less prominent anterior condyle, an increased Q angle, and a reduced medial-lateral:anterior-posterior aspect ratio.
NASA Astrophysics Data System (ADS)
Newman, Gregory A.; Commer, Michael
2009-07-01
Three-dimensional (3D) geophysical imaging is now receiving considerable attention for electrical conductivity mapping of potential offshore oil and gas reservoirs. The imaging technology employs controlled source electromagnetic (CSEM) and magnetotelluric (MT) fields and treats geological media exhibiting transverse anisotropy. Moreover when combined with established seismic methods, direct imaging of reservoir fluids is possible. Because of the size of the 3D conductivity imaging problem, strategies are required exploiting computational parallelism and optimal meshing. The algorithm thus developed has been shown to scale to tens of thousands of processors. In one imaging experiment, 32,768 tasks/processors on the IBM Watson Research Blue Gene/L supercomputer were successfully utilized. Over a 24 hour period we were able to image a large scale field data set that previously required over four months of processing time on distributed clusters based on Intel or AMD processors utilizing 1024 tasks on an InfiniBand fabric. Electrical conductivity imaging using massively parallel computational resources produces results that cannot be obtained otherwise and are consistent with timeframes required for practical exploration problems.
From Ambiguities to Insights: Query-based Comparisons of High-Dimensional Data
NASA Astrophysics Data System (ADS)
Kowalski, Jeanne; Talbot, Conover; Tsai, Hua L.; Prasad, Nijaguna; Umbricht, Christopher; Zeiger, Martha A.
2007-11-01
Genomic technologies will revolutionize drag discovery and development; that much is universally agreed upon. The high dimension of data from such technologies has challenged available data analytic methods; that much is apparent. To date, large-scale data repositories have not been utilized in ways that permit their wealth of information to be efficiently processed for knowledge, presumably due in large part to inadequate analytical tools to address numerous comparisons of high-dimensional data. In candidate gene discovery, expression comparisons are often made between two features (e.g., cancerous versus normal), such that the enumeration of outcomes is manageable. With multiple features, the setting becomes more complex, in terms of comparing expression levels of tens of thousands transcripts across hundreds of features. In this case, the number of outcomes, while enumerable, become rapidly large and unmanageable, and scientific inquiries become more abstract, such as "which one of these (compounds, stimuli, etc.) is not like the others?" We develop analytical tools that promote more extensive, efficient, and rigorous utilization of the public data resources generated by the massive support of genomic studies. Our work innovates by enabling access to such metadata with logically formulated scientific inquires that define, compare and integrate query-comparison pair relations for analysis. We demonstrate our computational tool's potential to address an outstanding biomedical informatics issue of identifying reliable molecular markers in thyroid cancer. Our proposed query-based comparison (QBC) facilitates access to and efficient utilization of metadata through logically formed inquires expressed as query-based comparisons by organizing and comparing results from biotechnologies to address applications in biomedicine.
NASA Astrophysics Data System (ADS)
Panitkin, Sergey; Barreiro Megino, Fernando; Caballero Bejar, Jose; Benjamin, Doug; Di Girolamo, Alessandro; Gable, Ian; Hendrix, Val; Hover, John; Kucharczyk, Katarzyna; Medrano Llamas, Ramon; Love, Peter; Ohman, Henrik; Paterson, Michael; Sobie, Randall; Taylor, Ryan; Walker, Rodney; Zaytsev, Alexander; Atlas Collaboration
2014-06-01
The computing model of the ATLAS experiment was designed around the concept of grid computing and, since the start of data taking, this model has proven very successful. However, new cloud computing technologies bring attractive features to improve the operations and elasticity of scientific distributed computing. ATLAS sees grid and cloud computing as complementary technologies that will coexist at different levels of resource abstraction, and two years ago created an R&D working group to investigate the different integration scenarios. The ATLAS Cloud Computing R&D has been able to demonstrate the feasibility of offloading work from grid to cloud sites and, as of today, is able to integrate transparently various cloud resources into the PanDA workload management system. The ATLAS Cloud Computing R&D is operating various PanDA queues on private and public resources and has provided several hundred thousand CPU days to the experiment. As a result, the ATLAS Cloud Computing R&D group has gained a significant insight into the cloud computing landscape and has identified points that still need to be addressed in order to fully utilize this technology. This contribution will explain the cloud integration models that are being evaluated and will discuss ATLAS' learning during the collaboration with leading commercial and academic cloud providers.
NASA Technical Reports Server (NTRS)
2002-01-01
The accuDEXA(R) Bone Mineral Density Assessment System, manufactured by Schick Technologies, Inc., utilizes "camera on a chip" sensor technology invented and developed by NASA's Jet Propulsion Laboratory. Schick's accuDEXA system offers several advantages over traditional osteoporosis tests, which assess bone density loss in the hip and spine, and require specialized personnel to conduct. With accuDEXA, physicians can test the entire body's bone density at a peripheral site, such as the finger, without applying gels or having patients remove garments. Results are achieved in 30 seconds and printed out in less than a minute, compared to the estimated exam time of 15 minutes for hip and spine density analyses. Schick has also applied the CMOS APS technology to a new software product that performs dental radiography using up to 90 percent less radiation exposure than conventional X-rays. Called Computed Dental Radiography(R), the new digital imaging product utilizes an electronic sensor in place of X-ray film to generate sharp and clear images that appear on a computer screen within 3 seconds, and can be enlarged and enhanced to identify problems.
NASA Astrophysics Data System (ADS)
Nelson, Mathew
In today's age of exponential change and technological advancement, awareness of any gender gap in technology and computer science-related fields is crucial, but further research must be done in an effort to better understand the complex interacting factors contributing to the gender gap. This study utilized a survey to investigate specific gender differences relating to computing self-efficacy, computer usage, and environmental factors of exposure, personal interests, and parental influence that impact gender differences of high school students within a one-to-one computing environment in South Dakota. The population who completed the One-to-One High School Computing Survey for this study consisted of South Dakota high school seniors who had been involved in a one-to-one computing environment for two or more years. The data from the survey were analyzed using descriptive and inferential statistics for the determined variables. From the review of literature and data analysis several conclusions were drawn from the findings. Among them are that overall, there was very little difference in perceived computing self-efficacy and computing anxiety between male and female students within the one-to-one computing initiative. The study supported the current research that males and females utilized computers similarly, but males spent more time using their computers to play online games. Early exposure to computers, or the age at which the student was first exposed to a computer, and the number of computers present in the home (computer ownership) impacted computing self-efficacy. The results also indicated parental encouragement to work with computers also contributed positively to both male and female students' computing self-efficacy. Finally the study also found that both mothers and fathers encouraged their male children more than their female children to work with computing and pursue careers in computing science fields.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 11 2010-01-01 2010-01-01 false Purpose. 1703.100 Section 1703.100 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE... telecommunications, computer networks, and related advanced technologies by students, teachers, medical professionals...
Technology and Current Reading/Literacy Assessment Strategies
ERIC Educational Resources Information Center
Balajthy, Ernest
2007-01-01
Computer-based technologies offer promise as a means to assess students and provide teachers with better understandings of their students' achievement. This article describes recent developments in computer-based and web-based reading and literacy assessment, focusing on assessment administration, information management, and report creation. In…
Launch Site Computer Simulation and its Application to Processes
NASA Technical Reports Server (NTRS)
Sham, Michael D.
1995-01-01
This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.
2011-01-01
Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105
General aviation design synthesis utilizing interactive computer graphics
NASA Technical Reports Server (NTRS)
Galloway, T. L.; Smith, M. R.
1976-01-01
Interactive computer graphics is a fast growing area of computer application, due to such factors as substantial cost reductions in hardware, general availability of software, and expanded data communication networks. In addition to allowing faster and more meaningful input/output, computer graphics permits the use of data in graphic form to carry out parametric studies for configuration selection and for assessing the impact of advanced technologies on general aviation designs. The incorporation of interactive computer graphics into a NASA developed general aviation synthesis program is described, and the potential uses of the synthesis program in preliminary design are demonstrated.
Cross-Cultural Issues of Office Technology Management: Comparing Canada and the United States.
ERIC Educational Resources Information Center
Gattiker, Urs E.; And Others
Although the internationalization of business makes cross-cultural research on workers' attitudes toward computer-based technology valuable to management, cross-cultural studies are rare. A study was conducted to determine whether employees in the United States differ from Canadian employees in their evaluation of computer-based technology due to…
Klein, Gregory; Carr, Lauren; Kessler, Larry; Sullivan, Sean D.
2012-01-01
Abstract In this article, we trace the chronology of developments in breast imaging technologies that are used for diagnosis and staging of breast cancer, including mammography, ultrasonography, magnetic resonance imaging, computed tomography, and positron emission tomography. We explore factors that affected clinical acceptance and utilization of these technologies from discovery to clinical use, including milestones in peer-reviewed publication, US Food and Drug Administration approval, reimbursement by payers, and adoption into clinical guidelines. The factors driving utilization of new imaging technologies are mainly driven by regulatory approval and reimbursement by payers rather than evidence that they provide benefits to patients. Comparative effectiveness research can serve as a useful tool to investigate whether these imaging modalities provide information that improves patient outcomes in real-world settings. PMID:22275726
Risk Assessment Methodology Based on the NISTIR 7628 Guidelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Sheldon, Frederick T; Hauser, Katie R
2013-01-01
Earlier work describes computational models of critical infrastructure that allow an analyst to estimate the security of a system in terms of the impact of loss per stakeholder resulting from security breakdowns. Here, we consider how to identify, monitor and estimate risk impact and probability for different smart grid stakeholders. Our constructive method leverages currently available standards and defined failure scenarios. We utilize the National Institute of Standards and Technology (NIST) Interagency or Internal Reports (NISTIR) 7628 as a basis to apply Cyberspace Security Econometrics system (CSES) for comparing design principles and courses of action in making security-related decisions.
Apoptosis and Self-Destruct: A Contribution to Autonomic Agents?
NASA Technical Reports Server (NTRS)
Sterritt, Roy; Hinchey, Mike
2004-01-01
Autonomic Computing (AC), a self-managing systems initiative based on the biological metaphor of the autonomic nervous system, is increasingly gaining momentum as the way forward in designing reliable systems. Agent technologies have been identified as a key enabler for engineering autonomicity in systems, both in terms of retrofitting autonomicity into legacy systems and designing new systems. The AC initiative provides an opportunity to consider other biological systems and principles in seeking new design strategies. This paper reports on one such investigation; utilizing the apoptosis metaphor of biological systems to provide a dynamic health indicator signal between autonomic agents.
Simultaneous single-shot readout of multi-qubit circuits using a traveling-wave parametric amplifier
NASA Astrophysics Data System (ADS)
O'Brien, Kevin
Observing and controlling the state of ever larger quantum systems is critical for advancing quantum computation. Utilizing a Josephson traveling wave parametric amplifier (JTWPA), we demonstrate simultaneous multiplexed single shot readout of 10 transmon qubits in a planar architecture. We employ digital image sideband rejection to eliminate noise at the image frequencies. We quantify crosstalk and infidelity due to simultaneous readout and control of multiple qubits. Based on current amplifier technology, this approach can scale to simultaneous readout of at least 20 qubits. This work was supported by the Army Research Office.
Cloudbus Toolkit for Market-Oriented Cloud Computing
NASA Astrophysics Data System (ADS)
Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian
This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.
ERIC Educational Resources Information Center
Moore, Antionette L.
2012-01-01
The purpose of this qualitative study was to explore how the computer is utilized in the daily lives of seven African American male youth in the southeastern region of the United States. Critical pedagogy was selected as the theoretical framework using Paulo Freire ideas of problem-posing education to promote awareness towards using the computer…
Thermal Transfer Compared To The Fourteen Other Imaging Technologies
NASA Astrophysics Data System (ADS)
O'Leary, John W.
1989-07-01
A quiet revolution in the world of imaging has been underway for the past few years. The older technologies of dot matrix, daisy wheel, thermal paper and pen plotters have been increasingly displaced by laser, ink jet and thermal transfer. The net result of this revolution is improved technologies that afford superior imaging, quiet operation, plain paper usage, instant operation, and solid state components. Thermal transfer is one of the processes that incorporates these benefits. Among the imaging application for thermal transfer are: 1. Bar code labeling and scanning. 2. New systems for airline ticketing, boarding passes, reservations, etc. 3. Color computer graphics and imaging. 4. Copying machines that copy in color. 5. Fast growing communications media such as facsimile. 6. Low cost word processors and computer printers. 7. New devices that print pictures from video cameras or television sets. 8. Cameras utilizing computer chips in place of film.
Design and implementation of spatial knowledge grid for integrated spatial analysis
NASA Astrophysics Data System (ADS)
Liu, Xiangnan; Guan, Li; Wang, Ping
2006-10-01
Supported by spatial information grid(SIG), the spatial knowledge grid (SKG) for integrated spatial analysis utilizes the middleware technology in constructing the spatial information grid computation environment and spatial information service system, develops spatial entity oriented spatial data organization technology, carries out the profound computation of the spatial structure and spatial process pattern on the basis of Grid GIS infrastructure, spatial data grid and spatial information grid (specialized definition). At the same time, it realizes the complex spatial pattern expression and the spatial function process simulation by taking the spatial intelligent agent as the core to establish space initiative computation. Moreover through the establishment of virtual geographical environment with man-machine interactivity and blending, complex spatial modeling, network cooperation work and spatial community decision knowledge driven are achieved. The framework of SKG is discussed systematically in this paper. Its implement flow and the key technology with examples of overlay analysis are proposed as well.
Cisco Networking Academy Program for high school students: Formative & summative evaluation
NASA Astrophysics Data System (ADS)
Cranford-Wesley, Deanne
This study examined the effectiveness of the Cisco Network Technology Program in enhancing students' technology skills as measured by classroom strategies, student motivation, student attitude, and student learning. Qualitative and quantitative methods were utilized to determine the effectiveness of this program. The study focused on two 11th grade classrooms at Hamtramck High School. Hamtramck, an inner-city community located in Detroit, is racially and ethnically diverse. The majority of students speak English as a second language; more than 20 languages are represented in the school district. More than 70% of the students are considered to be economically at risk. Few students have computers at home, and their access to the few computers at school is limited. Purposive sampling was conducted for this study. The sample consisted of 40 students, all of whom were trained in Cisco Networking Technologies. The researcher examined viable learning strategies in teaching a Cisco Networking class that focused on a web-based approach. Findings revealed that the Cisco Networking Academy Program was an excellent vehicle for teaching networking skills and, therefore, helping to enhance computer skills for the participating students. However, only a limited number of students were able to participate in the program, due to limited computer labs and lack of qualified teaching personnel. In addition, the cumbersome technical language posed an obstacle to students' success in networking. Laboratory assignments were preferred by 90% of the students over lecture and PowerPoint presentations. Practical applications, lab projects, interactive assignments, PowerPoint presentations, lectures, discussions, readings, research, and assessment all helped to increase student learning and proficiency and to enrich the classroom experience. Classroom strategies are crucial to student success in the networking program. Equipment must be updated and utilized to ensure that students are applying practical skills to networking concepts. The results also suggested a high level of motivation and retention in student participants. Students in both classes scored 80% proficiency on the Achievement Motivation Profile Assessment. The identified standard proficiency score was 70%, and both classes exceeded the standard.
Afify, Ahmed; Haney, Stephan
2016-08-01
Since it was first introduced into the dental world, computer-aided design/computer-aided manufacturing (CAD/CAM) technology has improved dramatically in regards to both data acquisition and fabrication abilities. CAD/CAM is capable of providing well-fitting intra- and extraoral prostheses when sound guidelines are followed. As CAD/CAM technology encompasses both surgical and prosthetic dental applications as well as fixed and removable aspects, it could improve the average quality of dental prostheses compared with the results obtained by conventional manufacturing methods. The purpose of this article is to provide an introduction into the methods in which this technology may be used to enhance the wear and fracture resistance of dentures and overdentures. This article will also showcase two clinical reports in which CAD/CAM technology has been implemented. © 2016 by the American College of Prosthodontists.
Courtney-Pratt, Helen; Cummings, Elizabeth; Turner, Paul; Cameron-Tucker, Helen; Wood-Baker, Richard; Walters, Eugene Haydn; Robinson, Andrew Lyle
2012-11-01
Achieving adoption, use, and integration of information and communication technology by healthcare clinicians in the workplace is recognized as a challenge that requires a multifaceted approach. This article explores community health nurses' engagement with information and communication technology as part of a larger research project that investigated the delivery of self-management support to people with chronic obstructive pulmonary disease. Following a survey of computer skills, participants were provided with computer training to support use of the project information system. Changes in practice were explored using action research meetings and individual semistructured interviews. Results highlight three domains that affected nurses' acceptance, utilization, and integration of information and communication technology into practice; environmental issues; factors in building capacity, confidence, and trust in the technology; and developing competence. Nurses face individual and practice challenges when attempting to integrate new processes into work activities, and the use of participatory models to support adoption is recommended.
WISARD: workbench for integrated superfast association studies for related datasets.
Lee, Sungyoung; Choi, Sungkyoung; Qiao, Dandi; Cho, Michael; Silverman, Edwin K; Park, Taesung; Won, Sungho
2018-04-20
A Mendelian transmission produces phenotypic and genetic relatedness between family members, giving family-based analytical methods an important role in genetic epidemiological studies-from heritability estimations to genetic association analyses. With the advance in genotyping technologies, whole-genome sequence data can be utilized for genetic epidemiological studies, and family-based samples may become more useful for detecting de novo mutations. However, genetic analyses employing family-based samples usually suffer from the complexity of the computational/statistical algorithms, and certain types of family designs, such as incorporating data from extended families, have rarely been used. We present a Workbench for Integrated Superfast Association studies for Related Data (WISARD) programmed in C/C++. WISARD enables the fast and a comprehensive analysis of SNP-chip and next-generation sequencing data on extended families, with applications from designing genetic studies to summarizing analysis results. In addition, WISARD can automatically be run in a fully multithreaded manner, and the integration of R software for visualization makes it more accessible to non-experts. Comparison with existing toolsets showed that WISARD is computationally suitable for integrated analysis of related subjects, and demonstrated that WISARD outperforms existing toolsets. WISARD has also been successfully utilized to analyze the large-scale massive sequencing dataset of chronic obstructive pulmonary disease data (COPD), and we identified multiple genes associated with COPD, which demonstrates its practical value.
Digital optical interconnects for photonic computing
NASA Astrophysics Data System (ADS)
Guilfoyle, Peter S.; Stone, Richard V.; Zeise, Frederick F.
1994-05-01
A 32-bit digital optical computer (DOC II) has been implemented in hardware utilizing 8,192 free-space optical interconnects. The architecture exploits parallel interconnect technology by implementing microcode at the primitive level. A burst mode of 0.8192 X 1012 binary operations per sec has been reliably demonstrated. The prototype has been successful in demonstrating general purpose computation. In addition to emulating the RISC instruction set within the UNIX operating environment, relational database text search operations have been implemented on DOC II.
Integrating an Intelligent Tutoring System for TAOs with Second Life
2010-12-01
SL) and interacts with a number of computer -controlled objects that take on the roles of the TAO’s teammates. TAOs rely on the same mechanism to...projects that utilize both game and simulation technology for training. He joined Stottler Henke in the fall of 2000 and holds a Ph.D. in computer science...including implementing tutors in multiuser worlds. He has been at Stottler Henke since 2005 and has a MS in computer science from Stanford University
Report to the President on the Use of Technology To Strengthen K-12 Education in the United States.
ERIC Educational Resources Information Center
President's Committee of Advisors on Science and Technology, Washington, DC. Panel on Educational Technology.
While a number of different approaches have been suggested for the improvement of K-12 education in the United States, one common element of many plans is the more extensive and more effective utilization of computer, networking, and other technologies in support of a broad program of systemic and curricular reform. The Panel on Educational…
ERIC Educational Resources Information Center
Ercan, Orhan; Bilen, Kadir
2014-01-01
Advances in computer technologies and adoption of related methods and techniques in education have developed parallel to each other. This study focuses on the need to utilize more than one teaching method and technique in education rather than focusing on a single teaching method. By using the pre-test post-test and control group semi-experimental…
Technological innovations for human outposts on planetary bodies
NASA Technical Reports Server (NTRS)
Clark, Benton C.
1988-01-01
Technology developments which have applications for establishing man-tended outposts on the moon and Mars are reviewed. The development of pressurized rovers and computer-aided control, repair, and manufacturing is discussed. The possibility of utilizing aerodynamic drag by optimizing dynamic pressure to accomplish the necessary spacecraft velocity reduction for planetary orbital capture is considered and research in the development of artificial gravity is examined.
An Examination of the Impact of One-to-One Computing When Used as a Tool for Student Writing
ERIC Educational Resources Information Center
Keppler, Michael P.
2012-01-01
The purpose of this study was to examine whether or not the use of net-book technology in the classroom altered the writing process for elementary students. The impetus for this study stemmed from the fact that more research needs to be conducted utilizing observations and interviews to examine the relationship between one-to-one technology use…
Evolving technologies for Space Station Freedom computer-based workstations
NASA Technical Reports Server (NTRS)
Jensen, Dean G.; Rudisill, Marianne
1990-01-01
Viewgraphs on evolving technologies for Space Station Freedom computer-based workstations are presented. The human-computer computer software environment modules are described. The following topics are addressed: command and control workstation concept; cupola workstation concept; Japanese experiment module RMS workstation concept; remote devices controlled from workstations; orbital maneuvering vehicle free flyer; remote manipulator system; Japanese experiment module exposed facility; Japanese experiment module small fine arm; flight telerobotic servicer; human-computer interaction; and workstation/robotics related activities.
Revilla León, M; Klemm, I M; García-Arranz, J; Özcan, M
2017-09-01
An edentulous patient was rehabilitated with maxillary metal-ceramic and mandibular metal-resin implant-supported fixed dental prosthesis (FDP). Metal frameworks of the FDPs were fabricated using 3D additive manufacturing technologies utilizing selective laser melting (SLM) and electron beam melting (EBM) processes. Both SLM and EBM technologies were employed in combination with computer numerical control (CNC) post-machining at the implant interface. This report highlights the technical and clinical protocol for fabrication of FDPs using SLM and EBM additive technologies. Copyright© 2017 Dennis Barber Ltd.
The use of wireless laptop computers for computer-assisted learning in pharmacokinetics.
Munar, Myrna Y; Singh, Harleen; Belle, Donna; Brackett, Carolyn C; Earle, Sandra B
2006-02-15
To implement computer-assisted learning workshops into pharmacokinetics courses in a doctor of pharmacy (PharmD) program. Workshops were designed for students to utilize computer software programs on laptop computers to build pharmacokinetic models to predict drug concentrations resulting from various dosage regimens. In addition, students were able to visualize through graphing programs how altering different parameters changed drug concentration-time curves. Surveys were conducted to measure students' attitudes toward computer technology before and after implementation. Finally, traditional examinations were used to evaluate student learning. Doctor of pharmacy students responded favorably to the use of wireless laptop computers in problem-based pharmacokinetic workshops. Eighty-eight percent (n = 61/69) and 82% (n = 55/67) of PharmD students completed surveys before and after computer implementation, respectively. Prior to implementation, 95% of students agreed that computers would enhance learning in pharmacokinetics. After implementation, 98% of students strongly agreed (p < 0.05) that computers enhanced learning. Examination results were significantly higher after computer implementation (89% with computers vs. 84% without computers; p = 0.01). Implementation of wireless laptop computers in a pharmacokinetic course enabled students to construct their own pharmacokinetic models that could respond to changing parameters. Students had greater comprehension and were better able to interpret results and provide appropriate recommendations. Computer-assisted pharmacokinetic techniques can be powerful tools when making decisions about drug therapy.
The Use of Wireless Laptop Computers for Computer-Assisted Learning in Pharmacokinetics
Munar, Myrna Y.; Singh, Harleen; Belle, Donna; Brackett, Carolyn C.; Earle, Sandra B.
2006-01-01
Objective To implement computer-assisted learning workshops into pharmacokinetics courses in a doctor of pharmacy (PharmD) program. Design Workshops were designed for students to utilize computer software programs on laptop computers to build pharmacokinetic models to predict drug concentrations resulting from various dosage regimens. In addition, students were able to visualize through graphing programs how altering different parameters changed drug concentration-time curves. Surveys were conducted to measure students’ attitudes toward computer technology before and after implementation. Finally, traditional examinations were used to evaluate student learning. Assessment Doctor of pharmacy students responded favorably to the use of wireless laptop computers in problem-based pharmacokinetic workshops. Eighty-eight percent (n = 61/69) and 82% (n = 55/67) of PharmD students completed surveys before and after computer implementation, respectively. Prior to implementation, 95% of students agreed that computers would enhance learning in pharmacokinetics. After implementation, 98% of students strongly agreed (p < 0.05) that computers enhanced learning. Examination results were significantly higher after computer implementation (89% with computers vs. 84% without computers; p = 0.01). Conclusion Implementation of wireless laptop computers in a pharmacokinetic course enabled students to construct their own pharmacokinetic models that could respond to changing parameters. Students had greater comprehension and were better able to interpret results and provide appropriate recommendations. Computer-assisted pharmacokinetic techniques can be powerful tools when making decisions about drug therapy. PMID:17136147
Algorithm-Based Fault Tolerance Integrated with Replication
NASA Technical Reports Server (NTRS)
Some, Raphael; Rennels, David
2008-01-01
In a proposed approach to programming and utilization of commercial off-the-shelf computing equipment, a combination of algorithm-based fault tolerance (ABFT) and replication would be utilized to obtain high degrees of fault tolerance without incurring excessive costs. The basic idea of the proposed approach is to integrate ABFT with replication such that the algorithmic portions of computations would be protected by ABFT, and the logical portions by replication. ABFT is an extremely efficient, inexpensive, high-coverage technique for detecting and mitigating faults in computer systems used for algorithmic computations, but does not protect against errors in logical operations surrounding algorithms.
Merging Technology and Emotions: Introduction to Affective Computing.
Brigham, Tara J
2017-01-01
Affective computing technologies are designed to sense and respond based on human emotions. This technology allows a computer system to process the information gathered from various sensors to assess the emotional state of an individual. The system then offers a distinct response based on what it "felt." While this is completely unlike how most people interact with electronics today, this technology is likely to trickle into future everyday life. This column will explain what affective computing is, some of its benefits, and concerns with its adoption. It will also provide an overview of its implication in the library setting and offer selected examples of how and where it is currently being used.
Proposal for nanoscale cascaded plasmonic majority gates for non-Boolean computation.
Dutta, Sourav; Zografos, Odysseas; Gurunarayanan, Surya; Radu, Iuliana; Soree, Bart; Catthoor, Francky; Naeemi, Azad
2017-12-19
Surface-plasmon-polariton waves propagating at the interface between a metal and a dielectric, hold the key to future high-bandwidth, dense on-chip integrated logic circuits overcoming the diffraction limitation of photonics. While recent advances in plasmonic logic have witnessed the demonstration of basic and universal logic gates, these CMOS oriented digital logic gates cannot fully utilize the expressive power of this novel technology. Here, we aim at unraveling the true potential of plasmonics by exploiting an enhanced native functionality - the majority voter. Contrary to the state-of-the-art plasmonic logic devices, we use the phase of the wave instead of the intensity as the state or computational variable. We propose and demonstrate, via numerical simulations, a comprehensive scheme for building a nanoscale cascadable plasmonic majority logic gate along with a novel referencing scheme that can directly translate the information encoded in the amplitude and phase of the wave into electric field intensity at the output. Our MIM-based 3-input majority gate displays a highly improved overall area of only 0.636 μm 2 for a single-stage compared with previous works on plasmonic logic. The proposed device demonstrates non-Boolean computational capability and can find direct utility in highly parallel real-time signal processing applications like pattern recognition.
2013-01-01
Background Information is lacking about the capacity of those working in community practice settings to utilize health information technology for colorectal cancer screening. Objective To address this gap we asked those working in community practice settings to share their perspectives about how the implementation of a Web-based patient-led decision aid might affect patient-clinician conversations about colorectal cancer screening and the day-to-day clinical workflow. Methods Five focus groups in five community practice settings were conducted with 8 physicians, 1 physician assistant, and 18 clinic staff. Focus groups were organized using a semistructured discussion guide designed to identify factors that mediate and impede the use of a Web-based decision aid intended to clarify patient preferences for colorectal cancer screening and to trigger shared decision making during the clinical encounter. Results All physicians, the physician assistant, and 8 of the 18 clinic staff were active participants in the focus groups. Clinician and staff participants from each setting reported a belief that the Web-based patient-led decision aid could be an informative and educational tool; in all but one setting participants reported a readiness to recommend the tool to patients. The exception related to clinicians from one clinic who described a preference for patients having fewer screening choices, noting that a colonoscopy was the preferred screening modality for patients in their clinic. Perceived barriers to utilizing the Web-based decision aid included patients’ lack of Internet access or low computer literacy, and potential impediments to the clinics’ daily workflow. Expanding patients’ use of an online decision aid that is both easy to access and understand and that is utilized by patients outside of the office visit was described as a potentially efficient means for soliciting patients’ screening preferences. Participants described that a system to link the online decision aid to a computerized reminder system could promote a better understanding of patients’ screening preferences, though some expressed concern that such a system could be difficult to keep up and running. Conclusions Community practice clinicians and staff perceived the Web-based decision aid technology as promising but raised questions as to how the technology and resultant information would be integrated into their daily practice workflow. Additional research investigating how to best implement online decision aids should be conducted prior to the widespread adoption of such technology so as to maximize the benefits of the technology while minimizing workflow disruptions. PMID:24351420
ERIC Educational Resources Information Center
Botello, Jennifer A.
2014-01-01
With increased dependence on computer-based standardized tests to assess academic achievement, technological literacy has become an essential skill. Yet, because students have unequal access to technology, they may not have equal opportunities to perform well on these computer-based tests. The researcher had observed students taking the STAR…
Electric utility companies and geothermal power
NASA Technical Reports Server (NTRS)
Pivirotto, D. S.
1976-01-01
The requirements of the electric utility industry as the primary potential market for geothermal energy are analyzed, based on a series of structured interviews with utility companies and financial institution executives. The interviews were designed to determine what information and technologies would be required before utilities would make investment decisions in favor of geothermal energy, the time frame in which the information and technologies would have to be available, and the influence of the governmental politics. The paper describes the geothermal resources, electric utility industry, its structure, the forces influencing utility companies, and their relationship to geothermal energy. A strategy for federal stimulation of utility investment in geothermal energy is suggested. Possibilities are discussed for stimulating utility investment through financial incentives, amelioration of institutional barriers, and technological improvements.
EPA is developing methods for utilizing computational chemistry, high-throughput screening (HTS) and various toxicogenomic technologies to predict potential for toxicity and prioritize limited testing resources towards chemicals that likely represent the greatest hazard to human ...
Non-invasive brain-computer interface system: towards its application as assistive technology.
Cincotti, Febo; Mattia, Donatella; Aloise, Fabio; Bufalari, Simona; Schalk, Gerwin; Oriolo, Giuseppe; Cherubini, Andrea; Marciani, Maria Grazia; Babiloni, Fabio
2008-04-15
The quality of life of people suffering from severe motor disabilities can benefit from the use of current assistive technology capable of ameliorating communication, house-environment management and mobility, according to the user's residual motor abilities. Brain-computer interfaces (BCIs) are systems that can translate brain activity into signals that control external devices. Thus they can represent the only technology for severely paralyzed patients to increase or maintain their communication and control options. Here we report on a pilot study in which a system was implemented and validated to allow disabled persons to improve or recover their mobility (directly or by emulation) and communication within the surrounding environment. The system is based on a software controller that offers to the user a communication interface that is matched with the individual's residual motor abilities. Patients (n=14) with severe motor disabilities due to progressive neurodegenerative disorders were trained to use the system prototype under a rehabilitation program carried out in a house-like furnished space. All users utilized regular assistive control options (e.g., microswitches or head trackers). In addition, four subjects learned to operate the system by means of a non-invasive EEG-based BCI. This system was controlled by the subjects' voluntary modulations of EEG sensorimotor rhythms recorded on the scalp; this skill was learnt even though the subjects have not had control over their limbs for a long time. We conclude that such a prototype system, which integrates several different assistive technologies including a BCI system, can potentially facilitate the translation from pre-clinical demonstrations to a clinical useful BCI.
Non invasive Brain-Computer Interface system: towards its application as assistive technology
Cincotti, Febo; Mattia, Donatella; Aloise, Fabio; Bufalari, Simona; Schalk, Gerwin; Oriolo, Giuseppe; Cherubini, Andrea; Marciani, Maria Grazia; Babiloni, Fabio
2010-01-01
The quality of life of people suffering from severe motor disabilities can benefit from the use of current assistive technology capable of ameliorating communication, house-environment management and mobility, according to the user's residual motor abilities. Brain Computer Interfaces (BCIs) are systems that can translate brain activity into signals that control external devices. Thus they can represent the only technology for severely paralyzed patients to increase or maintain their communication and control options. Here we report on a pilot study in which a system was implemented and validated to allow disabled persons to improve or recover their mobility (directly or by emulation) and communication within the surrounding environment. The system is based on a software controller that offers to the user a communication interface that is matched with the individual's residual motor abilities. Patients (n=14) with severe motor disabilities due to progressive neurodegenerative disorders were trained to use the system prototype under a rehabilitation program carried out in a house-like furnished space. All users utilized regular assistive control options (e.g., microswitches or head trackers). In addition, four subjects learned to operate the system by means of a non-invasive EEG-based BCI. This system was controlled by the subjects' voluntary modulations of EEG sensorimotor rhythms recorded on the scalp; this skill was learnt even though the subjects have not had control over their limbs for a long time. We conclude that such a prototype system, which integrates several different assistive technologies including a BCI system, can potentially facilitate the translation from pre-clinical demonstrations to a clinical useful BCI. PMID:18394526
Distance Learning: A Way of Life-Long Learning
2005-09-01
promise of future benefits. 15. SUBJECT TERMS training, educational technology , distributed learning , distance learning , collaboration, online instruction...knowledge." - Aristotle Introduction Modern learning technology assumes various names: distance learning , distributed training, computer-based...training, web-based learning , or advanced distributed learning . No matter the name, the basic concept is using computer technology for instruction with no
Evolution of Cardiac Biomodels from Computational to Therapeutics.
Rathinam, Alwin Kumar; Mokhtar, Raja Amin Raja
2016-08-23
Biomodeling the human anatomy in exact structure and size is an exciting field of medical science. Utilizing medical data from various medical imaging topography, the data of an anatomical structure can be extracted and converted into a three-dimensional virtual biomodel; thereafter a physical biomodel can be generated utilizing rapid prototyping machines. Here, we have reviewed the utilization of this technology and have provided some guidelines to develop biomodels of cardiac structures. Cardiac biomodels provide insights for cardiothoracic surgeons, cardiologists, and patients alike. Additionally, the technology may have future usability for tissue engineering, robotic surgery, or routine hospital usage as a diagnostic and therapeutic tool for cardiovascular diseases (CVD). Given the broad areas of application of cardiac biomodels, attention should be given to further research and development of their potential.
The role of information and communication technology in developing smart education
NASA Astrophysics Data System (ADS)
Roslina; Zarlis, Muhammad; Mawengkang, Herman; Sembiring, R. W.
2017-09-01
The right to get a proper education for every citizen had been regulated by the government, but not all citizens have the same opportunity. This is due to the other factors in the nation's infrastructure, Frontier, Outermost, and Disadvantaged (3T) which have not beenaccomodatedto access information and communication technology (ICT), and the ideal learning environment in order to pursue knowledge. This condition could be achieved by reforming higher education. Such reforms include the provision of educational services in the form of a flexible learner-oriented, and to change the curriculum with market based.These changes would include the provision of lecturers, professors, and professional teaching force. Another important effort is to update the quality of higher education with resource utilization. This paper proposes a new education business model to realize the Smart Education (SE), with an orientation on the proven skills and competitive.SE is the higher education system to optimize output (outcome) learning with combine individual learning and collaboration techniques based network system, informal practice learning and formal theory. UtilizingICT resources can improve the quality and access to higher education in supporting activities of higher education.This paper shows that ICT resources can support virtual connected with the use of shared resources, such as resource of information, learning resources, computing resources, and human resources.
Technology Use and Frequency and Self-Rated Skills: A Survey of Community-Dwelling Older Adults.
Scanlon, Lorraine; O'Shea, Emma; O'Caoimh, Rónán; Timmons, Suzanne
2015-07-01
Many older adults are using technology regularly, but the vast majority still rate their technology skills as poor or average,reflecting their low usage of less-familiar items such as tablet computers. Despite moves toward increasing the use of ICT in the care, rehabilitation, and monitoring of older adults, baseline use of such devices is low. Further study is required to investigate how people's attitudes toward and experience with ICT influence its utility in clinical practice
Graphical User Interface in Art
NASA Astrophysics Data System (ADS)
Gwilt, Ian
This essay discusses the use of the Graphical User Interface (GUI) as a site of creative practice. By creatively repositioning the GUI as a work of art it is possible to challenge our understanding and expectations of the conventional computer interface wherein the icons and navigational architecture of the GUI no longer function as a technological tool. These artistic recontextualizations are often used to question our engagement with technology and to highlight the pivotal place that the domestic computer has taken in our everyday social, cultural and (increasingly), creative domains. Through these works the media specificity of the screen-based GUI can broken by dramatic changes in scale, form and configuration. This can be seen through the work of new media artists who have re-imagined the GUI in a number of creative forms both, within the digital, as image, animation, net and interactive art, and in the analogue, as print, painting, sculpture, installation and performative event. Furthermore as a creative work, the GUI can also be utilized as a visual way-finder to explore the relationship between the dynamic potentials of the digital and the concretized qualities of the material artifact.
Boland, Veronica C; Stockings, Emily A; Mattick, Richard P; McRobbie, Hayden; Brown, Jamie; Courtney, Ryan J
2018-02-07
To assess the methodological quality and effectiveness of technology-based smoking cessation interventions in disadvantaged groups. Four databases (EMBASE, Cochrane, Medline, and PsycInfo) were searched for studies conducted from 1980 to May 2016. Randomized controlled trials that compared a behavioral smoking cessation intervention delivered primarily through a technology-based platform (eg, mobile phone) with a no-intervention comparison group among disadvantaged smokers were included. Three reviewers assessed all relevant studies for inclusion, and one reviewer extracted study, participant and intervention-level data, with a subset crosschecked by a second reviewer. Thirteen studies targeting disadvantaged smokers (n =4820) were included. Only one study scored highly in terms of methodological rigor on EPOC criteria for judging risk of bias. Of the 13 studies using a technology-based platform, most utilized websites (n = 5) or computer programs (n = 5), and seven additionally offered nicotine replacement therapy. Technology-based interventions increased the odds of smoking cessation for disadvantaged groups at 1 month (odds ratio [OR] 1.70, 95% confidence interval [CI] 1.10, 2.63), 3 months (OR 1.30, 95% CI 1.07, 1.59), 6 months (OR 1.29, 95% CI 1.03, 1.62), and 18 months post-intervention (OR 1.83, 95% CI 1.11, 3.01). Few methodologically rigorous studies were identified. Mobile phone text-messaging, computer- and website-delivered quit support showed promise at increasing quit rates among Indigenous, psychiatric and inpatient substance use disorder patients. Further research is needed to address the role technology-based interventions have on overcoming health inequalities to meet the needs of disadvantaged groups. This review provides the first quantitative evidence of the effectiveness of a range of technology-based smoking cessation interventions among disadvantaged smokers, with separate estimates on the basis of intervention type, and cessation outcome measure. Providing cost-effective, easily accessible and real-time smoking cessation treatment is needed, and innovative technology-based platforms will help reach this endpoint. These interventions need to be tested in larger scale randomized controlled trial designs and target broader disadvantaged groups. Data collection beyond 6 months is also needed in order to establish the efficacy of these intervention approaches on long-term cessation rates among disadvantaged population groups. © The Author(s) 2016. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Wang, Li
2005-01-01
With the advent of networked computers and Internet technology, computer-based instruction has been widely used in language classrooms throughout the United States. Computer technologies have dramatically changed the way people gather information, conduct research and communicate with others worldwide. Considering the tremendous startup expenses,…
Baldwin, Constance D; Niebuhr, Virginia N; Sullivan, Brian
2004-01-01
We aimed to identify the evolving computer technology needs and interests of community faculty in order to design an effective faculty development program focused on computer skills: the Teaching and Learning Through Educational Technology (TeLeTET) program. Repeated surveys were conducted between 1994 and 2002 to assess computer resources and needs in a pool of over 800 primary care physician-educators in community practice in East Texas. Based on the results, we developed and evaluated several models to teach community preceptors about computer technologies that are useful for education. Before 1998, only half of our community faculty identified a strong interest in developing their technology skills. As the revolution in telecommunications advanced, however, preceptors' needs and interests changed, and the use of this technology to support community-based teaching became feasible. In 1998 and 1999, resource surveys showed that many of our community teaching sites had computers and Internet access. By 2001, the desire for teletechnology skills development was strong in a nucleus of community faculty, although lack of infrastructure, time, and skills were identified barriers. The TeLeTET project developed several innovative models for technology workshops and conferences, supplemented by online resources, that were well attended and positively evaluated by 181 community faculty over a 3-year period. We have identified the evolving needs of community faculty through iterative needs assessments, developed a flexible faculty development curriculum, and used open-ended, formative evaluation techniques to keep the TeLeTET program responsive to a rapidly changing environment for community-based education in computer technology.
Applying Utility Functions to Adaptation Planning for Home Automation Applications
NASA Astrophysics Data System (ADS)
Bratskas, Pyrros; Paspallis, Nearchos; Kakousis, Konstantinos; Papadopoulos, George A.
A pervasive computing environment typically comprises multiple embedded devices that may interact together and with mobile users. These users are part of the environment, and they experience it through a variety of devices embedded in the environment. This perception involves technologies which may be heterogeneous, pervasive, and dynamic. Due to the highly dynamic properties of such environments, the software systems running on them have to face problems such as user mobility, service failures, or resource and goal changes which may happen in an unpredictable manner. To cope with these problems, such systems must be autonomous and self-managed. In this chapter we deal with a special kind of a ubiquitous environment, a smart home environment, and introduce a user-preference-based model for adaptation planning. The model, which dynamically forms a set of configuration plans for resources, reasons automatically and autonomously, based on utility functions, on which plan is likely to best achieve the user's goals with respect to resource availability and user needs.
NASA Astrophysics Data System (ADS)
Pierce, S. A.; Gentle, J.
2015-12-01
The multi-criteria decision support system (MCSDSS) is a newly completed application for touch-enabled group decision support that uses D3 data visualization tools, a geojson conversion utility that we developed, and Paralelex to create an interactive tool. The MCSDSS is a prototype system intended to demonstrate the potential capabilities of a single page application (SPA) running atop a web and cloud based architecture utilizing open source technologies. The application is implemented on current web standards while supporting human interface design that targets both traditional mouse/keyboard interactions and modern touch/gesture enabled interactions. The technology stack for MCSDSS was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The application integrates current frameworks for highly performant agile development with unit testing, statistical analysis, data visualization, mapping technologies, geographic data manipulation, and cloud infrastructure while retaining support for traditional HTML5/CSS3 web standards. The software lifecylcle for MCSDSS has following best practices to develop, share, and document the codebase and application. Code is documented and shared via an online repository with the option for programmers to see, contribute, or fork the codebase. Example data files and tutorial documentation have been shared with clear descriptions and data object identifiers. And the metadata about the application has been incorporated into an OntoSoft entry to ensure that MCSDSS is searchable and clearly described. MCSDSS is a flexible platform that allows for data fusion and inclusion of large datasets in an interactive front-end application capable of connecting with other science-based applications and advanced computing resources. In addition, MCSDSS offers functionality that enables communication with non-technical users for policy, education, or engagement with groups around scientific topics with societal relevance.
Content-oriented Approach to Organization of Theories and Its Utilization
NASA Astrophysics Data System (ADS)
Hayashi, Yusuke; Bourdeau, Jacqueline; Mizoguch, Riichiro
In spite of the fact that the relation between theory and practice is a foundation of scientific and technological development, the trend of increasing the gap between theory and practice accelerates in these years. The gap embraces a risk of distrust of science and technology. Ontological engineering as the content-oriented research is expected to contribute to the resolution of the gap. This paper presents the feasibility of organization of theoretical knowledge on ontological engineering and new-generation intelligent systems based on it through an application of ontological engineering in the area of learning/instruction support. This area also has the problem of the gap between theory and practice, and its resolution is strongly required. So far we proposed OMNIBUS ontology, which is a comprehensive ontology that covers different learning/instructional theories and paradigms, and SMARTIES, which is a theory-aware and standard-compliant authoring system for making learning/instructional scenarios based on OMNIBUS ontology. We believe the theory-awareness and standard-compliance bridge the gap between theory and practice because it links theories to practical use of standard technologies and enables practitioners to easily enjoy theoretical support while using standard technologies in practice. The following goals are set in order to achieve it; computers (1) understand a variety of learning/instructional theories based on the organization of them, (2) utilize the understanding for helping authors' learning/instructional scenario making and (3) make such theoretically sound scenarios interoperable within the framework of standard technologies. This paper suggests an ontological engineering solution to the achievement of these three goals. Although the evaluation is far from complete in terms of practical use, we believe that the results of this study address high-level technical challenges from the viewpoint of the current state of the art in the research area of artificial intelligence not only in education but also in general, and therefore we hope that constitute a substantial contribution for organization of theoretical knowledge in many other areas.
Computational Approach for Developing Blood Pump
NASA Technical Reports Server (NTRS)
Kwak, Dochan
2002-01-01
This viewgraph presentation provides an overview of the computational approach to developing a ventricular assist device (VAD) which utilizes NASA aerospace technology. The VAD is used as a temporary support to sick ventricles for those who suffer from late stage congestive heart failure (CHF). The need for donor hearts is much greater than their availability, and the VAD is seen as a bridge-to-transplant. The computational issues confronting the design of a more advanced, reliable VAD include the modelling of viscous incompressible flow. A computational approach provides the possibility of quantifying the flow characteristics, which is especially valuable for analyzing compact design with highly sensitive operating conditions. Computational fluid dynamics (CFD) and rocket engine technology has been applied to modify the design of a VAD which enabled human transplantation. The computing requirement for this project is still large, however, and the unsteady analysis of the entire system from natural heart to aorta involves several hundred revolutions of the impeller. Further study is needed to assess the impact of mechanical VADs on the human body
A Direct Position-Determination Approach for Multiple Sources Based on Neural Network Computation.
Chen, Xin; Wang, Ding; Yin, Jiexin; Wu, Ying
2018-06-13
The most widely used localization technology is the two-step method that localizes transmitters by measuring one or more specified positioning parameters. Direct position determination (DPD) is a promising technique that directly localizes transmitters from sensor outputs and can offer superior localization performance. However, existing DPD algorithms such as maximum likelihood (ML)-based and multiple signal classification (MUSIC)-based estimations are computationally expensive, making it difficult to satisfy real-time demands. To solve this problem, we propose the use of a modular neural network for multiple-source DPD. In this method, the area of interest is divided into multiple sub-areas. Multilayer perceptron (MLP) neural networks are employed to detect the presence of a source in a sub-area and filter sources in other sub-areas, and radial basis function (RBF) neural networks are utilized for position estimation. Simulation results show that a number of appropriately trained neural networks can be successfully used for DPD. The performance of the proposed MLP-MLP-RBF method is comparable to the performance of the conventional MUSIC-based DPD algorithm for various signal-to-noise ratios and signal power ratios. Furthermore, the MLP-MLP-RBF network is less computationally intensive than the classical DPD algorithm and is therefore an attractive choice for real-time applications.
NASA Astrophysics Data System (ADS)
McFall, Steve
1994-03-01
With the increase in business automation and the widespread availability and low cost of computer systems, law enforcement agencies have seen a corresponding increase in criminal acts involving computers. The examination of computer evidence is a new field of forensic science with numerous opportunities for research and development. Research is needed to develop new software utilities to examine computer storage media, expert systems capable of finding criminal activity in large amounts of data, and to find methods of recovering data from chemically and physically damaged computer storage media. In addition, defeating encryption and password protection of computer files is also a topic requiring more research and development.
2012-03-01
by using a common communication technology there is no need to develop a complicated communications plan and generate an ad - hoc communications...DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words) Maintaining an accurate Common Operational Picture (COP) is a strategic requirement for...TERMS Android Programming, Cloud Computing, Common Operating Picture, Web Programing 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-10-20
Look-ahead dynamic simulation software system incorporates the high performance parallel computing technologies, significantly reduces the solution time for each transient simulation case, and brings the dynamic simulation analysis into on-line applications to enable more transparency for better reliability and asset utilization. It takes the snapshot of the current power grid status, functions in parallel computing the system dynamic simulation, and outputs the transient response of the power system in real time.
Ting, Samuel T; Ahmad, Rizwan; Jin, Ning; Craft, Jason; Serafim da Silveira, Juliana; Xue, Hui; Simonetti, Orlando P
2017-04-01
Sparsity-promoting regularizers can enable stable recovery of highly undersampled magnetic resonance imaging (MRI), promising to improve the clinical utility of challenging applications. However, lengthy computation time limits the clinical use of these methods, especially for dynamic MRI with its large corpus of spatiotemporal data. Here, we present a holistic framework that utilizes the balanced sparse model for compressive sensing and parallel computing to reduce the computation time of cardiac MRI recovery methods. We propose a fast, iterative soft-thresholding method to solve the resulting ℓ1-regularized least squares problem. In addition, our approach utilizes a parallel computing environment that is fully integrated with the MRI acquisition software. The methodology is applied to two formulations of the multichannel MRI problem: image-based recovery and k-space-based recovery. Using measured MRI data, we show that, for a 224 × 144 image series with 48 frames, the proposed k-space-based approach achieves a mean reconstruction time of 2.35 min, a 24-fold improvement compared a reconstruction time of 55.5 min for the nonlinear conjugate gradient method, and the proposed image-based approach achieves a mean reconstruction time of 13.8 s. Our approach can be utilized to achieve fast reconstruction of large MRI datasets, thereby increasing the clinical utility of reconstruction techniques based on compressed sensing. Magn Reson Med 77:1505-1515, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
ERIC Educational Resources Information Center
National Commission on Technology, Automation and Economic Progress, Washington, DC.
Three studies dealing with the educational implications of technological change are presented. "The Application of Computer Technology to the Improvement of Instruction and Learning" by Don D. Bushnell, Richard deMille, and Judith Purl is based on 35 research and development programs involving computer technology. Their general thesis is that…
Computational needs survey of NASA automation and robotics missions. Volume 1: Survey and results
NASA Technical Reports Server (NTRS)
Davis, Gloria J.
1991-01-01
NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is that mission computing requirements are frequently unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. A preliminary set of advanced mission computational processing requirements of automation and robotics (A&R) systems are provided for use by NASA, industry, and academic communities. These results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implementation capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Volume one includes the survey and results. Volume two contains the appendixes.
Computational needs survey of NASA automation and robotics missions. Volume 2: Appendixes
NASA Technical Reports Server (NTRS)
Davis, Gloria J.
1991-01-01
NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is the fact that mission computing requirements are frequency unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. Here, NASA, industry and academic communities are provided with a preliminary set of advanced mission computational processing requirements of automation and robotics (A and R) systems. The results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implemented capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Here, appendixes are provided.
Cloud-based Jupyter Notebooks for Water Data Analysis
NASA Astrophysics Data System (ADS)
Castronova, A. M.; Brazil, L.; Seul, M.
2017-12-01
The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative and reproducible science.
2014-01-01
Background People with severe disabilities, e.g. due to neurodegenerative disease, depend on technology that allows for accurate wheelchair control. For those who cannot operate a wheelchair with a joystick, brain-computer interfaces (BCI) may offer a valuable option. Technology depending on visual or auditory input may not be feasible as these modalities are dedicated to processing of environmental stimuli (e.g. recognition of obstacles, ambient noise). Herein we thus validated the feasibility of a BCI based on tactually-evoked event-related potentials (ERP) for wheelchair control. Furthermore, we investigated use of a dynamic stopping method to improve speed of the tactile BCI system. Methods Positions of four tactile stimulators represented navigation directions (left thigh: move left; right thigh: move right; abdomen: move forward; lower neck: move backward) and N = 15 participants delivered navigation commands by focusing their attention on the desired tactile stimulus in an oddball-paradigm. Results Participants navigated a virtual wheelchair through a building and eleven participants successfully completed the task of reaching 4 checkpoints in the building. The virtual wheelchair was equipped with simulated shared-control sensors (collision avoidance), yet these sensors were rarely needed. Conclusion We conclude that most participants achieved tactile ERP-BCI control sufficient to reliably operate a wheelchair and dynamic stopping was of high value for tactile ERP classification. Finally, this paper discusses feasibility of tactile ERPs for BCI based wheelchair control. PMID:24428900
Manganello, Jennifer A; Gerstner, Gena; Pergolino, Kristen; Graham, Yvonne; Strogatz, David
2016-01-01
Many state and local health departments, as well as community organizations, have been using new technologies to disseminate health information to targeted populations. Yet little data exist that show access and use patterns, as well as preferences for receiving health information, at the state level. This study was designed to obtain information about media and technology use, and health information seeking patterns, from a sample of New York State (NYS) residents. A cross-sectional telephone survey (with mobile phones and landlines) was developed to assess media and technology access, use patterns, and preferences for receiving health information among a sample of 1350 residents in NYS. The survey used random digit dialing methodology. A weighted analysis was conducted utilizing Stata/SE software. Data suggest that NYS residents have a high level of computer and Internet use; 82% have at least one working computer at home, and 85% use the Internet at least sometimes. Mobile phone use is also high; 90% indicated having a mobile phone, and of those 63% have a smartphone. When asked about preferences for receiving health information from an organization, many people preferred websites (49%); preferences for other sources varied by demographic characteristics. Findings suggest that the Internet and other technologies are viable ways to reach NYS residents, but agencies and organizations should still consider using traditional methods of communication in some cases, and determine appropriate channels based on the population of interest.
Gerstner, Gena; Pergolino, Kristen; Graham, Yvonne; Strogatz, David
2016-01-01
Background Many state and local health departments, as well as community organizations, have been using new technologies to disseminate health information to targeted populations. Yet little data exist that show access and use patterns, as well as preferences for receiving health information, at the state level. Objective This study was designed to obtain information about media and technology use, and health information seeking patterns, from a sample of New York State (NYS) residents. Methods A cross-sectional telephone survey (with mobile phones and landlines) was developed to assess media and technology access, use patterns, and preferences for receiving health information among a sample of 1350 residents in NYS. The survey used random digit dialing methodology. A weighted analysis was conducted utilizing Stata/SE software. Results Data suggest that NYS residents have a high level of computer and Internet use; 82% have at least one working computer at home, and 85% use the Internet at least sometimes. Mobile phone use is also high; 90% indicated having a mobile phone, and of those 63% have a smartphone. When asked about preferences for receiving health information from an organization, many people preferred websites (49%); preferences for other sources varied by demographic characteristics. Conclusions Findings suggest that the Internet and other technologies are viable ways to reach NYS residents, but agencies and organizations should still consider using traditional methods of communication in some cases, and determine appropriate channels based on the population of interest. PMID:27227163
Flight Avionics Hardware Roadmap
NASA Technical Reports Server (NTRS)
Some, Raphael; Goforth, Monte; Chen, Yuan; Powell, Wes; Paulick, Paul; Vitalpur, Sharada; Buscher, Deborah; Wade, Ray; West, John; Redifer, Matt;
2014-01-01
The Avionics Technology Roadmap takes an 80% approach to technology investment in spacecraft avionics. It delineates a suite of technologies covering foundational, component, and subsystem-levels, which directly support 80% of future NASA space mission needs. The roadmap eschews high cost, limited utility technologies in favor of lower cost, and broadly applicable technologies with high return on investment. The roadmap is also phased to support future NASA mission needs and desires, with a view towards creating an optimized investment portfolio that matures specific, high impact technologies on a schedule that matches optimum insertion points of these technologies into NASA missions. The roadmap looks out over 15+ years and covers some 114 technologies, 58 of which are targeted for TRL6 within 5 years, with 23 additional technologies to be at TRL6 by 2020. Of that number, only a few are recommended for near term investment: 1. Rad Hard High Performance Computing 2. Extreme temperature capable electronics and packaging 3. RFID/SAW-based spacecraft sensors and instruments 4. Lightweight, low power 2D displays suitable for crewed missions 5. Radiation tolerant Graphics Processing Unit to drive crew displays 6. Distributed/reconfigurable, extreme temperature and radiation tolerant, spacecraft sensor controller and sensor modules 7. Spacecraft to spacecraft, long link data communication protocols 8. High performance and extreme temperature capable C&DH subsystem In addition, the roadmap team recommends several other activities that it believes are necessary to advance avionics technology across NASA: center dot Engage the OCT roadmap teams to coordinate avionics technology advances and infusion into these roadmaps and their mission set center dot Charter a team to develop a set of use cases for future avionics capabilities in order to decouple this roadmap from specific missions center dot Partner with the Software Steering Committee to coordinate computing hardware and software technology roadmaps and investment recommendations center dot Continue monitoring foundational technologies upon which future avionics technologies will be dependent, e.g., RHBD and COTS semiconductor technologies
Advanced fuel system technology for utilizing broadened property aircraft fuels
NASA Technical Reports Server (NTRS)
Reck, G. M.
1980-01-01
Possible changes in fuel properties are identified based on current trends and projections. The effect of those changes with respect to the aircraft fuel system are examined and some technological approaches to utilizing those fuels are described.
Intelligent Systems For Aerospace Engineering: An Overview
NASA Technical Reports Server (NTRS)
KrishnaKumar, K.
2003-01-01
Intelligent systems are nature-inspired, mathematically sound, computationally intensive problem solving tools and methodologies that have become extremely important for advancing the current trends in information technology. Artificially intelligent systems currently utilize computers to emulate various faculties of human intelligence and biological metaphors. They use a combination of symbolic and sub-symbolic systems capable of evolving human cognitive skills and intelligence, not just systems capable of doing things humans do not do well. Intelligent systems are ideally suited for tasks such as search and optimization, pattern recognition and matching, planning, uncertainty management, control, and adaptation. In this paper, the intelligent system technologies and their application potential are highlighted via several examples.
Intelligent Systems for Aerospace Engineering: An Overview
NASA Technical Reports Server (NTRS)
Krishnakumar, Kalmanje
2002-01-01
Intelligent systems are nature-inspired, mathematically sound, computationally intensive problem solving tools and methodologies that have become extremely important for advancing the current trends in information technology. Artificially intelligent systems currently utilize computers to emulate various faculties of human intelligence and biological metaphors. They use a combination of symbolic and sub-symbolic systems capable of evolving human cognitive skills and intelligence, not just systems capable of doing things humans do not do well. Intelligent systems are ideally suited for tasks such as search and optimization, pattern recognition and matching, planning, uncertainty management, control, and adaptation. In this paper, the intelligent system technologies and their application potential are highlighted via several examples.
Remote voice training: A case study on space shuttle applications, appendix C
NASA Technical Reports Server (NTRS)
Mollakarimi, Cindy; Hamid, Tamin
1990-01-01
The Tile Automation System includes applications of automation and robotics technology to all aspects of the Shuttle tile processing and inspection system. An integrated set of rapid prototyping testbeds was developed which include speech recognition and synthesis, laser imaging systems, distributed Ada programming environments, distributed relational data base architectures, distributed computer network architectures, multi-media workbenches, and human factors considerations. Remote voice training in the Tile Automation System is discussed. The user is prompted over a headset by synthesized speech for the training sequences. The voice recognition units and the voice output units are remote from the user and are connected by Ethernet to the main computer system. A supervisory channel is used to monitor the training sequences. Discussions include the training approaches as well as the human factors problems and solutions for this system utilizing remote training techniques.
Implementing WebGL and HTML5 in Macromolecular Visualization and Modern Computer-Aided Drug Design.
Yuan, Shuguang; Chan, H C Stephen; Hu, Zhenquan
2017-06-01
Web browsers have long been recognized as potential platforms for remote macromolecule visualization. However, the difficulty in transferring large-scale data to clients and the lack of native support for hardware-accelerated applications in the local browser undermine the feasibility of such utilities. With the introduction of WebGL and HTML5 technologies in recent years, it is now possible to exploit the power of a graphics-processing unit (GPU) from a browser without any third-party plugin. Many new tools have been developed for biological molecule visualization and modern drug discovery. In contrast to traditional offline tools, real-time computing, interactive data analysis, and cross-platform analyses feature WebGL- and HTML5-based tools, facilitating biological research in a more efficient and user-friendly way. Copyright © 2017 Elsevier Ltd. All rights reserved.
Design and Implementation of Context-Aware Musuem Guide Agents
NASA Astrophysics Data System (ADS)
Satoh, Ichiro
This paper presents an agent-based system for building and operating context-aware services in public spaces, including museums. The system provides users with agents and detects the locations of users and deploys location-aware user-assistant agents at computers near the their current locations by using active RFID-tags. When a visitor moves between exhibits in a museum, this dynamically deploys his/her agent at the computers close to the exhibits by using mobile agent technology. It annotates the exhibits in his/her personalized form and navigate him/her user to the next exhibits along his/her routes. It also introduces user movement as a natural approach to interacting between users and agents. To demonstrate the utility and effectiveness of the system, we constructed location/user-aware visitor-guide services and experimented them for two weeks in a public museum.
Space Technology for Palate Surgery
NASA Technical Reports Server (NTRS)
1980-01-01
University of Miami utilized NASA's spacecraft viewing technology to develop the optical profilometer provides more accurate measurements of cleft palate casts than has heretofore been possible, enabling better planning of corrective surgery. Lens like instrument electronically scans a palate cast precisely measuring its irregular contours by detecting minute differences in the intensity of a light beam reflected off the cast. Readings are computer processed and delivered to the surgeon by a teleprinter.
McClure, Erin; Baker, Nathaniel; Carpenter, Matthew J; Treiber, Frank A; Gray, Kevin
2017-06-01
Despite the public health relevance of smoking in adolescents and emerging adults, this group remains understudied and underserved. High technology utilization among this group may be harnessed as a tool for better understanding of smoking, yet little is known regarding the acceptability of mobile health (mHealth) integration. Participants (ages 14-21) enrolled in a smoking cessation clinical trial provided feedback on their technology utilization, perceptions, and attitudes; and interest in remote monitoring for smoking. Characteristics that predicted greater technology acceptability for smoking treatment were also explored. Participants (N=87) averaged 19 years old and were mostly male (67%). Technology utilization was high for smart phone ownership (93%), Internet use (98%), and social media use (94%). Despite this, only one-third of participants had ever searched the Internet for cessation tips or counseling (33%). Participants showed interest in mHealth-enabled treatment (48%) and felt that it could be somewhat helpful (83%). Heavier smokers had more favorable attitudes toward technology-based treatment, as did those with smartphones and unlimited data. Our results demonstrate high technology utilization, favorable attitudes towards technology, and minimal concerns. Technology integration among this population should be pursued, though in a tailored fashion, to accomplish the goal of providing maximally effective, just-in-time interventions.
Missile signal processing common computer architecture for rapid technology upgrade
NASA Astrophysics Data System (ADS)
Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul
2004-10-01
Interceptor missiles process IR images to locate an intended target and guide the interceptor towards it. Signal processing requirements have increased as the sensor bandwidth increases and interceptors operate against more sophisticated targets. A typical interceptor signal processing chain is comprised of two parts. Front-end video processing operates on all pixels of the image and performs such operations as non-uniformity correction (NUC), image stabilization, frame integration and detection. Back-end target processing, which tracks and classifies targets detected in the image, performs such algorithms as Kalman tracking, spectral feature extraction and target discrimination. In the past, video processing was implemented using ASIC components or FPGAs because computation requirements exceeded the throughput of general-purpose processors. Target processing was performed using hybrid architectures that included ASICs, DSPs and general-purpose processors. The resulting systems tended to be function-specific, and required custom software development. They were developed using non-integrated toolsets and test equipment was developed along with the processor platform. The lifespan of a system utilizing the signal processing platform often spans decades, while the specialized nature of processor hardware and software makes it difficult and costly to upgrade. As a result, the signal processing systems often run on outdated technology, algorithms are difficult to update, and system effectiveness is impaired by the inability to rapidly respond to new threats. A new design approach is made possible three developments; Moore's Law - driven improvement in computational throughput; a newly introduced vector computing capability in general purpose processors; and a modern set of open interface software standards. Today's multiprocessor commercial-off-the-shelf (COTS) platforms have sufficient throughput to support interceptor signal processing requirements. This application may be programmed under existing real-time operating systems using parallel processing software libraries, resulting in highly portable code that can be rapidly migrated to new platforms as processor technology evolves. Use of standardized development tools and 3rd party software upgrades are enabled as well as rapid upgrade of processing components as improved algorithms are developed. The resulting weapon system will have a superior processing capability over a custom approach at the time of deployment as a result of a shorter development cycles and use of newer technology. The signal processing computer may be upgraded over the lifecycle of the weapon system, and can migrate between weapon system variants enabled by modification simplicity. This paper presents a reference design using the new approach that utilizes an Altivec PowerPC parallel COTS platform. It uses a VxWorks-based real-time operating system (RTOS), and application code developed using an efficient parallel vector library (PVL). A quantification of computing requirements and demonstration of interceptor algorithm operating on this real-time platform are provided.