Great Expectations: Distributed Financial Computing at Cornell.
ERIC Educational Resources Information Center
Schulden, Louise; Sidle, Clint
1988-01-01
The Cornell University Distributed Accounting (CUDA) system is an attempt to provide departments a software tool for better managing their finances, creating microcomputer standards, creating a vehicle for better administrative microcomputer support, and insuring local systems are consistent with central computer systems. (Author/MLW)
Dongarra, Jack; Heroux, Michael A.; Luszczek, Piotr
2015-08-17
Here, we describe a new high-performance conjugate-gradient (HPCG) benchmark. HPCG is composed of computations and data-access patterns commonly found in scientific applications. HPCG strives for a better correlation to existing codes from the computational science domain and to be representative of their performance. Furthermore, HPCG is meant to help drive the computer system design and implementation in directions that will better impact future performance improvement.
A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Safety Metrics for Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Leveson, Nancy G; Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Proposal for hierarchical description of software systems
NASA Technical Reports Server (NTRS)
Thauboth, H.
1973-01-01
The programming of digital computers has developed into a new dimension full of diffculties, because the hardware of computers has become so powerful that more complex applications are entrusted to computers. The costs of software development, verification, and maintenance are outpacing those of the hardware and the trend is toward futher increase of sophistication of application of computers and consequently of sophistication of software. To obtain better visibility into software systems and to improve the structure of software systems for better tests, verification, and maintenance, a clear, but rigorous description and documentation of software is needed. The purpose of the report is to extend the present methods in order to obtain a documentation that better reflects the interplay between the various components and functions of a software system at different levels of detail without losing the precision in expression. This is done by the use of block diagrams, sequence diagrams, and cross-reference charts. In the appendices, examples from an actual large sofware system, i.e. the Marshall System for Aerospace Systems Simulation (MARSYAS), are presented. The proposed documentation structure is compatible to automation of updating significant portions of the documentation for better software change control.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-13
... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology..., computational, and systems biology data can better inform risk assessment. This draft document is available for...
Developing Crash-Resistant Electronic Services.
ERIC Educational Resources Information Center
Almquist, Arne J.
1997-01-01
Libraries' dependence on computers can lead to frustrations for patrons and staff during downtime caused by computer system failures. Advice for reducing the number of crashes is provided, focusing on improved training for systems staff, better management of library systems, and the development of computer systems using quality components which…
Elastic Cloud Computing Architecture and System for Heterogeneous Spatiotemporal Computing
NASA Astrophysics Data System (ADS)
Shi, X.
2017-10-01
Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs), while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC) or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA) may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.
Computers Launch Faster, Better Job Matching
ERIC Educational Resources Information Center
Stevenson, Gloria
1976-01-01
Employment Security Automation Project (ESAP), a five-year program sponsored by the Employment and Training Administration, features an innovative computer-assisted job matching system and instantaneous computer-assisted service for unemployment insurance claimants. ESAP will also consolidate existing automated employment security systems to…
3-D Electromagnetic field analysis of wireless power transfer system using K computer
NASA Astrophysics Data System (ADS)
Kawase, Yoshihiro; Yamaguchi, Tadashi; Murashita, Masaya; Tsukada, Shota; Ota, Tomohiro; Yamamoto, Takeshi
2018-05-01
We analyze the electromagnetic field of a wireless power transfer system using the 3-D parallel finite element method on K computer, which is a super computer in Japan. It is clarified that the electromagnetic field of the wireless power transfer system can be analyzed in a practical time using the parallel computation on K computer, moreover, the accuracy of the loss calculation becomes better as the mesh division of the shield becomes fine.
Massaroni, Carlo; Cassetta, Eugenio; Silvestri, Sergio
2017-10-01
Respiratory assessment can be carried out by using motion capture systems. A geometrical model is mandatory in order to compute the breathing volume as a function of time from the markers' trajectories. This study describes a novel model to compute volume changes and calculate respiratory parameters by using a motion capture system. The novel method, ie, prism-based method, computes the volume enclosed within the chest by defining 82 prisms from the 89 markers attached to the subject chest. Volumes computed with this method are compared to spirometry volumes and to volumes computed by a conventional method based on the tetrahedron's decomposition of the chest wall and integrated in a commercial motion capture system. Eight healthy volunteers were enrolled and 30 seconds of quiet breathing data collected from each of them. Results show a better agreement between volumes computed by the prism-based method and the spirometry (discrepancy of 2.23%, R 2 = .94) compared to the agreement between volumes computed by the conventional method and the spirometry (discrepancy of 3.56%, R 2 = .92). The proposed method also showed better performances in the calculation of respiratory parameters. Our findings open up prospects for the further use of the new method in the breathing assessment via motion capture systems.
Computer-Aided College Algebra: Learning Components that Students Find Beneficial
ERIC Educational Resources Information Center
Aichele, Douglas B.; Francisco, Cynthia; Utley, Juliana; Wescoatt, Benjamin
2011-01-01
A mixed-method study was conducted during the Fall 2008 semester to better understand the experiences of students participating in computer-aided instruction of College Algebra using the software MyMathLab. The learning environment included a computer learning system for the majority of the instruction, a support system via focus groups (weekly…
Modeling ground-based timber harvesting systems using computer simulation
Jingxin Wang; Chris B. LeDoux
2001-01-01
Modeling ground-based timber harvesting systems with an object-oriented methodology was investigated. Object-oriented modeling and design promote a better understanding of requirements, cleaner designs, and better maintainability of the harvesting simulation system. The model developed simulates chainsaw felling, drive-to-tree feller-buncher, swing-to-tree single-grip...
Pagliari, C M; Hoang, T; Reddy, M; Wilkinson, L S; Poloniecki, J D; Given-Wilson, R M
2012-01-01
Objective To compare reader ratings of the clinical diagnostic quality of 50 and 100 μm computed radiography (CR) systems with screen–film mammography (SFM) in operative specimens. Methods Mammograms of 57 fresh operative breast specimens were analysed by 10 readers. Exposures were made with identical position and compression with three mammographic systems (Fuji 100CR, 50CR and SFM). Images were anonymised and readers blinded to the CR system used. A five-point comparative scoring system (−2 to +2) was used to assess seven quality criteria and overall diagnostic value. Statistical analysis was subsequently performed of reader ratings (n=16 925). Results For most quality criteria, both CR systems were rated as equivalent to or better than SFM. The CR systems were significantly better at demonstrating skin edge and background tissue (p<1×10−5). Microcalcification was best demonstrated on the CR50 system (p<1×10−5). The overall diagnostic value of both CR systems was rated as being as good as or better than SFM (p<1×10−5). Conclusion In this clinical setting, the overall diagnostic performance of both CR systems was as good as or better than SFM, with the CR50 system performing better than the CR100. PMID:22096218
Biological Basis For Computer Vision: Some Perspectives
NASA Astrophysics Data System (ADS)
Gupta, Madan M.
1990-03-01
Using biology as a basis for the development of sensors, devices and computer vision systems is a challenge to systems and vision scientists. It is also a field of promising research for engineering applications. Biological sensory systems, such as vision, touch and hearing, sense different physical phenomena from our environment, yet they possess some common mathematical functions. These mathematical functions are cast into the neural layers which are distributed throughout our sensory regions, sensory information transmission channels and in the cortex, the centre of perception. In this paper, we are concerned with the study of the biological vision system and the emulation of some of its mathematical functions, both retinal and visual cortex, for the development of a robust computer vision system. This field of research is not only intriguing, but offers a great challenge to systems scientists in the development of functional algorithms. These functional algorithms can be generalized for further studies in such fields as signal processing, control systems and image processing. Our studies are heavily dependent on the the use of fuzzy - neural layers and generalized receptive fields. Building blocks of such neural layers and receptive fields may lead to the design of better sensors and better computer vision systems. It is hoped that these studies will lead to the development of better artificial vision systems with various applications to vision prosthesis for the blind, robotic vision, medical imaging, medical sensors, industrial automation, remote sensing, space stations and ocean exploration.
Argonne Out Loud: Computation, Big Data, and the Future of Cities
Catlett, Charlie
2018-01-16
Charlie Catlett, a Senior Computer Scientist at Argonne and Director of the Urban Center for Computation and Data at the Computation Institute of the University of Chicago and Argonne, talks about how he and his colleagues are using high-performance computing, data analytics, and embedded systems to better understand and design cities.
NASA Astrophysics Data System (ADS)
Li, Haiqing; Chatterjee, Samir
With rapid advances in information and communication technology, computer-mediated communication (CMC) technologies are utilizing multiple IT platforms such as email, websites, cell-phones/PDAs, social networking sites, and gaming environments. However, no studies have compared the effectiveness of a persuasive system using such alternative channels and various persuasive techniques. Moreover, how affective computing impacts the effectiveness of persuasive systems is not clear. This study proposes (1) persuasive technology channels in combination with persuasive strategies will have different persuasive effectiveness; (2) Adding positive emotion to a message that leads to a better overall user experience could increase persuasive effectiveness. The affective computing or emotion information was added to the experiment using emoticons. The initial results of a pilot study show that computer-mediated communication channels along with various persuasive strategies can affect the persuasive effectiveness to varying degrees. These results also shows that adding a positive emoticon to a message leads to a better user experience which increases the overall persuasive effectiveness of a system.
ERIC Educational Resources Information Center
Buzzetto-More, Nicole; Ukoha, Ojiabo; Rustagi, Narendra
2010-01-01
The under representation of women and minorities in undergraduate computer science and information systems programs is a pervasive and persistent problem in the United States. Needed is a better understanding of the background and psychosocial factors that attract, or repel, minority students from computing disciplines. An examination of these…
Computers in Communications and Education at Coast Community College District.
ERIC Educational Resources Information Center
Luskin, Bernard J.; Ruth, Monty W.
Coast Community College District in Orange County, California is a leader among community colleges in the instructional use computers. The district's hardware consists of an IBM system 370 model 155 computer, over 80 typewriter terminals, 12 cathode ray tubes (CRT), and several microfiche image projection devices. Better than 700 computer-assisted…
Systems Biology in Immunology – A Computational Modeling Perspective
Germain, Ronald N.; Meier-Schellersheim, Martin; Nita-Lazar, Aleksandra; Fraser, Iain D. C.
2011-01-01
Systems biology is an emerging discipline that combines high-content, multiplexed measurements with informatic and computational modeling methods to better understand biological function at various scales. Here we present a detailed review of the methods used to create computational models and conduct simulations of immune function, We provide descriptions of the key data gathering techniques employed to generate the quantitative and qualitative data required for such modeling and simulation and summarize the progress to date in applying these tools and techniques to questions of immunological interest, including infectious disease. We include comments on what insights modeling can provide that complement information obtained from the more familiar experimental discovery methods used by most investigators and why quantitative methods are needed to eventually produce a better understanding of immune system operation in health and disease. PMID:21219182
ERIC Educational Resources Information Center
Orey, Michael A.; Nelson, Wayne A.
Arguing that the evolution of intelligent tutoring systems better reflects the recent theoretical developments of cognitive science than traditional computer-based instruction (CBI), this paper describes a general model for an intelligent tutoring system and suggests ways to improve CBI using design principles derived from research in cognitive…
Cloud Computing Techniques for Space Mission Design
NASA Technical Reports Server (NTRS)
Arrieta, Juan; Senent, Juan
2014-01-01
The overarching objective of space mission design is to tackle complex problems producing better results, and faster. In developing the methods and tools to fulfill this objective, the user interacts with the different layers of a computing system.
A Strategic Approach to Network Defense: Framing the Cloud
2011-03-10
accepted network defensive principles, to reduce risks associated with emerging virtualization capabilities and scalability of cloud computing . This expanded...defensive framework can assist enterprise networking and cloud computing architects to better design more secure systems.
Indirection and computer security.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berg, Michael J.
2011-09-01
The discipline of computer science is built on indirection. David Wheeler famously said, 'All problems in computer science can be solved by another layer of indirection. But that usually will create another problem'. We propose that every computer security vulnerability is yet another problem created by the indirections in system designs and that focusing on the indirections involved is a better way to design, evaluate, and compare security solutions. We are not proposing that indirection be avoided when solving problems, but that understanding the relationships between indirections and vulnerabilities is key to securing computer systems. Using this perspective, we analyzemore » common vulnerabilities that plague our computer systems, consider the effectiveness of currently available security solutions, and propose several new security solutions.« less
Computer graphics application in the engineering design integration system
NASA Technical Reports Server (NTRS)
Glatt, C. R.; Abel, R. W.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Stewart, W. A.
1975-01-01
The computer graphics aspect of the Engineering Design Integration (EDIN) system and its application to design problems were discussed. Three basic types of computer graphics may be used with the EDIN system for the evaluation of aerospace vehicles preliminary designs: offline graphics systems using vellum-inking or photographic processes, online graphics systems characterized by direct coupled low cost storage tube terminals with limited interactive capabilities, and a minicomputer based refresh terminal offering highly interactive capabilities. The offline line systems are characterized by high quality (resolution better than 0.254 mm) and slow turnaround (one to four days). The online systems are characterized by low cost, instant visualization of the computer results, slow line speed (300 BAUD), poor hard copy, and the early limitations on vector graphic input capabilities. The recent acquisition of the Adage 330 Graphic Display system has greatly enhanced the potential for interactive computer aided design.
Yamamoto, K; Ogura, H; Furutani, H; Kitazoe, Y; Takeda, Y; Hirakawa, M
1986-01-01
A computer system operation is introduced, which has been in use since October 1981 at Kochi medical school as one of the integral sub-systems of the total hospital information system called IMIS. The system was designed from the beginning with the main purposes of obtaining better management of operations, and detailed medical records are included for before, during and after operations. It is shown that almost all operations except emergencies were managed using the computer system rather than the paper system. After presenting some of the results of the accumulated records we will discuss the reason for this high frequency of use of the computer system.
ERIC Educational Resources Information Center
Hecquet, Ignace; And Others
Principles are outlined that are used as a basis for the system of pricing the services of the Computer Centre. The system illustrates the use of a management method to secure better utilization of university resources. Departments decide how to use the appropriations granted to them and establish a system of internal prices that reflect the cost…
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-10-20
Look-ahead dynamic simulation software system incorporates the high performance parallel computing technologies, significantly reduces the solution time for each transient simulation case, and brings the dynamic simulation analysis into on-line applications to enable more transparency for better reliability and asset utilization. It takes the snapshot of the current power grid status, functions in parallel computing the system dynamic simulation, and outputs the transient response of the power system in real time.
Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing
NASA Technical Reports Server (NTRS)
Wells, B. Earl
2003-01-01
The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.
A Study of the Tablet Computer's Application in K-12 Schools in China
ERIC Educational Resources Information Center
Long, Taotao; Liang, Wenxin; Yu, Shengquan
2013-01-01
As an emerging mobile terminal, the tablet computer has begun to enter into the educational system. With the aim of having a better understanding of the application and people's perspectives on the new technology in K-12 schools in China, a survey was conducted to investigate the tablet computer's application, user's perspectives and requirements…
NASA Applications for Computational Electromagnetic Analysis
NASA Technical Reports Server (NTRS)
Lewis, Catherine C.; Trout, Dawn H.; Krome, Mark E.; Perry, Thomas A.
2011-01-01
Computational Electromagnetic Software is used by NASA to analyze the compatibility of systems too large or too complex for testing. Recent advances in software packages and computer capabilities have made it possible to determine the effects of a transmitter inside a launch vehicle fairing, better analyze the environment threats, and perform on-orbit replacements with assured electromagnetic compatibility.
ERIC Educational Resources Information Center
Feldmann, Richard J.; And Others
1972-01-01
Computer graphics provides a valuable tool for the representation and a better understanding of structures, both small and large. Accurate and rapid construction, manipulation, and plotting of structures, such as macromolecules as complex as hemoglobin, are performed by a collection of computer programs and a time-sharing computer. (21 references)…
INTERACTIONS BETWEEN ORGANIC COMPOUNDS AND CYCLODEXTRIN-CLAY SYSTEMS
Computational and experimental techniques are combined in order to better understand interactions involving organic compounds and cyclodextrin (CD)-clay systems. CD-clay systems may have great potential in the containment of organic contaminants in the environment. This study w...
Reducing usage of the computational resources by event driven approach to model predictive control
NASA Astrophysics Data System (ADS)
Misik, Stefan; Bradac, Zdenek; Cela, Arben
2017-08-01
This paper deals with a real-time and optimal control of dynamic systems while also considers the constraints which these systems might be subject to. Main objective of this work is to propose a simple modification of the existing Model Predictive Control approach to better suit needs of computational resource-constrained real-time systems. An example using model of a mechanical system is presented and the performance of the proposed method is evaluated in a simulated environment.
The Guide to Better Hospital Computer Decisions
Dorenfest, Sheldon I.
1981-01-01
A soon-to-be-published major study of hospital computer use entitled “The Guide to Better Hospital Computer Decisions” was conducted by my firm over the past 2½ years. The study required over twenty (20) man years of effort at a cost of over $300,000, and the six (6) volume final report provides more than 1,000 pages of data about how hospitals are and will be using computerized medical and business information systems. It describes the current status and future expectations for computer use in major application areas, such as, but not limited to, finance, admitting, pharmacy, laboratory, data collection and hospital or medical information systems. It also includes profiles of over 100 companies and other types of organizations providing data processing products and services to hospitals. In this paper, we discuss the need for the study, the specific objectives of the study, the methodology and approach taken to complete the study and a few major conclusions.
Terahertz Computed Tomography of NASA Thermal Protection System Materials
NASA Technical Reports Server (NTRS)
Roth, D. J.; Reyes-Rodriguez, S.; Zimdars, D. A.; Rauser, R. W.; Ussery, W. W.
2011-01-01
A terahertz axial computed tomography system has been developed that uses time domain measurements in order to form cross-sectional image slices and three-dimensional volume renderings of terahertz-transparent materials. The system can inspect samples as large as 0.0283 cubic meters (1 cubic foot) with no safety concerns as for x-ray computed tomography. In this study, the system is evaluated for its ability to detect and characterize flat bottom holes, drilled holes, and embedded voids in foam materials utilized as thermal protection on the external fuel tanks for the Space Shuttle. X-ray micro-computed tomography was also performed on the samples to compare against the terahertz computed tomography results and better define embedded voids. Limits of detectability based on depth and size for the samples used in this study are loosely defined. Image sharpness and morphology characterization ability for terahertz computed tomography are qualitatively described.
ERIC Educational Resources Information Center
Daniels, Mindy A.
2012-01-01
The purpose of this case study was to compare the pedagogical and affective efficiency and efficacy of creative prose fiction writing workshops taught via asynchronous computer-mediated online distance education with creative prose fiction writing workshops taught face-to-face in order to better understand their operational pedagogy and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.
Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. Formore » better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.« less
Experimental comparison of two quantum computing architectures.
Linke, Norbert M; Maslov, Dmitri; Roetteler, Martin; Debnath, Shantanu; Figgatt, Caroline; Landsman, Kevin A; Wright, Kenneth; Monroe, Christopher
2017-03-28
We run a selection of algorithms on two state-of-the-art 5-qubit quantum computers that are based on different technology platforms. One is a publicly accessible superconducting transmon device (www. ibm.com/ibm-q) with limited connectivity, and the other is a fully connected trapped-ion system. Even though the two systems have different native quantum interactions, both can be programed in a way that is blind to the underlying hardware, thus allowing a comparison of identical quantum algorithms between different physical systems. We show that quantum algorithms and circuits that use more connectivity clearly benefit from a better-connected system of qubits. Although the quantum systems here are not yet large enough to eclipse classical computers, this experiment exposes critical factors of scaling quantum computers, such as qubit connectivity and gate expressivity. In addition, the results suggest that codesigning particular quantum applications with the hardware itself will be paramount in successfully using quantum computers in the future.
Tracking by Identification Using Computer Vision and Radio
Mandeljc, Rok; Kovačič, Stanislav; Kristan, Matej; Perš, Janez
2013-01-01
We present a novel system for detection, localization and tracking of multiple people, which fuses a multi-view computer vision approach with a radio-based localization system. The proposed fusion combines the best of both worlds, excellent computer-vision-based localization, and strong identity information provided by the radio system, and is therefore able to perform tracking by identification, which makes it impervious to propagated identity switches. We present comprehensive methodology for evaluation of systems that perform person localization in world coordinate system and use it to evaluate the proposed system as well as its components. Experimental results on a challenging indoor dataset, which involves multiple people walking around a realistically cluttered room, confirm that proposed fusion of both systems significantly outperforms its individual components. Compared to the radio-based system, it achieves better localization results, while at the same time it successfully prevents propagation of identity switches that occur in pure computer-vision-based tracking. PMID:23262485
Systems Management of Air Force Standard Communications-Computer systems: There is a Better Way
1988-04-01
upgrade or replacement of systems. AFR 700-6, Information Systems Operation Management , AFR 700-7, Information Processing Center Opera- tions Management...and AFR 700-8, Telephone Systems Operation Management provide USAF guidance, policy and procedures governing this phase. 4 2. 800-Series Regulations
A simple computational algorithm of model-based choice preference.
Toyama, Asako; Katahira, Kentaro; Ohira, Hideki
2017-08-01
A broadly used computational framework posits that two learning systems operate in parallel during the learning of choice preferences-namely, the model-free and model-based reinforcement-learning systems. In this study, we examined another possibility, through which model-free learning is the basic system and model-based information is its modulator. Accordingly, we proposed several modified versions of a temporal-difference learning model to explain the choice-learning process. Using the two-stage decision task developed by Daw, Gershman, Seymour, Dayan, and Dolan (2011), we compared their original computational model, which assumes a parallel learning process, and our proposed models, which assume a sequential learning process. Choice data from 23 participants showed a better fit with the proposed models. More specifically, the proposed eligibility adjustment model, which assumes that the environmental model can weight the degree of the eligibility trace, can explain choices better under both model-free and model-based controls and has a simpler computational algorithm than the original model. In addition, the forgetting learning model and its variation, which assume changes in the values of unchosen actions, substantially improved the fits to the data. Overall, we show that a hybrid computational model best fits the data. The parameters used in this model succeed in capturing individual tendencies with respect to both model use in learning and exploration behavior. This computational model provides novel insights into learning with interacting model-free and model-based components.
Project Solo; Newsletter Number Four.
ERIC Educational Resources Information Center
Pittsburgh Univ., PA. Project Solo.
A paper titled "Myopia, Cornucopia and Utopia" makes up the major portion of this Project Solo Newsletter. It emphasizes the danger involved in the belief that the larger the system the better, and points out that although the computer utilizes technology, the human with judgment utilizes the computer. Some details of the Project Solo…
Using Computers in Early Childhood Classrooms: Teachers' Attitudes, Skills and Practices
ERIC Educational Resources Information Center
Chen, Jie-Qi; Chang, Charles
2006-01-01
To better prepare early childhood teachers for computer use, more information about their current skills and classroom practices is needed. Sampling from a large metropolitan public school system in the USA, the study surveyed 297 state pre-kindergarten teachers, gathering information about their attitudes, skills, and instructional methods…
Mobile cloud-computing-based healthcare service by noncontact ECG monitoring.
Fong, Ee-May; Chung, Wan-Young
2013-12-02
Noncontact electrocardiogram (ECG) measurement technique has gained popularity these days owing to its noninvasive features and convenience in daily life use. This paper presents mobile cloud computing for a healthcare system where a noncontact ECG measurement method is employed to capture biomedical signals from users. Healthcare service is provided to continuously collect biomedical signals from multiple locations. To observe and analyze the ECG signals in real time, a mobile device is used as a mobile monitoring terminal. In addition, a personalized healthcare assistant is installed on the mobile device; several healthcare features such as health status summaries, medication QR code scanning, and reminders are integrated into the mobile application. Health data are being synchronized into the healthcare cloud computing service (Web server system and Web server dataset) to ensure a seamless healthcare monitoring system and anytime and anywhere coverage of network connection is available. Together with a Web page application, medical data are easily accessed by medical professionals or family members. Web page performance evaluation was conducted to ensure minimal Web server latency. The system demonstrates better availability of off-site and up-to-the-minute patient data, which can help detect health problems early and keep elderly patients out of the emergency room, thus providing a better and more comprehensive healthcare cloud computing service.
Mobile Cloud-Computing-Based Healthcare Service by Noncontact ECG Monitoring
Fong, Ee-May; Chung, Wan-Young
2013-01-01
Noncontact electrocardiogram (ECG) measurement technique has gained popularity these days owing to its noninvasive features and convenience in daily life use. This paper presents mobile cloud computing for a healthcare system where a noncontact ECG measurement method is employed to capture biomedical signals from users. Healthcare service is provided to continuously collect biomedical signals from multiple locations. To observe and analyze the ECG signals in real time, a mobile device is used as a mobile monitoring terminal. In addition, a personalized healthcare assistant is installed on the mobile device; several healthcare features such as health status summaries, medication QR code scanning, and reminders are integrated into the mobile application. Health data are being synchronized into the healthcare cloud computing service (Web server system and Web server dataset) to ensure a seamless healthcare monitoring system and anytime and anywhere coverage of network connection is available. Together with a Web page application, medical data are easily accessed by medical professionals or family members. Web page performance evaluation was conducted to ensure minimal Web server latency. The system demonstrates better availability of off-site and up-to-the-minute patient data, which can help detect health problems early and keep elderly patients out of the emergency room, thus providing a better and more comprehensive healthcare cloud computing service. PMID:24316562
Visual analytics as a translational cognitive science.
Fisher, Brian; Green, Tera Marie; Arias-Hernández, Richard
2011-07-01
Visual analytics is a new interdisciplinary field of study that calls for a more structured scientific approach to understanding the effects of interaction with complex graphical displays on human cognitive processes. Its primary goal is to support the design and evaluation of graphical information systems that better support cognitive processes in areas as diverse as scientific research and emergency management. The methodologies that make up this new field are as yet ill defined. This paper proposes a pathway for development of visual analytics as a translational cognitive science that bridges fundamental research in human/computer cognitive systems and design and evaluation of information systems in situ. Achieving this goal will require the development of enhanced field methods for conceptual decomposition of human/computer cognitive systems that maps onto laboratory studies, and improved methods for conducting laboratory investigations that might better map onto real-world cognitive processes in technology-rich environments. Copyright © 2011 Cognitive Science Society, Inc.
Distributed computer taxonomy based on O/S structure
NASA Technical Reports Server (NTRS)
Foudriat, Edwin C.
1985-01-01
The taxonomy considers the resource structure at the operating system level. It compares a communication based taxonomy with the new taxonomy to illustrate how the latter does a better job when related to the client's view of the distributed computer. The results illustrate the fundamental features and what is required to construct fully distributed processing systems. The problem of using network computers on the space station is addressed. A detailed discussion of the taxonomy is not given here. Information is given in the form of charts and diagrams that were used to illustrate a talk.
The control of float zone interfaces by the use of selected boundary conditions
NASA Technical Reports Server (NTRS)
Foster, L. M.; Mcintosh, J.
1983-01-01
The main goal of the float zone crystal growth project of NASA's Materials Processing in Space Program is to thoroughly understand the molten zone/freezing crystal system and all the mechanisms that govern this system. The surface boundary conditions required to give flat float zone solid melt interfaces were studied and computed. The results provide float zone furnace designers with better methods for controlling solid melt interface shapes and for computing thermal profiles and gradients. Documentation and a user's guide were provided for the computer software.
Goshima, Yoshio; Hida, Tomonobu; Gotoh, Toshiyuki
2012-01-01
Axonal transport plays a crucial role in neuronal morphogenesis, survival and function. Despite its importance, however, the molecular mechanisms of axonal transport remain mostly unknown because a simple and quantitative assay system for monitoring this cellular process has been lacking. In order to better characterize the mechanisms involved in axonal transport, we formulate a novel computer-assisted monitoring system of axonal transport. Potential uses of this system and implications for future studies will be discussed.
A service based adaptive U-learning system using UX.
Jeong, Hwa-Young; Yi, Gangman
2014-01-01
In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques.
A Service Based Adaptive U-Learning System Using UX
Jeong, Hwa-Young
2014-01-01
In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques. PMID:25147832
NASA Technical Reports Server (NTRS)
Tomayko, James E.
1986-01-01
Twenty-five years of spacecraft onboard computer development have resulted in a better understanding of the requirements for effective, efficient, and fault tolerant flight computer systems. Lessons from eight flight programs (Gemini, Apollo, Skylab, Shuttle, Mariner, Voyager, and Galileo) and three reserach programs (digital fly-by-wire, STAR, and the Unified Data System) are useful in projecting the computer hardware configuration of the Space Station and the ways in which the Ada programming language will enhance the development of the necessary software. The evolution of hardware technology, fault protection methods, and software architectures used in space flight in order to provide insight into the pending development of such items for the Space Station are reviewed.
Program Aids Visualization Of Data
NASA Technical Reports Server (NTRS)
Truong, L. V.
1995-01-01
Living Color Frame System (LCFS) computer program developed to solve some problems that arise in connection with generation of real-time graphical displays of numerical data and of statuses of systems. Need for program like LCFS arises because computer graphics often applied for better understanding and interpretation of data under observation and these graphics become more complicated when animation required during run time. Eliminates need for custom graphical-display software for application programs. Written in Turbo C++.
Running R Statistical Computing Environment Software on the Peregrine
for the development of new statistical methodologies and enjoys a large user base. Please consult the distribution details. Natural language support but running in an English locale R is a collaborative project programming paradigms to better leverage modern HPC systems. The CRAN task view for High Performance Computing
ERIC Educational Resources Information Center
Chen, Hsinchun; Martinez, Joanne; Kirchhoff, Amy; Ng, Tobun D.; Schatz, Bruce R.
1998-01-01
Grounded on object filtering, automatic indexing, and co-occurrence analysis, an experiment was performed using a parallel supercomputer to analyze over 400,000 abstracts in an INSPEC computer engineering collection. A user evaluation revealed that system-generated thesauri were better than the human-generated INSPEC subject thesaurus in concept…
Algorithmic mechanisms for reliable crowdsourcing computation under collusion.
Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A; Pareja, Daniel
2015-01-01
We consider a computing system where a master processor assigns a task for execution to worker processors that may collude. We model the workers' decision of whether to comply (compute the task) or not (return a bogus result to save the computation cost) as a game among workers. That is, we assume that workers are rational in a game-theoretic sense. We identify analytically the parameter conditions for a unique Nash Equilibrium where the master obtains the correct result. We also evaluate experimentally mixed equilibria aiming to attain better reliability-profit trade-offs. For a wide range of parameter values that may be used in practice, our simulations show that, in fact, both master and workers are better off using a pure equilibrium where no worker cheats, even under collusion, and even for colluding behaviors that involve deviating from the game.
Algorithmic Mechanisms for Reliable Crowdsourcing Computation under Collusion
Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A.; Pareja, Daniel
2015-01-01
We consider a computing system where a master processor assigns a task for execution to worker processors that may collude. We model the workers’ decision of whether to comply (compute the task) or not (return a bogus result to save the computation cost) as a game among workers. That is, we assume that workers are rational in a game-theoretic sense. We identify analytically the parameter conditions for a unique Nash Equilibrium where the master obtains the correct result. We also evaluate experimentally mixed equilibria aiming to attain better reliability-profit trade-offs. For a wide range of parameter values that may be used in practice, our simulations show that, in fact, both master and workers are better off using a pure equilibrium where no worker cheats, even under collusion, and even for colluding behaviors that involve deviating from the game. PMID:25793524
Language evolution and human-computer interaction
NASA Technical Reports Server (NTRS)
Grudin, Jonathan; Norman, Donald A.
1991-01-01
Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.
Converting differential-equation models of biological systems to membrane computing.
Muniyandi, Ravie Chandren; Zin, Abdullah Mohd; Sanders, J W
2013-12-01
This paper presents a method to convert the deterministic, continuous representation of a biological system by ordinary differential equations into a non-deterministic, discrete membrane computation. The dynamics of the membrane computation is governed by rewrite rules operating at certain rates. That has the advantage of applying accurately to small systems, and to expressing rates of change that are determined locally, by region, but not necessary globally. Such spatial information augments the standard differentiable approach to provide a more realistic model. A biological case study of the ligand-receptor network of protein TGF-β is used to validate the effectiveness of the conversion method. It demonstrates the sense in which the behaviours and properties of the system are better preserved in the membrane computing model, suggesting that the proposed conversion method may prove useful for biological systems in particular. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Leahy, P.P.
1982-01-01
The Trescott computer program for modeling groundwater flow in three dimensions has been modified to (1) treat aquifer and confining bed pinchouts more realistically and (2) reduce the computer memory requirements needed for the input data. Using the original program, simulation of aquifer systems with nonrectangular external boundaries may result in a large number of nodes that are not involved in the numerical solution of the problem, but require computer storage. (USGS)
Experimental comparison of two quantum computing architectures
Linke, Norbert M.; Maslov, Dmitri; Roetteler, Martin; Debnath, Shantanu; Figgatt, Caroline; Landsman, Kevin A.; Wright, Kenneth; Monroe, Christopher
2017-01-01
We run a selection of algorithms on two state-of-the-art 5-qubit quantum computers that are based on different technology platforms. One is a publicly accessible superconducting transmon device (www.research.ibm.com/ibm-q) with limited connectivity, and the other is a fully connected trapped-ion system. Even though the two systems have different native quantum interactions, both can be programed in a way that is blind to the underlying hardware, thus allowing a comparison of identical quantum algorithms between different physical systems. We show that quantum algorithms and circuits that use more connectivity clearly benefit from a better-connected system of qubits. Although the quantum systems here are not yet large enough to eclipse classical computers, this experiment exposes critical factors of scaling quantum computers, such as qubit connectivity and gate expressivity. In addition, the results suggest that codesigning particular quantum applications with the hardware itself will be paramount in successfully using quantum computers in the future. PMID:28325879
A forward view on reliable computers for flight control
NASA Technical Reports Server (NTRS)
Goldberg, J.; Wensley, J. H.
1976-01-01
The requirements for fault-tolerant computers for flight control of commercial aircraft are examined; it is concluded that the reliability requirements far exceed those typically quoted for space missions. Examination of circuit technology and alternative computer architectures indicates that the desired reliability can be achieved with several different computer structures, though there are obvious advantages to those that are more economic, more reliable, and, very importantly, more certifiable as to fault tolerance. Progress in this field is expected to bring about better computer systems that are more rigorously designed and analyzed even though computational requirements are expected to increase significantly.
A computationally efficient scheme for the non-linear diffusion equation
NASA Astrophysics Data System (ADS)
Termonia, P.; Van de Vyver, H.
2009-04-01
This Letter proposes a new numerical scheme for integrating the non-linear diffusion equation. It is shown that it is linearly stable. Some tests are presented comparing this scheme to a popular decentered version of the linearized Crank-Nicholson scheme, showing that, although this scheme is slightly less accurate in treating the highly resolved waves, (i) the new scheme better treats highly non-linear systems, (ii) better handles the short waves, (iii) for a given test bed turns out to be three to four times more computationally cheap, and (iv) is easier in implementation.
An Intelligent Systems Approach to Automated Object Recognition: A Preliminary Study
Maddox, Brian G.; Swadley, Casey L.
2002-01-01
Attempts at fully automated object recognition systems have met with varying levels of success over the years. However, none of the systems have achieved high enough accuracy rates to be run unattended. One of the reasons for this may be that they are designed from the computer's point of view and rely mainly on image-processing methods. A better solution to this problem may be to make use of modern advances in computational intelligence and distributed processing to try to mimic how the human brain is thought to recognize objects. As humans combine cognitive processes with detection techniques, such a system would combine traditional image-processing techniques with computer-based intelligence to determine the identity of various objects in a scene.
Teaching Children Thinking. Artificial Intelligence Memo Number 247.
ERIC Educational Resources Information Center
Papert, Seymour
It is possible to maintain a vision of a technologically oriented educational system which is grander than the current one in which new gadgets are used to teach the old material in a thinly disguised old way. Educational innovation, particularly when computers are included, can find better things for children to do and better ways for the child…
Computer-based rhythm diagnosis and its possible influence on nonexpert electrocardiogram readers.
Hakacova, Nina; Trägårdh-Johansson, Elin; Wagner, Galen S; Maynard, Charles; Pahlm, Olle
2012-01-01
Systems providing computer-based analysis of the resting electrocardiogram (ECG) seek to improve the quality of health care by providing accurate and timely automatic diagnosis of, for example, cardiac rhythm to clinicians. The accuracy of these diagnoses, however, remains questionable. We tested the hypothesis that (a) 2 independent automated ECG systems have better accuracy in rhythm diagnosis than nonexpert clinicians and (b) both systems provide correct diagnostic suggestions in a large percentage of cases where the diagnosis of nonexpert clinicians is incorrect. Five hundred ECGs were manually analyzed by 2 senior experts, 3 nonexpert clinicians, and automatically by 2 automated systems. The accuracy of the nonexpert rhythm statements was compared with the accuracy of each system statement. The proportion of rhythm statements when the clinician's diagnoses were incorrect and the systems instead provided correct diagnosis was assessed. A total of 420 sinus rhythms and 156 rhythm disturbances were recognized by expert reading. Significance of the difference in accuracy between nonexperts and systems was P = .45 for system A and P = .11 for system B. The percentage of correct automated diagnoses in cases when the clinician was incorrect was 28% ± 10% for system A and 25% ± 11% for system B (P = .09). The rhythm diagnoses of automated systems did not reach better average accuracy than those of nonexpert readings. The computer diagnosis of rhythm can be incorrect in cases where the clinicians fail in reaching the correct ECG diagnosis. Copyright © 2012. Published by Elsevier Inc.
a Linux PC Cluster for Lattice QCD with Exact Chiral Symmetry
NASA Astrophysics Data System (ADS)
Chiu, Ting-Wai; Hsieh, Tung-Han; Huang, Chao-Hsi; Huang, Tsung-Ren
A computational system for lattice QCD with overlap Dirac quarks is described. The platform is a home-made Linux PC cluster, built with off-the-shelf components. At present the system constitutes of 64 nodes, with each node consisting of one Pentium 4 processor (1.6/2.0/2.5 GHz), one Gbyte of PC800/1066 RDRAM, one 40/80/120 Gbyte hard disk, and a network card. The computationally intensive parts of our program are written in SSE2 codes. The speed of our system is estimated to be 70 Gflops, and its price/performance ratio is better than $1.0/Mflops for 64-bit (double precision) computations in quenched QCD. We discuss how to optimize its hardware and software for computing propagators of overlap Dirac quarks.
Physically Based Virtual Surgery Planning and Simulation Tools for Personal Health Care Systems
NASA Astrophysics Data System (ADS)
Dogan, Firat; Atilgan, Yasemin
The virtual surgery planning and simulation tools have gained a great deal of importance in the last decade in a consequence of increasing capacities at the information technology level. The modern hardware architectures, large scale database systems, grid based computer networks, agile development processes, better 3D visualization and all the other strong aspects of the information technology brings necessary instruments into almost every desk. The last decade’s special software and sophisticated super computer environments are now serving to individual needs inside “tiny smart boxes” for reasonable prices. However, resistance to learning new computerized environments, insufficient training and all the other old habits prevents effective utilization of IT resources by the specialists of the health sector. In this paper, all the aspects of the former and current developments in surgery planning and simulation related tools are presented, future directions and expectations are investigated for better electronic health care systems.
From biological and social network metaphors to coupled bio-social wireless networks
Barrett, Christopher L.; Eubank, Stephen; Anil Kumar, V.S.; Marathe, Madhav V.
2010-01-01
Biological and social analogies have been long applied to complex systems. Inspiration has been drawn from biological solutions to solve problems in engineering products and systems, ranging from Velcro to camouflage to robotics to adaptive and learning computing methods. In this paper, we present an overview of recent advances in understanding biological systems as networks and use this understanding to design and analyse wireless communication networks. We expand on two applications, namely cognitive sensing and control and wireless epidemiology. We discuss how our work in these two applications is motivated by biological metaphors. We believe that recent advances in computing and communications coupled with advances in health and social sciences raise the possibility of studying coupled bio-social communication networks. We argue that we can better utilise the advances in our understanding of one class of networks to better our understanding of the other. PMID:21643462
NASA Astrophysics Data System (ADS)
Roussel, Marc R.
1999-10-01
One of the traditional obstacles to learning quantum mechanics is the relatively high level of mathematical proficiency required to solve even routine problems. Modern computer algebra systems are now sufficiently reliable that they can be used as mathematical assistants to alleviate this difficulty. In the quantum mechanics course at the University of Lethbridge, the traditional three lecture hours per week have been replaced by two lecture hours and a one-hour computer-aided problem solving session using a computer algebra system (Maple). While this somewhat reduces the number of topics that can be tackled during the term, students have a better opportunity to familiarize themselves with the underlying theory with this course design. Maple is also available to students during examinations. The use of a computer algebra system expands the class of feasible problems during a time-limited exercise such as a midterm or final examination. A modern computer algebra system is a complex piece of software, so some time needs to be devoted to teaching the students its proper use. However, the advantages to the teaching of quantum mechanics appear to outweigh the disadvantages.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madduri, Kamesh; Ediger, David; Jiang, Karl
2009-02-15
We present a new lock-free parallel algorithm for computing betweenness centralityof massive small-world networks. With minor changes to the data structures, ouralgorithm also achieves better spatial cache locality compared to previous approaches. Betweenness centrality is a key algorithm kernel in HPCS SSCA#2, a benchmark extensively used to evaluate the performance of emerging high-performance computing architectures for graph-theoretic computations. We design optimized implementations of betweenness centrality and the SSCA#2 benchmark for two hardware multithreaded systems: a Cray XMT system with the Threadstorm processor, and a single-socket Sun multicore server with the UltraSPARC T2 processor. For a small-world network of 134 millionmore » vertices and 1.073 billion edges, the 16-processor XMT system and the 8-core Sun Fire T5120 server achieve TEPS scores (an algorithmic performance count for the SSCA#2 benchmark) of 160 million and 90 million respectively, which corresponds to more than a 2X performance improvement over the previous parallel implementations. To better characterize the performance of these multithreaded systems, we correlate the SSCA#2 performance results with data from the memory-intensive STREAM and RandomAccess benchmarks. Finally, we demonstrate the applicability of our implementation to analyze massive real-world datasets by computing approximate betweenness centrality for a large-scale IMDb movie-actor network.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madduri, Kamesh; Ediger, David; Jiang, Karl
2009-05-29
We present a new lock-free parallel algorithm for computing betweenness centrality of massive small-world networks. With minor changes to the data structures, our algorithm also achieves better spatial cache locality compared to previous approaches. Betweenness centrality is a key algorithm kernel in the HPCS SSCA#2 Graph Analysis benchmark, which has been extensively used to evaluate the performance of emerging high-performance computing architectures for graph-theoretic computations. We design optimized implementations of betweenness centrality and the SSCA#2 benchmark for two hardware multithreaded systems: a Cray XMT system with the ThreadStorm processor, and a single-socket Sun multicore server with the UltraSparc T2 processor.more » For a small-world network of 134 million vertices and 1.073 billion edges, the 16-processor XMT system and the 8-core Sun Fire T5120 server achieve TEPS scores (an algorithmic performance count for the SSCA#2 benchmark) of 160 million and 90 million respectively, which corresponds to more than a 2X performance improvement over the previous parallel implementations. To better characterize the performance of these multithreaded systems, we correlate the SSCA#2 performance results with data from the memory-intensive STREAM and RandomAccess benchmarks. Finally, we demonstrate the applicability of our implementation to analyze massive real-world datasets by computing approximate betweenness centrality for a large-scale IMDb movie-actor network.« less
1987-03-01
information and work in a completely secure environment. Information used with today’s C3I systems must be protected. To better understand the role of...and security was of minor concern. The user either worked on his own behalf or as a programmer for someone else. The computer power was limited. With...Although the modules may be of the same classification level, the manager may want to limit each team’s access to the module on which they are working
Review of NASA antiskid braking research
NASA Technical Reports Server (NTRS)
Tanner, J. A.
1982-01-01
NASA antiskid braking system research programs are reviewed. These programs include experimental studies of four antiskid systems on the Langley Landing Loads Track, flights tests with a DC-9 airplane, and computer simulation studies. Results from these research efforts include identification of factors contributing to degraded antiskid performance under adverse weather conditions, tire tread temperature measurements during antiskid braking on dry runway surfaces, and an assessment of the accuracy of various brake pressure-torque computer models. This information should lead to the development of better antiskid systems in the future.
Survey shows continued strong interest in UNIX applications for healthcare.
Dunbar, C
1993-03-01
As part of the general computer industry movement toward open systems, many are predicting UNIX will become the dominant host operating system of the late 1990s. To better understand this prediction within the healthcare setting, Computers in Healthcare surveyed our readership about their opinions of UNIX, its current use and its relative importance as an information services strategy. The upshot? CIH readers definitely want more systems on UNIX, more healthcare applications written for UNIX and more trained resource people to help them with faster installation and more useful applications.
Annual Rainfall Forecasting by Using Mamdani Fuzzy Inference System
NASA Astrophysics Data System (ADS)
Fallah-Ghalhary, G.-A.; Habibi Nokhandan, M.; Mousavi Baygi, M.
2009-04-01
Long-term rainfall prediction is very important to countries thriving on agro-based economy. In general, climate and rainfall are highly non-linear phenomena in nature giving rise to what is known as "butterfly effect". The parameters that are required to predict the rainfall are enormous even for a short period. Soft computing is an innovative approach to construct computationally intelligent systems that are supposed to possess humanlike expertise within a specific domain, adapt themselves and learn to do better in changing environments, and explain how they make decisions. Unlike conventional artificial intelligence techniques the guiding principle of soft computing is to exploit tolerance for imprecision, uncertainty, robustness, partial truth to achieve tractability, and better rapport with reality. In this paper, 33 years of rainfall data analyzed in khorasan state, the northeastern part of Iran situated at latitude-longitude pairs (31°-38°N, 74°- 80°E). this research attempted to train Fuzzy Inference System (FIS) based prediction models with 33 years of rainfall data. For performance evaluation, the model predicted outputs were compared with the actual rainfall data. Simulation results reveal that soft computing techniques are promising and efficient. The test results using by FIS model showed that the RMSE was obtained 52 millimeter.
Darwinian Spacecraft: Soft Computing Strategies Breeding Better, Faster Cheaper
NASA Technical Reports Server (NTRS)
Noever, David A.; Baskaran, Subbiah
1999-01-01
Computers can create infinite lists of combinations to try to solve a particular problem, a process called "soft-computing." This process uses statistical comparables, neural networks, genetic algorithms, fuzzy variables in uncertain environments, and flexible machine learning to create a system which will allow spacecraft to increase robustness, and metric evaluation. These concepts will allow for the development of a spacecraft which will allow missions to be performed at lower costs.
Development of an assisting detection system for early infarct diagnosis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sim, K. S.; Nia, M. E.; Ee, C. S.
2015-04-24
In this paper, a detection assisting system for early infarct detection is developed. This new developed method is used to assist the medical practitioners to diagnose infarct from computed tomography images of brain. Using this assisting system, the infarct could be diagnosed at earlier stages. The non-contrast computed tomography (NCCT) brain images are the data set used for this system. Detection module extracts the pixel data from NCCT brain images, and produces the colourized version of images. The proposed method showed great potential in detecting infarct, and helps medical practitioners to make earlier and better diagnoses.
Deana D. Pennington
2007-01-01
Exploratory modeling is an approach used when process and/or parameter uncertainties are such that modeling attempts at realistic prediction are not appropriate. Exploratory modeling makes use of computational experimentation to test how varying model scenarios drive model outcome. The goal of exploratory modeling is to better understand the system of interest through...
Socially Relevant Knowledge Based Telemedicine
2011-10-01
or attitude at different situations and different circumstances. Fogg mentions that there are many reasons that computers can be better persuaders...finding appropriate way to persuade users to perform various activities. Fogg [8] defines persuasive technologies as “interactive computing systems...Education, IEEE Consumer Electronics Society Conference Games Innovation, ICE-GIC, 2009, pp 54-63. [8] Fogg , B. J., Persuasive Technology: Using
Cardiology office computer use: primer, pointers, pitfalls.
Shepard, R B; Blum, R I
1986-10-01
An office computer is a utility, like an automobile, with benefits and costs that are both direct and hidden and potential for disaster. For the cardiologist or cardiovascular surgeon, the increasing power and decreasing costs of computer hardware and the availability of software make use of an office computer system an increasingly attractive possibility. Management of office business functions is common; handling and scientific analysis of practice medical information are less common. The cardiologist can also access national medical information systems for literature searches and for interactive further education. Selection and testing of programs and the entire computer system before purchase of computer hardware will reduce the chances of disappointment or serious problems. Personnel pretraining and planning for office information flow and medical information security are necessary. Some cardiologists design their own office systems, buy hardware and software as needed, write programs for themselves and carry out the implementation themselves. For most cardiologists, the better course will be to take advantage of the professional experience of expert advisors. This article provides a starting point from which the practicing cardiologist can approach considering, specifying or implementing an office computer system for business functions and for scientific analysis of practice results.
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1990-01-01
Although powerful computers have allowed complex physical and manmade hardware systems to be modeled successfully, we have encountered persistent problems with the reliability of computer models for systems involving human learning, human action, and human organizations. This is not a misfortune; unlike physical and manmade systems, human systems do not operate under a fixed set of laws. The rules governing the actions allowable in the system can be changed without warning at any moment, and can evolve over time. That the governing laws are inherently unpredictable raises serious questions about the reliability of models when applied to human situations. In these domains, computers are better used, not for prediction and planning, but for aiding humans. Examples are systems that help humans speculate about possible futures, offer advice about possible actions in a domain, systems that gather information from the networks, and systems that track and support work flows in organizations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Renke; Jin, Shuangshuang; Chen, Yousu
This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less
An adaptive brain actuated system for augmenting rehabilitation
Roset, Scott A.; Gant, Katie; Prasad, Abhishek; Sanchez, Justin C.
2014-01-01
For people living with paralysis, restoration of hand function remains the top priority because it leads to independence and improvement in quality of life. In approaches to restore hand and arm function, a goal is to better engage voluntary control and counteract maladaptive brain reorganization that results from non-use. Standard rehabilitation augmented with developments from the study of brain-computer interfaces could provide a combined therapy approach for motor cortex rehabilitation and to alleviate motor impairments. In this paper, an adaptive brain-computer interface system intended for application to control a functional electrical stimulation (FES) device is developed as an experimental test bed for augmenting rehabilitation with a brain-computer interface. The system's performance is improved throughout rehabilitation by passive user feedback and reinforcement learning. By continuously adapting to the user's brain activity, similar adaptive systems could be used to support clinical brain-computer interface neurorehabilitation over multiple days. PMID:25565945
De Cock, Jens; Zanca, Federica; Canning, John; Pauwels, Ruben; Hermans, Robert
2015-07-01
To evaluate image quality and radiation dose of a state of the art cone beam computed tomography (CBCT) system and a multislice computed tomography (MSCT) system in patients with sinonasal poliposis. In this retrospective study two radiologists evaluated 57 patients with sinonasal poliposis who underwent a CBCT or MSCT sinus examination, along with a control group of 90 patients with normal radiological findings. Tissue doses were measured using a phantom model with thermoluminescent dosimeters (TLD). Overall image quality in CBCT was scored significantly higher than in MSCT in patients with normal radiologic findings (p-value: 0.00001). In patients with sinonasal poliposis, MSCT scored significantly higher than CBCT (p-value: 0.00001). The average effective dose for MSCT was 42% higher compared to CBCT (108 μSv vs 63 μSv). CBCT and MSCT are both suited for the evaluation of sinonasal poliposis. In patients with sinonasal poliposis, clinically important structures of the paranasal sinuses can be better delineated with MSCT, whereas in patients without sinonasal poliposis, CBCT turns out to define the important structures of the sinonasal region better. However, given the lower radiation dose, CBCT can be considered for the evaluation of the sinonasal structures in patients with sinonasal poliposis. • CBCT and MSCT are both suited for evaluation of sinonasal poliposis. • Effective dose for MSCT was 42% higher compared to CBCT. • In patients with sinonasal poliposis, clinically important anatomical structures are better delineated with MSCT. • In patients with normal radiological findings, clinically important anatomical structures are better delineated with CBCT.
Computer studies of multiple-quantum spin dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murdoch, J.B.
The excitation and detection of multiple-quantum (MQ) transitions in Fourier transform NMR spectroscopy is an interesting problem in the quantum mechanical dynamics of spin systems as well as an important new technique for investigation of molecular structure. In particular, multiple-quantum spectroscopy can be used to simplify overly complex spectra or to separate the various interactions between a nucleus and its environment. The emphasis of this work is on computer simulation of spin-system evolution to better relate theory and experiment.
Discrete State Change Model of Manufacturing Quality to Aid Assembly Process Design
NASA Astrophysics Data System (ADS)
Koga, Tsuyoshi; Aoyama, Kazuhiro
This paper proposes a representation model of the quality state change in an assembly process that can be used in a computer-aided process design system. In order to formalize the state change of the manufacturing quality in the assembly process, the functions, operations, and quality changes in the assembly process are represented as a network model that can simulate discrete events. This paper also develops a design method for the assembly process. The design method calculates the space of quality state change and outputs a better assembly process (better operations and better sequences) that can be used to obtain the intended quality state of the final product. A computational redesigning algorithm of the assembly process that considers the manufacturing quality is developed. The proposed method can be used to design an improved manufacturing process by simulating the quality state change. A prototype system for planning an assembly process is implemented and applied to the design of an auto-breaker assembly process. The result of the design example indicates that the proposed assembly process planning method outputs a better manufacturing scenario based on the simulation of the quality state change.
A Stochastic Total Least Squares Solution of Adaptive Filtering Problem
Ahmad, Noor Atinah
2014-01-01
An efficient and computationally linear algorithm is derived for total least squares solution of adaptive filtering problem, when both input and output signals are contaminated by noise. The proposed total least mean squares (TLMS) algorithm is designed by recursively computing an optimal solution of adaptive TLS problem by minimizing instantaneous value of weighted cost function. Convergence analysis of the algorithm is given to show the global convergence of the proposed algorithm, provided that the stepsize parameter is appropriately chosen. The TLMS algorithm is computationally simpler than the other TLS algorithms and demonstrates a better performance as compared with the least mean square (LMS) and normalized least mean square (NLMS) algorithms. It provides minimum mean square deviation by exhibiting better convergence in misalignment for unknown system identification under noisy inputs. PMID:24688412
Microcomputer-Based Intelligent Tutoring Systems: An Assessment.
ERIC Educational Resources Information Center
Schaffer, John William
Computer-assisted instruction, while familiar to most teachers, has failed to become an effective self-motivating instructional tool. Developments in artificial intelligence, however, have provided new and better tools for exploring human knowledge acquisition and utilization. Expert system technology represents one of the most promising of these…
Robust MOE Detector for DS-CDMA Systems with Signature Waveform Mismatch
NASA Astrophysics Data System (ADS)
Lin, Tsui-Tsai
In this letter, a decision-directed MOE detector with excellent robustness against signature waveform mismatch is proposed for DS-CDMA systems. Both the theoretic analysis and computer simulation results demonstrate that the proposed detector can provide better SINR performance than that of conventional detectors.
NASA Astrophysics Data System (ADS)
Barlow, Steven J.
1986-09-01
The Air Force needs a better method of designing new and retrofit heating, ventilating and air conditioning (HVAC) control systems. Air Force engineers currently use manual design/predict/verify procedures taught at the Air Force Institute of Technology, School of Civil Engineering, HVAC Control Systems course. These existing manual procedures are iterative and time-consuming. The objectives of this research were to: (1) Locate and, if necessary, modify an existing computer-based method for designing and analyzing HVAC control systems that is compatible with the HVAC Control Systems manual procedures, or (2) Develop a new computer-based method of designing and analyzing HVAC control systems that is compatible with the existing manual procedures. Five existing computer packages were investigated in accordance with the first objective: MODSIM (for modular simulation), HVACSIM (for HVAC simulation), TRNSYS (for transient system simulation), BLAST (for building load and system thermodynamics) and Elite Building Energy Analysis Program. None were found to be compatible or adaptable to the existing manual procedures, and consequently, a prototype of a new computer method was developed in accordance with the second research objective.
2012-01-01
The fast and accurate computation of the electric forces that drive the motion of charged particles at the nanometer scale represents a computational challenge. For this kind of system, where the discrete nature of the charges cannot be neglected, boundary element methods (BEM) represent a better approach than finite differences/finite elements methods. In this article, we compare two different BEM approaches to a canonical electrostatic problem in a three-dimensional space with inhomogeneous dielectrics, emphasizing their suitability for particle-based simulations: the iterative method proposed by Hoyles et al. and the Induced Charge Computation introduced by Boda et al. PMID:22338640
Berti, Claudio; Gillespie, Dirk; Eisenberg, Robert S; Fiegna, Claudio
2012-02-16
The fast and accurate computation of the electric forces that drive the motion of charged particles at the nanometer scale represents a computational challenge. For this kind of system, where the discrete nature of the charges cannot be neglected, boundary element methods (BEM) represent a better approach than finite differences/finite elements methods. In this article, we compare two different BEM approaches to a canonical electrostatic problem in a three-dimensional space with inhomogeneous dielectrics, emphasizing their suitability for particle-based simulations: the iterative method proposed by Hoyles et al. and the Induced Charge Computation introduced by Boda et al.
RighTime: A real time clock correcting program for MS-DOS-based computer systems
NASA Technical Reports Server (NTRS)
Becker, G. Thomas
1993-01-01
A computer program is described which effectively eliminates the misgivings of the DOS system clock in PC/AT-class computers. RighTime is a small, sophisticated memory-resident program that automatically corrects both the DOS system clock and the hardware 'CMOS' real time clock (RTC) in real time. RighTime learns what corrections are required without operator interaction beyond the occasional accurate time set. Both warm (power on) and cool (power off) errors are corrected, usually yielding better than one part per million accuracy in the typical desktop computer with no additional hardware, and RighTime increases the system clock resolution from approximately 0.0549 second to 0.01 second. Program tools are also available which allow visualization of RighTime's actions, verification of its performance, display of its history log, and which provide data for graphing of the system clock behavior. The program has found application in a wide variety of industries, including astronomy, satellite tracking, communications, broadcasting, transportation, public utilities, manufacturing, medicine, and the military.
Health care and privacy law in electronic commerce.
Wright, B
1994-01-01
As electronic data interchange (EDI) continues to gain acceptance and use, questions regarding protection of the confidentiality of private healthcare information have arisen. This article explains how a computer-based information system equipped with appropriate safeguards can be far better at ensuring privacy than a paper-based system.
The Transfer of Abstract Principles Governing Complex Adaptive Systems
ERIC Educational Resources Information Center
Goldstone, Robert L.; Sakamoto, Yasuaki
2003-01-01
Four experiments explored participants' understanding of the abstract principles governing computer simulations of complex adaptive systems. Experiments 1, 2, and 3 showed better transfer of abstract principles across simulations that were relatively dissimilar, and that this effect was due to participants who performed relatively poorly on the…
CEASAW: A User-Friendly Computer Environment Analysis for the Sawmill Owner
Guillermo Mendoza; William Sprouse; Philip A. Araman; William G. Luppold
1991-01-01
Improved spreadsheet software capabilities have brought optimization to users with little or no background in mathematical programming. Better interface capabilities of spreadsheet models now make it possible to combine optimization models with a spreadsheet system. Sawmill production and inventory systems possess many features that make them suitable application...
Kontopantelis, Evangelos; Buchan, Iain; Reeves, David; Checkland, Kath; Doran, Tim
2013-08-02
To investigate the relationship between performance on the UK Quality and Outcomes Framework pay-for-performance scheme and choice of clinical computer system. Retrospective longitudinal study. Data for 2007-2008 to 2010-2011, extracted from the clinical computer systems of general practices in England. All English practices participating in the pay-for-performance scheme: average 8257 each year, covering over 99% of the English population registered with a general practice. Levels of achievement on 62 quality-of-care indicators, measured as: reported achievement (levels of care after excluding inappropriate patients); population achievement (levels of care for all patients with the relevant condition) and percentage of available quality points attained. Multilevel mixed effects multiple linear regression models were used to identify population, practice and clinical computing system predictors of achievement. Seven clinical computer systems were consistently active in the study period, collectively holding approximately 99% of the market share. Of all population and practice characteristics assessed, choice of clinical computing system was the strongest predictor of performance across all three outcome measures. Differences between systems were greatest for intermediate outcomes indicators (eg, control of cholesterol levels). Under the UK's pay-for-performance scheme, differences in practice performance were associated with the choice of clinical computing system. This raises the question of whether particular system characteristics facilitate higher quality of care, better data recording or both. Inconsistencies across systems need to be understood and addressed, and researchers need to be cautious when generalising findings from samples of providers using a single computing system.
The nature of the (visualization) game: Challenges and opportunities from computational geophysics
NASA Astrophysics Data System (ADS)
Kellogg, L. H.
2016-12-01
As the geosciences enters the era of big data, modeling and visualization become increasingly vital tools for discovery, understanding, education, and communication. Here, we focus on modeling and visualization of the structure and dynamics of the Earth's surface and interior. The past decade has seen accelerated data acquisition, including higher resolution imaging and modeling of Earth's deep interior, complex models of geodynamics, and high resolution topographic imaging of the changing surface, with an associated acceleration of computational modeling through better scientific software, increased computing capability, and the use of innovative methods of scientific visualization. The role of modeling is to describe a system, answer scientific questions, and test hypotheses; the term "model" encompasses mathematical models, computational models, physical models, conceptual models, statistical models, and visual models of a structure or process. These different uses of the term require thoughtful communication to avoid confusion. Scientific visualization is integral to every aspect of modeling. Not merely a means of communicating results, the best uses of visualization enable scientists to interact with their data, revealing the characteristics of the data and models to enable better interpretation and inform the direction of future investigation. Innovative immersive technologies like virtual reality, augmented reality, and remote collaboration techniques, are being adapted more widely and are a magnet for students. Time-varying or transient phenomena are especially challenging to model and to visualize; researchers and students may need to investigate the role of initial conditions in driving phenomena, while nonlinearities in the governing equations of many Earth systems make the computations and resulting visualization especially challenging. Training students how to use, design, build, and interpret scientific modeling and visualization tools prepares them to better understand the nature of complex, multiscale geoscience data.
Vivekanandhan, Sapthagirivasan; Subramaniam, Janarthanam; Mariamichael, Anburajan
2016-10-01
Hip fractures due to osteoporosis are increasing progressively across the globe. It is also difficult for those fractured patients to undergo dual-energy X-ray absorptiometry scans due to its complicated protocol and its associated cost. The utilisation of computed tomography for the fracture treatment has become common in the clinical practice. It would be helpful for orthopaedic clinicians, if they could get some additional information related to bone strength for better treatment planning. The aim of our study was to develop an automated system to segment the femoral neck region, extract the cortical and trabecular bone parameters, and assess the bone strength using an isotropic volume construction from clinical computed tomography images. The right hip computed tomography and right femur dual-energy X-ray absorptiometry measurements were taken from 50 south-Indian females aged 30-80 years. Each computed tomography image volume was re-constructed to form isotropic volumes. An automated system by incorporating active contour models was used to segment the neck region. A minimum distance boundary method was applied to isolate the cortical and trabecular bone components. The trabecular bone was enhanced and segmented using trabecular enrichment approach. The cortical and trabecular bone features were extracted and statistically compared with dual-energy X-ray absorptiometry measured femur neck bone mineral density. The extracted bone measures demonstrated a significant correlation with neck bone mineral density (r > 0.7, p < 0.001). The inclusion of cortical measures, along with the trabecular measures extracted after isotropic volume construction and trabecular enrichment approach procedures, resulted in better estimation of bone strength. The findings suggest that the proposed system using the clinical computed tomography images scanned with low dose could eventually be helpful in osteoporosis diagnosis and its treatment planning. © IMechE 2016.
A Weibull distribution accrual failure detector for cloud computing.
Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.
Kendon, Vivien M; Nemoto, Kae; Munro, William J
2010-08-13
We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.
Meet EPA Environmental Engineer Terra Haxton, Ph.D.
EPA Environmental Engineer Terra Haxton, Ph.D., uses computer simulation models to protect drinking water. She investigates approaches to help water utilities be better prepared to respond to contamination incidents in their distribution systems.
A blueprint for better service.
Allen, C; Racoosin, B
1989-09-01
The authors review the thoughtful process that led an independent laboratory in Seattle from a mostly manual ordering/reporting system to a computer-based operation that has exceeded expectations for improved service to all of their laboratory customers.
Development of an autonomous video rendezous and docking system
NASA Technical Reports Server (NTRS)
Tietz, J. C.; Kelly, J. H.
1982-01-01
Video control systems using three flashing lights and two other types of docking aids were evaluated through computer simulation and other approaches. The three light system performed much better than the others. Its accuracy is affected little by tumbling of the target spacecraft, and in the simulations it was able to cope with attitude rates up to 20,000 degrees per hour about the docking axis. Its performance with rotation about other axes is determined primarily by the state estimation and goal setting portions of the control system, not by measurement accuracy. A suitable control system, and a computer program that can serve as the basis for the physical simulation are discussed.
NASA Technical Reports Server (NTRS)
Jain, Abhinandan
2011-01-01
Ndarts software provides algorithms for computing quantities associated with the dynamics of articulated, rigid-link, multibody systems. It is designed as a general-purpose dynamics library that can be used for the modeling of robotic platforms, space vehicles, molecular dynamics, and other such applications. The architecture and algorithms in Ndarts are based on the Spatial Operator Algebra (SOA) theory for computational multibody and robot dynamics developed at JPL. It uses minimal, internal coordinate models. The algorithms are low-order, recursive scatter/ gather algorithms. In comparison with the earlier Darts++ software, this version has a more general and cleaner design needed to support a larger class of computational dynamics needs. It includes a frames infrastructure, allows algorithms to operate on subgraphs of the system, and implements lazy and deferred computation for better efficiency. Dynamics modeling modules such as Ndarts are core building blocks of control and simulation software for space, robotic, mechanism, bio-molecular, and material systems modeling.
Implementation of Autonomous Control Technology for Plant Growth Chambers
NASA Technical Reports Server (NTRS)
Costello, Thomas A.; Sager, John C.; Krumins, Valdis; Wheeler, Raymond M.
2002-01-01
The Kennedy Space Center has significant infrastructure for research using controlled environment plant growth chambers. Such research supports development of bioregenerative life support technology for long-term space missions. Most of the existing chambers in Hangar L and Little L will be moved to the new Space Experiment Research and Processing Laboratory (SERPL) in the summer of 2003. The impending move has created an opportunity to update the control system technologies to allow for greater flexibility, less labor for set-up and maintenance, better diagnostics, better reliability and easier data retrieval. Part of these improvements can be realized using hardware which communicates through an ethernet connection to a central computer for supervisory control but can be operated independently of the computer during routine run-time. Both the hardware and software functionality of an envisioned system were tested on a prototype plant growth chamber (CEC-4) in Hangar L. Based upon these tests, recommendations for hardware and software selection and system design for implementation in SERPL are included.
Study of a hybrid multispectral processor
NASA Technical Reports Server (NTRS)
Marshall, R. E.; Kriegler, F. J.
1973-01-01
A hybrid processor is described offering enough handling capacity and speed to process efficiently the large quantities of multispectral data that can be gathered by scanner systems such as MSDS, SKYLAB, ERTS, and ERIM M-7. Combinations of general-purpose and special-purpose hybrid computers were examined to include both analog and digital types as well as all-digital configurations. The current trend toward lower costs for medium-scale digital circuitry suggests that the all-digital approach may offer the better solution within the time frame of the next few years. The study recommends and defines such a hybrid digital computing system in which both special-purpose and general-purpose digital computers would be employed. The tasks of recognizing surface objects would be performed in a parallel, pipeline digital system while the tasks of control and monitoring would be handled by a medium-scale minicomputer system. A program to design and construct a small, prototype, all-digital system has been started.
A Data-Based Financial Management Information System (FMIS) for Administrative Sciences Department
1990-12-01
Financial Management Information System that would result in improved management of financial assets, better use of clerical skills, and more detailed...develops and implements a personal computer-based Management Information System for the Management of the many funding accounts controlled by the...different software programs, into a single all-encompassing Management Information System . The system was written using dBASE IV and is currently operational.
NASA Astrophysics Data System (ADS)
Esparza, Javier
In many areas of computer science entities can “reproduce”, “replicate”, or “create new instances”. Paramount examples are threads in multithreaded programs, processes in operating systems, and computer viruses, but many others exist: procedure calls create new incarnations of the callees, web crawlers discover new pages to be explored (and so “create” new tasks), divide-and-conquer procedures split a problem into subproblems, and leaves of tree-based data structures become internal nodes with children. For lack of a better name, I use the generic term systems with process creation to refer to all these entities.
Machine Learning, deep learning and optimization in computer vision
NASA Astrophysics Data System (ADS)
Canu, Stéphane
2017-03-01
As quoted in the Large Scale Computer Vision Systems NIPS workshop, computer vision is a mature field with a long tradition of research, but recent advances in machine learning, deep learning, representation learning and optimization have provided models with new capabilities to better understand visual content. The presentation will go through these new developments in machine learning covering basic motivations, ideas, models and optimization in deep learning for computer vision, identifying challenges and opportunities. It will focus on issues related with large scale learning that is: high dimensional features, large variety of visual classes, and large number of examples.
A generalized approach to computer synthesis of digital holograms
NASA Technical Reports Server (NTRS)
Hopper, W. A.
1973-01-01
Hologram is constructed by taking number of digitized sample points and blending them together to form ''continuous'' picture. New system selects better set of sample points resulting in improved hologram from same amount of information.
Autonomic Computing: Freedom or a Threat?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fink, Glenn A.; Frincke, Deb
2007-12-01
No longer is the question whether autonomic computing will gain general acceptance but when. Experts expect autonomic computing to be widely used within 10 years. When it does become mainstream, how will autonomics change system administration and corporations, and will the change be for better or worse? The answer depends on how well we anticipate the limitations of what autonomic systems are suited to do, whether we can collectively address the vulnerabilities of autonomic approaches as we draw upon the advantages, and whether administrators, companies, partners, and users are prepared for the transition. This article presents some design considerations tomore » address the first two issues and some suggested survival techniques for the third.« less
Designing Guiding Systems for Brain-Computer Interfaces
Kosmyna, Nataliya; Lécuyer, Anatole
2017-01-01
Brain–Computer Interface (BCI) community has focused the majority of its research efforts on signal processing and machine learning, mostly neglecting the human in the loop. Guiding users on how to use a BCI is crucial in order to teach them to produce stable brain patterns. In this work, we explore the instructions and feedback for BCIs in order to provide a systematic taxonomy to describe the BCI guiding systems. The purpose of our work is to give necessary clues to the researchers and designers in Human–Computer Interaction (HCI) in making the fusion between BCIs and HCI more fruitful but also to better understand the possibilities BCIs can provide to them. PMID:28824400
Overall, the implementation of a computer-controlled hydrogen generation system and subsequent conversion of small engine equipment for hydrogen use has been surprisingly straightforward from an engineering and technology standpoint. More testing is required to get a better gr...
Electronic health records (EHRs): supporting ASCO's vision of cancer care.
Yu, Peter; Artz, David; Warner, Jeremy
2014-01-01
ASCO's vision for cancer care in 2030 is built on the expanding importance of panomics and big data, and envisions enabling better health for patients with cancer by the rapid transformation of systems biology knowledge into cancer care advances. This vision will be heavily dependent on the use of health information technology for computational biology and clinical decision support systems (CDSS). Computational biology will allow us to construct models of cancer biology that encompass the complexity of cancer panomics data and provide us with better understanding of the mechanisms governing cancer behavior. The Agency for Healthcare Research and Quality promotes CDSS based on clinical practice guidelines, which are knowledge bases that grow too slowly to match the rate of panomic-derived knowledge. CDSS that are based on systems biology models will be more easily adaptable to rapid advancements and translational medicine. We describe the characteristics of health data representation, a model for representing molecular data that supports data extraction and use for panomic-based clinical research, and argue for CDSS that are based on systems biology and are algorithm-based.
Kontopantelis, Evangelos; Buchan, Iain; Reeves, David; Checkland, Kath; Doran, Tim
2013-01-01
Objectives To investigate the relationship between performance on the UK Quality and Outcomes Framework pay-for-performance scheme and choice of clinical computer system. Design Retrospective longitudinal study. Setting Data for 2007–2008 to 2010–2011, extracted from the clinical computer systems of general practices in England. Participants All English practices participating in the pay-for-performance scheme: average 8257 each year, covering over 99% of the English population registered with a general practice. Main outcome measures Levels of achievement on 62 quality-of-care indicators, measured as: reported achievement (levels of care after excluding inappropriate patients); population achievement (levels of care for all patients with the relevant condition) and percentage of available quality points attained. Multilevel mixed effects multiple linear regression models were used to identify population, practice and clinical computing system predictors of achievement. Results Seven clinical computer systems were consistently active in the study period, collectively holding approximately 99% of the market share. Of all population and practice characteristics assessed, choice of clinical computing system was the strongest predictor of performance across all three outcome measures. Differences between systems were greatest for intermediate outcomes indicators (eg, control of cholesterol levels). Conclusions Under the UK's pay-for-performance scheme, differences in practice performance were associated with the choice of clinical computing system. This raises the question of whether particular system characteristics facilitate higher quality of care, better data recording or both. Inconsistencies across systems need to be understood and addressed, and researchers need to be cautious when generalising findings from samples of providers using a single computing system. PMID:23913774
Haematological validation of a computer-based bone marrow reporting system.
Nguyen, D T; Diamond, L W; Cavenagh, J D; Parameswaran, R; Amess, J A
1997-01-01
AIMS: To prove the safety and effectiveness of "Professor Belmonte", a knowledge-based system for bone marrow reporting, a formal evaluation of the reports generated by the system was performed. METHODS: Three haematologists (a consultant, a senior registrar, and a junior registrar), none of whom were involved in the development of the software, compared the unedited reports generated by Professor Belmonte with the original bone marrow reports in 785 unselected cases. Each haematologist independently graded the quality of Belmonte's reports using one of four categories: (a) better than the original report (more informative, containing useful information missing in the original report); (b) equivalent to the original report; (c) satisfactory, but missing information that should have been included; and (d) unsatisfactory. RESULTS: The consultant graded 64 reports as more informative than the original, 687 as equivalent to the original, 32 as satisfactory, and two as unsatisfactory. The senior registrar considered 29 reports to be better than the original, 739 to be equivalent to the original, 15 to be satisfactory, and two to be unsatisfactory. The junior registrar found that 88 reports were better than the original, 681 were equivalent to the original, 14 were satisfactory, and two were unsatisfactory. Each judge found two different reports to be unsatisfactory according to their criteria. All 785 reports generated by the computer system received at least two scores of satisfactory or better. CONCLUSIONS: In this representative study, Professor Belmonte generated bone marrow reports that proved to be as accurate as the original reports in a large university hospital. The haematology knowledge contained within the system, the reasoning process, and the function of the software are safe and effective for assisting haematologists in generating high quality bone marrow reports. PMID:9215118
Extension of a streamwise upwind algorithm to a moving grid system
NASA Technical Reports Server (NTRS)
Obayashi, Shigeru; Goorjian, Peter M.; Guruswamy, Guru P.
1990-01-01
A new streamwise upwind algorithm was derived to compute unsteady flow fields with the use of a moving-grid system. The temporally nonconservative LU-ADI (lower-upper-factored, alternating-direction-implicit) method was applied for time marching computations. A comparison of the temporally nonconservative method with a time-conservative implicit upwind method indicates that the solutions are insensitive to the conservative properties of the implicit solvers when practical time steps are used. Using this new method, computations were made for an oscillating wing at a transonic Mach number. The computed results confirm that the present upwind scheme captures the shock motion better than the central-difference scheme based on the beam-warming algorithm. The new upwind option of the code allows larger time-steps and thus is more efficient, even though it requires slightly more computational time per time step than the central-difference option.
A Weibull distribution accrual failure detector for cloud computing
Wu, Zhibo; Wu, Jin; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing. PMID:28278229
[Research on the Application of Fuzzy Logic to Systems Analysis and Control
NASA Technical Reports Server (NTRS)
1998-01-01
Research conducted with the support of NASA Grant NCC2-275 has been focused in the main on the development of fuzzy logic and soft computing methodologies and their applications to systems analysis and control. with emphasis 011 problem areas which are of relevance to NASA's missions. One of the principal results of our research has been the development of a new methodology called Computing with Words (CW). Basically, in CW words drawn from a natural language are employed in place of numbers for computing and reasoning. There are two major imperatives for computing with words. First, computing with words is a necessity when the available information is too imprecise to justify the use of numbers, and second, when there is a tolerance for imprecision which can be exploited to achieve tractability, robustness, low solution cost, and better rapport with reality. Exploitation of the tolerance for imprecision is an issue of central importance in CW.
Employing subgoals in computer programming education
NASA Astrophysics Data System (ADS)
Margulieux, Lauren E.; Catrambone, Richard; Guzdial, Mark
2016-01-01
The rapid integration of technology into our professional and personal lives has left many education systems ill-equipped to deal with the influx of people seeking computing education. To improve computing education, we are applying techniques that have been developed for other procedural fields. The present study applied such a technique, subgoal labeled worked examples, to explore whether it would improve programming instruction. The first two experiments, conducted in a laboratory, suggest that the intervention improves undergraduate learners' problem-solving performance and affects how learners approach problem-solving. The third experiment demonstrates that the intervention has similar, and perhaps stronger, effects in an online learning environment with in-service K-12 teachers who want to become qualified to teach computing courses. By implementing this subgoal intervention as a tool for educators to teach themselves and their students, education systems could improve computing education and better prepare learners for an increasingly technical world.
Computerized digital dermoscopy.
Gewirtzman, A J; Braun, R P
2003-01-01
Within the past 15 years, dermoscopy has become a widely used non-invasive technique for physicians to better visualize pigmented lesions. Dermoscopy has helped trained physicians to better diagnose pigmented lesions. Now, the digital revolution is beginning to enhance standard dermoscopic procedures. Using digital dermoscopy, physicians are better able to document pigmented lesions for patient follow-up and to get second opinions, either through teledermoscopy with an expert colleague or by using computer-assisted diagnosis. As the market for digital dermoscopy products begins to grow, so do the number of decisions physicians need to make when choosing a system to fit their needs. The current market for digital dermoscopy includes two varieties of relatively simple and cheap attachments which can convert a consumer digital camera into a digital dermoscope. A coupling adapter acts as a fastener between the camera and an ordinary dermoscope, whereas a dermoscopy attachment includes the dermoscope optics and light source and can be attached directly to the camera. Other options for digital dermoscopy include complete dermoscopy systems that use a hand-held video camera linked directly to a computer. These systems differ from each other in whether or not they are calibrated as well as the quality of the camera and software interface. Another option in digital skin imaging involves spectral analysis rather than dermoscopy. This article serves as a guide to the current systems available and their capabilities.
A Service-oriented Approach towards Context-aware Mobile Learning Management Systems
2010-07-01
towards a pervasive university. Keywords-context-aware computing, service-oriented archi- tecture, mobile computing, elearning , learn management sys- tem I...usage of device- specific features provide support for various ubiquitous and pervasive eLearning scenarios [2][3]. By knowing where the user currently...data from the mobile device towards a context-aware mobile LMS. II. BASIC CONCEPTS For a better understanding of the presented eLearning sce- narios
Digital Literacy Development of Students Involved in an ICT Educational Project
NASA Astrophysics Data System (ADS)
Quintana, Maria Graciela Badilla; Pujol, Meritxell Cortada
The impact of the Information and Communication Technologies (ICT) has become the core of a change that involves most of the society fields, consequently the technological and informational literacy are essential requirements in education. The research is a quasi-experimental and ex-post-facto study in schools from Spain. The aim was to describe and analyze the involvement showed by 219 students who participated in a development of ICT's Project named Ponte dos Brozos. The research objective was to respond if the students who usually worked with ICT, had better knowledge and management with computing tools, and if they are better prepared in researching and selecting information. Results showed that students who have a higher contact with ICTs know about the technology and how to use it, also better knowledge and control of the computer and operative systems, a high information management level trough the Internet, although their literacy in information is devoid.
Delvigne, Frank; Takors, Ralf; Mudde, Rob; van Gulik, Walter; Noorman, Henk
2017-09-01
Efficient optimization of microbial processes is a critical issue for achieving a number of sustainable development goals, considering the impact of microbial biotechnology in agrofood, environment, biopharmaceutical and chemical industries. Many of these applications require scale-up after proof of concept. However, the behaviour of microbial systems remains unpredictable (at least partially) when shifting from laboratory-scale to industrial conditions. The need for robust microbial systems is thus highly needed in this context, as well as a better understanding of the interactions between fluid mechanics and cell physiology. For that purpose, a full scale-up/down computational framework is already available. This framework links computational fluid dynamics (CFD), metabolic flux analysis and agent-based modelling (ABM) for a better understanding of the cell lifelines in a heterogeneous environment. Ultimately, this framework can be used for the design of scale-down simulators and/or metabolically engineered cells able to cope with environmental fluctuations typically found in large-scale bioreactors. However, this framework still needs some refinements, such as a better integration of gas-liquid flows in CFD, and taking into account intrinsic biological noise in ABM. © 2017 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.
ERIC Educational Resources Information Center
Jaramillo, Senaida I.
When migrant children are enrolled in the Migrant Education Program, they are also enrolled in the Migrant Student Record Transfer System (MSRTS), a national system which accumulates educational and health information for each child on a computer located in Little Rock, Arkansas. The system affords teachers the opportunity to review the records,…
GRAMPS: An Automated Ambulatory Geriatric Record
Hammond, Kenric W.; King, Carol A.; Date, Vishvanath V.; Prather, Robert J.; Loo, Lawrence; Siddiqui, Khwaja
1988-01-01
GRAMPS (Geriatric Record and Multidisciplinary Planning System) is an interactive MUMPS system developed for VA outpatient use. It allows physicians to effectively document care in problem-oriented format with structured narrative and free text, eliminating handwritten input. We evaluated the system in a one-year controlled cohort study. When the computer, was used, appointment times averaged 8.2 minutes longer (32.6 vs. 24.4 minutes) compared to control visits with the same physicians. Computer use was associated with better quality of care as measured in the management of a common problem, hypertension, as well as decreased overall costs of care. When a faster computer was installed, data entry times improved, suggesting that slower processing had accounted for a substantial portion of the observed difference in appointment lengths. The GRAMPS system was well-accepted by providers. The modular design used in GRAMPS has been extended to medical-care applications in Nursing and Mental Health.
Computers in health care for the 21st century.
O'Desky, R I; Ball, M J; Ball, E E
1990-03-01
As the world enters the last decade of the 20th Century, there is a great deal of speculation about the effect of computers on the future delivery of health care. In this article, the authors attempt to identify some of the evolving computer technologies and anticipate what effect they will have by the year 2000. Rather than listing potential accomplishments, each of the affected areas: hardware, software, health care systems and communications, are presented in an evolutionary manner so the reader can better appreciate where we have been and where we are going.
Computational Systems Biology in Cancer: Modeling Methods and Applications
Materi, Wayne; Wishart, David S.
2007-01-01
In recent years it has become clear that carcinogenesis is a complex process, both at the molecular and cellular levels. Understanding the origins, growth and spread of cancer, therefore requires an integrated or system-wide approach. Computational systems biology is an emerging sub-discipline in systems biology that utilizes the wealth of data from genomic, proteomic and metabolomic studies to build computer simulations of intra and intercellular processes. Several useful descriptive and predictive models of the origin, growth and spread of cancers have been developed in an effort to better understand the disease and potential therapeutic approaches. In this review we describe and assess the practical and theoretical underpinnings of commonly-used modeling approaches, including ordinary and partial differential equations, petri nets, cellular automata, agent based models and hybrid systems. A number of computer-based formalisms have been implemented to improve the accessibility of the various approaches to researchers whose primary interest lies outside of model development. We discuss several of these and describe how they have led to novel insights into tumor genesis, growth, apoptosis, vascularization and therapy. PMID:19936081
ATTACK WARNING: Better Management Required to Resolve NORAD Integration Deficiencies
1989-07-01
protocols, Cumbersome Integration different manufacturers’ computer systems can communicate with eachother . The warning and assessment subsystems...by treating TW/AA system as a single system subject to program review and oversight by the Defense Acquisition Board. Within this management...restore the unit to operation quickly enough after a power loss to meet NORAD mis- sion requirements. The Air Force intends to have the contractor
Decision-support systems for natural-hazards and land-management issues
Dinitz, Laura; Forney, William; Byrd, Kristin
2012-01-01
Scientists at the USGS Western Geographic Science Center are developing decision-support systems (DSSs) for natural-hazards and land-management issues. DSSs are interactive computer-based tools that use data and models to help identify and solve problems. These systems can provide crucial support to policymakers, planners, and communities for making better decisions about long-term natural hazards mitigation and land-use planning.
Williams, Peter Huw; de Lusignan, Simon
2006-01-01
The Royal College of Physicians (RCP) have produced guidelines for stroke management in primary care; this guidance is taken to be the gold standard for the care of people with stroke. UK general practitioners now have a quality-based contract which includes a Quality and Outcomes Framework (QOF). This consists of financially remunerated 'quality points' for specific disease areas, including stroke. Achievement of these quality points is measured by extracting a limited list of computer codes from practice computer systems. To investigate whether a high stroke quality score is associated with adherence to RCP guidelines. Examination of computer and written medical records of all patients with a diagnosis of stroke. Two general practices, one in southwest London, one in Surrey, with a combined practice population of over 20 000. Both practices had a similar age-sex profile and prevalence of stroke. One practice scored 93.5% (29/31) of the available stroke quality points. The other practice achieved 73.4% (22.75/31), and only did better in one stroke quality target. However, the practice scoring fewer quality points had much better adherence to RCP guidance: 96% of patients were assessed in secondary care compared with 79% (P=0.001); 64% of stroke patients were seen the same day, compared with 44%; 56% received rehabilitation compared with 37%. Higher quality points did not reflect better adherence to RCP guidance. This audit highlights a gap between relatively simplistic measures of quality in the QOF, dependent on the recording of a narrow range of computer codes, and the actual standard of care being delivered. Research is needed to see whether this finding is generalisable and how the Quality and Outcomes Framework might be better aligned with delivering best practice.
NASA Astrophysics Data System (ADS)
Yeh, Cheng-Ta; Lin, Yi-Kuei; Yang, Jo-Yun
2018-07-01
Network reliability is an important performance index for many real-life systems, such as electric power systems, computer systems and transportation systems. These systems can be modelled as stochastic-flow networks (SFNs) composed of arcs and nodes. Most system supervisors respect the network reliability maximization by finding the optimal multi-state resource assignment, which is one resource to each arc. However, a disaster may cause correlated failures for the assigned resources, affecting the network reliability. This article focuses on determining the optimal resource assignment with maximal network reliability for SFNs. To solve the problem, this study proposes a hybrid algorithm integrating the genetic algorithm and tabu search to determine the optimal assignment, called the hybrid GA-TS algorithm (HGTA), and integrates minimal paths, recursive sum of disjoint products and the correlated binomial distribution to calculate network reliability. Several practical numerical experiments are adopted to demonstrate that HGTA has better computational quality than several popular soft computing algorithms.
ERIC Educational Resources Information Center
Bliemel, Michael; Ali-Hassan, Hossam
2014-01-01
For several years, we used Intel's flash-based game "IT Manager 3: Unseen Forces" as an experiential learning tool, where students had to act as a manager making real-time prioritization decisions about repairing computer problems, training and upgrading systems with better technologies as well as managing increasing numbers of technical…
Using VirtualGL/TurboVNC Software on the Peregrine System |
High-Performance Computing | NREL VirtualGL/TurboVNC Software on the Peregrine System Using , allowing users to access and share large-memory visualization nodes with high-end graphics processing units may be better than just using X11 forwarding when connecting from a remote site with low bandwidth and
A Lightweight, High-performance I/O Management Package for Data-intensive Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jun Wang
2007-07-17
File storage systems are playing an increasingly important role in high-performance computing as the performance gap between CPU and disk increases. It could take a long time to develop an entire system from scratch. Solutions will have to be built as extensions to existing systems. If new portable, customized software components are plugged into these systems, better sustained high I/O performance and higher scalability will be achieved, and the development cycle of next-generation of parallel file systems will be shortened. The overall research objective of this ECPI development plan aims to develop a lightweight, customized, high-performance I/O management package namedmore » LightI/O to extend and leverage current parallel file systems used by DOE. During this period, We have developed a novel component in LightI/O and prototype them into PVFS2, and evaluate the resultant prototype—extended PVFS2 system on data-intensive applications. The preliminary results indicate the extended PVFS2 delivers better performance and reliability to users. A strong collaborative effort between the PI at the University of Nebraska Lincoln and the DOE collaborators—Drs Rob Ross and Rajeev Thakur at Argonne National Laboratory who are leading the PVFS2 group makes the project more promising.« less
Review of Collaborative Tools for Planning and Engineering
2007-10-01
including PDAs) and Operating Systems 1 In general, should support laptops, desktops, Windows OS, Mac OS, Palm OS, Windows CE, Blackberry , Sun...better), voting (to establish operating parameters), reactor design, wind tunnel simulation Display same material on every computer, synchronisation
1992-02-27
This map shows the presence of water vapor over global oceans. The imagery was produced by combining Special Sensor Microwave Imager measurements and computer models. This data will help scientists better understand how weather systems move water vapor from the tropics toward the poles producing precipitation.
What Is an Activity? Appropriating an Activity-Centric System
NASA Astrophysics Data System (ADS)
Yarosh, Svetlana; Matthews, Tara; Moran, Thomas P.; Smith, Barton
Activity-Centric Computing (ACC) systems seek to address the fragmentation of office work across tools and documents by allowing users to organize work around the computational construct of an Activity. Defining and structuring appropriate Activities within a system poses a challenge for users that must be overcome in order to benefit from ACC support. We know little about how knowledge workers appropriate the Activity construct. To address this, we studied users’ appropriation of a production-quality ACC system, Lotus Activities, for everyday work by employees in a large corporation. We contribute to a better understanding of how users articulate their individual and collaborative work in the system by providing empirical evidence of their patterns of appropriation. We conclude by discussing how our findings can inform the design of other ACC systems for the workplace.
Cogbill, Thomas H; Ziegelbein, Kurt J
2011-02-01
The basic principles underlying computed tomography, magnetic resonance, and ultrasound are reviewed to promote better understanding of the properties and appropriate applications of these 3 common imaging modalities. A glossary of frequently used terms for each technique is appended for convenience. Risks to patient safety including contrast-induced nephropathy, radiation-induced malignancy, and nephrogenic systemic fibrosis are discussed. Copyright © 2011 Elsevier Inc. All rights reserved.
EOS MLS Lessons Learned: Design Ideas for Safer and Lower Cost Operations
NASA Technical Reports Server (NTRS)
Miller, Dominick
2012-01-01
The Earth Observing System (EOS) Microwave Limb Sounder (MLS) is a complex instrument with a front end computer and 32 subsystem computers. MLS is one of four instruments on NASA's EOS Aura spacecraft With almost 8 years in orbit, MLS has a few lessons learned which can be applied during the design phase of future instruments to effect better longevity, more robust operations and a significant cost benefit during operations phase.
NAVO MSRC Navigator. Spring 2003
2003-01-01
computational model run on the IBM POWER4 (MARCELLUS) in support of the Airborne Laser Challenge Project II. The data were visualized using Alias|Wavefront Maya...Turbulence in a Jet Stream in the Airborne Laser Context High Performance Computing 11 Largest NAVO MSRC System Becomes Even Bigger and Better 11 Using the smp...centimeters (cm). The resolution requirement to resolve the microjets and the flow outside in the combustor is too severe for any single numerical method
Mass Storage System Upgrades at the NASA Center for Computational Sciences
NASA Technical Reports Server (NTRS)
Tarshish, Adina; Salmon, Ellen; Macie, Medora; Saletta, Marty
2000-01-01
The NASA Center for Computational Sciences (NCCS) provides supercomputing and mass storage services to over 1200 Earth and space scientists. During the past two years, the mass storage system at the NCCS went through a great deal of changes both major and minor. Tape drives, silo control software, and the mass storage software itself were upgraded, and the mass storage platform was upgraded twice. Some of these upgrades were aimed at achieving year-2000 compliance, while others were simply upgrades to newer and better technologies. In this paper we will describe these upgrades.
A Computational Model of Reasoning from the Clinical Literature
Rennels, Glenn D.
1986-01-01
This paper explores the premise that a formalized representation of empirical studies can play a central role in computer-based decision support. The specific motivations underlying this research include the following propositions: 1. Reasoning from experimental evidence contained in the clinical literature is central to the decisions physicians make in patient care. 2. A computational model, based upon a declarative representation for published reports of clinical studies, can drive a computer program that selectively tailors knowledge of the clinical literature as it is applied to a particular case. 3. The development of such a computational model is an important first step toward filling a void in computer-based decision support systems. Furthermore, the model may help us better understand the general principles of reasoning from experimental evidence both in medicine and other domains. Roundsman is a developmental computer system which draws upon structured representations of the clinical literature in order to critique plans for the management of primary breast cancer. Roundsman is able to produce patient-specific analyses of breast cancer management options based on the 24 clinical studies currently encoded in its knowledge base. The Roundsman system is a first step in exploring how the computer can help to bring a critical analysis of the relevant literature to the physician, structured around a particular patient and treatment decision.
NASA Technical Reports Server (NTRS)
Spencer, M. M.; Wolf, J. M.; Schall, M. A.
1974-01-01
A system of computer programs were developed which performs geometric rectification and line-by-line mapping of airborne multispectral scanner data to ground coordinates and estimates ground area. The system requires aircraft attitude and positional information furnished by ancillary aircraft equipment, as well as ground control points. The geometric correction and mapping procedure locates the scan lines, or the pixels on each line, in terms of map grid coordinates. The area estimation procedure gives ground area for each pixel or for a predesignated parcel specified in map grid coordinates. The results of exercising the system with simulated data showed the uncorrected video and corrected imagery and produced area estimates accurate to better than 99.7%.
Prediction based proactive thermal virtual machine scheduling in green clouds.
Kinger, Supriya; Kumar, Rajesh; Sharma, Anju
2014-01-01
Cloud computing has rapidly emerged as a widely accepted computing paradigm, but the research on Cloud computing is still at an early stage. Cloud computing provides many advanced features but it still has some shortcomings such as relatively high operating cost and environmental hazards like increasing carbon footprints. These hazards can be reduced up to some extent by efficient scheduling of Cloud resources. Working temperature on which a machine is currently running can be taken as a criterion for Virtual Machine (VM) scheduling. This paper proposes a new proactive technique that considers current and maximum threshold temperature of Server Machines (SMs) before making scheduling decisions with the help of a temperature predictor, so that maximum temperature is never reached. Different workload scenarios have been taken into consideration. The results obtained show that the proposed system is better than existing systems of VM scheduling, which does not consider current temperature of nodes before making scheduling decisions. Thus, a reduction in need of cooling systems for a Cloud environment has been obtained and validated.
Jain, Aditi; Asrani, Hemant; Singhal, Abhinav Chand; Bhatia, Taranjeet Kaur; Sharma, Vaibhav; Jaiswal, Pragya
2016-01-01
Aims: To compare the canal transportation, centering ability, and remaining dentin thickness of WaveOne and ProTaper systems using cone beam computed tomography. Subjects and Methods: Forty extracted human single-rooted premolars were used in the present study. Preinstrumentation scanning of all teeth was taken; canal curvatures were calculated, and the samples were randomly divided into two groups, with twenty samples in each group; one group was instrumented with WaveOne system and the other group with ProTaper rotary system. Postinstrumentation scans were performed, and the two scans were compared to determine canal transportation, centering ability, and remaining dentin thickness at 3 mm, 6 mm, and 9 mm from the root apex. Statistical Analysis Used: Student's unpaired t-test. Results: Using Student's unpaired t-test, results were as follows: for canal transportation, Group 1 showed significant difference at 3 mm and 6 mm and insignificant difference at 9 mm while Group 2 showed insignificant difference in all the three regions. For centering ability and remaining dentin thickness, Group 1 showed insignificant difference at 3 mm and 9 mm while significant difference at 6 mm was obtained. When comparison of remaining dentin thickness was done at three levels using two groups WaveOne and ProTaper, there was no significant difference between two groups. Conclusions: (1) WaveOne single reciprocation file system respected better canal anatomy better than ProTaper. (2) Individually, centering ability of WaveOne was better at 3 mm, 6 mm, and 9 mm levels. (3) However, ProTaper individually was better centered at 3 mm (apical third) and 9 mm (coronal 3rd) levels than 6 mm level (middle third). PMID:27656063
Automated support tool for variable rate irrigation prescriptions
USDA-ARS?s Scientific Manuscript database
Variable rate irrigation (VRI) enables center pivot management to better meet non-uniform water and fertility needs. This is accomplished through correctly matching system water application with spatial and temporal variability within the field. A computer program was modified to accommodate GIS dat...
Gastrointestinal robot-assisted surgery. A current perspective.
Lunca, Sorinel; Bouras, George; Stanescu, Alexandru Calin
2005-12-01
Minimally invasive techniques have revolutionized operative surgery. Computer aided surgery and robotic surgical systems strive to improve further on currently available minimally invasive surgery and open new horizons. Only several centers are currently using surgical robots and publishing data. In gastrointestinal surgery, robotic surgery is applied to a wide range of procedures, but is still in its infancy. Cholecystectomy, Nissen fundoplication and Heller myotomy are among the most frequently performed operations. The ZEUS (Computer Motion, Goleta, CA) and the da Vinci (Intuitive Surgical, Mountain View, CA) surgical systems are today the most advanced robotic systems used in gastrointestinal surgery. Most studies reported that robotic gastrointestinal surgery is feasible and safe, provides improved dexterity, better visualization, reduced fatigue and high levels of precision when compared to conventional laparoscopic surgery. Its main drawbacks are the absence of force feedback and extremely high costs. At this moment there are no reports to clearly demonstrate the superiority of robotics over conventional laparoscopic surgery. Further research and more prospective randomized trials are needed to better define the optimal application of this new technology in gastrointestinal surgery.
NASA Astrophysics Data System (ADS)
Cavaglieri, Daniele; Bewley, Thomas
2015-04-01
Implicit/explicit (IMEX) Runge-Kutta (RK) schemes are effective for time-marching ODE systems with both stiff and nonstiff terms on the RHS; such schemes implement an (often A-stable or better) implicit RK scheme for the stiff part of the ODE, which is often linear, and, simultaneously, a (more convenient) explicit RK scheme for the nonstiff part of the ODE, which is often nonlinear. Low-storage RK schemes are especially effective for time-marching high-dimensional ODE discretizations of PDE systems on modern (cache-based) computational hardware, in which memory management is often the most significant computational bottleneck. In this paper, we develop and characterize eight new low-storage implicit/explicit RK schemes which have higher accuracy and better stability properties than the only low-storage implicit/explicit RK scheme available previously, the venerable second-order Crank-Nicolson/Runge-Kutta-Wray (CN/RKW3) algorithm that has dominated the DNS/LES literature for the last 25 years, while requiring similar storage (two, three, or four registers of length N) and comparable floating-point operations per timestep.
NASA Technical Reports Server (NTRS)
Brooks, Rodney Allen; Stein, Lynn Andrea
1994-01-01
We describe a project to capitalize on newly available levels of computational resources in order to understand human cognition. We will build an integrated physical system including vision, sound input and output, and dextrous manipulation, all controlled by a continuously operating large scale parallel MIMD computer. The resulting system will learn to 'think' by building on its bodily experiences to accomplish progressively more abstract tasks. Past experience suggests that in attempting to build such an integrated system we will have to fundamentally change the way artificial intelligence, cognitive science, linguistics, and philosophy think about the organization of intelligence. We expect to be able to better reconcile the theories that will be developed with current work in neuroscience.
NASA Technical Reports Server (NTRS)
Deere, Karen A.; Viken, Sally A.; Carter, Melissa B.; Viken, Jeffrey K.; Derlaga, Joseph M.; Stoll, Alex M.
2017-01-01
A variety of tools, from fundamental to high order, have been used to better understand applications of distributed electric propulsion to aid the wing and propulsion system design of the Leading Edge Asynchronous Propulsion Technology (LEAPTech) project and the X-57 Maxwell airplane. Three high-fidelity, Navier-Stokes computational fluid dynamics codes used during the project with results presented here are FUN3D, STAR-CCM+, and OVERFLOW. These codes employ various turbulence models to predict fully turbulent and transitional flow. Results from these codes are compared for two distributed electric propulsion configurations: the wing tested at NASA Armstrong on the Hybrid-Electric Integrated Systems Testbed truck, and the wing designed for the X-57 Maxwell airplane. Results from these computational tools for the high-lift wing tested on the Hybrid-Electric Integrated Systems Testbed truck and the X-57 high-lift wing presented compare reasonably well. The goal of the X-57 wing and distributed electric propulsion system design achieving or exceeding the required ?? (sub L) = 3.95 for stall speed was confirmed with all of the computational codes.
Federal Research Opportunities: DOE, DOD, and HHS Need Better Guidance for Participant Activities
2016-01-01
process controls of advanced power systems, gas sensors and high temperatures, improving extraction of earth elements, quantum computing, biofilms ...chronic diseases (e.g., heart, obesity, cancer ), environmental health, toxic substances, health statistics, and public health preparedness. Food and...Health Localization of proteins using molecular markers, gene regulatory effects in cancer , medical informatics, and central nervous system
Characterization of structural connections for multicomponent systems
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Huckelbridge, Arthur A.
1988-01-01
This study explores combining Component Mode Synthesis methods for coupling structural components with Parameter Identification procedures for improving the analytical modeling of the connections. Improvements in the connection stiffness and damping properties are computed in terms of physical parameters so that the physical characteristics of the connections can be better understood, in addition to providing improved input for the system model.
Application of Local Discretization Methods in the NASA Finite-Volume General Circulation Model
NASA Technical Reports Server (NTRS)
Yeh, Kao-San; Lin, Shian-Jiann; Rood, Richard B.
2002-01-01
We present the basic ideas of the dynamics system of the finite-volume General Circulation Model developed at NASA Goddard Space Flight Center for climate simulations and other applications in meteorology. The dynamics of this model is designed with emphases on conservative and monotonic transport, where the property of Lagrangian conservation is used to maintain the physical consistency of the computational fluid for long-term simulations. As the model benefits from the noise-free solutions of monotonic finite-volume transport schemes, the property of Lagrangian conservation also partly compensates the accuracy of transport for the diffusion effects due to the treatment of monotonicity. By faithfully maintaining the fundamental laws of physics during the computation, this model is able to achieve sufficient accuracy for the global consistency of climate processes. Because the computing algorithms are based on local memory, this model has the advantage of efficiency in parallel computation with distributed memory. Further research is yet desirable to reduce the diffusion effects of monotonic transport for better accuracy, and to mitigate the limitation due to fast-moving gravity waves for better efficiency.
Counterfactual quantum computation through quantum interrogation
NASA Astrophysics Data System (ADS)
Hosten, Onur; Rakher, Matthew T.; Barreiro, Julio T.; Peters, Nicholas A.; Kwiat, Paul G.
2006-02-01
The logic underlying the coherent nature of quantum information processing often deviates from intuitive reasoning, leading to surprising effects. Counterfactual computation constitutes a striking example: the potential outcome of a quantum computation can be inferred, even if the computer is not run. Relying on similar arguments to interaction-free measurements (or quantum interrogation), counterfactual computation is accomplished by putting the computer in a superposition of `running' and `not running' states, and then interfering the two histories. Conditional on the as-yet-unknown outcome of the computation, it is sometimes possible to counterfactually infer information about the solution. Here we demonstrate counterfactual computation, implementing Grover's search algorithm with an all-optical approach. It was believed that the overall probability of such counterfactual inference is intrinsically limited, so that it could not perform better on average than random guesses. However, using a novel `chained' version of the quantum Zeno effect, we show how to boost the counterfactual inference probability to unity, thereby beating the random guessing limit. Our methods are general and apply to any physical system, as illustrated by a discussion of trapped-ion systems. Finally, we briefly show that, in certain circumstances, counterfactual computation can eliminate errors induced by decoherence.
Viejo, Guillaume; Khamassi, Mehdi; Brovelli, Andrea; Girard, Benoît
2015-01-01
Current learning theory provides a comprehensive description of how humans and other animals learn, and places behavioral flexibility and automaticity at heart of adaptive behaviors. However, the computations supporting the interactions between goal-directed and habitual decision-making systems are still poorly understood. Previous functional magnetic resonance imaging (fMRI) results suggest that the brain hosts complementary computations that may differentially support goal-directed and habitual processes in the form of a dynamical interplay rather than a serial recruitment of strategies. To better elucidate the computations underlying flexible behavior, we develop a dual-system computational model that can predict both performance (i.e., participants' choices) and modulations in reaction times during learning of a stimulus–response association task. The habitual system is modeled with a simple Q-Learning algorithm (QL). For the goal-directed system, we propose a new Bayesian Working Memory (BWM) model that searches for information in the history of previous trials in order to minimize Shannon entropy. We propose a model for QL and BWM coordination such that the expensive memory manipulation is under control of, among others, the level of convergence of the habitual learning. We test the ability of QL or BWM alone to explain human behavior, and compare them with the performance of model combinations, to highlight the need for such combinations to explain behavior. Two of the tested combination models are derived from the literature, and the latter being our new proposal. In conclusion, all subjects were better explained by model combinations, and the majority of them are explained by our new coordination proposal. PMID:26379518
Network gateway security method for enterprise Grid: a literature review
NASA Astrophysics Data System (ADS)
Sujarwo, A.; Tan, J.
2017-03-01
The computational Grid has brought big computational resources closer to scientists. It enables people to do a large computational job anytime and anywhere without any physical border anymore. However, the massive and spread of computer participants either as user or computational provider arise problems in security. The challenge is on how the security system, especially the one which filters data in the gateway could works in flexibility depends on the registered Grid participants. This paper surveys what people have done to approach this challenge, in order to find the better and new method for enterprise Grid. The findings of this paper is the dynamically controlled enterprise firewall to secure the Grid resources from unwanted connections with a new firewall controlling method and components.
Curvature of blended rolled edge reflectors at the shadow boundary contour
NASA Technical Reports Server (NTRS)
Ellingson, S. W.
1988-01-01
A technique is advanced for computing the radius of curvature of blended rolled edge reflector surfaces at the shadow boundary, in the plane perpendicular to the shadow boundary contour. This curvature must be known in order to compute the spurious endpoint contributions in the physical optics (PO) solution for the scattering from reflectors with rolled edges. The technique is applicable to reflectors with radially-defined rim-shapes and rolled edge terminations. The radius of curvature for several basic reflector systems is computed, and it is shown that this curvature can vary greatly along the shadow boundary contour. Finally, the total PO field in the target zone of a sample compact range system is computed and corrected using the shadow boundary radius of curvature, obtained using the technique. It is shown that the fields obtained are a better approximation to the true scattered fields.
The BioIntelligence Framework: a new computational platform for biomedical knowledge computing.
Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles; Mousses, Spyro
2013-01-01
Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information.
Thermodynamics of Computational Copying in Biochemical Systems
NASA Astrophysics Data System (ADS)
Ouldridge, Thomas E.; Govern, Christopher C.; ten Wolde, Pieter Rein
2017-04-01
Living cells use readout molecules to record the state of receptor proteins, similar to measurements or copies in typical computational devices. But is this analogy rigorous? Can cells be optimally efficient, and if not, why? We show that, as in computation, a canonical biochemical readout network generates correlations; extracting no work from these correlations sets a lower bound on dissipation. For general input, the biochemical network cannot reach this bound, even with arbitrarily slow reactions or weak thermodynamic driving. It faces an accuracy-dissipation trade-off that is qualitatively distinct from and worse than implied by the bound, and more complex steady-state copy processes cannot perform better. Nonetheless, the cost remains close to the thermodynamic bound unless accuracy is extremely high. Additionally, we show that biomolecular reactions could be used in thermodynamically optimal devices under exogenous manipulation of chemical fuels, suggesting an experimental system for testing computational thermodynamics.
Darwin v. 2.0: an interpreted computer language for the biosciences.
Gonnet, G H; Hallett, M T; Korostensky, C; Bernardin, L
2000-02-01
We announce the availability of the second release of Darwin v. 2.0, an interpreted computer language especially tailored to researchers in the biosciences. The system is a general tool applicable to a wide range of problems. This second release improves Darwin version 1.6 in several ways: it now contains (1) a larger set of libraries touching most of the classical problems from computational biology (pairwise alignment, all versus all alignments, tree construction, multiple sequence alignment), (2) an expanded set of general purpose algorithms (search algorithms for discrete problems, matrix decomposition routines, complex/long integer arithmetic operations), (3) an improved language with a cleaner syntax, (4) better on-line help, and (5) a number of fixes to user-reported bugs. Darwin is made available for most operating systems free of char ge from the Computational Biochemistry Research Group (CBRG), reachable at http://chrg.inf.ethz.ch. darwin@inf.ethz.ch
Strauss, G; Winkler, D; Jacobs, S; Trantakis, C; Dietz, A; Bootz, F; Meixensberger, J; Falk, V
2005-07-01
This study examines the advantages and disadvantages of a commercial telemanipulator system (daVinci, Intuitive Surgical, USA) with computer-guided instruments in functional endoscopic sinus surgery (FESS). We performed five different surgical FESS steps on 14 anatomical preparation and compared them with conventional FESS. A total of 140 procedures were examined taking into account the following parameters: degrees of freedom (DOF), duration , learning curve, force feedback, human-machine-interface. Telemanipulatory instruments have more DOF available then conventional instrumentation in FESS. The average time consumed by configuration of the telemanipulator is around 9+/-2 min. Missing force feedback is evaluated mainly as a disadvantage of the telemanipulator. Scaling was evaluated as helpful. The ergonomic concept seems to be better than the conventional solution. Computer guided instruments showed better results for the available DOF of the instruments. The human-machine-interface is more adaptable and variable then in conventional instrumentation. Motion scaling and indexing are characteristics of the telemanipulator concept which are helpful for FESS in our study.
FUN3D and CFL3D Computations for the First High Lift Prediction Workshop
NASA Technical Reports Server (NTRS)
Park, Michael A.; Lee-Rausch, Elizabeth M.; Rumsey, Christopher L.
2011-01-01
Two Reynolds-averaged Navier-Stokes codes were used to compute flow over the NASA Trapezoidal Wing at high lift conditions for the 1st AIAA CFD High Lift Prediction Workshop, held in Chicago in June 2010. The unstructured-grid code FUN3D and the structured-grid code CFL3D were applied to several different grid systems. The effects of code, grid system, turbulence model, viscous term treatment, and brackets were studied. The SST model on this configuration predicted lower lift than the Spalart-Allmaras model at high angles of attack; the Spalart-Allmaras model agreed better with experiment. Neglecting viscous cross-derivative terms caused poorer prediction in the wing tip vortex region. Output-based grid adaptation was applied to the unstructured-grid solutions. The adapted grids better resolved wake structures and reduced flap flow separation, which was also observed in uniform grid refinement studies. Limitations of the adaptation method as well as areas for future improvement were identified.
Classified one-step high-radix signed-digit arithmetic units
NASA Astrophysics Data System (ADS)
Cherri, Abdallah K.
1998-08-01
High-radix number systems enable higher information storage density, less complexity, fewer system components, and fewer cascaded gates and operations. A simple one-step fully parallel high-radix signed-digit arithmetic is proposed for parallel optical computing based on new joint spatial encodings. This reduces hardware requirements and improves throughput by reducing the space-bandwidth produce needed. The high-radix signed-digit arithmetic operations are based on classifying the neighboring input digit pairs into various groups to reduce the computation rules. A new joint spatial encoding technique is developed to present both the operands and the computation rules. This technique increases the spatial bandwidth product of the spatial light modulators of the system. An optical implementation of the proposed high-radix signed-digit arithmetic operations is also presented. It is shown that our one-step trinary signed-digit and quaternary signed-digit arithmetic units are much simpler and better than all previously reported high-radix signed-digit techniques.
Fujita, Hideo; Uchimura, Yuji; Waki, Kayo; Omae, Koji; Takeuchi, Ichiro; Ohe, Kazuhiko
2013-01-01
To improve emergency services for accurate diagnosis of cardiac emergency, we developed a low-cost new mobile electrocardiography system "Cloud Cardiology®" based upon cloud computing for prehospital diagnosis. This comprises a compact 12-lead ECG unit equipped with Bluetooth and Android Smartphone with an application for transmission. Cloud server enables us to share ECG simultaneously inside and outside the hospital. We evaluated the clinical effectiveness by conducting a clinical trial with historical comparison to evaluate this system in a rapid response car in the real emergency service settings. We found that this system has an ability to shorten the onset to balloon time of patients with acute myocardial infarction, resulting in better clinical outcome. Here we propose that cloud-computing based simultaneous data sharing could be powerful solution for emergency service for cardiology, along with its significant clinical outcome.
Design of Remote Monitoring System of Irrigation based on GSM and ZigBee Technology
NASA Astrophysics Data System (ADS)
Xiao xi, Zheng; Fang, Zhao; Shuaifei, Shao
2018-03-01
To solve the problems of low level of irrigation and waste of water resources, a remote monitoring system for farmland irrigation based on GSM communication technology and ZigBee technology was designed. The system is composed of sensors, GSM communication module, ZigBee module, host computer, valve and so on. The system detects and closes the pump and the electromagnetic valve according to the need of the system, and transmits the monitoring information to the host computer or the user’s Mobile phone through the GSM communication network. Experiments show that the system has low power consumption, friendly man-machine interface, convenient and simple. It can monitor agricultural environment remotely and control related irrigation equipment at any time and place, and can better meet the needs of remote monitoring of farmland irrigation.
Job-mix modeling and system analysis of an aerospace multiprocessor.
NASA Technical Reports Server (NTRS)
Mallach, E. G.
1972-01-01
An aerospace guidance computer organization, consisting of multiple processors and memory units attached to a central time-multiplexed data bus, is described. A job mix for this type of computer is obtained by analysis of Apollo mission programs. Multiprocessor performance is then analyzed using: 1) queuing theory, under certain 'limiting case' assumptions; 2) Markov process methods; and 3) system simulation. Results of the analyses indicate: 1) Markov process analysis is a useful and efficient predictor of simulation results; 2) efficient job execution is not seriously impaired even when the system is so overloaded that new jobs are inordinately delayed in starting; 3) job scheduling is significant in determining system performance; and 4) a system having many slow processors may or may not perform better than a system of equal power having few fast processors, but will not perform significantly worse.
Development of an automated film-reading system for ballistic ranges
NASA Technical Reports Server (NTRS)
Yates, Leslie A.
1992-01-01
Software for an automated film-reading system that uses personal computers and digitized shadowgraphs is described. The software identifies pixels associated with fiducial-line and model images, and least-squares procedures are used to calculate the positions and orientations of the images. Automated position and orientation readings for sphere and cone models are compared to those obtained using a manual film reader. When facility calibration errors are removed from these readings, the accuracy of the automated readings is better than the pixel resolution, and it is equal to, or better than, the manual readings. The effects of film-reading and facility-calibration errors on calculated aerodynamic coefficients is discussed.
NASA Astrophysics Data System (ADS)
Thakur, Jay Krishna; Singh, Sudhir Kumar; Ekanthalu, Vicky Shettigondahalli
2017-07-01
Integration of remote sensing (RS), geographic information systems (GIS) and global positioning system (GPS) are emerging research areas in the field of groundwater hydrology, resource management, environmental monitoring and during emergency response. Recent advancements in the fields of RS, GIS, GPS and higher level of computation will help in providing and handling a range of data simultaneously in a time- and cost-efficient manner. This review paper deals with hydrological modeling, uses of remote sensing and GIS in hydrological modeling, models of integrations and their need and in last the conclusion. After dealing with these issues conceptually and technically, we can develop better methods and novel approaches to handle large data sets and in a better way to communicate information related with rapidly decreasing societal resources, i.e. groundwater.
Approaches for scalable modeling and emulation of cyber systems : LDRD final report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayo, Jackson R.; Minnich, Ronald G.; Armstrong, Robert C.
2009-09-01
The goal of this research was to combine theoretical and computational approaches to better understand the potential emergent behaviors of large-scale cyber systems, such as networks of {approx} 10{sup 6} computers. The scale and sophistication of modern computer software, hardware, and deployed networked systems have significantly exceeded the computational research community's ability to understand, model, and predict current and future behaviors. This predictive understanding, however, is critical to the development of new approaches for proactively designing new systems or enhancing existing systems with robustness to current and future cyber threats, including distributed malware such as botnets. We have developed preliminarymore » theoretical and modeling capabilities that can ultimately answer questions such as: How would we reboot the Internet if it were taken down? Can we change network protocols to make them more secure without disrupting existing Internet connectivity and traffic flow? We have begun to address these issues by developing new capabilities for understanding and modeling Internet systems at scale. Specifically, we have addressed the need for scalable network simulation by carrying out emulations of a network with {approx} 10{sup 6} virtualized operating system instances on a high-performance computing cluster - a 'virtual Internet'. We have also explored mappings between previously studied emergent behaviors of complex systems and their potential cyber counterparts. Our results provide foundational capabilities for further research toward understanding the effects of complexity in cyber systems, to allow anticipating and thwarting hackers.« less
NASA Astrophysics Data System (ADS)
Gintautas, Vadas; Hubler, Alfred
2006-03-01
As worldwide computer resources increase in power and decrease in cost, real-time simulations of physical systems are becoming increasingly prevalent, from laboratory models to stock market projections and entire ``virtual worlds'' in computer games. Often, these systems are meticulously designed to match real-world systems as closely as possible. We study the limiting behavior of a virtual horizontally driven pendulum coupled to its real-world counterpart, where the interaction occurs on a time scale that is much shorter than the time scale of the dynamical system. We find that if the physical parameters of the virtual system match those of the real system within a certain tolerance, there is a qualitative change in the behavior of the two-pendulum system as the strength of the coupling is increased. Applications include a new method to measure the physical parameters of a real system and the use of resonance spectroscopy to refine a computer model. As virtual systems better approximate real ones, even very weak interactions may produce unexpected and dramatic behavior. The research is supported by the National Science Foundation Grant No. NSF PHY 01-40179, NSF DMS 03-25939 ITR, and NSF DGE 03-38215.
OpenID connect as a security service in Cloud-based diagnostic imaging systems
NASA Astrophysics Data System (ADS)
Ma, Weina; Sartipi, Kamran; Sharghi, Hassan; Koff, David; Bak, Peter
2015-03-01
The evolution of cloud computing is driving the next generation of diagnostic imaging (DI) systems. Cloud-based DI systems are able to deliver better services to patients without constraining to their own physical facilities. However, privacy and security concerns have been consistently regarded as the major obstacle for adoption of cloud computing by healthcare domains. Furthermore, traditional computing models and interfaces employed by DI systems are not ready for accessing diagnostic images through mobile devices. RESTful is an ideal technology for provisioning both mobile services and cloud computing. OpenID Connect, combining OpenID and OAuth together, is an emerging REST-based federated identity solution. It is one of the most perspective open standards to potentially become the de-facto standard for securing cloud computing and mobile applications, which has ever been regarded as "Kerberos of Cloud". We introduce OpenID Connect as an identity and authentication service in cloud-based DI systems and propose enhancements that allow for incorporating this technology within distributed enterprise environment. The objective of this study is to offer solutions for secure radiology image sharing among DI-r (Diagnostic Imaging Repository) and heterogeneous PACS (Picture Archiving and Communication Systems) as well as mobile clients in the cloud ecosystem. Through using OpenID Connect as an open-source identity and authentication service, deploying DI-r and PACS to private or community clouds should obtain equivalent security level to traditional computing model.
Grid site availability evaluation and monitoring at CMS
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; ...
2017-10-01
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less
Grid site availability evaluation and monitoring at CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less
Grid site availability evaluation and monitoring at CMS
NASA Astrophysics Data System (ADS)
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; Lammel, Stephan; Sciabà, Andrea
2017-10-01
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impact data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.
A computational system for lattice QCD with overlap Dirac quarks
NASA Astrophysics Data System (ADS)
Chiu, Ting-Wai; Hsieh, Tung-Han; Huang, Chao-Hsi; Huang, Tsung-Ren
2003-05-01
We outline the essential features of a Linux PC cluster which is now being developed at National Taiwan University, and discuss how to optimize its hardware and software for lattice QCD with overlap Dirac quarks. At present, the cluster constitutes of 30 nodes, with each node consisting of one Pentium 4 processor (1.6/2.0 GHz), one Gbyte of PC800 RDRAM, one 40/80 Gbyte hard disk, and a network card. The speed of this system is estimated to be 30 Gflops, and its price/performance ratio is better than $1.0/Mflops for 64-bit (double precision) computations in quenched lattice QCD with overlap Dirac quarks.
Smart Grid Privacy through Distributed Trust
NASA Astrophysics Data System (ADS)
Lipton, Benjamin
Though the smart electrical grid promises many advantages in efficiency and reliability, the risks to consumer privacy have impeded its deployment. Researchers have proposed protecting privacy by aggregating user data before it reaches the utility, using techniques of homomorphic encryption to prevent exposure of unaggregated values. However, such schemes generally require users to trust in the correct operation of a single aggregation server. We propose two alternative systems based on secret sharing techniques that distribute this trust among multiple service providers, protecting user privacy against a misbehaving server. We also provide an extensive evaluation of the systems considered, comparing their robustness to privacy compromise, error handling, computational performance, and data transmission costs. We conclude that while all the systems should be computationally feasible on smart meters, the two methods based on secret sharing require much less computation while also providing better protection against corrupted aggregators. Building systems using these techniques could help defend the privacy of electricity customers, as well as customers of other utilities as they move to a more data-driven architecture.
Performance Evaluation of Communication Software Systems for Distributed Computing
NASA Technical Reports Server (NTRS)
Fatoohi, Rod
1996-01-01
In recent years there has been an increasing interest in object-oriented distributed computing since it is better quipped to deal with complex systems while providing extensibility, maintainability, and reusability. At the same time, several new high-speed network technologies have emerged for local and wide area networks. However, the performance of networking software is not improving as fast as the networking hardware and the workstation microprocessors. This paper gives an overview and evaluates the performance of the Common Object Request Broker Architecture (CORBA) standard in a distributed computing environment at NASA Ames Research Center. The environment consists of two testbeds of SGI workstations connected by four networks: Ethernet, FDDI, HiPPI, and ATM. The performance results for three communication software systems are presented, analyzed and compared. These systems are: BSD socket programming interface, IONA's Orbix, an implementation of the CORBA specification, and the PVM message passing library. The results show that high-level communication interfaces, such as CORBA and PVM, can achieve reasonable performance under certain conditions.
GPU computing with Kaczmarz’s and other iterative algorithms for linear systems
Elble, Joseph M.; Sahinidis, Nikolaos V.; Vouzis, Panagiotis
2009-01-01
The graphics processing unit (GPU) is used to solve large linear systems derived from partial differential equations. The differential equations studied are strongly convection-dominated, of various sizes, and common to many fields, including computational fluid dynamics, heat transfer, and structural mechanics. The paper presents comparisons between GPU and CPU implementations of several well-known iterative methods, including Kaczmarz’s, Cimmino’s, component averaging, conjugate gradient normal residual (CGNR), symmetric successive overrelaxation-preconditioned conjugate gradient, and conjugate-gradient-accelerated component-averaged row projections (CARP-CG). Computations are preformed with dense as well as general banded systems. The results demonstrate that our GPU implementation outperforms CPU implementations of these algorithms, as well as previously studied parallel implementations on Linux clusters and shared memory systems. While the CGNR method had begun to fall out of favor for solving such problems, for the problems studied in this paper, the CGNR method implemented on the GPU performed better than the other methods, including a cluster implementation of the CARP-CG method. PMID:20526446
NASA Astrophysics Data System (ADS)
Zaveri, Mazad Shaheriar
The semiconductor/computer industry has been following Moore's law for several decades and has reaped the benefits in speed and density of the resultant scaling. Transistor density has reached almost one billion per chip, and transistor delays are in picoseconds. However, scaling has slowed down, and the semiconductor industry is now facing several challenges. Hybrid CMOS/nano technologies, such as CMOL, are considered as an interim solution to some of the challenges. Another potential architectural solution includes specialized architectures for applications/models in the intelligent computing domain, one aspect of which includes abstract computational models inspired from the neuro/cognitive sciences. Consequently in this dissertation, we focus on the hardware implementations of Bayesian Memory (BM), which is a (Bayesian) Biologically Inspired Computational Model (BICM). This model is a simplified version of George and Hawkins' model of the visual cortex, which includes an inference framework based on Judea Pearl's belief propagation. We then present a "hardware design space exploration" methodology for implementing and analyzing the (digital and mixed-signal) hardware for the BM. This particular methodology involves: analyzing the computational/operational cost and the related micro-architecture, exploring candidate hardware components, proposing various custom hardware architectures using both traditional CMOS and hybrid nanotechnology - CMOL, and investigating the baseline performance/price of these architectures. The results suggest that CMOL is a promising candidate for implementing a BM. Such implementations can utilize the very high density storage/computation benefits of these new nano-scale technologies much more efficiently; for example, the throughput per 858 mm2 (TPM) obtained for CMOL based architectures is 32 to 40 times better than the TPM for a CMOS based multiprocessor/multi-FPGA system, and almost 2000 times better than the TPM for a PC implementation. We later use this methodology to investigate the hardware implementations of cortex-scale spiking neural system, which is an approximate neural equivalent of BICM based cortex-scale system. The results of this investigation also suggest that CMOL is a promising candidate to implement such large-scale neuromorphic systems. In general, the assessment of such hypothetical baseline hardware architectures provides the prospects for building large-scale (mammalian cortex-scale) implementations of neuromorphic/Bayesian/intelligent systems using state-of-the-art and beyond state-of-the-art silicon structures.
Enhanced Electric Power Transmission by Hybrid Compensation Technique
NASA Astrophysics Data System (ADS)
Palanichamy, C.; Kiu, G. Q.
2015-04-01
In today's competitive environment, new power system engineers are likely to contribute immediately to the task, without years of seasoning via on-the-job training, mentoring, and rotation assignments. At the same time it is becoming obligatory to train power system engineering graduates for an increasingly quality-minded corporate environment. In order to achieve this, there is a need to make available better-quality tools for educating and training power system engineering students and in-service system engineers too. As a result of the swift advances in computer hardware and software, many windows-based computer software packages were developed for the purpose of educating and training. In line with those packages, a simulation package called Hybrid Series-Shunt Compensators (HSSC) has been developed and presented in this paper for educational purposes.
Container-code recognition system based on computer vision and deep neural networks
NASA Astrophysics Data System (ADS)
Liu, Yi; Li, Tianjian; Jiang, Li; Liang, Xiaoyao
2018-04-01
Automatic container-code recognition system becomes a crucial requirement for ship transportation industry in recent years. In this paper, an automatic container-code recognition system based on computer vision and deep neural networks is proposed. The system consists of two modules, detection module and recognition module. The detection module applies both algorithms based on computer vision and neural networks, and generates a better detection result through combination to avoid the drawbacks of the two methods. The combined detection results are also collected for online training of the neural networks. The recognition module exploits both character segmentation and end-to-end recognition, and outputs the recognition result which passes the verification. When the recognition module generates false recognition, the result will be corrected and collected for online training of the end-to-end recognition sub-module. By combining several algorithms, the system is able to deal with more situations, and the online training mechanism can improve the performance of the neural networks at runtime. The proposed system is able to achieve 93% of overall recognition accuracy.
A Low Complexity System Based on Multiple Weighted Decision Trees for Indoor Localization
Sánchez-Rodríguez, David; Hernández-Morera, Pablo; Quinteiro, José Ma.; Alonso-González, Itziar
2015-01-01
Indoor position estimation has become an attractive research topic due to growing interest in location-aware services. Nevertheless, satisfying solutions have not been found with the considerations of both accuracy and system complexity. From the perspective of lightweight mobile devices, they are extremely important characteristics, because both the processor power and energy availability are limited. Hence, an indoor localization system with high computational complexity can cause complete battery drain within a few hours. In our research, we use a data mining technique named boosting to develop a localization system based on multiple weighted decision trees to predict the device location, since it has high accuracy and low computational complexity. The localization system is built using a dataset from sensor fusion, which combines the strength of radio signals from different wireless local area network access points and device orientation information from a digital compass built-in mobile device, so that extra sensors are unnecessary. Experimental results indicate that the proposed system leads to substantial improvements on computational complexity over the widely-used traditional fingerprinting methods, and it has a better accuracy than they have. PMID:26110413
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henline, P.A.
1995-12-31
The increased use of UNIX based computer systems for machine control, data handling and analysis has greatly enhanced the operating scenarios and operating efficiency of the DIII-D tokamak. This paper will describe some of these UNIX systems and their specific uses. These include the plasma control system, the electron cyclotron heating control system, the analysis of electron temperature and density measurements and the general data acquisition system (which is collecting over 130 Mbytes of data). The speed and total capability of these systems has dramatically affected the ability to operate DIII-D. The improved operating scenarios include better plasma shape controlmore » due to the more thorough MHD calculations done between shots and the new ability to see the time dependence of profile data as it relates across different spatial locations in the tokamak. Other analysis which engenders improved operating abilities will be described.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henline, P.A.
1995-10-01
The increased use of UNIX based computer systems for machine control, data handling and analysis has greatly enhanced the operating scenarios and operating efficiency of the DRI-D tokamak. This paper will describe some of these UNIX systems and their specific uses. These include the plasma control system, the electron cyclotron heating control system, the analysis of electron temperature and density measurements and the general data acquisition system (which is collecting over 130 Mbytes of data). The speed and total capability of these systems has dramatically affected the ability to operate DIII-D. The improved operating scenarios include better plasma shape controlmore » due to the more thorough MHD calculations done between shots and the new ability to see the time dependence of profile data as it relates across different spatial locations in the tokamak. Other analysis which engenders improved operating abilities will be described.« less
Cost Considerations in Cloud Computing
2014-01-01
investments. 2. Database Options The potential promise that “ big data ” analytics holds for many enterprise mission areas makes relevant the question of the...development of a range of new distributed file systems and data - bases that have better scalability properties than traditional SQL databases. Hadoop ... data . Many systems exist that extend or supplement Hadoop —such as Apache Accumulo, which provides a highly granular mechanism for managing security
Graphical Displays Assist In Analysis Of Failures
NASA Technical Reports Server (NTRS)
Pack, Ginger; Wadsworth, David; Razavipour, Reza
1995-01-01
Failure Environment Analysis Tool (FEAT) computer program enables people to see and better understand effects of failures in system. Uses digraph models to determine what will happen to system if set of failure events occurs and to identify possible causes of selected set of failures. Digraphs or engineering schematics used. Also used in operations to help identify causes of failures after they occur. Written in C language.
Role of Soft Computing Approaches in HealthCare Domain: A Mini Review.
Gambhir, Shalini; Malik, Sanjay Kumar; Kumar, Yugal
2016-12-01
In the present era, soft computing approaches play a vital role in solving the different kinds of problems and provide promising solutions. Due to popularity of soft computing approaches, these approaches have also been applied in healthcare data for effectively diagnosing the diseases and obtaining better results in comparison to traditional approaches. Soft computing approaches have the ability to adapt itself according to problem domain. Another aspect is a good balance between exploration and exploitation processes. These aspects make soft computing approaches more powerful, reliable and efficient. The above mentioned characteristics make the soft computing approaches more suitable and competent for health care data. The first objective of this review paper is to identify the various soft computing approaches which are used for diagnosing and predicting the diseases. Second objective is to identify various diseases for which these approaches are applied. Third objective is to categories the soft computing approaches for clinical support system. In literature, it is found that large number of soft computing approaches have been applied for effectively diagnosing and predicting the diseases from healthcare data. Some of these are particle swarm optimization, genetic algorithm, artificial neural network, support vector machine etc. A detailed discussion on these approaches are presented in literature section. This work summarizes various soft computing approaches used in healthcare domain in last one decade. These approaches are categorized in five different categories based on the methodology, these are classification model based system, expert system, fuzzy and neuro fuzzy system, rule based system and case based system. Lot of techniques are discussed in above mentioned categories and all discussed techniques are summarized in the form of tables also. This work also focuses on accuracy rate of soft computing technique and tabular information is provided for each category including author details, technique, disease and utility/accuracy.
NASA Astrophysics Data System (ADS)
Popov, Igor; Sukov, Sergey
2018-02-01
A modification of the adaptive artificial viscosity (AAV) method is considered. This modification is based on one stage time approximation and is adopted to calculation of gasdynamics problems on unstructured grids with an arbitrary type of grid elements. The proposed numerical method has simplified logic, better performance and parallel efficiency compared to the implementation of the original AAV method. Computer experiments evidence the robustness and convergence of the method to difference solution.
Ocean Models and Proper Orthogonal Decomposition
NASA Astrophysics Data System (ADS)
Salas-de-Leon, D. A.
2007-05-01
The increasing computational developments and the better understanding of mathematical and physical systems resulted in an increasing number of ocean models. Long time ago, modelers were like a secret organization and recognize each other by using secret codes and languages that only a select group of people was able to recognize and understand. The access to computational systems was reduced, on one hand equipment and the using time of computers were expensive and restricted, and on the other hand, they required an advance computational languages that not everybody wanted to learn. Now a days most college freshman own a personal computer (PC or laptop), and/or have access to more sophisticated computational systems than those available for research in the early 80's. The resource availability resulted in a mayor access to all kind models. Today computer speed and time and the algorithms does not seem to be a problem, even though some models take days to run in small computational systems. Almost every oceanographic institution has their own model, what is more, in the same institution from one office to the next there are different models for the same phenomena, developed by different research member, the results does not differ substantially since the equations are the same, and the solving algorithms are similar. The algorithms and the grids, constructed with algorithms, can be found in text books and/or over the internet. Every year more sophisticated models are constructed. The Proper Orthogonal Decomposition is a technique that allows the reduction of the number of variables to solve keeping the model properties, for which it can be a very useful tool in diminishing the processes that have to be solved using "small" computational systems, making sophisticated models available for a greater community.
A telemedicine system for enabling teaching activities.
Masero, V; Sanchez, F M; Uson, J
2000-01-01
In order to improve the distance teaching of minimally invasive surgery techniques, an integrated system has been developed. It comprises a telecommunications system, a server, a workstation, some medical peripherals and several computer applications developed in the Minimally Invasive Surgery Centre. The latest peripherals, such as robotized teleoperating systems for telesurgery and virtual reality peripherals, have been added. The visualization of the zone to be treated, along with the teacher's explanations, enables the student to understand the procedures of the operation much better.
Towards systemic theories in biological psychiatry.
Bender, W; Albus, M; Möller, H-J; Tretter, F
2006-02-01
Although still rather controversial, empirical data on the neurobiology of schizophrenia have reached a degree of complexity that makes it hard to obtain a coherent picture of the malfunctions of the brain in schizophrenia. Theoretical neuropsychiatry should therefore use the tools of theoretical sciences like cybernetics, informatics, computational neuroscience or systems science. The methodology of systems science permits the modeling of complex dynamic nonlinear systems. Such procedures might help us to understand brain functions and the disorders and actions of psychiatric drugs better.
NASA Astrophysics Data System (ADS)
Bravos, Angelo; Hill, Howard; Choca, James; Bresolin, Linda B.; Bresolin, Michael J.
1986-03-01
Computer technology is rapidly becoming an inseparable part of many health science specialties. Recently, a new area of computer technology, namely Artificial Intelligence, has been applied toward assisting the medical experts in their diagnostic and therapeutic decision making process. MOODIS is an experimental diagnostic expert system which assists Psychiatry specialists in diagnosing human Mood Disorders, better known as Affective Disorders. Its diagnostic methodology is patterned after MDX, a diagnostic expert system developed at LAIR (Laboratory for Artificial Intelligence Research) of Ohio State University. MOODIS is implemented in CSRL (Conceptual Structures Representation Language) also developed at LAIR. This paper describes MOODIS in terms of conceptualization and requirements, and discusses why the MDX approach and CSRL were chosen.
Computer aided fixture design - A case based approach
NASA Astrophysics Data System (ADS)
Tanji, Shekhar; Raiker, Saiesh; Mathew, Arun Tom
2017-11-01
Automated fixture design plays important role in process planning and integration of CAD and CAM. An automated fixture setup design system is developed where when fixturing surfaces and points are described allowing modular fixture components to get automatically select for generating fixture units and placed into position with satisfying assembled conditions. In past, various knowledge based system have been developed to implement CAFD in practice. In this paper, to obtain an acceptable automated machining fixture design, a case-based reasoning method with developed retrieval system is proposed. Visual Basic (VB) programming language is used in integrating with SolidWorks API (Application programming interface) module for better retrieval procedure reducing computational time. These properties are incorporated in numerical simulation to determine the best fit for practical use.
Resonant transition-based quantum computation
NASA Astrophysics Data System (ADS)
Chiang, Chen-Fu; Hsieh, Chang-Yu
2017-05-01
In this article we assess a novel quantum computation paradigm based on the resonant transition (RT) phenomenon commonly associated with atomic and molecular systems. We thoroughly analyze the intimate connections between the RT-based quantum computation and the well-established adiabatic quantum computation (AQC). Both quantum computing frameworks encode solutions to computational problems in the spectral properties of a Hamiltonian and rely on the quantum dynamics to obtain the desired output state. We discuss how one can adapt any adiabatic quantum algorithm to a corresponding RT version and the two approaches are limited by different aspects of Hamiltonians' spectra. The RT approach provides a compelling alternative to the AQC under various circumstances. To better illustrate the usefulness of the novel framework, we analyze the time complexity of an algorithm for 3-SAT problems and discuss straightforward methods to fine tune its efficiency.
ERIC Educational Resources Information Center
Bagley, James R.; Galpin, Andrew J.
2015-01-01
Interdisciplinary exploration is vital to education in the 21st century. This manuscript outlines an innovative laboratory-based teaching method that combines elements of biochemistry/molecular biology, kinesiology/health science, computer science, and manufacturing engineering to give students the ability to better conceptualize complex…
Perception and Attention for Visualization
ERIC Educational Resources Information Center
Haroz, Steve
2013-01-01
This work examines how a better understanding of visual perception and attention can impact visualization design. In a collection of studies, I explore how different levels of the visual system can measurably affect a variety of visualization metrics. The results show that expert preference, user performance, and even computational performance are…
ERIC Educational Resources Information Center
López, Víctor; Pintó, Roser
2017-01-01
Computer simulations are often considered effective educational tools, since their visual and communicative power enable students to better understand physical systems and phenomena. However, previous studies have found that when students read visual representations some reading difficulties can arise, especially when these are complex or dynamic…
How Can Intelligent CAL Better Adapt to Learners?
ERIC Educational Resources Information Center
Boyd, Gary McI.; Mitchell, P. David
1992-01-01
Discusses intelligent computer-aided learning (ICAL) support systems and considers learner characteristics as elements of ICAL student models. Cybernetic theory and attribute-treatment results are discussed, six components of a student model for tutoring are described, and methods for determining the student's model of the tutor are examined. (22…
Dynamic Programming: An Introduction by Example
ERIC Educational Resources Information Center
Zietz, Joachim
2007-01-01
The author introduces some basic dynamic programming techniques, using examples, with the help of the computer algebra system "Maple". The emphasis is on building confidence and intuition for the solution of dynamic problems in economics. To integrate the material better, the same examples are used to introduce different techniques. One covers the…
The ICCB Computer Based Facilities Inventory & Utilization Management Information Subsystem.
ERIC Educational Resources Information Center
Lach, Ivan J.
The Illinois Community College Board (ICCB) Facilities Inventory and Utilization subsystem, a part of the ICCB management information system, was designed to provide decision makers with needed information to better manage the facility resources of Illinois community colleges. This subsystem, dependent upon facilities inventory data and course…
NASA Astrophysics Data System (ADS)
Zhang, Shunli; Zhang, Dinghua; Gong, Hao; Ghasemalizadeh, Omid; Wang, Ge; Cao, Guohua
2014-11-01
Iterative algorithms, such as the algebraic reconstruction technique (ART), are popular for image reconstruction. For iterative reconstruction, the area integral model (AIM) is more accurate for better reconstruction quality than the line integral model (LIM). However, the computation of the system matrix for AIM is more complex and time-consuming than that for LIM. Here, we propose a fast and accurate method to compute the system matrix for AIM. First, we calculate the intersection of each boundary line of a narrow fan-beam with pixels in a recursive and efficient manner. Then, by grouping the beam-pixel intersection area into six types according to the slopes of the two boundary lines, we analytically compute the intersection area of the narrow fan-beam with the pixels in a simple algebraic fashion. Overall, experimental results show that our method is about three times faster than the Siddon algorithm and about two times faster than the distance-driven model (DDM) in computation of the system matrix. The reconstruction speed of our AIM-based ART is also faster than the LIM-based ART that uses the Siddon algorithm and DDM-based ART, for one iteration. The fast reconstruction speed of our method was accomplished without compromising the image quality.
Mittal, Neelam; Jain, Jyoti
2014-01-01
The purpose of this study was to evaluate the efficacy of nickel-titanium rotary retreatment systems versus stainless steel hand retreatment system with or without solvent for gutta-percha removal during retreatment. Sixty extracted human mandibular molar teeth with single canal in a distal root was prepared with ProTaper rotary nickel-titanium files and obturated with gutta-percha and sealer. The teeth were randomly divided into six groups of 10 specimens in each groups. The volume of filling material before and after retreatment were evaluated in cm(3) using the computed tomography (CT) scanner proprietary software. Maximum amount of filling material removed during retreatment with ProTaper retreatment system with solvent and minimum with hand retreatment system with solvent. None of the technique was 100% effective in removing the filling materials, but the ProTaper retreatment system with solvent was better.
1991-06-01
intensive systems, including the use of onboard digital computers. Topics include: measurements that are digital in origin, sampling, encoding, transmitting...Individuals charged with designing aircraft measuring systems to become better acquainted with new solutions to their requirements. This volume Is...concerned with aircraft measuring systems as related to flight test and flight research. Measure - ments that are digital in origin or that must be
Air System Information Management
NASA Technical Reports Server (NTRS)
Filman, Robert E.
2004-01-01
I flew to Washington last week, a trip rich in distributed information management. Buying tickets, at the gate, in flight, landing and at the baggage claim, myriad messages about my reservation, the weather, our flight plans, gates, bags and so forth flew among a variety of travel agency, airline and Federal Aviation Administration (FAA) computers and personnel. By and large, each kind of information ran on a particular application, often specialized to own data formats and communications network. I went to Washington to attend an FAA meeting on System-Wide Information Management (SWIM) for the National Airspace System (NAS) (http://www.nasarchitecture.faa.gov/Tutorials/NAS101.cfm). NAS (and its information infrastructure, SWIM) is an attempt to bring greater regularity, efficiency and uniformity to the collection of stovepipe applications now used to manage air traffic. Current systems hold information about flight plans, flight trajectories, weather, air turbulence, current and forecast weather, radar summaries, hazardous condition warnings, airport and airspace capacity constraints, temporary flight restrictions, and so forth. Information moving among these stovepipe systems is usually mediated by people (for example, air traffic controllers) or single-purpose applications. People, whose intelligence is critical for difficult tasks and unusual circumstances, are not as efficient as computers for tasks that can be automated. Better information sharing can lead to higher system capacity, more efficient utilization and safer operations. Better information sharing through greater automation is possible though not necessarily easy.
Health Information System in a Cloud Computing Context.
Sadoughi, Farahnaz; Erfannia, Leila
2017-01-01
Healthcare as a worldwide industry is experiencing a period of growth based on health information technology. The capabilities of cloud systems make it as an option to develop eHealth goals. The main objectives of the present study was to evaluate the advantages and limitations of health information systems implementation in a cloud-computing context that was conducted as a systematic review in 2016. Science direct, Scopus, Web of science, IEEE, PubMed and Google scholar were searched according study criteria. Among 308 articles initially found, 21 articles were entered in the final analysis. All the studies had considered cloud computing as a positive tool to help advance health technology, but none had insisted too much on its limitations and threats. Electronic health record systems have been mostly studied in the fields of implementation, designing, and presentation of models and prototypes. According to this research, the main advantages of cloud-based health information systems could be categorized into the following groups: economic benefits and advantages of information management. The main limitations of the implementation of cloud-based health information systems could be categorized into the 4 groups of security, legal, technical, and human restrictions. Compared to earlier studies, the present research had the advantage of dealing with the issue of health information systems in a cloud platform. The high frequency of studies conducted on the implementation of cloud-based health information systems revealed health industry interest in the application of this technology. Security was a subject discussed in most studies due to health information sensitivity. In this investigation, some mechanisms and solutions were discussed concerning the mentioned systems, which would provide a suitable area for future scientific research on this issue. The limitations and solutions discussed in this systematic study would help healthcare managers and decision-makers take better and more efficient advantages of this technology and make better planning to adopt cloud-based health information systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yip, Ho Yin, E-mail: hoyinyip@yahoo.com.hk; Mui, Wing Lun A.; Lee, Joseph W.Y.
2013-07-01
Performances of radiosurgery of intracranial lesions between cone-based Linac system and Tomotherapy-based system were compared in terms of dosimetry and time. Twelve patients with single intracranial lesion treated with cone-based Linac radiosurgery system from 2005 to 2009 were replanned for Tomotherapy-based radiosurgery treatment. The conformity index, homogeneity index (HI), and gradient score index (GSI) of each case was calculated. The Wilcoxon matched-pair test was used to compare the 3 indices between both systems. The cases with regular target (n = 6) and those with irregular target (n = 6) were further analyzed separately. The estimated treatment time between both systemsmore » was also compared. Significant differences were found in HI (p = 0.05) and in GSI (p = 0.03) for the whole group. Cone-based radiosurgery was better in GSI whereas Tomotherapy-based radiosurgery was better in HI. Cone-based radiosurgery was better in conformity index (p = 0.03) and GSI (p = 0.03) for regular targets, whereas Tomotherapy-based radiosurgery system performed significantly better in HI (p = 0.03) for irregular targets. The estimated total treatment time for Tomotherapy-based radiosurgery ranged from 24 minutes to 35 minutes, including 15 minutes of pretreatment megavoltage computed tomography (MVCT) and image registration, whereas that for cone-based radiosurgery ranged from 15 minutes for 1 isocenter to 75 minutes for 5 isocenters. As a rule of thumb, Tomotherapy-based radiosurgery system should be the first-line treatment for irregular lesions because of better dose homogeneity and shorter treatment time. Cone-based Linac radiosurgery system should be the treatment of choice for regular targets because of the better dose conformity, rapid dose fall-off, and reasonable treatment time.« less
[Basic concept in computer assisted surgery].
Merloz, Philippe; Wu, Hao
2006-03-01
To investigate application of medical digital imaging systems and computer technologies in orthopedics. The main computer-assisted surgery systems comprise the four following subcategories. (1) A collection and recording process for digital data on each patient, including preoperative images (CT scans, MRI, standard X-rays), intraoperative visualization (fluoroscopy, ultrasound), and intraoperative position and orientation of surgical instruments or bone sections (using 3D localises). Data merging based on the matching of preoperative imaging (CT scans, MRI, standard X-rays) and intraoperative visualization (anatomical landmarks, or bone surfaces digitized intraoperatively via 3D localiser; intraoperative ultrasound images processed for delineation of bone contours). (2) In cases where only intraoperative images are used for computer-assisted surgical navigation, the calibration of the intraoperative imaging system replaces the merged data system, which is then no longer necessary. (3) A system that provides aid in decision-making, so that the surgical approach is planned on basis of multimodal information: the interactive positioning of surgical instruments or bone sections transmitted via pre- or intraoperative images, display of elements to guide surgical navigation (direction, axis, orientation, length and diameter of a surgical instrument, impingement, etc. ). And (4) A system that monitors the surgical procedure, thereby ensuring that the optimal strategy defined at the preoperative stage is taken into account. It is possible that computer-assisted orthopedic surgery systems will enable surgeons to better assess the accuracy and reliability of the various operative techniques, an indispensable stage in the optimization of surgery.
System Administrator for LCS Development Sets
NASA Technical Reports Server (NTRS)
Garcia, Aaron
2013-01-01
The Spaceport Command and Control System Project is creating a Checkout and Control System that will eventually launch the next generation of vehicles from Kennedy Space Center. KSC has a large set of Development and Operational equipment already deployed in several facilities, including the Launch Control Center, which requires support. The position of System Administrator will complete tasks across multiple platforms (Linux/Windows), many of them virtual. The Hardware Branch of the Control and Data Systems Division at the Kennedy Space Center uses system administrators for a variety of tasks. The position of system administrator comes with many responsibilities which include maintaining computer systems, repair or set up hardware, install software, create backups and recover drive images are a sample of jobs which one must complete. Other duties may include working with clients in person or over the phone and resolving their computer system needs. Training is a major part of learning how an organization functions and operates. Taking that into consideration, NASA is no exception. Training on how to better protect the NASA computer infrastructure will be a topic to learn, followed by NASA work polices. Attending meetings and discussing progress will be expected. A system administrator will have an account with root access. Root access gives a user full access to a computer system and or network. System admins can remove critical system files and recover files using a tape backup. Problem solving will be an important skill to develop in order to complete the many tasks.
Prediction Based Proactive Thermal Virtual Machine Scheduling in Green Clouds
Kinger, Supriya; Kumar, Rajesh; Sharma, Anju
2014-01-01
Cloud computing has rapidly emerged as a widely accepted computing paradigm, but the research on Cloud computing is still at an early stage. Cloud computing provides many advanced features but it still has some shortcomings such as relatively high operating cost and environmental hazards like increasing carbon footprints. These hazards can be reduced up to some extent by efficient scheduling of Cloud resources. Working temperature on which a machine is currently running can be taken as a criterion for Virtual Machine (VM) scheduling. This paper proposes a new proactive technique that considers current and maximum threshold temperature of Server Machines (SMs) before making scheduling decisions with the help of a temperature predictor, so that maximum temperature is never reached. Different workload scenarios have been taken into consideration. The results obtained show that the proposed system is better than existing systems of VM scheduling, which does not consider current temperature of nodes before making scheduling decisions. Thus, a reduction in need of cooling systems for a Cloud environment has been obtained and validated. PMID:24737962
Exploiting analytics techniques in CMS computing monitoring
NASA Astrophysics Data System (ADS)
Bonacorsi, D.; Kuznetsov, V.; Magini, N.; Repečka, A.; Vaandering, E.
2017-10-01
The CMS experiment has collected an enormous volume of metadata about its computing operations in its monitoring systems, describing its experience in operating all of the CMS workflows on all of the Worldwide LHC Computing Grid Tiers. Data mining efforts into all these information have rarely been done, but are of crucial importance for a better understanding of how CMS did successful operations, and to reach an adequate and adaptive modelling of the CMS operations, in order to allow detailed optimizations and eventually a prediction of system behaviours. These data are now streamed into the CERN Hadoop data cluster for further analysis. Specific sets of information (e.g. data on how many replicas of datasets CMS wrote on disks at WLCG Tiers, data on which datasets were primarily requested for analysis, etc) were collected on Hadoop and processed with MapReduce applications profiting of the parallelization on the Hadoop cluster. We present the implementation of new monitoring applications on Hadoop, and discuss the new possibilities in CMS computing monitoring introduced with the ability to quickly process big data sets from mulltiple sources, looking forward to a predictive modeling of the system.
Exploiting GPUs in Virtual Machine for BioCloud
Jo, Heeseung; Jeong, Jinkyu; Lee, Myoungho; Choi, Dong Hoon
2013-01-01
Recently, biological applications start to be reimplemented into the applications which exploit many cores of GPUs for better computation performance. Therefore, by providing virtualized GPUs to VMs in cloud computing environment, many biological applications will willingly move into cloud environment to enhance their computation performance and utilize infinite cloud computing resource while reducing expenses for computations. In this paper, we propose a BioCloud system architecture that enables VMs to use GPUs in cloud environment. Because much of the previous research has focused on the sharing mechanism of GPUs among VMs, they cannot achieve enough performance for biological applications of which computation throughput is more crucial rather than sharing. The proposed system exploits the pass-through mode of PCI express (PCI-E) channel. By making each VM be able to access underlying GPUs directly, applications can show almost the same performance as when those are in native environment. In addition, our scheme multiplexes GPUs by using hot plug-in/out device features of PCI-E channel. By adding or removing GPUs in each VM in on-demand manner, VMs in the same physical host can time-share their GPUs. We implemented the proposed system using the Xen VMM and NVIDIA GPUs and showed that our prototype is highly effective for biological GPU applications in cloud environment. PMID:23710465
Exploiting GPUs in virtual machine for BioCloud.
Jo, Heeseung; Jeong, Jinkyu; Lee, Myoungho; Choi, Dong Hoon
2013-01-01
Recently, biological applications start to be reimplemented into the applications which exploit many cores of GPUs for better computation performance. Therefore, by providing virtualized GPUs to VMs in cloud computing environment, many biological applications will willingly move into cloud environment to enhance their computation performance and utilize infinite cloud computing resource while reducing expenses for computations. In this paper, we propose a BioCloud system architecture that enables VMs to use GPUs in cloud environment. Because much of the previous research has focused on the sharing mechanism of GPUs among VMs, they cannot achieve enough performance for biological applications of which computation throughput is more crucial rather than sharing. The proposed system exploits the pass-through mode of PCI express (PCI-E) channel. By making each VM be able to access underlying GPUs directly, applications can show almost the same performance as when those are in native environment. In addition, our scheme multiplexes GPUs by using hot plug-in/out device features of PCI-E channel. By adding or removing GPUs in each VM in on-demand manner, VMs in the same physical host can time-share their GPUs. We implemented the proposed system using the Xen VMM and NVIDIA GPUs and showed that our prototype is highly effective for biological GPU applications in cloud environment.
NASA Astrophysics Data System (ADS)
Kumar, Manoj; Srivastava, Akanksha
2013-01-01
This paper presents a survey of innovative approaches of the most effective computational techniques for solving singular perturbed partial differential equations, which are useful because of their numerical and computer realizations. Many applied problems appearing in semiconductors theory, biochemistry, kinetics, theory of electrical chains, economics, solid mechanics, fluid dynamics, quantum mechanics, and many others can be modelled as singularly perturbed systems. Here, we summarize a wide range of research articles published by numerous researchers during the last ten years to get a better view of the present scenario in this area of research.
Galaxy CloudMan: delivering cloud compute clusters.
Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James
2010-12-21
Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.
Smart Payload Development for High Data Rate Instrument Systems
NASA Technical Reports Server (NTRS)
Pingree, Paula J.; Norton, Charles D.
2007-01-01
This slide presentation reviews the development of smart payloads instruments systems with high data rates. On-board computation has become a bottleneck for advanced science instrument and engineering capabilities. In order to improve the computation capability on board, smart payloads have been proposed. A smart payload is a Localized instrument, that can offload the flight processor of extensive computing cycles, simplify the interfaces, and minimize the dependency of the instrument on the flight system. This has been proposed for the Mars mission, Mars Atmospheric Trace Molecule Spectroscopy (MATMOS). The design of this system is discussed; the features of the Virtex-4, are discussed, and the technical approach is reviewed. The proposed Hybrid Field Programmable Gate Array (FPGA) technology has been shown to deliver breakthrough performance by tightly coupling hardware and software. Smart Payload designs for instruments such as MATMOS can meet science data return requirements with more competitive use of available on-board resources and can provide algorithm acceleration in hardware leading to implementation of better (more advanced) algorithms in on-board systems for improved science data return
Fully anharmonic IR and Raman spectra of medium-size molecular systems: accuracy and interpretation†
Barone, Vincenzo; Biczysko, Malgorzata; Bloino, Julien
2015-01-01
Computation of full infrared (IR) and Raman spectra (including absolute intensities and transition energies) for medium- and large-sized molecular systems beyond the harmonic approximation is one of the most interesting challenges of contemporary computational chemistry. Contrary to common beliefs, low-order perturbation theory is able to deliver results of high accuracy (actually often better than those issuing from current direct dynamics approaches) provided that anharmonic resonances are properly managed. This perspective sketches the recent developments in our research group toward the development a robust and user-friendly virtual spectrometer rooted into the second-order vibrational perturbation theory (VPT2) and usable also by non-specialists essentially as a black-box procedure. Several examples are explicitly worked out in order to illustrate the features of our computational tool together with the most important ongoing developments. PMID:24346191
Solving the Coupled System Improves Computational Efficiency of the Bidomain Equations
Southern, James A.; Plank, Gernot; Vigmond, Edward J.; Whiteley, Jonathan P.
2017-01-01
The bidomain equations are frequently used to model the propagation of cardiac action potentials across cardiac tissue. At the whole organ level the size of the computational mesh required makes their solution a significant computational challenge. As the accuracy of the numerical solution cannot be compromised, efficiency of the solution technique is important to ensure that the results of the simulation can be obtained in a reasonable time whilst still encapsulating the complexities of the system. In an attempt to increase efficiency of the solver, the bidomain equations are often decoupled into one parabolic equation that is computationally very cheap to solve and an elliptic equation that is much more expensive to solve. In this study the performance of this uncoupled solution method is compared with an alternative strategy in which the bidomain equations are solved as a coupled system. This seems counter-intuitive as the alternative method requires the solution of a much larger linear system at each time step. However, in tests on two 3-D rabbit ventricle benchmarks it is shown that the coupled method is up to 80% faster than the conventional uncoupled method — and that parallel performance is better for the larger coupled problem. PMID:19457741
Encoder-Decoder Optimization for Brain-Computer Interfaces
Merel, Josh; Pianto, Donald M.; Cunningham, John P.; Paninski, Liam
2015-01-01
Neuroprosthetic brain-computer interfaces are systems that decode neural activity into useful control signals for effectors, such as a cursor on a computer screen. It has long been recognized that both the user and decoding system can adapt to increase the accuracy of the end effector. Co-adaptation is the process whereby a user learns to control the system in conjunction with the decoder adapting to learn the user's neural patterns. We provide a mathematical framework for co-adaptation and relate co-adaptation to the joint optimization of the user's control scheme ("encoding model") and the decoding algorithm's parameters. When the assumptions of that framework are respected, co-adaptation cannot yield better performance than that obtainable by an optimal initial choice of fixed decoder, coupled with optimal user learning. For a specific case, we provide numerical methods to obtain such an optimized decoder. We demonstrate our approach in a model brain-computer interface system using an online prosthesis simulator, a simple human-in-the-loop pyschophysics setup which provides a non-invasive simulation of the BCI setting. These experiments support two claims: that users can learn encoders matched to fixed, optimal decoders and that, once learned, our approach yields expected performance advantages. PMID:26029919
Encoder-decoder optimization for brain-computer interfaces.
Merel, Josh; Pianto, Donald M; Cunningham, John P; Paninski, Liam
2015-06-01
Neuroprosthetic brain-computer interfaces are systems that decode neural activity into useful control signals for effectors, such as a cursor on a computer screen. It has long been recognized that both the user and decoding system can adapt to increase the accuracy of the end effector. Co-adaptation is the process whereby a user learns to control the system in conjunction with the decoder adapting to learn the user's neural patterns. We provide a mathematical framework for co-adaptation and relate co-adaptation to the joint optimization of the user's control scheme ("encoding model") and the decoding algorithm's parameters. When the assumptions of that framework are respected, co-adaptation cannot yield better performance than that obtainable by an optimal initial choice of fixed decoder, coupled with optimal user learning. For a specific case, we provide numerical methods to obtain such an optimized decoder. We demonstrate our approach in a model brain-computer interface system using an online prosthesis simulator, a simple human-in-the-loop pyschophysics setup which provides a non-invasive simulation of the BCI setting. These experiments support two claims: that users can learn encoders matched to fixed, optimal decoders and that, once learned, our approach yields expected performance advantages.
An element search ant colony technique for solving virtual machine placement problem
NASA Astrophysics Data System (ADS)
Srija, J.; Rani John, Rose; Kanaga, Grace Mary, Dr.
2017-09-01
The data centres in the cloud environment play a key role in providing infrastructure for ubiquitous computing, pervasive computing, mobile computing etc. This computing technique tries to utilize the available resources in order to provide services. Hence maintaining the resource utilization without wastage of power consumption has become a challenging task for the researchers. In this paper we propose the direct guidance ant colony system for effective mapping of virtual machines to the physical machine with maximal resource utilization and minimal power consumption. The proposed algorithm has been compared with the existing ant colony approach which is involved in solving virtual machine placement problem and thus the proposed algorithm proves to provide better result than the existing technique.
NASA Technical Reports Server (NTRS)
Yates, Leslie A.
1992-01-01
Software for an automated film-reading system that uses personal computers and digitized shadowgraphs is described. The software identifies pixels associated with fiducial-line and model images, and least-squares procedures are used to calculate the positions and orientations of the images. Automated position and orientation readings for sphere and cone models are compared to those obtained using a manual film reader. When facility calibration errors are removed from these readings, the accuracy of the automated readings is better than the pixel resolution, and it is equal to, or better than, the manual readings. The effects of film-reading and facility-calibration errors on calculated aerodynamic coefficients is discussed.
Logistics Automation Master Plan (LAMP). Better Logistics Support through Automation.
1983-06-01
office micro-computers, positioned throughout the command chain , by providing real time links between LCA and all users: 2. Goals: Assist HQDA staff in...field i.e., Airland Battle 2000. IV-27 Section V: CONCEPT OF EXECUTION Suply (Retail) A. SRstem Description. I. The Division Logistics Property Book...7. Divisional Direct Support Unit Automated Supply System (DDASS)/Direct pport Level Suply Automation (DLSA). DDASS and DLSA are system development
1985-07-01
sublethal toxicity of tributyltin oxide (TBTO) and its putative environmental product, tribu- tyltin sulfide ( TBTS ) to zoeal mud crabs, RIthropanopeus...EXPOSURE TO TRIBUTYLTIN A. Valkirs . B. Davidson Computer Sciences Corporation P. Seligman Naval Ocean Systems Center -5 . - - Naval Ocean Systems...Organotin .,’vwfuf coatingsu~~ study better defines the longterm toxicity and bloaccumnulation potential of tributyltin released from antifouting
NASA Astrophysics Data System (ADS)
Nakamura, Christopher M.; Murphy, Sytil K.; Christel, Michael G.; Stevens, Scott M.; Zollman, Dean A.
2016-06-01
Computer-automated assessment of students' text responses to short-answer questions represents an important enabling technology for online learning environments. We have investigated the use of machine learning to train computer models capable of automatically classifying short-answer responses and assessed the results. Our investigations are part of a project to develop and test an interactive learning environment designed to help students learn introductory physics concepts. The system is designed around an interactive video tutoring interface. We have analyzed 9 with about 150 responses or less. We observe for 4 of the 9 automated assessment with interrater agreement of 70% or better with the human rater. This level of agreement may represent a baseline for practical utility in instruction and indicates that the method warrants further investigation for use in this type of application. Our results also suggest strategies that may be useful for writing activities and questions that are more appropriate for automated assessment. These strategies include building activities that have relatively few conceptually distinct ways of perceiving the physical behavior of relatively few physical objects. Further success in this direction may allow us to promote interactivity and better provide feedback in online learning systems. These capabilities could enable our system to function more like a real tutor.
Shah, Dipali Yogesh; Wadekar, Swati Ishwara; Dadpe, Ashwini Manish; Jadhav, Ganesh Ranganath; Choudhary, Lalit Jayant; Kalra, Dheeraj Deepak
2017-01-01
The purpose of this study was to compare and evaluate the shaping ability of ProTaper (PT) and Self-Adjusting File (SAF) system using cone-beam computed tomography (CBCT) to assess their performance in oval-shaped root canals. Sixty-two mandibular premolars with single oval canals were divided into two experimental groups ( n = 31) according to the systems used: Group I - PT and Group II - SAF. Canals were evaluated before and after instrumentation using CBCT to assess centering ratio and canal transportation at three levels. Data were statistically analyzed using one-way analysis of variance, post hoc Tukey's test, and t -test. The SAF showed better centering ability and lesser canal transportation than the PT only in the buccolingual plane at 6 and 9 mm levels. The shaping ability of the PT was best in the apical third in both the planes. The SAF had statistically significant better centering and lesser canal transportation in the buccolingual as compared to the mesiodistal plane at the middle and coronal levels. The SAF produced significantly less transportation and remained centered than the PT at the middle and coronal levels in the buccolingual plane of oval canals. In the mesiodistal plane, the performance of both the systems was parallel.
NASA Astrophysics Data System (ADS)
Lu, Zheng; Lu, Xilin; Lu, Wensheng; Masri, Sami F.
2012-04-01
This paper presents a systematic experimental investigation of the effects of buffered particle dampers attached to a multi-degree-of-freedom (mdof) system under different dynamic loads (free vibration, random excitation as well as real onsite earthquake excitations), and analytical/computational study of such a system. A series of shaking table tests of a three-storey steel frame with the buffered particle damper system are carried out to evaluate the performance and to verify the analysis method. It is shown that buffered particle dampers have good performance in reducing the response of structures under dynamic loads, especially under random excitation case. It can effectively control the fundamental mode of the mdof primary system; however, the control effect for higher modes is variable. It is also shown that, for a specific container geometry, a certain mass ratio leads to more efficient momentum transfer from the primary system to the particles with a better vibration attenuation effect, and that buffered particle dampers have better control effect than the conventional rigid ones. An analytical solution based on the discrete element method is also presented. Comparison between the experimental and computational results shows that reasonably accurate estimates of the response of a primary system can be obtained. Properly designed buffered particle dampers can effectively reduce the response of lightly damped mdof primary system with a small weight penalty, under different dynamic loads.
Computational Approaches to Drug Repurposing and Pharmacology
Hodos, Rachel A; Kidd, Brian A; Khader, Shameer; Readhead, Ben P; Dudley, Joel T
2016-01-01
Data in the biological, chemical, and clinical domains are accumulating at ever-increasing rates and have the potential to accelerate and inform drug development in new ways. Challenges and opportunities now lie in developing analytic tools to transform these often complex and heterogeneous data into testable hypotheses and actionable insights. This is the aim of computational pharmacology, which uses in silico techniques to better understand and predict how drugs affect biological systems, which can in turn improve clinical use, avoid unwanted side effects, and guide selection and development of better treatments. One exciting application of computational pharmacology is drug repurposing- finding new uses for existing drugs. Already yielding many promising candidates, this strategy has the potential to improve the efficiency of the drug development process and reach patient populations with previously unmet needs such as those with rare diseases. While current techniques in computational pharmacology and drug repurposing often focus on just a single data modality such as gene expression or drug-target interactions, we rationalize that methods such as matrix factorization that can integrate data within and across diverse data types have the potential to improve predictive performance and provide a fuller picture of a drug's pharmacological action. PMID:27080087
Intercell scheduling: A negotiation approach using multi-agent coalitions
NASA Astrophysics Data System (ADS)
Tian, Yunna; Li, Dongni; Zheng, Dan; Jia, Yunde
2016-10-01
Intercell scheduling problems arise as a result of intercell transfers in cellular manufacturing systems. Flexible intercell routes are considered in this article, and a coalition-based scheduling (CBS) approach using distributed multi-agent negotiation is developed. Taking advantage of the extended vision of the coalition agents, the global optimization is improved and the communication cost is reduced. The objective of the addressed problem is to minimize mean tardiness. Computational results show that, compared with the widely used combinatorial rules, CBS provides better performance not only in minimizing the objective, i.e. mean tardiness, but also in minimizing auxiliary measures such as maximum completion time, mean flow time and the ratio of tardy parts. Moreover, CBS is better than the existing intercell scheduling approach for the same problem with respect to the solution quality and computational costs.
The Future Medical Science and Colorectal Surgeons
2017-01-01
Future medical technology breakthroughs will build from the incredible progress made in computers, biotechnology, and nanotechnology and from the information learned from the human genome. With such technology and information, computer-aided diagnoses, organ replacement, gene therapy, personalized drugs, and even age reversal will become possible. True 3-dimensional system technology will enable surgeons to envision key clinical features and will help them in planning complex surgery. Surgeons will enter surgical instructions in a virtual space from a remote medical center, order a medical robot to perform the operation, and review the operation in real time on a monitor. Surgeons will be better than artificial intelligence or automated robots when surgeons (or we) love patients and ask questions for a better future. The purpose of this paper is looking at the future medical science and the changes of colorectal surgeons. PMID:29354602
The Future Medical Science and Colorectal Surgeons.
Kim, Young Jin
2017-12-01
Future medical technology breakthroughs will build from the incredible progress made in computers, biotechnology, and nanotechnology and from the information learned from the human genome. With such technology and information, computer-aided diagnoses, organ replacement, gene therapy, personalized drugs, and even age reversal will become possible. True 3-dimensional system technology will enable surgeons to envision key clinical features and will help them in planning complex surgery. Surgeons will enter surgical instructions in a virtual space from a remote medical center, order a medical robot to perform the operation, and review the operation in real time on a monitor. Surgeons will be better than artificial intelligence or automated robots when surgeons (or we) love patients and ask questions for a better future. The purpose of this paper is looking at the future medical science and the changes of colorectal surgeons.
Chen, Qingkui; Zhao, Deyu; Wang, Jingjuan
2017-01-01
This paper aims to develop a low-cost, high-performance and high-reliability computing system to process large-scale data using common data mining algorithms in the Internet of Things (IoT) computing environment. Considering the characteristics of IoT data processing, similar to mainstream high performance computing, we use a GPU (Graphics Processing Unit) cluster to achieve better IoT services. Firstly, we present an energy consumption calculation method (ECCM) based on WSNs. Then, using the CUDA (Compute Unified Device Architecture) Programming model, we propose a Two-level Parallel Optimization Model (TLPOM) which exploits reasonable resource planning and common compiler optimization techniques to obtain the best blocks and threads configuration considering the resource constraints of each node. The key to this part is dynamic coupling Thread-Level Parallelism (TLP) and Instruction-Level Parallelism (ILP) to improve the performance of the algorithms without additional energy consumption. Finally, combining the ECCM and the TLPOM, we use the Reliable GPU Cluster Architecture (RGCA) to obtain a high-reliability computing system considering the nodes’ diversity, algorithm characteristics, etc. The results show that the performance of the algorithms significantly increased by 34.1%, 33.96% and 24.07% for Fermi, Kepler and Maxwell on average with TLPOM and the RGCA ensures that our IoT computing system provides low-cost and high-reliability services. PMID:28777325
Fang, Yuling; Chen, Qingkui; Xiong, Neal N; Zhao, Deyu; Wang, Jingjuan
2017-08-04
This paper aims to develop a low-cost, high-performance and high-reliability computing system to process large-scale data using common data mining algorithms in the Internet of Things (IoT) computing environment. Considering the characteristics of IoT data processing, similar to mainstream high performance computing, we use a GPU (Graphics Processing Unit) cluster to achieve better IoT services. Firstly, we present an energy consumption calculation method (ECCM) based on WSNs. Then, using the CUDA (Compute Unified Device Architecture) Programming model, we propose a Two-level Parallel Optimization Model (TLPOM) which exploits reasonable resource planning and common compiler optimization techniques to obtain the best blocks and threads configuration considering the resource constraints of each node. The key to this part is dynamic coupling Thread-Level Parallelism (TLP) and Instruction-Level Parallelism (ILP) to improve the performance of the algorithms without additional energy consumption. Finally, combining the ECCM and the TLPOM, we use the Reliable GPU Cluster Architecture (RGCA) to obtain a high-reliability computing system considering the nodes' diversity, algorithm characteristics, etc. The results show that the performance of the algorithms significantly increased by 34.1%, 33.96% and 24.07% for Fermi, Kepler and Maxwell on average with TLPOM and the RGCA ensures that our IoT computing system provides low-cost and high-reliability services.
Computer laboratory in medical education for medical students.
Hercigonja-Szekeres, Mira; Marinović, Darko; Kern, Josipa
2009-01-01
Five generations of second year students at the Zagreb University School of Medicine were interviewed through an anonymous questionnaire on their use of personal computers, Internet, computer laboratories and computer-assisted education in general. Results show an advance in students' usage of information and communication technology during the period from 1998/99 to 2002/03. However, their positive opinion about computer laboratory depends on installed capacities: the better the computer laboratory technology, the better the students' acceptance and use of it.
Toward detection of marine vehicles on horizon from buoy camera
NASA Astrophysics Data System (ADS)
Fefilatyev, Sergiy; Goldgof, Dmitry B.; Langebrake, Lawrence
2007-10-01
This paper presents a new technique for automatic detection of marine vehicles in open sea from a buoy camera system using computer vision approach. Users of such system include border guards, military, port safety and flow management, sanctuary protection personnel. The system is intended to work autonomously, taking images of the surrounding ocean surface and analyzing them on the subject of presence of marine vehicles. The goal of the system is to detect an approximate window around the ship and prepare the small image for transmission and human evaluation. The proposed computer vision-based algorithm combines horizon detection method with edge detection and post-processing. The dataset of 100 images is used to evaluate the performance of proposed technique. We discuss promising results of ship detection and suggest necessary improvements for achieving better performance.
Gao, Luying; Liu, Ruyu; Jiang, Yuxin; Song, Wenfeng; Wang, Ying; Liu, Jia; Wang, Juanjuan; Wu, Dongqian; Li, Shuai; Hao, Aimin; Zhang, Bo
2018-04-01
The purpose of this study was to compare the diagnostic efficiency of a thyroid ultrasound computer-aided diagnosis (CAD) system with that of 1 radiologist. This study retrospectively reviewed 342 surgically resected thyroid nodules from July 2013 to December 2013 at our center. The nodules were assessed on typical ultrasound images using the CAD system and reviewed by 1 experienced radiologist. The radiologist stratified the risk of malignancy using the Thyroid Imaging Reporting and Data Systems (TIRADS) and the American Thyroid Association (ATA) guidelines. The radiologist, using TI-RADS and ATA guidelines, performed better than the CAD system (P < .01). The sensitivity of the CAD system was similar to that of an experienced radiologist (P > .05; P < .01; and P > .05). However, we found that the CAD system had lower specificity (P < .01). The sensitivity of a thyroid ultrasound CAD system in differentiating nodules was similar to that of an experienced radiologist. However, the CAD system had lower specificity. © 2017 Wiley Periodicals, Inc.
A joint precoding scheme for indoor downlink multi-user MIMO VLC systems
NASA Astrophysics Data System (ADS)
Zhao, Qiong; Fan, Yangyu; Kang, Bochao
2017-11-01
In this study, we aim to improve the system performance and reduce the implementation complexity of precoding scheme for visible light communication (VLC) systems. By incorporating the power-method algorithm and the block diagonalization (BD) algorithm, we propose a joint precoding scheme for indoor downlink multi-user multi-input-multi-output (MU-MIMO) VLC systems. In this scheme, we apply the BD algorithm to eliminate the co-channel interference (CCI) among users firstly. Secondly, the power-method algorithm is used to search the precoding weight for each user based on the optimal criterion of signal to interference plus noise ratio (SINR) maximization. Finally, the optical power restrictions of VLC systems are taken into account to constrain the precoding weight matrix. Comprehensive computer simulations in two scenarios indicate that the proposed scheme always has better bit error rate (BER) performance and lower computation complexity than that of the traditional scheme.
An interactive web-based system using cloud for large-scale visual analytics
NASA Astrophysics Data System (ADS)
Kaseb, Ahmed S.; Berry, Everett; Rozolis, Erik; McNulty, Kyle; Bontrager, Seth; Koh, Youngsol; Lu, Yung-Hsiang; Delp, Edward J.
2015-03-01
Network cameras have been growing rapidly in recent years. Thousands of public network cameras provide tremendous amount of visual information about the environment. There is a need to analyze this valuable information for a better understanding of the world around us. This paper presents an interactive web-based system that enables users to execute image analysis and computer vision techniques on a large scale to analyze the data from more than 65,000 worldwide cameras. This paper focuses on how to use both the system's website and Application Programming Interface (API). Given a computer program that analyzes a single frame, the user needs to make only slight changes to the existing program and choose the cameras to analyze. The system handles the heterogeneity of the geographically distributed cameras, e.g. different brands, resolutions. The system allocates and manages Amazon EC2 and Windows Azure cloud resources to meet the analysis requirements.
The computational future for climate and Earth system models: on the path to petaflop and beyond.
Washington, Warren M; Buja, Lawrence; Craig, Anthony
2009-03-13
The development of the climate and Earth system models has had a long history, starting with the building of individual atmospheric, ocean, sea ice, land vegetation, biogeochemical, glacial and ecological model components. The early researchers were much aware of the long-term goal of building the Earth system models that would go beyond what is usually included in the climate models by adding interactive biogeochemical interactions. In the early days, the progress was limited by computer capability, as well as by our knowledge of the physical and chemical processes. Over the last few decades, there has been much improved knowledge, better observations for validation and more powerful supercomputer systems that are increasingly meeting the new challenges of comprehensive models. Some of the climate model history will be presented, along with some of the successes and difficulties encountered with present-day supercomputer systems.
A Cloud-based Infrastructure and Architecture for Environmental System Research
NASA Astrophysics Data System (ADS)
Wang, D.; Wei, Y.; Shankar, M.; Quigley, J.; Wilson, B. E.
2016-12-01
The present availability of high-capacity networks, low-cost computers and storage devices, and the widespread adoption of hardware virtualization and service-oriented architecture provide a great opportunity to enable data and computing infrastructure sharing between closely related research activities. By taking advantage of these approaches, along with the world-class high computing and data infrastructure located at Oak Ridge National Laboratory, a cloud-based infrastructure and architecture has been developed to efficiently deliver essential data and informatics service and utilities to the environmental system research community, and will provide unique capabilities that allows terrestrial ecosystem research projects to share their software utilities (tools), data and even data submission workflow in a straightforward fashion. The infrastructure will minimize large disruptions from current project-based data submission workflows for better acceptances from existing projects, since many ecosystem research projects already have their own requirements or preferences for data submission and collection. The infrastructure will eliminate scalability problems with current project silos by provide unified data services and infrastructure. The Infrastructure consists of two key components (1) a collection of configurable virtual computing environments and user management systems that expedite data submission and collection from environmental system research community, and (2) scalable data management services and system, originated and development by ORNL data centers.
The BioIntelligence Framework: a new computational platform for biomedical knowledge computing
Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles
2013-01-01
Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information. PMID:22859646
NASA Technical Reports Server (NTRS)
Trivedi, K. S. (Editor); Clary, J. B. (Editor)
1980-01-01
A computer aided reliability estimation procedure (CARE 3), developed to model the behavior of ultrareliable systems required by flight-critical avionics and control systems, is evaluated. The mathematical models, numerical method, and fault-tolerant architecture modeling requirements are examined, and the testing and characterization procedures are discussed. Recommendations aimed at enhancing CARE 3 are presented; in particular, the need for a better exposition of the method and the user interface is emphasized.
Interfacing HTCondor-CE with OpenStack
NASA Astrophysics Data System (ADS)
Bockelman, B.; Caballero Bejar, J.; Hover, J.
2017-10-01
Over the past few years, Grid Computing technologies have reached a high level of maturity. One key aspect of this success has been the development and adoption of newer Compute Elements to interface the external Grid users with local batch systems. These new Compute Elements allow for better handling of jobs requirements and a more precise management of diverse local resources. However, despite this level of maturity, the Grid Computing world is lacking diversity in local execution platforms. As Grid Computing technologies have historically been driven by the needs of the High Energy Physics community, most resource providers run the platform (operating system version and architecture) that best suits the needs of their particular users. In parallel, the development of virtualization and cloud technologies has accelerated recently, making available a variety of solutions, both commercial and academic, proprietary and open source. Virtualization facilitates performing computational tasks on platforms not available at most computing sites. This work attempts to join the technologies, allowing users to interact with computing sites through one of the standard Computing Elements, HTCondor-CE, but running their jobs within VMs on a local cloud platform, OpenStack, when needed. The system will re-route, in a transparent way, end user jobs into dynamically-launched VM worker nodes when they have requirements that cannot be satisfied by the static local batch system nodes. Also, once the automated mechanisms are in place, it becomes straightforward to allow an end user to invoke a custom Virtual Machine at the site. This will allow cloud resources to be used without requiring the user to establish a separate account. Both scenarios are described in this work.
Review of Augmented Paper Systems in Education: An Orchestration Perspective
ERIC Educational Resources Information Center
Prieto, Luis P.; Wen, Yun; Caballero, Daniela; Dillenbourg, Pierre
2014-01-01
Augmented paper has been proposed as a way to integrate more easily ICTs in settings like formal education, where paper has a strong presence. However, despite the multiplicity of educational applications using paper-based computing, their deployment in authentic settings is still marginal. To better understand this gap between research proposals…
Safeguarding Your Technology: Practical Guidelines for Electronic Education Information Security.
ERIC Educational Resources Information Center
Szuba, Tom
This guide was developed specifically for educational administrators at the building, campus, district, system, and state levels, and is meant to serve as a framework to help them better understand why and how to effectively secure their organization's information, software, and computer and networking equipment. This document is organized into 10…
Predicting pork loin intramuscular fat using computer vision system.
Liu, J-H; Sun, X; Young, J M; Bachmeier, L A; Newman, D J
2018-09-01
The objective of this study was to investigate the ability of computer vision system to predict pork intramuscular fat percentage (IMF%). Center-cut loin samples (n = 85) were trimmed of subcutaneous fat and connective tissue. Images were acquired and pixels were segregated to estimate image IMF% and 18 image color features for each image. Subjective IMF% was determined by a trained grader. Ether extract IMF% was calculated using ether extract method. Image color features and image IMF% were used as predictors for stepwise regression and support vector machine models. Results showed that subjective IMF% had a correlation of 0.81 with ether extract IMF% while the image IMF% had a 0.66 correlation with ether extract IMF%. Accuracy rates for regression models were 0.63 for stepwise and 0.75 for support vector machine. Although subjective IMF% has shown to have better prediction, results from computer vision system demonstrates the potential of being used as a tool in predicting pork IMF% in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.
Galaxy CloudMan: delivering cloud compute clusters
2010-01-01
Background Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is “cloud computing”, which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate “as is” use by experimental biologists. Results We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon’s EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. Conclusions The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge. PMID:21210983
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, C.
Almost every computer architect dreams of achieving high system performance with low implementation costs. A multigauge machine can reconfigure its data-path width, provide parallelism, achieve better resource utilization, and sometimes can trade computational precision for increased speed. A simple experimental method is used here to capture the main characteristics of multigauging. The measurements indicate evidence of near-optimal speedups. Adapting these ideas in designing parallel processors incurs low costs and provides flexibility. Several operational aspects of designing a multigauge machine are discussed as well. Thus, this research reports the technical, economical, and operational feasibility studies of multigauging.
A review on economic emission dispatch problems using quantum computational intelligence
NASA Astrophysics Data System (ADS)
Mahdi, Fahad Parvez; Vasant, Pandian; Kallimani, Vish; Abdullah-Al-Wadud, M.
2016-11-01
Economic emission dispatch (EED) problems are one of the most crucial problems in power systems. Growing energy demand, limitation of natural resources and global warming make this topic into the center of discussion and research. This paper reviews the use of Quantum Computational Intelligence (QCI) in solving Economic Emission Dispatch problems. QCI techniques like Quantum Genetic Algorithm (QGA) and Quantum Particle Swarm Optimization (QPSO) algorithm are discussed here. This paper will encourage the researcher to use more QCI based algorithm to get better optimal result for solving EED problems.
Reaching out to clinicians: implementation of a computerized alert system.
Degnan, Dan; Merryfield, Dave; Hultgren, Steve
2004-01-01
Several published articles have identified that providing automated, computer-generated clinical alerts about potentially critical clinical situations should result in better quality of care. In 1999, the pharmacy department at a community hospital network implemented and refined a commercially available, computerized clinical alert system. This case report discusses the implementation process, gives examples of how the system is used, and describes results following implementation. The use of the clinical alert system in this hospital network resulted in improved patient safety as well as in greater efficiency and decreased costs.
Experimental realization of a highly secure chaos communication under strong channel noise
NASA Astrophysics Data System (ADS)
Ye, Weiping; Dai, Qionglin; Wang, Shihong; Lu, Huaping; Kuang, Jinyu; Zhao, Zhenfeng; Zhu, Xiangqing; Tang, Guoning; Huang, Ronghuai; Hu, Gang
2004-09-01
A one-way coupled spatiotemporally chaotic map lattice is used to construct cryptosystem. With the combinatorial applications of both chaotic computations and conventional algebraic operations, our system has optimal cryptographic properties much better than the separative applications of known chaotic and conventional methods. We have realized experiments to practice duplex voice secure communications in realistic Wired Public Switched Telephone Network by applying our chaotic system and the system of Advanced Encryption Standard (AES), respectively, for cryptography. Our system can work stably against strong channel noise when AES fails to work.
Future directions of meteorology related to air-quality research.
Seaman, Nelson L
2003-06-01
Meteorology is one of the major factors contributing to air-pollution episodes. More accurate representation of meteorological fields has been possible in recent years through the use of remote sensing systems, high-speed computers and fine-mesh meteorological models. Over the next 5-20 years, better meteorological inputs for air quality studies will depend on making better use of a wealth of new remotely sensed observations in more advanced data assimilation systems. However, for fine mesh models to be successful, parameterizations used to represent physical processes must be redesigned to be more precise and better adapted for the scales at which they will be applied. Candidates for significant overhaul include schemes to represent turbulence, deep convection, shallow clouds, and land-surface processes. Improvements in the meteorological observing systems, data assimilation and modeling, coupled with advancements in air-chemistry modeling, will soon lead to operational forecasting of air quality in the US. Predictive capabilities can be expected to grow rapidly over the next decade. This will open the way for a number of valuable new services and strategies, including better warnings of unhealthy atmospheric conditions, event-dependent emissions restrictions, and now casting support for homeland security in the event of toxic releases into the atmosphere.
A scalable approach to solving dense linear algebra problems on hybrid CPU-GPU systems
Song, Fengguang; Dongarra, Jack
2014-10-01
Aiming to fully exploit the computing power of all CPUs and all graphics processing units (GPUs) on hybrid CPU-GPU systems to solve dense linear algebra problems, in this paper we design a class of heterogeneous tile algorithms to maximize the degree of parallelism, to minimize the communication volume, and to accommodate the heterogeneity between CPUs and GPUs. The new heterogeneous tile algorithms are executed upon our decentralized dynamic scheduling runtime system, which schedules a task graph dynamically and transfers data between compute nodes automatically. The runtime system uses a new distributed task assignment protocol to solve data dependencies between tasksmore » without any coordination between processing units. By overlapping computation and communication through dynamic scheduling, we are able to attain scalable performance for the double-precision Cholesky factorization and QR factorization. Finally, our approach demonstrates a performance comparable to Intel MKL on shared-memory multicore systems and better performance than both vendor (e.g., Intel MKL) and open source libraries (e.g., StarPU) in the following three environments: heterogeneous clusters with GPUs, conventional clusters without GPUs, and shared-memory systems with multiple GPUs.« less
A scalable approach to solving dense linear algebra problems on hybrid CPU-GPU systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Fengguang; Dongarra, Jack
Aiming to fully exploit the computing power of all CPUs and all graphics processing units (GPUs) on hybrid CPU-GPU systems to solve dense linear algebra problems, in this paper we design a class of heterogeneous tile algorithms to maximize the degree of parallelism, to minimize the communication volume, and to accommodate the heterogeneity between CPUs and GPUs. The new heterogeneous tile algorithms are executed upon our decentralized dynamic scheduling runtime system, which schedules a task graph dynamically and transfers data between compute nodes automatically. The runtime system uses a new distributed task assignment protocol to solve data dependencies between tasksmore » without any coordination between processing units. By overlapping computation and communication through dynamic scheduling, we are able to attain scalable performance for the double-precision Cholesky factorization and QR factorization. Finally, our approach demonstrates a performance comparable to Intel MKL on shared-memory multicore systems and better performance than both vendor (e.g., Intel MKL) and open source libraries (e.g., StarPU) in the following three environments: heterogeneous clusters with GPUs, conventional clusters without GPUs, and shared-memory systems with multiple GPUs.« less
Getting seamless care right from the beginning - integrating computers into the human interaction.
Pearce, Christopher; Kumarpeli, Pushpa; de Lusignan, Simon
2010-01-01
The digital age is coming to the health space, behind many other fields of society. In part this is because health remains heavily reliant on human interaction. The doctor-patient relationship remains a significant factor in determining patient outcomes. Whilst there are many benefits to E-Health, there are also significant risks if computers are not adequately integrated into this interaction and accurate data are consequently not available on the patient's journey through the health system. Video analysis of routine clinical consultations in Australian and UK primary care. We analyzed 308 consultations (141+167 respectively) from these systems, with an emphasis on how the consultation starts. Australian consultations have a mean duration of 12.7 mins, UK 11.8 mins. In both countries around 7% of consultations are computer initiated. Where doctors engaged with computer use the patient observed the computer screen much more and better records were produced. However, there was suboptimal engagement and poor records and no coding in around 20% of consultations. How the computer is used at the start of the consultation can set the scene for an effective interaction or reflect disengagement from technology and creation of poor records.
Unique Applications for Artificial Neural Networks. Phase 1
1991-08-08
significance. For the VRP, a problem that has received considerable attention in the literature, the new NGO-VRP methodology generates better solutions...represent the stop assignments of each route. The effect of the genetic recombinations is to make simple local exchanges to the relative positions of the...technique for representing a computer-based associative memory [Arbib, 1987]. In our routing system, the basic job of the neural network system is to accept
Systems Engineering Building Advances Power Grid Research
Virden, Jud; Huang, Henry; Skare, Paul; Dagle, Jeff; Imhoff, Carl; Stoustrup, Jakob; Melton, Ron; Stiles, Dennis; Pratt, Rob
2018-01-16
Researchers and industry are now better equipped to tackle the nationâs most pressing energy challenges through PNNLâs new Systems Engineering Building â including challenges in grid modernization, buildings efficiency and renewable energy integration. This lab links real-time grid data, software platforms, specialized laboratories and advanced computing resources for the design and demonstration of new tools to modernize the grid and increase buildings energy efficiency.
2009-09-01
problems, to better model the problem solving of computer systems. This research brought about the intertwining of AI and cognitive psychology . Much of...where symbol sequences are sequential intelligent states of the network, and must be classified as normal, abnormal , or unknown. These symbols...is associated with abnormal behavior; and abcbc is associated with unknown behavior, as it fits no known behavior. Predicted outcomes from
The Modeling, Simulation and Comparison of Interconnection Networks for Parallel Processing.
1987-12-01
performs better at a lower hardware cost than do the single stage cube and mesh networks. As a result, the designer of a paralll pro- cessing system is...attempted, and in most cases succeeded, in designing and implementing faster. more powerful systems. Due to design innovations and technological advances...largely to the computational complexity of the algorithms executed. In the von Neumann machine, instructions must be executed in a sequential manner. Design
Flexible solution for interoperable cloud healthcare systems.
Vida, Mihaela Marcella; Lupşe, Oana Sorina; Stoicu-Tivadar, Lăcrămioara; Bernad, Elena
2012-01-01
It is extremely important for the healthcare domain to have a standardized communication because will improve the quality of information and in the end the resulting benefits will improve the quality of patients' life. The standards proposed to be used are: HL7 CDA and CCD. For a better access to the medical data a solution based on cloud computing (CC) is investigated. CC is a technology that supports flexibility, seamless care, and reduced costs of the medical act. To ensure interoperability between healthcare information systems a solution creating a Web Custom Control is presented. The control shows the database tables and fields used to configure the two standards. This control will facilitate the work of the medical staff and hospital administrators, because they can configure the local system easily and prepare it for communication with other systems. The resulted information will have a higher quality and will provide knowledge that will support better patient management and diagnosis.
Tangible display systems: direct interfaces for computer-based studies of surface appearance
NASA Astrophysics Data System (ADS)
Darling, Benjamin A.; Ferwerda, James A.
2010-02-01
When evaluating the surface appearance of real objects, observers engage in complex behaviors involving active manipulation and dynamic viewpoint changes that allow them to observe the changing patterns of surface reflections. We are developing a class of tangible display systems to provide these natural modes of interaction in computer-based studies of material perception. A first-generation tangible display was created from an off-the-shelf laptop computer containing an accelerometer and webcam as standard components. Using these devices, custom software estimated the orientation of the display and the user's viewing position. This information was integrated with a 3D rendering module so that rotating the display or moving in front of the screen would produce realistic changes in the appearance of virtual objects. In this paper, we consider the design of a second-generation system to improve the fidelity of the virtual surfaces rendered to the screen. With a high-quality display screen and enhanced tracking and rendering capabilities, a secondgeneration system will be better able to support a range of appearance perception applications.
Supporting medical communication for older patients with a shared touch-screen computer.
Piper, Anne Marie; Hollan, James D
2013-11-01
Increasingly health care facilities are adopting electronic medical record systems and installing computer workstations in patient exam rooms. The introduction of computer workstations into the medical interview process makes it important to consider the impact of such technology on older patients as well as new types of interfaces that may better suit the needs of older adults. While many older adults are comfortable with a traditional computer workstation with a keyboard and mouse, this article explores how a large horizontal touch-screen (i.e., a surface computer) may suit the needs of older patients and facilitates the doctor-patient interview process. Twenty older adults (age 60 to 88) used a prototype multiuser, multitouch system in our research laboratory to examine seven health care scenarios. Behavioral observations as well as results from questionnaires and a structured interview were analyzed. The older adults quickly adapted to the prototype system and reported that it was easy to use. Participants also suggested that having a shared view of one's medical records, especially charts and images, would enhance communication with their doctor and aid understanding. While this study is exploratory and some areas of interaction with a surface computer need to be refined, the technology is promising for sharing electronic patient information during medical interviews involving older adults. Future work must examine doctors' and nurses' interaction with the technology as well as logistical issues of installing such a system in a real world medical setting. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Deep Learning in Medical Imaging: General Overview.
Lee, June-Goo; Jun, Sanghoon; Cho, Young-Won; Lee, Hyunna; Kim, Guk Bae; Seo, Joon Beom; Kim, Namkug
2017-01-01
The artificial neural network (ANN)-a machine learning technique inspired by the human neuronal synapse system-was introduced in the 1950s. However, the ANN was previously limited in its ability to solve actual problems, due to the vanishing gradient and overfitting problems with training of deep architecture, lack of computing power, and primarily the absence of sufficient data to train the computer system. Interest in this concept has lately resurfaced, due to the availability of big data, enhanced computing power with the current graphics processing units, and novel algorithms to train the deep neural network. Recent studies on this technology suggest its potentially to perform better than humans in some visual and auditory recognition tasks, which may portend its applications in medicine and healthcare, especially in medical imaging, in the foreseeable future. This review article offers perspectives on the history, development, and applications of deep learning technology, particularly regarding its applications in medical imaging.
On Study of Application of Big Data and Cloud Computing Technology in Smart Campus
NASA Astrophysics Data System (ADS)
Tang, Zijiao
2017-12-01
We live in an era of network and information, which means we produce and face a lot of data every day, however it is not easy for database in the traditional meaning to better store, process and analyze the mass data, therefore the big data was born at the right moment. Meanwhile, the development and operation of big data rest with cloud computing which provides sufficient space and resources available to process and analyze data of big data technology. Nowadays, the proposal of smart campus construction aims at improving the process of building information in colleges and universities, therefore it is necessary to consider combining big data technology and cloud computing technology into construction of smart campus to make campus database system and campus management system mutually combined rather than isolated, and to serve smart campus construction through integrating, storing, processing and analyzing mass data.
Mallick, Zulquernain
2007-01-01
The last 20 years have seen a tremendous growth in mobile computing and wireless communications and services. An experimental study was conducted to explore the effect of text/background color on a laptop computing system along with variable environmental vibration on operators' data entry task performance in moving automobiles. The operators' performance was measured in terms of the number of characters entered per minute without spaces (NCEPMWS) on a laptop computing system. The subjects were divided into 3 categories, namely, Novices, Intermediates and Experts. Findings suggest a re-evaluation of existing laptop designs taking ergonomics into consideration. It appears that proper selection of text/background color on the laptop coupled with controlled vehicular speed could result in a better quality of interaction between human and laptops and it could also resolve the problem of poor data entry task performance.
Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools
NASA Astrophysics Data System (ADS)
Boe, Bryce A.
There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.
FAST: framework for heterogeneous medical image computing and visualization.
Smistad, Erik; Bozorgi, Mohammadmehdi; Lindseth, Frank
2015-11-01
Computer systems are becoming increasingly heterogeneous in the sense that they consist of different processors, such as multi-core CPUs and graphic processing units. As the amount of medical image data increases, it is crucial to exploit the computational power of these processors. However, this is currently difficult due to several factors, such as driver errors, processor differences, and the need for low-level memory handling. This paper presents a novel FrAmework for heterogeneouS medical image compuTing and visualization (FAST). The framework aims to make it easier to simultaneously process and visualize medical images efficiently on heterogeneous systems. FAST uses common image processing programming paradigms and hides the details of memory handling from the user, while enabling the use of all processors and cores on a system. The framework is open-source, cross-platform and available online. Code examples and performance measurements are presented to show the simplicity and efficiency of FAST. The results are compared to the insight toolkit (ITK) and the visualization toolkit (VTK) and show that the presented framework is faster with up to 20 times speedup on several common medical imaging algorithms. FAST enables efficient medical image computing and visualization on heterogeneous systems. Code examples and performance evaluations have demonstrated that the toolkit is both easy to use and performs better than existing frameworks, such as ITK and VTK.
Bang, Magnus; Timpka, Toomas
2007-06-01
Co-located teams often use material objects to communicate messages in collaboration. Modern desktop computing systems with abstract graphical user interface (GUIs) fail to support this material dimension of inter-personal communication. The aim of this study is to investigate how tangible user interfaces can be used in computer systems to better support collaborative routines among co-located clinical teams. The semiotics of physical objects used in team collaboration was analyzed from data collected during 1 month of observations at an emergency room. The resulting set of communication patterns was used as a framework when designing an experimental system. Following the principles of augmented reality, physical objects were mapped into a physical user interface with the goal of maintaining the symbolic value of those objects. NOSTOS is an experimental ubiquitous computing environment that takes advantage of interaction devices integrated into the traditional clinical environment, including digital pens, walk-up displays, and a digital desk. The design uses familiar workplace tools to function as user interfaces to the computer in order to exploit established cognitive and collaborative routines. Paper-based tangible user interfaces and digital desks are promising technologies for co-located clinical teams. A key issue that needs to be solved before employing such solutions in practice is associated with limited feedback from the passive paper interfaces.
Computational needs survey of NASA automation and robotics missions. Volume 1: Survey and results
NASA Technical Reports Server (NTRS)
Davis, Gloria J.
1991-01-01
NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is that mission computing requirements are frequently unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. A preliminary set of advanced mission computational processing requirements of automation and robotics (A&R) systems are provided for use by NASA, industry, and academic communities. These results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implementation capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Volume one includes the survey and results. Volume two contains the appendixes.
Computational needs survey of NASA automation and robotics missions. Volume 2: Appendixes
NASA Technical Reports Server (NTRS)
Davis, Gloria J.
1991-01-01
NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is the fact that mission computing requirements are frequency unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. Here, NASA, industry and academic communities are provided with a preliminary set of advanced mission computational processing requirements of automation and robotics (A and R) systems. The results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implemented capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Here, appendixes are provided.
Bayesian Treed Calibration: An Application to Carbon Capture With AX Sorbent
DOE Office of Scientific and Technical Information (OSTI.GOV)
Konomi, Bledar A.; Karagiannis, Georgios; Lai, Kevin
2017-01-02
In cases where field or experimental measurements are not available, computer models can model real physical or engineering systems to reproduce their outcomes. They are usually calibrated in light of experimental data to create a better representation of the real system. Statistical methods, based on Gaussian processes, for calibration and prediction have been especially important when the computer models are expensive and experimental data limited. In this paper, we develop the Bayesian treed calibration (BTC) as an extension of standard Gaussian process calibration methods to deal with non-stationarity computer models and/or their discrepancy from the field (or experimental) data. Ourmore » proposed method partitions both the calibration and observable input space, based on a binary tree partitioning, into sub-regions where existing model calibration methods can be applied to connect a computer model with the real system. The estimation of the parameters in the proposed model is carried out using Markov chain Monte Carlo (MCMC) computational techniques. Different strategies have been applied to improve mixing. We illustrate our method in two artificial examples and a real application that concerns the capture of carbon dioxide with AX amine based sorbents. The source code and the examples analyzed in this paper are available as part of the supplementary materials.« less
NASA Astrophysics Data System (ADS)
Kisi, Ozgur; Shiri, Jalal
2012-06-01
Estimating sediment volume carried by a river is an important issue in water resources engineering. This paper compares the accuracy of three different soft computing methods, Artificial Neural Networks (ANNs), Adaptive Neuro-Fuzzy Inference System (ANFIS), and Gene Expression Programming (GEP), in estimating daily suspended sediment concentration on rivers by using hydro-meteorological data. The daily rainfall, streamflow and suspended sediment concentration data from Eel River near Dos Rios, at California, USA are used as a case study. The comparison results indicate that the GEP model performs better than the other models in daily suspended sediment concentration estimation for the particular data sets used in this study. Levenberg-Marquardt, conjugate gradient and gradient descent training algorithms were used for the ANN models. Out of three algorithms, the Conjugate gradient algorithm was found to be better than the others.
NASA Technical Reports Server (NTRS)
Ives, R. E.
1982-01-01
A thermal monitoring and control concept is described for a volatile condensable materials (VCM) test apparatus where electric resistance heaters are employed. The technique is computer based, but requires only proportioning ON/OFF relay control signals supplied through a programmable scanner and simple quadrac power controllers. System uniqueness is derived from automatic temperature measurements and the averaging of these measurements in discrete overlapping temperature zones. Overall control tolerance proves to be better than + or - 0.5 C from room ambient temperature to 150 C. Using precisely calibrated thermocouples, the method provides excellent temperature control of a small copper VCM heating plate at 125 + or - 0.2 C over a 24 hr test period. For purposes of unattended operation, the programmable computer/controller provides a continual data printout of system operation. Real time operator command is also provided for, as is automatic shutdown of the system and operator alarm in the event of malfunction.
Cyber physical systems role in manufacturing technologies
NASA Astrophysics Data System (ADS)
Al-Ali, A. R.; Gupta, Ragini; Nabulsi, Ahmad Al
2018-04-01
Empowered by the recent development in single System-on-Chip, Internet of Things, and cloud computing technologies, cyber physical systems are evolving as a major controller during and post the manufacturing products process. In additional to their real physical space, cyber products nowadays have a virtual space. A product virtual space is a digital twin that is attached to it to enable manufacturers and their clients to better manufacture, monitor, maintain and operate it throughout its life time cycles, i.e. from the product manufacturing date, through operation and to the end of its lifespan. Each product is equipped with a tiny microcontroller that has a unique identification number, access code and WiFi conductivity to access it anytime and anywhere during its life cycle. This paper presents the cyber physical systems architecture and its role in manufacturing. Also, it highlights the role of Internet of Things and cloud computing in industrial manufacturing and factory automation.
Silvey, Garry M.; Macri, Jennifer M.; Lee, Paul P.; Lobach, David F.
2005-01-01
New mobile computing devices including personal digital assistants (PDAs) and tablet computers have emerged to facilitate data collection at the point of care. Unfortunately, little research has been reported regarding which device is optimal for a given care setting. In this study we created and compared functionally identical applications on a Palm operating system-based PDA and a Windows-based tablet computer for point-of-care documentation of clinical observations by eye care professionals when caring for patients with diabetes. Eye-care professionals compared the devices through focus group sessions and through validated usability surveys. We found that the application on the tablet computer was preferred over the PDA for documenting the complex data related to eye care. Our findings suggest that the selection of a mobile computing platform depends on the amount and complexity of the data to be entered; the tablet computer functions better for high volume, complex data entry, and the PDA, for low volume, simple data entry. PMID:16779128
Bush, Peter W.
1986-01-01
The Edwards-Trinity regional aquifer-system analysis project, begun in October 1985 and scheduled to be completed by October 1991, is one of a series of similar projects being conducted nationwide. The project is intended to define the hydrogeologic framework, and to describe the geochemistry and groundwater flow of the aquifer system in order to provide a better understanding of the system's long-term water-yielding potential. A multidisciplinary approach will be used in which computer-based digital simulation of flow in the system will be the principal method of hydrogeologic investigation.
Computer games and fine motor skills.
Borecki, Lukasz; Tolstych, Katarzyna; Pokorski, Mieczyslaw
2013-01-01
The study seeks to determine the influence of computer games on fine motor skills in young adults, an area of incomplete understanding and verification. We hypothesized that computer gaming could have a positive influence on basic motor skills, such as precision, aiming, speed, dexterity, or tremor. We examined 30 habitual game users (F/M - 3/27; age range 20-25 years) of the highly interactive game Counter Strike, in which players impersonate soldiers on a battlefield, and 30 age- and gender-matched subjects who declared never to play games. Selected tests from the Vienna Test System were used to assess fine motor skills and tremor. The results demonstrate that the game users scored appreciably better than the control subjects in all tests employed. In particular, the players did significantly better in the precision of arm-hand movements, as expressed by a lower time of errors, 1.6 ± 0.6 vs. 2.8 ± 0.6 s, a lower error rate, 13.6 ± 0.3 vs. 20.4 ± 2.2, and a shorter total time of performing a task, 14.6 ± 2.9 vs. 32.1 ± 4.5 s in non-players, respectively; p < 0.001 all. The findings demonstrate a positive influence of computer games on psychomotor functioning. We submit that playing computer games may be a useful training tool to increase fine motor skills and movement coordination.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lanza, Mathieu; Lique, François, E-mail: francois.lique@univ-lehavre.fr
The determination of hyperfine structure resolved excitation cross sections and rate coefficients due to H{sub 2} collisions is required to interpret astronomical spectra. In this paper, we present several theoretical approaches to compute these data. An almost exact recoupling approach and approximate sudden methods are presented. We apply these different approaches to the HCl–H{sub 2} collisional system in order to evaluate their respective accuracy. HCl–H{sub 2} hyperfine structure resolved cross sections and rate coefficients are then computed using recoupling and approximate sudden methods. As expected, the approximate sudden approaches are more accurate when the collision energy increases and the resultsmore » suggest that these approaches work better for para-H{sub 2} than for ortho-H{sub 2} colliding partner. For the first time, we present HCl–H{sub 2} hyperfine structure resolved rate coefficients, computed here for temperatures ranging from 5 to 300 K. The usual Δj{sub 1} = ΔF{sub 1} propensity rules are observed for the hyperfine transitions. The new rate coefficients will significantly help the interpretation of interstellar HCl emission lines observed with current and future telescopes. We expect that these new data will allow a better determination of the HCl abundance in the interstellar medium, that is crucial to understand the interstellar chlorine chemistry.« less
Weber, Gerhard-Wilhelm; Ozöğür-Akyüz, Süreyya; Kropat, Erik
2009-06-01
An emerging research area in computational biology and biotechnology is devoted to mathematical modeling and prediction of gene-expression patterns; it nowadays requests mathematics to deeply understand its foundations. This article surveys data mining and machine learning methods for an analysis of complex systems in computational biology. It mathematically deepens recent advances in modeling and prediction by rigorously introducing the environment and aspects of errors and uncertainty into the genetic context within the framework of matrix and interval arithmetics. Given the data from DNA microarray experiments and environmental measurements, we extract nonlinear ordinary differential equations which contain parameters that are to be determined. This is done by a generalized Chebychev approximation and generalized semi-infinite optimization. Then, time-discretized dynamical systems are studied. By a combinatorial algorithm which constructs and follows polyhedra sequences, the region of parametric stability is detected. In addition, we analyze the topological landscape of gene-environment networks in terms of structural stability. As a second strategy, we will review recent model selection and kernel learning methods for binary classification which can be used to classify microarray data for cancerous cells or for discrimination of other kind of diseases. This review is practically motivated and theoretically elaborated; it is devoted to a contribution to better health care, progress in medicine, a better education, and more healthy living conditions.
Exploiting Analytics Techniques in CMS Computing Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonacorsi, D.; Kuznetsov, V.; Magini, N.
The CMS experiment has collected an enormous volume of metadata about its computing operations in its monitoring systems, describing its experience in operating all of the CMS workflows on all of the Worldwide LHC Computing Grid Tiers. Data mining efforts into all these information have rarely been done, but are of crucial importance for a better understanding of how CMS did successful operations, and to reach an adequate and adaptive modelling of the CMS operations, in order to allow detailed optimizations and eventually a prediction of system behaviours. These data are now streamed into the CERN Hadoop data cluster formore » further analysis. Specific sets of information (e.g. data on how many replicas of datasets CMS wrote on disks at WLCG Tiers, data on which datasets were primarily requested for analysis, etc) were collected on Hadoop and processed with MapReduce applications profiting of the parallelization on the Hadoop cluster. We present the implementation of new monitoring applications on Hadoop, and discuss the new possibilities in CMS computing monitoring introduced with the ability to quickly process big data sets from mulltiple sources, looking forward to a predictive modeling of the system.« less
Putting Fun Back into Learning.
ERIC Educational Resources Information Center
Rao, Srikumar S.
1995-01-01
People will learn better if they like what they are learning. Computers offer an extensive library of cases, examples, and stories that are easy to access, fun to work through, and tell students what they want to know. One example is the ASK system, a 15-module, self-study, multimedia program that is fun for trainees to use, which should enhance…
The Application and Evaluation of PLATO IV in AF Technical Training.
ERIC Educational Resources Information Center
Mockovak, William P.; And Others
The Air Force has been plagued with the rising cost of technical training and has increasingly turned to computer-assisted instruction (CAI) for better cost effectiveness. Toward this aim a trial of PLATO IV, a CAI system utilizing a graphic display and centered at the University of Illinois, was initiated at the Chanute and Sheppard training…
ERIC Educational Resources Information Center
Osguthorpe, Russell T.; Li Chang, Linda
1988-01-01
A computerized symbol processor system using an Apple IIe computer and a Power Pad graphics tablet was tested with 22 nonspeaking, multiply disabled students. The students were taught to express themselves independently in writing, and they did significantly better than control students on measures of language comprehension and symbol recognition.…
Adventures in Modeling: Exploring Complex, Dynamic Systems with StarLogo.
ERIC Educational Resources Information Center
Colella, Vanessa Stevens; Klopfer, Eric; Resnick, Mitchel
For thousands of years people from da Vinci to Einstein have created models to help them better understand patterns and processes in the world around them. Computers make it easier for novices to build and explore their own models and learn new scientific ideas in the process. This book introduces teachers and students to designing, creating, and…
Wurdack, C M
1997-01-01
Computers are changing the way we do everything from paying our bills to programming our home entertainment systems. If you thought that dental education was not likely to benefit from computers, consider this: Computer technology is revolutionizing dental instruction in ways that promise to improve the quality and efficiency of dental education. It is providing a challenging learning opportunity for dental educators as well. Since much of dental education involves the visual transfer of both concepts and procedures from the instructor to the student, it makes sense that using computer technology to enhance conventional teaching techniques--with materials that include clear, informative images and real-time demonstrations melding sound and animation to deliver to the student in the classroom material that complements textbooks, 35mm slides, and the lecture format. Use of computers at UOP is about teaching students to be competent dentists by making instruction more direct, better visualized, and more comprehensible.
Zhao, Anbang; Ma, Lin; Ma, Xuefei; Hui, Juan
2017-02-20
In this paper, an improved azimuth angle estimation method with a single acoustic vector sensor (AVS) is proposed based on matched filtering theory. The proposed method is mainly applied in an active sonar detection system. According to the conventional passive method based on complex acoustic intensity measurement, the mathematical and physical model of this proposed method is described in detail. The computer simulation and lake experiments results indicate that this method can realize the azimuth angle estimation with high precision by using only a single AVS. Compared with the conventional method, the proposed method achieves better estimation performance. Moreover, the proposed method does not require complex operations in frequencydomain and achieves computational complexity reduction.
Lee, Cheens; Robinson, Kerin M; Wendt, Kate; Williamson, Dianne
The unimpeded functioning of hospital Health Information Services (HIS) is essential for patient care, clinical governance, organisational performance measurement, funding and research. In an investigation of hospital Health Information Services' preparedness for internal disasters, all hospitals in the state of Victoria with the following characteristics were surveyed: they have a Health Information Service/ Department; there is a Manager of the Health Information Service/Department; and their inpatient capacity is greater than 80 beds. Fifty percent of the respondents have experienced an internal disaster within the past decade, the majority affecting the Health Information Service. The most commonly occurring internal disasters were computer system failure and floods. Two-thirds of the hospitals have internal disaster plans; the most frequently occurring scenarios provided for are computer system failure, power failure and fire. More large hospitals have established back-up systems than medium- and small-size hospitals. Fifty-three percent of hospitals have a recovery plan for internal disasters. Hospitals typically self-rate as having a 'medium' level of internal disaster preparedness. Overall, large hospitals are better prepared for internal disasters than medium and small hospitals, and preparation for disruption of computer systems and medical record services is relatively high on their agendas.
Tzou, Chieh-Han John; Pona, Igor; Placheta, Eva; Hold, Alina; Michaelidou, Maria; Artner, Nicole; Kropatsch, Walter; Gerber, Hans; Frey, Manfred
2012-08-01
Since the implementation of the computer-aided system for assessing facial palsy in 1999 by Frey et al (Plast Reconstr Surg. 1999;104:2032-2039), no similar system that can make an objective, three-dimensional, quantitative analysis of facial movements has been marketed. This system has been in routine use since its launch, and it has proven to be reliable, clinically applicable, and therapeutically accurate. With the cooperation of international partners, more than 200 patients were analyzed. Recent developments in computer vision--mostly in the area of generative face models, applying active--appearance models (and extensions), optical flow, and video-tracking-have been successfully incorporated to automate the prototype system. Further market-ready development and a business partner will be needed to enable the production of this system to enhance clinical methodology in diagnostic and prognostic accuracy as a personalized therapy concept, leading to better results and higher quality of life for patients with impaired facial function.
Performance of the Wavelet Decomposition on Massively Parallel Architectures
NASA Technical Reports Server (NTRS)
El-Ghazawi, Tarek A.; LeMoigne, Jacqueline; Zukor, Dorothy (Technical Monitor)
2001-01-01
Traditionally, Fourier Transforms have been utilized for performing signal analysis and representation. But although it is straightforward to reconstruct a signal from its Fourier transform, no local description of the signal is included in its Fourier representation. To alleviate this problem, Windowed Fourier transforms and then wavelet transforms have been introduced, and it has been proven that wavelets give a better localization than traditional Fourier transforms, as well as a better division of the time- or space-frequency plane than Windowed Fourier transforms. Because of these properties and after the development of several fast algorithms for computing the wavelet representation of any signal, in particular the Multi-Resolution Analysis (MRA) developed by Mallat, wavelet transforms have increasingly been applied to signal analysis problems, especially real-life problems, in which speed is critical. In this paper we present and compare efficient wavelet decomposition algorithms on different parallel architectures. We report and analyze experimental measurements, using NASA remotely sensed images. Results show that our algorithms achieve significant performance gains on current high performance parallel systems, and meet scientific applications and multimedia requirements. The extensive performance measurements collected over a number of high-performance computer systems have revealed important architectural characteristics of these systems, in relation to the processing demands of the wavelet decomposition of digital images.
3D Displays And User Interface Design For A Radiation Therapy Treatment Planning CAD Tool
NASA Astrophysics Data System (ADS)
Mosher, Charles E.; Sherouse, George W.; Chaney, Edward L.; Rosenman, Julian G.
1988-06-01
The long term goal of the project described in this paper is to improve local tumor control through the use of computer-aided treatment design methods that can result in selection of better treatment plans compared with conventional planning methods. To this end, a CAD tool for the design of radiation treatment beams is described. Crucial to the effectiveness of this tool are high quality 3D display techniques. We have found that 2D and 3D display methods dramatically improve the comprehension of the complex spatial relationships between patient anatomy, radiation beams, and dose distributions. In order to take full advantage of these displays, an intuitive and highly interactive user interface was created. If the system is to be used by physicians unfamiliar with computer systems, it is essential that a user interface is incorporated that allows the user to navigate through each step of the design process in a manner similar to what they are used to. Compared with conventional systems, we believe our display and CAD tools will allow the radiotherapist to achieve more accurate beam targetting leading to a better radiation dose configuration to the tumor volume. This would result in a reduction of the dose to normal tissue.
1984-09-01
Application Cited Deere and Company e Assist in design of electronic systems for tractors, crawlers, graders, scrapers, etc. Defense Contract Audit Agency . Aid...in developing and enhancing operational audits . DoD, Cameron Station e Conduct affordability analyses; evalu- ate new start systems. DoD, Defense...document productivity gains. e Promotes better inLustry and customer re~latons by providing a common baseline or starting polut for cost vs. perfor- vanz
Exploiting chaos for applications.
Ditto, William L; Sinha, Sudeshna
2015-09-01
We discuss how understanding the nature of chaotic dynamics allows us to control these systems. A controlled chaotic system can then serve as a versatile pattern generator that can be used for a range of application. Specifically, we will discuss the application of controlled chaos to the design of novel computational paradigms. Thus, we present an illustrative research arc, starting with ideas of control, based on the general understanding of chaos, moving over to applications that influence the course of building better devices.
Shah, Dipali Yogesh; Wadekar, Swati Ishwara; Dadpe, Ashwini Manish; Jadhav, Ganesh Ranganath; Choudhary, Lalit Jayant; Kalra, Dheeraj Deepak
2017-01-01
Context and Aims: The purpose of this study was to compare and evaluate the shaping ability of ProTaper (PT) and Self-Adjusting File (SAF) system using cone-beam computed tomography (CBCT) to assess their performance in oval-shaped root canals. Materials and Methods: Sixty-two mandibular premolars with single oval canals were divided into two experimental groups (n = 31) according to the systems used: Group I – PT and Group II – SAF. Canals were evaluated before and after instrumentation using CBCT to assess centering ratio and canal transportation at three levels. Data were statistically analyzed using one-way analysis of variance, post hoc Tukey's test, and t-test. Results: The SAF showed better centering ability and lesser canal transportation than the PT only in the buccolingual plane at 6 and 9 mm levels. The shaping ability of the PT was best in the apical third in both the planes. The SAF had statistically significant better centering and lesser canal transportation in the buccolingual as compared to the mesiodistal plane at the middle and coronal levels. Conclusions: The SAF produced significantly less transportation and remained centered than the PT at the middle and coronal levels in the buccolingual plane of oval canals. In the mesiodistal plane, the performance of both the systems was parallel. PMID:28855757
Choi, Kup-Sze; Chan, Tak-Yin
2015-03-01
To investigate the feasibility of using tablet device as user interface for students with upper extremity disabilities to input mathematics efficiently into computer. A touch-input system using tablet device as user interface was proposed to assist these students to write mathematics. User-switchable and context-specific keyboard layouts were designed to streamline the input process. The system could be integrated with conventional computer systems only with minor software setup. A two-week pre-post test study involving five participants was conducted to evaluate the performance of the system and collect user feedback. The mathematics input efficiency of the participants was found to improve during the experiment sessions. In particular, their performance in entering trigonometric expressions by using the touch-input system was significantly better than that by using conventional mathematics editing software with keyboard and mouse. The participants rated the touch-input system positively and were confident that they could operate at ease with more practice. The proposed touch-input system provides a convenient way for the students with hand impairment to write mathematics and has the potential to facilitate their mathematics learning. Implications for Rehabilitation Students with upper extremity disabilities often face barriers to learning mathematics which is largely based on handwriting. Conventional computer user interfaces are inefficient for them to input mathematics into computer. A touch-input system with context-specific and user-switchable keyboard layouts was designed to improve the efficiency of mathematics input. Experimental results and user feedback suggested that the system has the potential to facilitate mathematics learning for the students.
Performance, Agility and Cost of Cloud Computing Services for NASA GES DISC Giovanni Application
NASA Astrophysics Data System (ADS)
Pham, L.; Chen, A.; Wharton, S.; Winter, E. L.; Lynnes, C.
2013-12-01
The NASA Goddard Earth Science Data and Information Services Center (GES DISC) is investigating the performance, agility and cost of Cloud computing for GES DISC applications. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure), one of the core applications at the GES DISC for online climate-related Earth science data access, subsetting, analysis, visualization, and downloading, was used to evaluate the feasibility and effort of porting an application to the Amazon Cloud Services platform. The performance and the cost of running Giovanni on the Amazon Cloud were compared to similar parameters for the GES DISC local operational system. A Giovanni Time-Series analysis of aerosol absorption optical depth (388nm) from OMI (Ozone Monitoring Instrument)/Aura was selected for these comparisons. All required data were pre-cached in both the Cloud and local system to avoid data transfer delays. The 3-, 6-, 12-, and 24-month data were used for analysis on the Cloud and local system respectively, and the processing times for the analysis were used to evaluate system performance. To investigate application agility, Giovanni was installed and tested on multiple Cloud platforms. The cost of using a Cloud computing platform mainly consists of: computing, storage, data requests, and data transfer in/out. The Cloud computing cost is calculated based on the hourly rate, and the storage cost is calculated based on the rate of Gigabytes per month. Cost for incoming data transfer is free, and for data transfer out, the cost is based on the rate in Gigabytes. The costs for a local server system consist of buying hardware/software, system maintenance/updating, and operating cost. The results showed that the Cloud platform had a 38% better performance and cost 36% less than the local system. This investigation shows the potential of cloud computing to increase system performance and lower the overall cost of system management.
Performance evaluation of the Engineering Analysis and Data Systems (EADS) 2
NASA Technical Reports Server (NTRS)
Debrunner, Linda S.
1994-01-01
The Engineering Analysis and Data System (EADS)II (1) was installed in March 1993 to provide high performance computing for science and engineering at Marshall Space Flight Center (MSFC). EADS II increased the computing capabilities over the existing EADS facility in the areas of throughput and mass storage. EADS II includes a Vector Processor Compute System (VPCS), a Virtual Memory Compute System (CFS), a Common Output System (COS), as well as Image Processing Station, Mini Super Computers, and Intelligent Workstations. These facilities are interconnected by a sophisticated network system. This work considers only the performance of the VPCS and the CFS. The VPCS is a Cray YMP. The CFS is implemented on an RS 6000 using the UniTree Mass Storage System. To better meet the science and engineering computing requirements, EADS II must be monitored, its performance analyzed, and appropriate modifications for performance improvement made. Implementing this approach requires tool(s) to assist in performance monitoring and analysis. In Spring 1994, PerfStat 2.0 was purchased to meet these needs for the VPCS and the CFS. PerfStat(2) is a set of tools that can be used to analyze both historical and real-time performance data. Its flexible design allows significant user customization. The user identifies what data is collected, how it is classified, and how it is displayed for evaluation. Both graphical and tabular displays are supported. The capability of the PerfStat tool was evaluated, appropriate modifications to EADS II to optimize throughput and enhance productivity were suggested and implemented, and the effects of these modifications on the systems performance were observed. In this paper, the PerfStat tool is described, then its use with EADS II is outlined briefly. Next, the evaluation of the VPCS, as well as the modifications made to the system are described. Finally, conclusions are drawn and recommendations for future worked are outlined.
Integrating perioperative information from divergent sources.
Frost, Elizabeth A M
2012-01-01
The enormous diversity of physician practices, including specialists, and patient requirements and comorbidities make integration of appropriate perioperative information difficult. Lack of communicating computer systems adds to the difficulty of assembling data. Meta analysis and evidence-based studies indicate that far too many tests are performed perioperatively. Guidelines for appropriate perioperative management have been formulated by several specialties. Education as to current findings and requirements should be better communicated to surgeons, consultants, and patients to improve healthcare needs and at the same time decrease costs. Means to better communication by interpersonal collaboration are outlined. © 2012 Mount Sinai School of Medicine.
Construction of In-house Databases in a Corporation
NASA Astrophysics Data System (ADS)
Okuda, Yasukazu; Yoshikawa, Ichirou; Sasano, Fumio
The authors describe the outline and the construction process of the in-house technical information system of Mitsui Petrochemical Industries Ltd., “MITOLIS”. This system was constructed in 1981 and has been improved since then to make better use of in-house technical reports. Bibliographic data and keywords of technical reports of R & D division are stored in the host computer system in Iwakuni and can be retrieved by the company members on the desk-side terminal connected to the local area network (LAN). The number of stored reports reaches 6100 from 1970 to 1987.
NASA Astrophysics Data System (ADS)
Clarke, L.
2017-12-01
Integrated assessment (IA) modeling and research has a long history, spanning over 30 years since its inception and addressing a wide range of contemporary issues along the way. Over the last decade, IA modeling and research has emerged as one of the primary analytical methods for understanding the complex interactions between human and natural systems, from the interactions between energy, water, and land/food systems to the interplay between health, climate, and air pollution. IA modeling and research is particularly well-suited for the analysis of these interactions because it is a discipline that strives to integrate representations of multiple systems into consistent computational platforms or frameworks. In doing so, it explicitly confronts the many tradeoffs that are frequently necessary to manage complexity and computational cost while still representing the most important interactions and overall, coupled system behavior. This talk explores the history of IA modeling and research as a means to better understand its role in the assessment of contemporary issues at the confluence of human and natural systems. It traces the evolution of IA modeling and research from initial exploration of long-term emissions pathways, to the role of technology in the global evolution of the energy system, to the key linkages between land and energy systems and, more recently, the linkages with water, air pollution, and other key systems and issues. It discusses the advances in modeling that have emerged over this evolution and the biggest challenges that still present themselves as we strive to better understand the most important interactions between human and natural systems and the implications of these interactions for human welfare and decision making.
Integrated Workforce Modeling System
NASA Technical Reports Server (NTRS)
Moynihan, Gary P.
2000-01-01
There are several computer-based systems, currently in various phases of development at KSC, which encompass some component, aspect, or function of workforce modeling. These systems may offer redundant capabilities and/or incompatible interfaces. A systems approach to workforce modeling is necessary in order to identify and better address user requirements. This research has consisted of two primary tasks. Task 1 provided an assessment of existing and proposed KSC workforce modeling systems for their functionality and applicability to the workforce planning function. Task 2 resulted in the development of a proof-of-concept design for a systems approach to workforce modeling. The model incorporates critical aspects of workforce planning, including hires, attrition, and employee development.
Vereecke, P J
1996-09-01
Computerization is no more improvised just like that and the management of computer plans is from now on a matter for methods and approaches which leave nothing to chance. The main purpose of the present work's empirical part is to a priori determine the attitude which will be the one of the nurses toward the implementation of computer aid in the care units that is expected on the short term. So we will have at our disposal pertinent information which will enable the concerned people to efficiently negotiate this change. For the realization of the present study, we built a LIKERT scale of attitude and integrated it within a question paper with nearly 40 questions. This later was submitted to a stratified random sample of 100 nurses in office at the Clinique St-Pierre in Ottignies (Belgium). After a presentation of that hospital's computer position and of observations made in other hospitals, a detailed study protocol specifies all the features of the research. The achieved results show us that the concerned nurses are highly favorable to the implementation of computer aid. The present work sets out and develops very numerous findings expressed in figures as well as relations and correlations between the attitude and certain features of the sample. The realization of the objectives of any care institution--i.e. a better nursing of the patients and a better managing of the care distribution--colldies with the abundance and the complexity of "Information" which is itself conveyed by numerous systems: nursing, medical, operational, managing information system,... Facing such a reality which is sometimes complex and relatively less analysed, this work is an outline reflection on a bibliographical basis, a conceptual approach that puts certain of the Nursing Information System's generalities in a prominent position--but also some of its specifies--themselves mainly brought to light by means of the introduction of computer aid. The reader will also find here a conceptual pattern integrating the various hospital information sub-systems in interaction, as well as a graphical representation of their information and overlap areas. As the reader will see, the information system--which has become an essential notion in today's data processing--is in the heart of the running of any hospital. Hardly unperceptible, the information system, as long as it is not materialized by the computerization project nonetheless represents a major stake which is covetted by the whole hospital contributors.
NASA Astrophysics Data System (ADS)
Yoon, Susan A.; Koehler-Yom, Jessica; Anderson, Emma; Lin, Joyce; Klopfer, Eric
2015-05-01
Background: This exploratory study is part of a larger-scale research project aimed at building theoretical and practical knowledge of complex systems in students and teachers with the goal of improving high school biology learning through professional development and a classroom intervention. Purpose: We propose a model of adaptive expertise to better understand teachers' classroom practices as they attempt to navigate myriad variables in the implementation of biology units that include working with computer simulations, and learning about and teaching through complex systems ideas. Sample: Research participants were three high school biology teachers, two females and one male, ranging in teaching experience from six to 16 years. Their teaching contexts also ranged in student achievement from 14-47% advanced science proficiency. Design and methods: We used a holistic multiple case study methodology and collected data during the 2011-2012 school year. Data sources include classroom observations, teacher and student surveys, and interviews. Data analyses and trustworthiness measures were conducted through qualitative mining of data sources and triangulation of findings. Results: We illustrate the characteristics of adaptive expertise of more or less successful teaching and learning when implementing complex systems curricula. We also demonstrate differences between case study teachers in terms of particular variables associated with adaptive expertise. Conclusions: This research contributes to scholarship on practices and professional development needed to better support teachers to teach through a complex systems pedagogical and curricular approach.
A computer vision system for the recognition of trees in aerial photographs
NASA Technical Reports Server (NTRS)
Pinz, Axel J.
1991-01-01
Increasing problems of forest damage in Central Europe set the demand for an appropriate forest damage assessment tool. The Vision Expert System (VES) is presented which is capable of finding trees in color infrared aerial photographs. Concept and architecture of VES are discussed briefly. The system is applied to a multisource test data set. The processing of this multisource data set leads to a multiple interpretation result for one scene. An integration of these results will provide a better scene description by the vision system. This is achieved by an implementation of Steven's correlation algorithm.
NASA Astrophysics Data System (ADS)
Mazurowski, Maciej A.; Zhang, Jing; Lo, Joseph Y.; Kuzmiak, Cherie M.; Ghate, Sujata V.; Yoon, Sora
2014-03-01
Providing high quality mammography education to radiology trainees is essential, as good interpretation skills potentially ensure the highest benefit of screening mammography for patients. We have previously proposed a computer-aided education system that utilizes trainee models, which relate human-assessed image characteristics to interpretation error. We proposed that these models be used to identify the most difficult and therefore the most educationally useful cases for each trainee. In this study, as a next step in our research, we propose to build trainee models that utilize features that are automatically extracted from images using computer vision algorithms. To predict error, we used a logistic regression which accepts imaging features as input and returns error as output. Reader data from 3 experts and 3 trainees were used. Receiver operating characteristic analysis was applied to evaluate the proposed trainee models. Our experiments showed that, for three trainees, our models were able to predict error better than chance. This is an important step in the development of adaptive computer-aided education systems since computer-extracted features will allow for faster and more extensive search of imaging databases in order to identify the most educationally beneficial cases.
Fusing human and machine skills for remote robotic operations
NASA Technical Reports Server (NTRS)
Schenker, Paul S.; Kim, Won S.; Venema, Steven C.; Bejczy, Antal K.
1991-01-01
The question of how computer assists can improve teleoperator trajectory tracking during both free and force-constrained motions is addressed. Computer graphics techniques which enable the human operator to both visualize and predict detailed 3D trajectories in real-time are reported. Man-machine interactive control procedures for better management of manipulator contact forces and positioning are also described. It is found that collectively, these novel advanced teleoperations techniques both enhance system performance and significantly reduce control problems long associated with teleoperations under time delay. Ongoing robotic simulations of the 1984 space shuttle Solar Maximum EVA Repair Mission are briefly described.
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.
The remote sensing image segmentation mean shift algorithm parallel processing based on MapReduce
NASA Astrophysics Data System (ADS)
Chen, Xi; Zhou, Liqing
2015-12-01
With the development of satellite remote sensing technology and the remote sensing image data, traditional remote sensing image segmentation technology cannot meet the massive remote sensing image processing and storage requirements. This article put cloud computing and parallel computing technology in remote sensing image segmentation process, and build a cheap and efficient computer cluster system that uses parallel processing to achieve MeanShift algorithm of remote sensing image segmentation based on the MapReduce model, not only to ensure the quality of remote sensing image segmentation, improved split speed, and better meet the real-time requirements. The remote sensing image segmentation MeanShift algorithm parallel processing algorithm based on MapReduce shows certain significance and a realization of value.
NASA Technical Reports Server (NTRS)
Mulligan, Jeffrey B.
2017-01-01
A color algebra refers to a system for computing sums and products of colors, analogous to additive and subtractive color mixtures. The difficulty addressed here is the fact that, because of metamerism, we cannot know with certainty the spectrum that produced a particular color solely on the basis of sensory data. Knowledge of the spectrum is not required to compute additive mixture of colors, but is critical for subtractive (multiplicative) mixture. Therefore, we cannot predict with certainty the multiplicative interactions between colors based solely on sensory data. There are two potential applications of a color algebra: first, to aid modeling phenomena of human visual perception, such as color constancy and transparency; and, second, to provide better models of the interactions of lights and surfaces for computer graphics rendering.
Teaching of real numbers by using the Archimedes-Cantor approach and computer algebra systems
NASA Astrophysics Data System (ADS)
Vorob'ev, Evgenii M.
2015-11-01
Computer technologies and especially computer algebra systems (CAS) allow students to overcome some of the difficulties they encounter in the study of real numbers. The teaching of calculus can be considerably more effective with the use of CAS provided the didactics of the discipline makes it possible to reveal the full computational potential of CAS. In the case of real numbers, the Archimedes-Cantor approach satisfies this requirement. The name of Archimedes brings back the exhaustion method. Cantor's name reminds us of the use of Cauchy rational sequences to represent real numbers. The usage of CAS with the Archimedes-Cantor approach enables the discussion of various representations of real numbers such as graphical, decimal, approximate decimal with precision estimates, and representation as points on a straight line. Exercises with numbers such as e, π, the golden ratio ϕ, and algebraic irrational numbers can help students better understand the real numbers. The Archimedes-Cantor approach also reveals a deep and close relationship between real numbers and continuity, in particular the continuity of functions.
Acute acalculous cholecystitis in systemic lupus erythematosus: a rare initial manifestation.
Manuel, Valdano; Pedro, Gertrudes Maria; Cordeiro, Lemuel Bornelli; de Miranda, Sandra Maria da Rocha Neto
2016-01-01
Acute acalculous cholecystitis is a very rare gastrointestinal manifestation in systemic lupus erythematosus and becomes rarer as an initial manifestation. There are only two cases reported. The authors report a 20-year-old black woman that presented acute acalculous cholecystitis revealed by abdominal computed tomography. During hospitalization, she was diagnosed systemic lupus erythematosus. Conservative treatment with antibiotics was performed with complete remission of the symptoms. Corticosteroid was started in ambulatory. Cholecystectomy has been the treatment of choice in acute acalculous cholecystitis as a complication of systemic lupus erythematosus. The patient responded well to conservative treatment, and surgery was not required. This case is unique in the way that corticosteroid was started in ambulatory care. We should not forget that the acute acalculous cholecystitis can be the initial presentation of systemic lupus erythematosus although its occurrence is very rare. Conservative treatment should be considered. Abdominal computed tomography was a determinant exam for better assessment of acute acalculous cholecystitis. Copyright © 2013 Elsevier Editora Ltda. All rights reserved.
Web-based system for surgical planning and simulation
NASA Astrophysics Data System (ADS)
Eldeib, Ayman M.; Ahmed, Mohamed N.; Farag, Aly A.; Sites, C. B.
1998-10-01
The growing scientific knowledge and rapid progress in medical imaging techniques has led to an increasing demand for better and more efficient methods of remote access to high-performance computer facilities. This paper introduces a web-based telemedicine project that provides interactive tools for surgical simulation and planning. The presented approach makes use of client-server architecture based on new internet technology where clients use an ordinary web browser to view, send, receive and manipulate patients' medical records while the server uses the supercomputer facility to generate online semi-automatic segmentation, 3D visualization, surgical simulation/planning and neuroendoscopic procedures navigation. The supercomputer (SGI ONYX 1000) is located at the Computer Vision and Image Processing Lab, University of Louisville, Kentucky. This system is under development in cooperation with the Department of Neurological Surgery, Alliant Health Systems, Louisville, Kentucky. The server is connected via a network to the Picture Archiving and Communication System at Alliant Health Systems through a DICOM standard interface that enables authorized clients to access patients' images from different medical modalities.
TRUSS: An intelligent design system for aircraft wings
NASA Technical Reports Server (NTRS)
Bates, Preston R.; Schrage, Daniel P.
1989-01-01
Competitive leadership in the international marketplace, superiority in national defense, excellence in productivity, and safety of both private and public systems are all national defense goals which are dependent on superior engineering design. In recent years, it has become more evident that early design decisions are critical, and when only based on performance often result in products which are too expensive, hard to manufacture, or unsupportable. Better use of computer-aided design tools and information-based technologies is required to produce better quality United States products. A program is outlined here to explore the use of knowledge based expert systems coupled with numerical optimization, database management techniques, and designer interface methods in a networked design environment to improve and assess design changes due to changing emphasis or requirements. The initial structural design of a tiltrotor aircraft wing is used as a representative example to demonstrate the approach being followed.
NASA Astrophysics Data System (ADS)
Broccard, Frédéric D.; Joshi, Siddharth; Wang, Jun; Cauwenberghs, Gert
2017-08-01
Objective. Computation in nervous systems operates with different computational primitives, and on different hardware, than traditional digital computation and is thus subjected to different constraints from its digital counterpart regarding the use of physical resources such as time, space and energy. In an effort to better understand neural computation on a physical medium with similar spatiotemporal and energetic constraints, the field of neuromorphic engineering aims to design and implement electronic systems that emulate in very large-scale integration (VLSI) hardware the organization and functions of neural systems at multiple levels of biological organization, from individual neurons up to large circuits and networks. Mixed analog/digital neuromorphic VLSI systems are compact, consume little power and operate in real time independently of the size and complexity of the model. Approach. This article highlights the current efforts to interface neuromorphic systems with neural systems at multiple levels of biological organization, from the synaptic to the system level, and discusses the prospects for future biohybrid systems with neuromorphic circuits of greater complexity. Main results. Single silicon neurons have been interfaced successfully with invertebrate and vertebrate neural networks. This approach allowed the investigation of neural properties that are inaccessible with traditional techniques while providing a realistic biological context not achievable with traditional numerical modeling methods. At the network level, populations of neurons are envisioned to communicate bidirectionally with neuromorphic processors of hundreds or thousands of silicon neurons. Recent work on brain-machine interfaces suggests that this is feasible with current neuromorphic technology. Significance. Biohybrid interfaces between biological neurons and VLSI neuromorphic systems of varying complexity have started to emerge in the literature. Primarily intended as a computational tool for investigating fundamental questions related to neural dynamics, the sophistication of current neuromorphic systems now allows direct interfaces with large neuronal networks and circuits, resulting in potentially interesting clinical applications for neuroengineering systems, neuroprosthetics and neurorehabilitation.
Emission Measurements of Ultracell XX25 Reformed Methanol Fuel Cell System
2008-06-01
combination of electrochemical devices such as fuel cell and battery. Polymer electrolyte membrane fuel cells ( PEMFC ) using hydrogen or liquid...communications and computers, sensors and night vision capabilities. High temperature PEMFC offers some advantages such as enhanced electrode kinetics and better...tolerance of carbon monoxide that will poison the conventional PEMFC . Ultracell Corporation, Livermore, California has developed a first
Redefining Security. A Report by the Joint Security Commission
1994-02-28
security policies. This report offers recommendations on developing new strategies for achieving security within our infor-mation systems, including...better, and we outline methods of improving government and industry personnel security poli- cies. We offer recommendations on developing new strategies ... strategies , sufficient funding, and management attention if our comput- ers and networks are to protect the confidentiality, integrity, and availability of
Integrating System Dynamics and Bayesian Networks with Application to Counter-IED Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarman, Kenneth D.; Brothers, Alan J.; Whitney, Paul D.
2010-06-06
The practice of choosing a single modeling paradigm for predictive analysis can limit the scope and relevance of predictions and their utility to decision-making processes. Considering multiple modeling methods simultaneously may improve this situation, but a better solution provides a framework for directly integrating different, potentially complementary modeling paradigms to enable more comprehensive modeling and predictions, and thus better-informed decisions. The primary challenges of this kind of model integration are to bridge language and conceptual gaps between modeling paradigms, and to determine whether natural and useful linkages can be made in a formal mathematical manner. To address these challenges inmore » the context of two specific modeling paradigms, we explore mathematical and computational options for linking System Dynamics (SD) and Bayesian network (BN) models and incorporating data into the integrated models. We demonstrate that integrated SD/BN models can naturally be described as either state space equations or Dynamic Bayes Nets, which enables the use of many existing computational methods for simulation and data integration. To demonstrate, we apply our model integration approach to techno-social models of insurgent-led attacks and security force counter-measures centered on improvised explosive devices.« less
High-fidelity, low-cost, automated method to assess laparoscopic skills objectively.
Gray, Richard J; Kahol, Kanav; Islam, Gazi; Smith, Marshall; Chapital, Alyssa; Ferrara, John
2012-01-01
We sought to define the extent to which a motion analysis-based assessment system constructed with simple equipment could measure technical skill objectively and quantitatively. An "off-the-shelf" digital video system was used to capture the hand and instrument movement of surgical trainees (beginner level = PGY-1, intermediate level = PGY-3, and advanced level = PGY-5/fellows) while they performed a peg transfer exercise. The video data were passed through a custom computer vision algorithm that analyzed incoming pixels to measure movement smoothness objectively. The beginner-level group had the poorest performance, whereas those in the advanced group generated the highest scores. Intermediate-level trainees scored significantly (p < 0.04) better than beginner trainees. Advanced-level trainees scored significantly better than intermediate-level trainees and beginner-level trainees (p < 0.04 and p < 0.03, respectively). A computer vision-based analysis of surgical movements provides an objective basis for technical expertise-level analysis with construct validity. The technology to capture the data is simple, low cost, and readily available, and it obviates the need for expert human assessment in this setting. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Computer-generated holographic near-eye display system based on LCoS phase only modulator
NASA Astrophysics Data System (ADS)
Sun, Peng; Chang, Shengqian; Zhang, Siman; Xie, Ting; Li, Huaye; Liu, Siqi; Wang, Chang; Tao, Xiao; Zheng, Zhenrong
2017-09-01
Augmented reality (AR) technology has been applied in various areas, such as large-scale manufacturing, national defense, healthcare, movie and mass media and so on. An important way to realize AR display is using computer-generated hologram (CGH), which is accompanied by low image quality and heavy computing defects. Meanwhile, the diffraction of Liquid Crystal on Silicon (LCoS) has a negative effect on image quality. In this paper, a modified algorithm based on traditional Gerchberg-Saxton (GS) algorithm was proposed to improve the image quality, and new method to establish experimental system was used to broaden field of view (FOV). In the experiment, undesired zero-order diffracted light was eliminated and high definition 2D image was acquired with FOV broadened to 36.1 degree. We have also done some pilot research in 3D reconstruction with tomography algorithm based on Fresnel diffraction. With the same experimental system, experimental results demonstrate the feasibility of 3D reconstruction. These modifications are effective and efficient, and may provide a better solution in AR realization.
NASA Astrophysics Data System (ADS)
Avara, Mark J.; Noble, Scott; Shiokawa, Hotaka; Cheng, Roseanne; Campanelli, Manuela; Krolik, Julian H.
2017-08-01
A multi-patch approach to numerical simulations of black hole accretion flows allows one to robustly match numerical grid shape and equations solved to the natural structure of the physical system. For instance, a cartesian gridded patch can be used to cover coordinate singularities on a spherical-polar grid, increasing computational efficiency and better capturing the physical system through natural symmetries. We will present early tests, initial applications, and first results from the new MHD implementation of the PATCHWORK framework.
ERIC Educational Resources Information Center
Bennedsen, Jens; Caspersen, Michael E.
2008-01-01
In order to better understand predictors of success and, when possible, improve the design of the first year computer science courses at university to increase the likelihood of success, we study a number of factors that may potentially indicate students' computer science aptitude. Based on findings in general education, we have studied the…
High-speed parallel implementation of a modified PBR algorithm on DSP-based EH topology
NASA Astrophysics Data System (ADS)
Rajan, K.; Patnaik, L. M.; Ramakrishna, J.
1997-08-01
Algebraic Reconstruction Technique (ART) is an age-old method used for solving the problem of three-dimensional (3-D) reconstruction from projections in electron microscopy and radiology. In medical applications, direct 3-D reconstruction is at the forefront of investigation. The simultaneous iterative reconstruction technique (SIRT) is an ART-type algorithm with the potential of generating in a few iterations tomographic images of a quality comparable to that of convolution backprojection (CBP) methods. Pixel-based reconstruction (PBR) is similar to SIRT reconstruction, and it has been shown that PBR algorithms give better quality pictures compared to those produced by SIRT algorithms. In this work, we propose a few modifications to the PBR algorithms. The modified algorithms are shown to give better quality pictures compared to PBR algorithms. The PBR algorithm and the modified PBR algorithms are highly compute intensive, Not many attempts have been made to reconstruct objects in the true 3-D sense because of the high computational overhead. In this study, we have developed parallel two-dimensional (2-D) and 3-D reconstruction algorithms based on modified PBR. We attempt to solve the two problems encountered by the PBR and modified PBR algorithms, i.e., the long computational time and the large memory requirements, by parallelizing the algorithm on a multiprocessor system. We investigate the possible task and data partitioning schemes by exploiting the potential parallelism in the PBR algorithm subject to minimizing the memory requirement. We have implemented an extended hypercube (EH) architecture for the high-speed execution of the 3-D reconstruction algorithm using the commercially available fast floating point digital signal processor (DSP) chips as the processing elements (PEs) and dual-port random access memories (DPR) as channels between the PEs. We discuss and compare the performances of the PBR algorithm on an IBM 6000 RISC workstation, on a Silicon Graphics Indigo 2 workstation, and on an EH system. The results show that an EH(3,1) using DSP chips as PEs executes the modified PBR algorithm about 100 times faster than an LBM 6000 RISC workstation. We have executed the algorithms on a 4-node IBM SP2 parallel computer. The results show that execution time of the algorithm on an EH(3,1) is better than that of a 4-node IBM SP2 system. The speed-up of an EH(3,1) system with eight PEs and one network controller is approximately 7.85.
Divilov, Konstantin; Wiesner-Hanks, Tyr; Barba, Paola; Cadle-Davidson, Lance; Reisch, Bruce I
2017-12-01
Quantitative phenotyping of downy mildew sporulation is frequently used in plant breeding and genetic studies, as well as in studies focused on pathogen biology such as chemical efficacy trials. In these scenarios, phenotyping a large number of genotypes or treatments can be advantageous but is often limited by time and cost. We present a novel computational pipeline dedicated to estimating the percent area of downy mildew sporulation from images of inoculated grapevine leaf discs in a manner that is time and cost efficient. The pipeline was tested on images from leaf disc assay experiments involving two F 1 grapevine families, one that had glabrous leaves (Vitis rupestris B38 × 'Horizon' [RH]) and another that had leaf trichomes (Horizon × V. cinerea B9 [HC]). Correlations between computer vision and manual visual ratings reached 0.89 in the RH family and 0.43 in the HC family. Additionally, we were able to use the computer vision system prior to sporulation to measure the percent leaf trichome area. We estimate that an experienced rater scoring sporulation would spend at least 90% less time using the computer vision system compared with the manual visual method. This will allow more treatments to be phenotyped in order to better understand the genetic architecture of downy mildew resistance and of leaf trichome density. We anticipate that this computer vision system will find applications in other pathosystems or traits where responses can be imaged with sufficient contrast from the background.
NASA Technical Reports Server (NTRS)
Putnam, William M.
2011-01-01
Earth system models like the Goddard Earth Observing System model (GEOS-5) have been pushing the limits of large clusters of multi-core microprocessors, producing breath-taking fidelity in resolving cloud systems at a global scale. GPU computing presents an opportunity for improving the efficiency of these leading edge models. A GPU implementation of GEOS-5 will facilitate the use of cloud-system resolving resolutions in data assimilation and weather prediction, at resolutions near 3.5 km, improving our ability to extract detailed information from high-resolution satellite observations and ultimately produce better weather and climate predictions
Data management system performance modeling
NASA Technical Reports Server (NTRS)
Kiser, Larry M.
1993-01-01
This paper discusses analytical techniques that have been used to gain a better understanding of the Space Station Freedom's (SSF's) Data Management System (DMS). The DMS is a complex, distributed, real-time computer system that has been redesigned numerous times. The implications of these redesigns have not been fully analyzed. This paper discusses the advantages and disadvantages for static analytical techniques such as Rate Monotonic Analysis (RMA) and also provides a rationale for dynamic modeling. Factors such as system architecture, processor utilization, bus architecture, queuing, etc. are well suited for analysis with a dynamic model. The significance of performance measures for a real-time system are discussed.
A 3D visualization system for molecular structures
NASA Technical Reports Server (NTRS)
Green, Terry J.
1989-01-01
The properties of molecules derive in part from their structures. Because of the importance of understanding molecular structures various methodologies, ranging from first principles to empirical technique, were developed for computing the structure of molecules. For large molecules such as polymer model compounds, the structural information is difficult to comprehend by examining tabulated data. Therefore, a molecular graphics display system, called MOLDS, was developed to help interpret the data. MOLDS is a menu-driven program developed to run on the LADC SNS computer systems. This program can read a data file generated by the modeling programs or data can be entered using the keyboard. MOLDS has the following capabilities: draws the 3-D representation of a molecule using stick, ball and ball, or space filled model from Cartesian coordinates, draws different perspective views of the molecule; rotates the molecule on the X, Y, Z axis or about some arbitrary line in space, zooms in on a small area of the molecule in order to obtain a better view of a specific region; and makes hard copy representation of molecules on a graphic printer. In addition, MOLDS can be easily updated and readily adapted to run on most computer systems.
Serial Back-Plane Technologies in Advanced Avionics Architectures
NASA Technical Reports Server (NTRS)
Varnavas, Kosta
2005-01-01
Current back plane technologies such as VME, and current personal computer back planes such as PCI, are shared bus systems that can exhibit nondeterministic latencies. This means a card can take control of the bus and use resources indefinitely affecting the ability of other cards in the back plane to acquire the bus. This provides a real hit on the reliability of the system. Additionally, these parallel busses only have bandwidths in the 100s of megahertz range and EMI and noise effects get worse the higher the bandwidth goes. To provide scalable, fault-tolerant, advanced computing systems, more applicable to today s connected computing environment and to better meet the needs of future requirements for advanced space instruments and vehicles, serial back-plane technologies should be implemented in advanced avionics architectures. Serial backplane technologies eliminate the problem of one card getting the bus and never relinquishing it, or one minor problem on the backplane bringing the whole system down. Being serial instead of parallel improves the reliability by reducing many of the signal integrity issues associated with parallel back planes and thus significantly improves reliability. The increased speeds associated with a serial backplane are an added bonus.
Yoder, J W; Schultz, D F; Williams, B T
1998-10-01
The solution to many of the problems of the computer-based recording of the medical record has been elusive, largely due to difficulties in the capture of those data elements that comprise the records of the Present Illness and of the Physical Findings. Reliable input of data has proven to be more complex than originally envisioned by early work in the field. This has led to more research and development into better data collection protocols and easy to use human-computer interfaces as support tools. The Medical Examination Direct Iconic and Graphic Augmented Text Entry System (MEDIGATE System) is a computer enhanced interactive graphic and textual record of the findings from physical examinations designed to provide ease of user input and to support organization and processing of the data characterizing these findings. The primary design objective of the MEDIGATE System is to develop and evaluate different interface designs for recording observations from the physical examination in an attempt to overcome some of the deficiencies in this major component of the individual record of health and illness.
Planning and simulation of medical robot tasks.
Raczkowsky, J; Bohner, P; Burghart, C; Grabowski, H
1998-01-01
Complex techniques for planning and performing surgery revolutionize medical interventions. In former times preoperative planning of interventions usually took place in the surgeons mind. Today's new computer techniques allow the surgeon to discuss various operation methods for a patient and to visualize them three-dimensionally. The use of computer assisted surgical planning helps to get better results of a treatment and supports the surgeon before and during the surgical intervention. In this paper we are presenting our planning and simulation system for operations in maxillo-facial surgery. All phases of a surgical intervention are supported. Chapter 1 gives a description of the medical motivation for our planning system and its environment. In Chapter 2 the basic components are presented. The planning system is depicted in Chapter 3 and a simulation of a robot assisted surgery can be found in Chapter 4. Chapter 5 concludes the paper and gives a survey about our future work.
Better informed in clinical practice - a brief overview of dental informatics.
Reynolds, P A; Harper, J; Dunne, S
2008-03-22
Uptake of dental informatics has been hampered by technical and user issues. Innovative systems have been developed, but usability issues have affected many. Advances in technology and artificial intelligence are now producing clinically useful systems, although issues still remain with adapting computer interfaces to the dental practice working environment. A dental electronic health record has become a priority in many countries, including the UK. However, experience shows that any dental electronic health record (EHR) system cannot be subordinate to, or a subset of, a medical record. Such a future dental EHR is likely to incorporate integrated care pathways. Future best dental practice will increasingly depend on computer-based support tools, although disagreement remains about the effectiveness of current support tools. Over the longer term, future dental informatics tools will incorporate dynamic, online evidence-based medicine (EBM) tools, and promise more adaptive, patient-focused and efficient dental care with educational advantages in training.
Bassett, Danielle S; Sporns, Olaf
2017-01-01
Despite substantial recent progress, our understanding of the principles and mechanisms underlying complex brain function and cognition remains incomplete. Network neuroscience proposes to tackle these enduring challenges. Approaching brain structure and function from an explicitly integrative perspective, network neuroscience pursues new ways to map, record, analyze and model the elements and interactions of neurobiological systems. Two parallel trends drive the approach: the availability of new empirical tools to create comprehensive maps and record dynamic patterns among molecules, neurons, brain areas and social systems; and the theoretical framework and computational tools of modern network science. The convergence of empirical and computational advances opens new frontiers of scientific inquiry, including network dynamics, manipulation and control of brain networks, and integration of network processes across spatiotemporal domains. We review emerging trends in network neuroscience and attempt to chart a path toward a better understanding of the brain as a multiscale networked system. PMID:28230844
Pharmacoinformatic approaches to understand complexation of dendrimeric nanoparticles with drugs
NASA Astrophysics Data System (ADS)
Jain, Vaibhav; Bharatam, Prasad V.
2014-02-01
Nanoparticle based drug delivery systems are gaining popularity due to their wide spectrum advantages over traditional drug delivery systems; among them, dendrimeric nano-vectors are the most widely explored carriers for pharmaceutical and biomedical applications. The precise mechanism of encapsulation of drug molecules inside the dendritic matrix, delivery of drugs into specific cells, interactions of nano-formulation with biological targets and proteins, etc. present a substantial challenge to the scientific understanding of the subject. Computational methods complement experimental techniques in the design and optimization of drug delivery systems, thus minimizing the investment in drug design and development. Significant progress in computer simulations could facilitate an understanding of the precise mechanism of encapsulation of bioactive molecules and their delivery. This review summarizes the pharmacoinformatic studies spanning from quantum chemical calculations to coarse-grained simulations, aimed at providing better insight into dendrimer-drug interactions and the physicochemical parameters influencing the binding and release mechanism of drugs.
Beyond the computer-based patient record: re-engineering with a vision.
Genn, B; Geukers, L
1995-01-01
In order to achieve real benefit from the potential offered by a Computer-Based Patient Record, the capabilities of the technology must be applied along with true re-engineering of healthcare delivery processes. University Hospital recognizes this and is using systems implementation projects, such as the catalyst, for transforming the way we care for our patients. Integration is fundamental to the success of these initiatives and this must be explicitly planned against an organized systems architecture whose standards are market-driven. University Hospital also recognizes that Community Health Information Networks will offer improved quality of patient care at a reduced overall cost to the system. All of these implementation factors are considered up front as the hospital makes its initial decisions on to how to computerize its patient records. This improves our chances for success and will provide a consistent vision to guide the hospital's development of new and better patient care.
NASA Astrophysics Data System (ADS)
Giday, Nigussie M.; Katamzi-Joseph, Zama T.
2018-02-01
This study investigates the performance of a tomographic algorithm, Multi-Instrument and Data Analysis System (MIDAS), during an extended period of 4-14 March 2012, containing moderate and strong geomagnetic storms conditions, over an understudied and data scarce Eastern African longitude sector. Nonetheless, a relatively better distribution of Global Navigation Satellite Systems stations exists along a narrow longitude sector between 30°E and 44°E and latitude range of 30°S and 36°N that spans the equatorial, middle-, and low-latitude ionosphere. Then results are compared with independent global models such as International Reference Ionosphere 2012 (IRI-2012) and global ionosphere map (GIM). MIDAS performance was better than that of the IRI-2012 and GIM models in terms of capturing the diurnal trends as well as the short temporal total electron content (TEC) structures, with least root-mean-square errors (RMSEs). Overall, MIDAS results showed better agreement with the observation vertical TEC (VTEC) with computed maximum correlation coefficient (r) of 0.99 and minimum root-mean-square error (RMSE) of 2.91 TEC unit (1 TECU = 1,016 el m-2 over all the test stations and the validation days. Conversely, for the IRI-2012 and GIM TEC estimates, the corresponding maximum computed r values were 0.93 and 0.99, respectively, while the minimum RMSEs were 13.03 TECU and 6.52 TECU, respectively, for all the test stations and the validation days.
Zhao, Anbang; Ma, Lin; Ma, Xuefei; Hui, Juan
2017-01-01
In this paper, an improved azimuth angle estimation method with a single acoustic vector sensor (AVS) is proposed based on matched filtering theory. The proposed method is mainly applied in an active sonar detection system. According to the conventional passive method based on complex acoustic intensity measurement, the mathematical and physical model of this proposed method is described in detail. The computer simulation and lake experiments results indicate that this method can realize the azimuth angle estimation with high precision by using only a single AVS. Compared with the conventional method, the proposed method achieves better estimation performance. Moreover, the proposed method does not require complex operations in frequency-domain and achieves computational complexity reduction. PMID:28230763
Mu, Zhendong; Yin, Jinhai; Hu, Jianfeng
2018-01-01
In this paper, a person authentication system that can effectively identify individuals by generating unique electroencephalogram signal features in response to self-face and non-self-face photos is presented. In order to achieve a good stability performance, the sequence of self-face photo including first-occurrence position and non-first-occurrence position are taken into account in the serial occurrence of visual stimuli. In addition, a Fisher linear classification method and event-related potential technique for feature analysis is adapted to yield remarkably better outcomes than that by most of the existing methods in the field. The results have shown that the EEG-based person authentications via brain-computer interface can be considered as a suitable approach for biometric authentication system.
2011-01-01
Background Due to the increasing functionality of medical information systems, it is hard to imagine day to day work in hospitals without IT support. Therefore, the design of dialogues between humans and information systems is one of the most important issues to be addressed in health care. This survey presents an analysis of the current quality level of human-computer interaction of healthcare-IT in German hospitals, focused on the users' point of view. Methods To evaluate the usability of clinical-IT according to the design principles of EN ISO 9241-10 the IsoMetrics Inventory, an assessment tool, was used. The focus of this paper has been put on suitability for task, training effort and conformity with user expectations, differentiated by information systems. Effectiveness has been evaluated with the focus on interoperability and functionality of different IT systems. Results 4521 persons from 371 hospitals visited the start page of the study, while 1003 persons from 158 hospitals completed the questionnaire. The results show relevant variations between different information systems. Conclusions Specialised information systems with defined functionality received better assessments than clinical information systems in general. This could be attributed to the improved customisation of these specialised systems for specific working environments. The results can be used as reference data for evaluation and benchmarking of human computer engineering in clinical health IT context for future studies. PMID:22070880
van der List, Jelle P; Chawla, Harshvardhan; Joskowicz, Leo; Pearle, Andrew D
2016-11-01
Recently, there is a growing interest in surgical variables that are intraoperatively controlled by orthopaedic surgeons, including lower leg alignment, component positioning and soft tissues balancing. Since more tight control over these factors is associated with improved outcomes of unicompartmental knee arthroplasty and total knee arthroplasty (TKA), several computer navigation and robotic-assisted systems have been developed. Although mechanical axis accuracy and component positioning have been shown to improve with computer navigation, no superiority in functional outcomes has yet been shown. This could be explained by the fact that many differences exist between the number and type of surgical variables these systems control. Most systems control lower leg alignment and component positioning, while some in addition control soft tissue balancing. Finally, robotic-assisted systems have the additional advantage of improving surgical precision. A systematic search in PubMed, Embase and Cochrane Library resulted in 40 comparative studies and three registries on computer navigation reporting outcomes of 474,197 patients, and 21 basic science and clinical studies on robotic-assisted knee arthroplasty. Twenty-eight of these comparative computer navigation studies reported Knee Society Total scores in 3504 patients. Stratifying by type of surgical variables, no significant differences were noted in outcomes between surgery with computer-navigated TKA controlling for alignment and component positioning versus conventional TKA (p = 0.63). However, significantly better outcomes were noted following computer-navigated TKA that also controlled for soft tissue balancing versus conventional TKA (mean difference 4.84, 95 % Confidence Interval 1.61, 8.07, p = 0.003). A literature review of robotic systems showed that these systems can, similarly to computer navigation, reliably improve lower leg alignment, component positioning and soft tissues balancing. Furthermore, two studies comparing robotic-assisted with computer-navigated surgery reported superiority of robotic-assisted surgery in controlling these factors. Manually controlling all these surgical variables can be difficult for the orthopaedic surgeon. Findings in this study suggest that computer navigation or robotic assistance may help managing these multiple variables and could improve outcomes. Future studies assessing the role of soft tissue balancing in knee arthroplasty and long-term follow-up studies assessing the role of computer-navigated and robotic-assisted knee arthroplasty are needed.
XTALOPT: An open-source evolutionary algorithm for crystal structure prediction
NASA Astrophysics Data System (ADS)
Lonie, David C.; Zurek, Eva
2011-02-01
The implementation and testing of XTALOPT, an evolutionary algorithm for crystal structure prediction, is outlined. We present our new periodic displacement (ripple) operator which is ideally suited to extended systems. It is demonstrated that hybrid operators, which combine two pure operators, reduce the number of duplicate structures in the search. This allows for better exploration of the potential energy surface of the system in question, while simultaneously zooming in on the most promising regions. A continuous workflow, which makes better use of computational resources as compared to traditional generation based algorithms, is employed. Various parameters in XTALOPT are optimized using a novel benchmarking scheme. XTALOPT is available under the GNU Public License, has been interfaced with various codes commonly used to study extended systems, and has an easy to use, intuitive graphical interface. Program summaryProgram title:XTALOPT Catalogue identifier: AEGX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL v2.1 or later [1] No. of lines in distributed program, including test data, etc.: 36 849 No. of bytes in distributed program, including test data, etc.: 1 149 399 Distribution format: tar.gz Programming language: C++ Computer: PCs, workstations, or clusters Operating system: Linux Classification: 7.7 External routines: QT [2], OpenBabel [3], AVOGADRO [4], SPGLIB [8] and one of: VASP [5], PWSCF [6], GULP [7]. Nature of problem: Predicting the crystal structure of a system from its stoichiometry alone remains a grand challenge in computational materials science, chemistry, and physics. Solution method: Evolutionary algorithms are stochastic search techniques which use concepts from biological evolution in order to locate the global minimum on their potential energy surface. Our evolutionary algorithm, XTALOPT, is freely available to the scientific community for use and collaboration under the GNU Public License. Running time: User dependent. The program runs until stopped by the user.
Computational tools for comparative phenomics; the role and promise of ontologies
Gkoutos, Georgios V.; Schofield, Paul N.; Hoehndorf, Robert
2012-01-01
A major aim of the biological sciences is to gain an understanding of human physiology and disease. One important step towards such a goal is the discovery of the function of genes that will lead to better understanding of the physiology and pathophysiology of organisms ultimately providing better understanding, diagnosis, and therapy. Our increasing ability to phenotypically characterise genetic variants of model organisms coupled with systematic and hypothesis-driven mutagenesis is resulting in a wealth of information that could potentially provide insight to the functions of all genes in an organism. The challenge we are now facing is to develop computational methods that can integrate and analyse such data. The introduction of formal ontologies that make their semantics explicit and accessible to automated reasoning promises the tantalizing possibility of standardizing biomedical knowledge allowing for novel, powerful queries that bridge multiple domains, disciplines, species and levels of granularity. We review recent computational approaches that facilitate the integration of experimental data from model organisms with clinical observations in humans. These methods foster novel cross species analysis approaches, thereby enabling comparative phenomics and leading to the potential of translating basic discoveries from the model systems into diagnostic and therapeutic advances at the clinical level. PMID:22814867
Intelligent Context-Aware and Adaptive Interface for Mobile LBS
Liu, Yanhong
2015-01-01
Context-aware user interface plays an important role in many human-computer Interaction tasks of location based services. Although spatial models for context-aware systems have been studied extensively, how to locate specific spatial information for users is still not well resolved, which is important in the mobile environment where location based services users are impeded by device limitations. Better context-aware human-computer interaction models of mobile location based services are needed not just to predict performance outcomes, such as whether people will be able to find the information needed to complete a human-computer interaction task, but to understand human processes that interact in spatial query, which will in turn inform the detailed design of better user interfaces in mobile location based services. In this study, a context-aware adaptive model for mobile location based services interface is proposed, which contains three major sections: purpose, adjustment, and adaptation. Based on this model we try to describe the process of user operation and interface adaptation clearly through the dynamic interaction between users and the interface. Then we show how the model applies users' demands in a complicated environment and suggested the feasibility by the experimental results. PMID:26457077
Computational study of noise in a large signal transduction network.
Intosalmi, Jukka; Manninen, Tiina; Ruohonen, Keijo; Linne, Marja-Leena
2011-06-21
Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor. We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased. We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies. © 2011 Intosalmi et al; licensee BioMed Central Ltd.
A conceptual framework of computations in mid-level vision
Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P.
2014-01-01
If a picture is worth a thousand words, as an English idiom goes, what should those words—or, rather, descriptors—capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations. PMID:25566044
A conceptual framework of computations in mid-level vision.
Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P
2014-01-01
If a picture is worth a thousand words, as an English idiom goes, what should those words-or, rather, descriptors-capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations.
Tambe, Varsha Harshal; Nagmode, Pradnya Sunil; Abraham, Sathish; Patait, Mahendra; Lahoti, Pratik Vinod; Jaju, Neha
2014-01-01
Aim: The aim of the present study was to compare the canal transportation and centering ability of Rotary ProTaper, One Shape and Wave One systems using cone beam computed tomography (CBCT) in curved root canals to find better instrumentation technique for maintaining root canal geometry. Materials and Methods: Total 30 freshly extracted premolars having curved root canals with at least 10 degrees of curvature were divided into three groups of 10 teeth each. All teeth were scanned by CBCT to determine the root canal shape before instrumentation. In Group 1, the canals were prepared with Rotary ProTaper files, in Group 2 the canals were prepared with One Shape files and in Group 3 canals were prepared with Wave One files. After preparation, post-instrumentation scan was performed. Pre-instrumentation and post-instrumentation images were obtained at three levels, 3 mm apical, 3 mm coronal and 8 mm apical above the apical foramen were compared using CBCT software. Amount of transportation and centering ability were assessed. The three groups were statistically compared with analysis of variance and Tukey honestly significant. Results: All instruments maintained the original canal curvature with significant differences between the different files. Data suggested that Wave One files presented the best outcomes for both the variables evaluated. Wave One files caused lesser transportation and remained better centered in the canal than One Shape and Rotary ProTaper files. Conclusion: The canal preparation with Wave One files showed lesser transportation and better centering ability than One Shape and ProTaper. PMID:25506145
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yeung, Yu-Hong; Pothen, Alex; Halappanavar, Mahantesh
We present an augmented matrix approach to update the solution to a linear system of equations when the coefficient matrix is modified by a few elements within a principal submatrix. This problem arises in the dynamic security analysis of a power grid, where operators need to performmore » $N-x$ contingency analysis, i.e., determine the state of the system when up to $x$ links from $N$ fail. Our algorithms augment the coefficient matrix to account for the changes in it, and then compute the solution to the augmented system without refactoring the modified matrix. We provide two algorithms, a direct method, and a hybrid direct-iterative method for solving the augmented system. We also exploit the sparsity of the matrices and vectors to accelerate the overall computation. Our algorithms are compared on three power grids with PARDISO, a parallel direct solver, and CHOLMOD, a direct solver with the ability to modify the Cholesky factors of the coefficient matrix. We show that our augmented algorithms outperform PARDISO (by two orders of magnitude), and CHOLMOD (by a factor of up to 5). Further, our algorithms scale better than CHOLMOD as the number of elements updated increases. The solutions are computed with high accuracy. Our algorithms are capable of computing $N-x$ contingency analysis on a $778K$ bus grid, updating a solution with $x=20$ elements in $$1.6 \\times 10^{-2}$$ seconds on an Intel Xeon processor.« less
An Efficient Mutual Authentication Framework for Healthcare System in Cloud Computing.
Kumar, Vinod; Jangirala, Srinivas; Ahmad, Musheer
2018-06-28
The increasing role of Telecare Medicine Information Systems (TMIS) makes its accessibility for patients to explore medical treatment, accumulate and approach medical data through internet connectivity. Security and privacy preservation is necessary for medical data of the patient in TMIS because of the very perceptive purpose. Recently, Mohit et al.'s proposed a mutual authentication protocol for TMIS in the cloud computing environment. In this work, we reviewed their protocol and found that it is not secure against stolen verifier attack, many logged in patient attack, patient anonymity, impersonation attack, and fails to protect session key. For enhancement of security level, we proposed a new mutual authentication protocol for the similar environment. The presented framework is also more capable in terms of computation cost. In addition, the security evaluation of the protocol protects resilience of all possible security attributes, and we also explored formal security evaluation based on random oracle model. The performance of the proposed protocol is much better in comparison to the existing protocol.
Cloud based intelligent system for delivering health care as a service.
Kaur, Pankaj Deep; Chana, Inderveer
2014-01-01
The promising potential of cloud computing and its convergence with technologies such as mobile computing, wireless networks, sensor technologies allows for creation and delivery of newer type of cloud services. In this paper, we advocate the use of cloud computing for the creation and management of cloud based health care services. As a representative case study, we design a Cloud Based Intelligent Health Care Service (CBIHCS) that performs real time monitoring of user health data for diagnosis of chronic illness such as diabetes. Advance body sensor components are utilized to gather user specific health data and store in cloud based storage repositories for subsequent analysis and classification. In addition, infrastructure level mechanisms are proposed to provide dynamic resource elasticity for CBIHCS. Experimental results demonstrate that classification accuracy of 92.59% is achieved with our prototype system and the predicted patterns of CPU usage offer better opportunities for adaptive resource elasticity. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
A Comparison of Solver Performance for Complex Gastric Electrophysiology Models
Sathar, Shameer; Cheng, Leo K.; Trew, Mark L.
2016-01-01
Computational techniques for solving systems of equations arising in gastric electrophysiology have not been studied for efficient solution process. We present a computationally challenging problem of simulating gastric electrophysiology in anatomically realistic stomach geometries with multiple intracellular and extracellular domains. The multiscale nature of the problem and mesh resolution required to capture geometric and functional features necessitates efficient solution methods if the problem is to be tractable. In this study, we investigated and compared several parallel preconditioners for the linear systems arising from tetrahedral discretisation of electrically isotropic and anisotropic problems, with and without stimuli. The results showed that the isotropic problem was computationally less challenging than the anisotropic problem and that the application of extracellular stimuli increased workload considerably. Preconditioning based on block Jacobi and algebraic multigrid solvers were found to have the best overall solution times and least iteration counts, respectively. The algebraic multigrid preconditioner would be expected to perform better on large problems. PMID:26736543
Desktop-based computer-assisted orthopedic training system for spinal surgery.
Rambani, Rohit; Ward, James; Viant, Warren
2014-01-01
Simulation and surgical training has moved on since its inception during the end of the last century. The trainees are getting more exposed to computers and laboratory training in different subspecialties. More needs to be done in orthopedic simulation in spinal surgery. To develop a training system for pedicle screw fixation and validate its effectiveness in a cohort of junior orthopedic trainees. Fully simulated computer-navigated training system is used to train junior orthopedic trainees perform pedicle screw insertion in the lumbar spine. Real patient computed tomography scans are used to produce the real-time fluoroscopic images of the lumbar spine. The training system was developed to simulate pedicle screw insertion in the lumbar spine. A total of 12 orthopedic senior house officers performed pedicle screw insertion in the lumbar spine before and after the training on training system. The results were assessed based on the scoring system, which included the amount of time taken, accuracy of pedicle screw insertion, and the number of exposures requested to complete the procedure. The result shows a significant improvement in amount of time taken, accuracy of fixation, and the number of exposures after the training on simulator system. This was statistically significant using paired Student t test (p < 0.05). Fully simulated computer-navigated training system is an efficient training tool for young orthopedic trainees. This system can be used to augment training in the operating room, and trainees acquire their skills in the comfort of their study room or in the training room in the hospital. The system has the potential to be used in various other orthopedic procedures for learning of technical skills in a manner aimed at ensuring a smooth escalation in task complexity leading to the better performance of procedures in the operating theater. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Systems Biology for Organotypic Cell Cultures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grego, Sonia; Dougherty, Edward R.; Alexander, Francis J.
Translating in vitro biological data into actionable information related to human health holds the potential to improve disease treatment and risk assessment of chemical exposures. While genomics has identified regulatory pathways at the cellular level, translation to the organism level requires a multiscale approach accounting for intra-cellular regulation, inter-cellular interaction, and tissue/organ-level effects. Tissue-level effects can now be probed in vitro thanks to recently developed systems of three-dimensional (3D), multicellular, “organotypic” cell cultures, which mimic functional responses of living tissue. However, there remains a knowledge gap regarding interactions across different biological scales, complicating accurate prediction of health outcomes from molecular/genomicmore » data and tissue responses. Systems biology aims at mathematical modeling of complex, non-linear biological systems. We propose to apply a systems biology approach to achieve a computational representation of tissue-level physiological responses by integrating empirical data derived from organotypic culture systems with computational models of intracellular pathways to better predict human responses. Successful implementation of this integrated approach will provide a powerful tool for faster, more accurate and cost-effective screening of potential toxicants and therapeutics. On September 11, 2015, an interdisciplinary group of scientists, engineers, and clinicians gathered for a workshop in Research Triangle Park, North Carolina, to discuss this ambitious goal. Participants represented laboratory-based and computational modeling approaches to pharmacology and toxicology, as well as the pharmaceutical industry, government, non-profits, and academia. Discussions focused on identifying critical system perturbations to model, the computational tools required, and the experimental approaches best suited to generating key data. This consensus report summarizes the discussions held.« less
Workshop Report: Systems Biology for Organotypic Cell Cultures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grego, Sonia; Dougherty, Edward R.; Alexander, Francis Joseph
Translating in vitro biological data into actionable information related to human health holds the potential to improve disease treatment and risk assessment of chemical exposures. While genomics has identified regulatory pathways at the cellular level, translation to the organism level requires a multiscale approach accounting for intra-cellular regulation, inter-cellular interaction, and tissue/organ-level effects. Tissue-level effects can now be probed in vitro thanks to recently developed systems of three-dimensional (3D), multicellular, “organotypic” cell cultures, which mimic functional responses of living tissue. However, there remains a knowledge gap regarding interactions across different biological scales, complicating accurate prediction of health outcomes from molecular/genomicmore » data and tissue responses. Systems biology aims at mathematical modeling of complex, non-linear biological systems. We propose to apply a systems biology approach to achieve a computational representation of tissue-level physiological responses by integrating empirical data derived from organotypic culture systems with computational models of intracellular pathways to better predict human responses. Successful implementation of this integrated approach will provide a powerful tool for faster, more accurate and cost-effective screening of potential toxicants and therapeutics. On September 11, 2015, an interdisciplinary group of scientists, engineers, and clinicians gathered for a workshop in Research Triangle Park, North Carolina, to discuss this ambitious goal. Participants represented laboratory-based and computational modeling approaches to pharmacology and toxicology, as well as the pharmaceutical industry, government, non-profits, and academia. Discussions focused on identifying critical system perturbations to model, the computational tools required, and the experimental approaches best suited to generating key data.« less
Workshop Report: Systems Biology for Organotypic Cell Cultures
Grego, Sonia; Dougherty, Edward R.; Alexander, Francis Joseph; ...
2016-11-14
Translating in vitro biological data into actionable information related to human health holds the potential to improve disease treatment and risk assessment of chemical exposures. While genomics has identified regulatory pathways at the cellular level, translation to the organism level requires a multiscale approach accounting for intra-cellular regulation, inter-cellular interaction, and tissue/organ-level effects. Tissue-level effects can now be probed in vitro thanks to recently developed systems of three-dimensional (3D), multicellular, “organotypic” cell cultures, which mimic functional responses of living tissue. However, there remains a knowledge gap regarding interactions across different biological scales, complicating accurate prediction of health outcomes from molecular/genomicmore » data and tissue responses. Systems biology aims at mathematical modeling of complex, non-linear biological systems. We propose to apply a systems biology approach to achieve a computational representation of tissue-level physiological responses by integrating empirical data derived from organotypic culture systems with computational models of intracellular pathways to better predict human responses. Successful implementation of this integrated approach will provide a powerful tool for faster, more accurate and cost-effective screening of potential toxicants and therapeutics. On September 11, 2015, an interdisciplinary group of scientists, engineers, and clinicians gathered for a workshop in Research Triangle Park, North Carolina, to discuss this ambitious goal. Participants represented laboratory-based and computational modeling approaches to pharmacology and toxicology, as well as the pharmaceutical industry, government, non-profits, and academia. Discussions focused on identifying critical system perturbations to model, the computational tools required, and the experimental approaches best suited to generating key data.« less
Systems biology for organotypic cell cultures.
Grego, Sonia; Dougherty, Edward R; Alexander, Francis J; Auerbach, Scott S; Berridge, Brian R; Bittner, Michael L; Casey, Warren; Cooley, Philip C; Dash, Ajit; Ferguson, Stephen S; Fennell, Timothy R; Hawkins, Brian T; Hickey, Anthony J; Kleensang, Andre; Liebman, Michael N J; Martin, Florian; Maull, Elizabeth A; Paragas, Jason; Qiao, Guilin Gary; Ramaiahgari, Sreenivasa; Sumner, Susan J; Yoon, Miyoung
2017-01-01
Translating in vitro biological data into actionable information related to human health holds the potential to improve disease treatment and risk assessment of chemical exposures. While genomics has identified regulatory pathways at the cellular level, translation to the organism level requires a multiscale approach accounting for intra-cellular regulation, inter-cellular interaction, and tissue/organ-level effects. Tissue-level effects can now be probed in vitro thanks to recently developed systems of three-dimensional (3D), multicellular, "organotypic" cell cultures, which mimic functional responses of living tissue. However, there remains a knowledge gap regarding interactions across different biological scales, complicating accurate prediction of health outcomes from molecular/genomic data and tissue responses. Systems biology aims at mathematical modeling of complex, non-linear biological systems. We propose to apply a systems biology approach to achieve a computational representation of tissue-level physiological responses by integrating empirical data derived from organotypic culture systems with computational models of intracellular pathways to better predict human responses. Successful implementation of this integrated approach will provide a powerful tool for faster, more accurate and cost-effective screening of potential toxicants and therapeutics. On September 11, 2015, an interdisciplinary group of scientists, engineers, and clinicians gathered for a workshop in Research Triangle Park, North Carolina, to discuss this ambitious goal. Participants represented laboratory-based and computational modeling approaches to pharmacology and toxicology, as well as the pharmaceutical industry, government, non-profits, and academia. Discussions focused on identifying critical system perturbations to model, the computational tools required, and the experimental approaches best suited to generating key data.
Machine vision system for inspecting characteristics of hybrid rice seed
NASA Astrophysics Data System (ADS)
Cheng, Fang; Ying, Yibin
2004-03-01
Obtaining clear images advantaged of improving the classification accuracy involves many factors, light source, lens extender and background were discussed in this paper. The analysis of rice seed reflectance curves showed that the wavelength of light source for discrimination of the diseased seeds from normal rice seeds in the monochromic image recognition mode was about 815nm for jinyou402 and shanyou10. To determine optimizing conditions for acquiring digital images of rice seed using a computer vision system, an adjustable color machine vision system was developed. The machine vision system with 20mm to 25mm lens extender produce close-up images which made it easy to object recognition of characteristics in hybrid rice seeds. White background was proved to be better than black background for inspecting rice seeds infected by disease and using the algorithms based on shape. Experimental results indicated good classification for most of the characteristics with the machine vision system. The same algorithm yielded better results in optimizing condition for quality inspection of rice seed. Specifically, the image processing can correct for details such as fine fissure with the machine vision system.
NASA Astrophysics Data System (ADS)
Isnur Haryudo, Subuh; Imam Agung, Achmad; Firmansyah, Rifqi
2018-04-01
The purpose of this research is to develop learning media of control technique using Matrix Laboratory software with industry requirement approach. Learning media serves as a tool for creating a better and effective teaching and learning situation because it can accelerate the learning process in order to enhance the quality of learning. Control Techniques using Matrix Laboratory software can enlarge the interest and attention of students, with real experience and can grow independent attitude. This research design refers to the use of research and development (R & D) methods that have been modified by multi-disciplinary team-based researchers. This research used Computer based learning method consisting of computer and Matrix Laboratory software which was integrated with props. Matrix Laboratory has the ability to visualize the theory and analysis of the Control System which is an integration of computing, visualization and programming which is easy to use. The result of this instructional media development is to use mathematical equations using Matrix Laboratory software on control system application with DC motor plant and PID (Proportional-Integral-Derivative). Considering that manufacturing in the field of Distributed Control systems (DCSs), Programmable Controllers (PLCs), and Microcontrollers (MCUs) use PID systems in production processes are widely used in industry.
Nonvolatile reconfigurable sequential logic in a HfO2 resistive random access memory array.
Zhou, Ya-Xiong; Li, Yi; Su, Yu-Ting; Wang, Zhuo-Rui; Shih, Ling-Yi; Chang, Ting-Chang; Chang, Kuan-Chang; Long, Shi-Bing; Sze, Simon M; Miao, Xiang-Shui
2017-05-25
Resistive random access memory (RRAM) based reconfigurable logic provides a temporal programmable dimension to realize Boolean logic functions and is regarded as a promising route to build non-von Neumann computing architecture. In this work, a reconfigurable operation method is proposed to perform nonvolatile sequential logic in a HfO 2 -based RRAM array. Eight kinds of Boolean logic functions can be implemented within the same hardware fabrics. During the logic computing processes, the RRAM devices in an array are flexibly configured in a bipolar or complementary structure. The validity was demonstrated by experimentally implemented NAND and XOR logic functions and a theoretically designed 1-bit full adder. With the trade-off between temporal and spatial computing complexity, our method makes better use of limited computing resources, thus provides an attractive scheme for the construction of logic-in-memory systems.
Baroudi, Kusai; Ibraheem, Shukran Nasser
2015-01-01
Background: This paper aimed to evaluate the application of computer-aided design and computer-aided manufacturing (CAD-CAM) technology and the factors that affect the survival of restorations. Materials and Methods: A thorough literature search using PubMed, Medline, Embase, Science Direct, Wiley Online Library and Grey literature were performed from the year 2004 up to June 2014. Only relevant research was considered. Results: The use of chair-side CAD/CAM systems is promising in all dental branches in terms of minimizing time and effort made by dentists, technicians and patients for restoring and maintaining patient oral function and aesthetic, while providing high quality outcome. Conclusion: The way of producing and placing the restorations made with the chair-side CAD/CAM (CEREC and E4D) devices is better than restorations made by conventional laboratory procedures. PMID:25954082
Quantitative accuracy of the closed-form least-squares solution for targeted SPECT.
Shcherbinin, S; Celler, A
2010-10-07
The aim of this study is to investigate the quantitative accuracy of the closed-form least-squares solution (LSS) for single photon emission computed tomography (SPECT). The main limitation for employing this method in actual clinical reconstructions is the computational cost related to operations with a large-sized system matrix. However, in some clinical situations, the size of the system matrix can be decreased using targeted reconstruction. For example, some oncology SPECT studies are characterized by intense tracer uptakes that are localized in relatively small areas, while the remaining parts of the patient body have only a low activity background. Conventional procedures reconstruct the activity distribution in the whole object, which leads to relatively poor image accuracy/resolution for tumors while computer resources are wasted, trying to rebuild diagnostically useless background. In this study, we apply a concept of targeted reconstruction to SPECT phantom experiments imitating such oncology scans. Our approach includes two major components: (i) disconnection of the entire imaging system of equations and extraction of only those parts that correspond to the targets, i.e., regions of interest (ROI) encompassing active containers/tumors and (ii) generation of the closed-form LSS for each target ROI. We compared these ROI-based LSS with those reconstructed by the conventional MLEM approach. The analysis of the five processed cases from two phantom experiments demonstrated that the LSS approach outperformed MLEM in terms of the noise level inside ROI. On the other hand, MLEM better recovered total activity if the number of iterations was large enough. For the experiment without background activity, the ROI-based LSS led to noticeably better spatial activity distribution inside ROI. However, the distributions pertaining to both approaches were practically identical for the experiment with the concentration ratio 7:1 between the containers and the background.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, Michael K.; Davidson, Megan
As part of Sandia’s nuclear deterrence mission, the B61-12 Life Extension Program (LEP) aims to modernize the aging weapon system. Modernization requires requalification and Sandia is using high performance computing to perform advanced computational simulations to better understand, evaluate, and verify weapon system performance in conjunction with limited physical testing. The Nose Bomb Subassembly (NBSA) of the B61-12 is responsible for producing a fuzing signal upon ground impact. The fuzing signal is dependent upon electromechanical impact sensors producing valid electrical fuzing signals at impact. Computer generated models were used to assess the timing between the impact sensor’s response to themore » deceleration of impact and damage to major components and system subassemblies. The modeling and simulation team worked alongside the physical test team to design a large-scale reverse ballistic test to not only assess system performance, but to also validate their computational models. The reverse ballistic test conducted at Sandia’s sled test facility sent a rocket sled with a representative target into a stationary B61-12 (NBSA) to characterize the nose crush and functional response of NBSA components. Data obtained from data recorders and high-speed photometrics were integrated with previously generated computer models in order to refine and validate the model’s ability to reliably simulate real-world effects. Large-scale tests are impractical to conduct for every single impact scenario. By creating reliable computer models, we can perform simulations that identify trends and produce estimates of outcomes over the entire range of required impact conditions. Sandia’s HPCs enable geometric resolution that was unachievable before, allowing for more fidelity and detail, and creating simulations that can provide insight to support evaluation of requirements and performance margins. As computing resources continue to improve, researchers at Sandia are hoping to improve these simulations so they provide increasingly credible analysis of the system response and performance over the full range of conditions.« less
Mathematical and Computational Foundations of Recurrence Quantifications
NASA Astrophysics Data System (ADS)
Marwan, Norbert; Webber, Charles L.
Real-world systems possess deterministic trajectories, phase singularities and noise. Dynamic trajectories have been studied in temporal and frequency domains, but these are linear approaches. Basic to the field of nonlinear dynamics is the representation of trajectories in phase space. A variety of nonlinear tools such as the Lyapunov exponent, Kolmogorov-Sinai entropy, correlation dimension, etc. have successfully characterized trajectories in phase space, provided the systems studied were stationary in time. Ubiquitous in nature, however, are systems that are nonlinear and nonstationary, existing in noisy environments all of which are assumption breaking to otherwise powerful linear tools. What has been unfolding over the last quarter of a century, however, is the timely discovery and practical demonstration that the recurrences of system trajectories in phase space can provide important clues to the system designs from which they derive. In this chapter we will introduce the basics of recurrence plots (RP) and their quantification analysis (RQA). We will begin by summarizing the concept of phase space reconstructions. Then we will provide the mathematical underpinnings of recurrence plots followed by the details of recurrence quantifications. Finally, we will discuss computational approaches that have been implemented to make recurrence strategies feasible and useful. As computers become faster and computer languages advance, younger generations of researchers will be stimulated and encouraged to capture nonlinear recurrence patterns and quantification in even better formats. This particular branch of nonlinear dynamics remains wide open for the definition of new recurrence variables and new applications untouched to date.
Design of a fast computer-based partial discharge diagnostic system
NASA Technical Reports Server (NTRS)
Oliva, Jose R.; Karady, G. G.; Domitz, Stan
1991-01-01
Partial discharges cause progressive deterioration of insulating materials working in high voltage conditions and may lead ultimately to insulator failure. Experimental findings indicate that deterioration increases with the number of discharges and is consequently proportional to the magnitude and frequency of the applied voltage. In order to obtain a better understanding of the mechanisms of deterioration produced by partial discharges, instrumentation capable of individual pulse resolution is required. A new computer-based partial discharge detection system was designed and constructed to conduct long duration tests on sample capacitors. This system is capable of recording large number of pulses without dead time and producing valuable information related to amplitude, polarity, and charge content of the discharges. The operation of the system is automatic and no human supervision is required during the testing stage. Ceramic capacitors were tested at high voltage in long duration tests. The obtained results indicated that the charge content of partial discharges shift towards high levels of charge as the level of deterioration in the capacitor increases.
Exascale Storage Systems the SIRIUS Way
NASA Astrophysics Data System (ADS)
Klasky, S. A.; Abbasi, H.; Ainsworth, M.; Choi, J.; Curry, M.; Kurc, T.; Liu, Q.; Lofstead, J.; Maltzahn, C.; Parashar, M.; Podhorszki, N.; Suchyta, E.; Wang, F.; Wolf, M.; Chang, C. S.; Churchill, M.; Ethier, S.
2016-10-01
As the exascale computing age emerges, data related issues are becoming critical factors that determine how and where we do computing. Popular approaches used by traditional I/O solution and storage libraries become increasingly bottlenecked due to their assumptions about data movement, re-organization, and storage. While, new technologies, such as “burst buffers”, can help address some of the short-term performance issues, it is essential that we reexamine the underlying storage and I/O infrastructure to effectively support requirements and challenges at exascale and beyond. In this paper we present a new approach to the exascale Storage System and I/O (SSIO), which is based on allowing users to inject application knowledge into the system and leverage this knowledge to better manage, store, and access large data volumes so as to minimize the time to scientific insights. Central to our approach is the distinction between the data, metadata, and the knowledge contained therein, transferred from the user to the system by describing “utility” of data as it ages.
Is the US Workforce Prepared to Thrive in the Past or in the Future?
ERIC Educational Resources Information Center
Burrus, Daniel
2014-01-01
Past education focused on the three Rs (reading, 'riting and 'rithmetic), but these no longer give humans an edge over advanced computers and automation systems. This is why we need to understand where the future is heading and better prepare both our current workforce as well as the future workforce for tomorrow's job market. Of…
NASA Technical Reports Server (NTRS)
2000-01-01
Kennedy Space Center's need to conduct real-time monitoring of Space Shuttle operations led to the development of Netlander Inc.'s JTouch system. The technology behind JTouch allows engineers to view Space Shuttle and ground support data from any desktop computer using a web browser. Companies can make use of JTouch to better monitor locations scattered around the world, increasing decision-making speed and reducing travel costs for site visits.
2001-04-01
IHS), could share information technology (IT) and patient medical information to provide greater continuity of care, accelerate VA eligibility... patient medical information to provide greater continuity of care, accelerate VA eligibility determinations, and save software development costs.1 In...system, which primarily includes information on patient hospital admission and discharge, patient medications , laboratory results, and radiology
Turbine endwall single cylinder program
NASA Technical Reports Server (NTRS)
Langston, L. S.
1982-01-01
Detailed measurement of the flow field in front of a large-scale single cylinder, mounted in a wind tunnel is discussed. A better understanding of the three dimensional separation occuring in front of the cylinder on the endwall, and of the vortex system that is formed is sought. A data base with which to check analytical and numerical computer models of three dimensional flows is also anticipated.
Graphical Interface for the Study of Gas-Phase Reaction Kinetics: Cyclopentene Vapor Pyrolysis
NASA Astrophysics Data System (ADS)
Marcotte, Ronald E.; Wilson, Lenore D.
2001-06-01
The undergraduate laboratory experiment on the pyrolysis of gaseous cyclopentene has been modernized to improve safety, speed, and precision and to better reflect the current practice of physical chemistry. It now utilizes virtual instrument techniques to create a graphical computer interface for the collection and display of experimental data. An electronic pressure gauge has replaced the mercury manometer formerly needed in proximity to the 500 °C pyrolysis oven. Students have much better real-time information available to them and no longer require multiple lab periods to get rate constants and acceptable Arrhenius parameters. The time saved on manual data collection is used to give the students a tour of the computer interfacing hardware and software and a hands-on introduction to gas-phase reagent preparation using a research-grade high-vacuum system. This includes loading the sample, degassing it by the freeze-pump-thaw technique, handling liquid nitrogen and working through the logic necessary for each reconfiguration of the diffusion pump section and the submanifolds.
Antenna pattern control using impedance surfaces
NASA Technical Reports Server (NTRS)
Balanis, Constantine A.; Liu, Kefeng
1992-01-01
During this research period, we have effectively transferred existing computer codes from CRAY supercomputer to work station based systems. The work station based version of our code preserved the accuracy of the numerical computations while giving a much better turn-around time than the CRAY supercomputer. Such a task relieved us of the heavy dependence of the supercomputer account budget and made codes developed in this research project more feasible for applications. The analysis of pyramidal horns with impedance surfaces was our major focus during this research period. Three different modeling algorithms in analyzing lossy impedance surfaces were investigated and compared with measured data. Through this investigation, we discovered that a hybrid Fourier transform technique, which uses the eigen mode in the stepped waveguide section and the Fourier transformed field distributions across the stepped discontinuities for lossy impedances coating, gives a better accuracy in analyzing lossy coatings. After a further refinement of the present technique, we will perform an accurate radiation pattern synthesis in the coming reporting period.
ALLY: An operator's associate for satellite ground control systems
NASA Technical Reports Server (NTRS)
Bushman, J. B.; Mitchell, Christine M.; Jones, P. M.; Rubin, K. S.
1991-01-01
The key characteristics of an intelligent advisory system is explored. A central feature is that human-machine cooperation should be based on a metaphor of human-to-human cooperation. ALLY, a computer-based operator's associate which is based on a preliminary theory of human-to-human cooperation, is discussed. ALLY assists the operator in carrying out the supervisory control functions for a simulated NASA ground control system. Experimental evaluation of ALLY indicates that operators using ALLY performed at least as well as they did when using a human associate and in some cases even better.
Steady-state and transient operation of a heat-pipe radiator system
NASA Technical Reports Server (NTRS)
Sellers, J. P.
1974-01-01
Data obtained on a VCHP heat-pipe radiator system tested in a vacuum environment were studied. Analyses and interpretation of the steady-state results are presented along with an initial analysis of some of the transient data. Particular emphasis was placed on quantitative comparisons of the experimental data with computer model simulations. The results of the study provide a better understanding of the system but do not provide a complete explanation for the observed low VCHP performance and the relatively flat radiator panel temperature distribution. The results of the study also suggest hardware, software, and testing improvements.
Analysis of a combined refrigerator-generator space power system
NASA Technical Reports Server (NTRS)
Klann, J. L.
1973-01-01
Description of a single-shaft and a two-shaft rotating machinery arrangements using neon for application in a combined refrigerator-generator power system for space missions. The arrangements consist of combined assemblies of a power turbine, alternator, compressor, and cry-turbine with a single-stage radial-flow design. A computer program was prepared to study the thermodynamics of the dual system in the evaluation of its cryocooling/electric capacity and appropriate weight. A preliminary analysis showed that a two-shaft arrangement of the power- and refrigeration-loop rotating machinery provided better output capacities than a single-shaft arrangement, without prohibitive operating compromises.
Mayer, Richard E; Hegarty, Mary; Mayer, Sarah; Campbell, Julie
2005-12-01
In 4 experiments, students received a lesson consisting of computer-based animation and narration or a lesson consisting of paper-based static diagrams and text. The lessons used the same words and graphics in the paper-based and computer-based versions to explain the process of lightning formation (Experiment 1), how a toilet tank works (Experiment 2), how ocean waves work (Experiment 3), and how a car's braking system works (Experiment 4). On subsequent retention and transfer tests, the paper group performed significantly better than the computer group on 4 of 8 comparisons, and there was no significant difference on the rest. These results support the static media hypothesis, in which static illustrations with printed text reduce extraneous processing and promote germane processing as compared with narrated animations.
Development of computer games for assessment and training in post-stroke arm telerehabilitation.
Rodriguez-de-Pablo, Cristina; Perry, Joel C; Cavallaro, Francesca I; Zabaleta, Haritz; Keller, Thierry
2012-01-01
Stroke is the leading cause of long term disability among adults in industrialized nations. The majority of these disabilities include deficiencies in arm function, which can make independent living very difficult. Research shows that better results in rehabilitation are obtained when patients receive more intensive therapy. However this intensive therapy is currently too expensive to be provided by the public health system, and at home few patients perform the repetitive exercises recommended by their therapists. Computer games can provide an affordable, enjoyable, and effective way to intensify treatment, while keeping the patient as well as their therapists informed about their progress. This paper presents the study, design, implementation and user-testing of a set of computer games for at-home assessment and training of upper-limb motor impairment after stroke.
Ejection mechanisms in the sublayer of a turbulent channel
NASA Technical Reports Server (NTRS)
Jimenez, J.; Moin, P.; Moser, R. D.; Keefe, L. R.
1987-01-01
A possible model for the inception of vorticity ejections in the viscous sublayer of a turbulent rectangular channel is presented. It was shown that this part of the flow is dominated by protruding strong shear layers of z-vorticity, and it was proposed as a mechanism for their maintenance and reproduction which is essentially equivalent to that responsible for the instability of 2-D Tollmien-Schlichting waves. The efforts to isolate computationally a single structure for its study have failed up to now, since it appears that single structures decay in the absence of external forcing, but a convenient computation model was identified in the form of a long and narrow periodic computational box containing at each moment only a few structures. Further work in the identification of better reduced systems is in progress.
[Cost analysis for navigation in knee endoprosthetics].
Cerha, O; Kirschner, S; Günther, K-P; Lützner, J
2009-12-01
Total knee arthroplasty (TKA) is one of the most frequent procedures in orthopaedic surgery. The outcome depends on a range of factors including alignment of the leg and the positioning of the implant in addition to patient-associated factors. Computer-assisted navigation systems can improve the restoration of a neutral leg alignment. This procedure has been established especially in Europe and North America. The additional expenses are not reimbursed in the German DRG system (Diagnosis Related Groups). In the present study a cost analysis of computer-assisted TKA compared to the conventional technique was performed. The acquisition expenses of various navigation systems (5 and 10 year depreciation), annual costs for maintenance and software updates as well as the accompanying costs per operation (consumables, additional operating time) were considered. The additional operating time was determined on the basis of a meta-analysis according to the current literature. Situations with 25, 50, 100, 200 and 500 computer-assisted TKAs per year were simulated. The amount of the incremental costs of the computer-assisted TKA depends mainly on the annual volume and the additional operating time. A relevant decrease of the incremental costs was detected between 50 and 100 procedures per year. In a model with 100 computer-assisted TKAs per year an additional operating time of 14 mins and a 10 year depreciation of the investment costs, the incremental expenses amount to
Emergent Leadership and Team Effectiveness on a Team Resource Allocation Task
1987-10-01
equivalent training and experience on this task, but they had different levels of experience with computers and video games . This differential experience...typed: that is. it is sex-typed to the extent that males spend mnore time on related instrumeuts like computers and video games . However. the sex...perform better or worse than less talkative teams? Did teams with much computer and ’or video game experience perform better than inexperienced teams
Anisotropic resonator analysis using the Fourier-Bessel mode solver
NASA Astrophysics Data System (ADS)
Gauthier, Robert C.
2018-03-01
A numerical mode solver for optical structures that conform to cylindrical symmetry using Faraday's and Ampere's laws as starting expressions is developed when electric or magnetic anisotropy is present. The technique builds on the existing Fourier-Bessel mode solver which allows resonator states to be computed exploiting the symmetry properties of the resonator and states to reduce the matrix system. The introduction of anisotropy into the theoretical frame work facilitates the inclusion of PML borders permitting the computation of open ended structures and a better estimation of the resonator state quality factor. Matrix populating expressions are provided that can accommodate any material anisotropy with arbitrary orientation in the computation domain. Several example of electrical anisotropic computations are provided for rationally symmetric structures such as standard optical fibers, axial Bragg-ring fibers and bottle resonators. The anisotropy present in the materials introduces off diagonal matrix elements in the permittivity tensor when expressed in cylindrical coordinates. The effects of the anisotropy of computed states are presented and discussed.
Measuring the effects of heterogeneity on distributed systems
NASA Technical Reports Server (NTRS)
El-Toweissy, Mohamed; Zeineldine, Osman; Mukkamala, Ravi
1991-01-01
Distributed computer systems in daily use are becoming more and more heterogeneous. Currently, much of the design and analysis studies of such systems assume homogeneity. This assumption of homogeneity has been mainly driven by the resulting simplicity in modeling and analysis. A simulation study is presented which investigated the effects of heterogeneity on scheduling algorithms for hard real time distributed systems. In contrast to previous results which indicate that random scheduling may be as good as a more complex scheduler, this algorithm is shown to be consistently better than a random scheduler. This conclusion is more prevalent at high workloads as well as at high levels of heterogeneity.
Medical physics: some recollections in diagnostic X-ray imaging and therapeutic radiology.
Gray, J E; Orton, C G
2000-12-01
Medical physics has changed dramatically since 1895. There was a period of slow evolutionary change during the first 70 years after Roentgen's discovery of x rays. With the advent of the computer, however, both diagnostic and therapeutic radiology have undergone rapid growth and changes. Technologic advances such as computed tomography and magnetic resonance imaging in diagnostic imaging and three-dimensional treatment planning systems, stereotactic radiosurgery, and intensity modulated radiation therapy in radiation oncology have resulted in substantial changes in medical physics. These advances have improved diagnostic imaging and radiation therapy while expanding the need for better educated and experienced medical physics staff.
Fast Reduction Method in Dominance-Based Information Systems
NASA Astrophysics Data System (ADS)
Li, Yan; Zhou, Qinghua; Wen, Yongchuan
2018-01-01
In real world applications, there are often some data with continuous values or preference-ordered values. Rough sets based on dominance relations can effectively deal with these kinds of data. Attribute reduction can be done in the framework of dominance-relation based approach to better extract decision rules. However, the computational cost of the dominance classes greatly affects the efficiency of attribute reduction and rule extraction. This paper presents an efficient method of computing dominance classes, and further compares it with traditional method with increasing attributes and samples. Experiments on UCI data sets show that the proposed algorithm obviously improves the efficiency of the traditional method, especially for large-scale data.
Mobile Cloud Computing with SOAP and REST Web Services
NASA Astrophysics Data System (ADS)
Ali, Mushtaq; Fadli Zolkipli, Mohamad; Mohamad Zain, Jasni; Anwar, Shahid
2018-05-01
Mobile computing in conjunction with Mobile web services drives a strong approach where the limitations of mobile devices may possibly be tackled. Mobile Web Services are based on two types of technologies; SOAP and REST, which works with the existing protocols to develop Web services. Both the approaches carry their own distinct features, yet to keep the constraint features of mobile devices in mind, the better in two is considered to be the one which minimize the computation and transmission overhead while offloading. The load transferring of mobile device to remote servers for execution called computational offloading. There are numerous approaches to implement computational offloading a viable solution for eradicating the resources constraints of mobile device, yet a dynamic method of computational offloading is always required for a smooth and simple migration of complex tasks. The intention of this work is to present a distinctive approach which may not engage the mobile resources for longer time. The concept of web services utilized in our work to delegate the computational intensive tasks for remote execution. We tested both SOAP Web services approach and REST Web Services for mobile computing. Two parameters considered in our lab experiments to test; Execution Time and Energy Consumption. The results show that RESTful Web services execution is far better than executing the same application by SOAP Web services approach, in terms of execution time and energy consumption. Conducting experiments with the developed prototype matrix multiplication app, REST execution time is about 200% better than SOAP execution approach. In case of energy consumption REST execution is about 250% better than SOAP execution approach.
TBIdoc: 3D content-based CT image retrieval system for traumatic brain injury
NASA Astrophysics Data System (ADS)
Li, Shimiao; Gong, Tianxia; Wang, Jie; Liu, Ruizhe; Tan, Chew Lim; Leong, Tze Yun; Pang, Boon Chuan; Lim, C. C. Tchoyoson; Lee, Cheng Kiang; Tian, Qi; Zhang, Zhuo
2010-03-01
Traumatic brain injury (TBI) is a major cause of death and disability. Computed Tomography (CT) scan is widely used in the diagnosis of TBI. Nowadays, large amount of TBI CT data is stacked in the hospital radiology department. Such data and the associated patient information contain valuable information for clinical diagnosis and outcome prediction. However, current hospital database system does not provide an efficient and intuitive tool for doctors to search out cases relevant to the current study case. In this paper, we present the TBIdoc system: a content-based image retrieval (CBIR) system which works on the TBI CT images. In this web-based system, user can query by uploading CT image slices from one study, retrieval result is a list of TBI cases ranked according to their 3D visual similarity to the query case. Specifically, cases of TBI CT images often present diffuse or focal lesions. In TBIdoc system, these pathological image features are represented as bin-based binary feature vectors. We use the Jaccard-Needham measure as the similarity measurement. Based on these, we propose a 3D similarity measure for computing the similarity score between two series of CT slices. nDCG is used to evaluate the system performance, which shows the system produces satisfactory retrieval results. The system is expected to improve the current hospital data management in TBI and to give better support for the clinical decision-making process. It may also contribute to the computer-aided education in TBI.
Computer-enhanced laparoscopic training system (CELTS): bridging the gap.
Stylopoulos, N; Cotin, S; Maithel, S K; Ottensmeye, M; Jackson, P G; Bardsley, R S; Neumann, P F; Rattner, D W; Dawson, S L
2004-05-01
There is a large and growing gap between the need for better surgical training methodologies and the systems currently available for such training. In an effort to bridge this gap and overcome the disadvantages of the training simulators now in use, we developed the Computer-Enhanced Laparoscopic Training System (CELTS). CELTS is a computer-based system capable of tracking the motion of laparoscopic instruments and providing feedback about performance in real time. CELTS consists of a mechanical interface, a customizable set of tasks, and an Internet-based software interface. The special cognitive and psychomotor skills a laparoscopic surgeon should master were explicitly defined and transformed into quantitative metrics based on kinematics analysis theory. A single global standardized and task-independent scoring system utilizing a z-score statistic was developed. Validation exercises were performed. The scoring system clearly revealed a gap between experts and trainees, irrespective of the task performed; none of the trainees obtained a score above the threshold that distinguishes the two groups. Moreover, CELTS provided educational feedback by identifying the key factors that contributed to the overall score. Among the defined metrics, depth perception, smoothness of motion, instrument orientation, and the outcome of the task are major indicators of performance and key parameters that distinguish experts from trainees. Time and path length alone, which are the most commonly used metrics in currently available systems, are not considered good indicators of performance. CELTS is a novel and standardized skills trainer that combines the advantages of computer simulation with the features of the traditional and popular training boxes. CELTS can easily be used with a wide array of tasks and ensures comparability across different training conditions. This report further shows that a set of appropriate and clinically relevant performance metrics can be defined and a standardized scoring system can be designed.
Identification of Program Signatures from Cloud Computing System Telemetry Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, Nicole M.; Greaves, Mark T.; Smith, William P.
Malicious cloud computing activity can take many forms, including running unauthorized programs in a virtual environment. Detection of these malicious activities while preserving the privacy of the user is an important research challenge. Prior work has shown the potential viability of using cloud service billing metrics as a mechanism for proxy identification of malicious programs. Previously this novel detection method has been evaluated in a synthetic and isolated computational environment. In this paper we demonstrate the ability of billing metrics to identify programs, in an active cloud computing environment, including multiple virtual machines running on the same hypervisor. The openmore » source cloud computing platform OpenStack, is used for private cloud management at Pacific Northwest National Laboratory. OpenStack provides a billing tool (Ceilometer) to collect system telemetry measurements. We identify four different programs running on four virtual machines under the same cloud user account. Programs were identified with up to 95% accuracy. This accuracy is dependent on the distinctiveness of telemetry measurements for the specific programs we tested. Future work will examine the scalability of this approach for a larger selection of programs to better understand the uniqueness needed to identify a program. Additionally, future work should address the separation of signatures when multiple programs are running on the same virtual machine.« less
Synthetic biology: insights into biological computation.
Manzoni, Romilde; Urrios, Arturo; Velazquez-Garcia, Silvia; de Nadal, Eulàlia; Posas, Francesc
2016-04-18
Organisms have evolved a broad array of complex signaling mechanisms that allow them to survive in a wide range of environmental conditions. They are able to sense external inputs and produce an output response by computing the information. Synthetic biology attempts to rationally engineer biological systems in order to perform desired functions. Our increasing understanding of biological systems guides this rational design, while the huge background in electronics for building circuits defines the methodology. In this context, biocomputation is the branch of synthetic biology aimed at implementing artificial computational devices using engineered biological motifs as building blocks. Biocomputational devices are defined as biological systems that are able to integrate inputs and return outputs following pre-determined rules. Over the last decade the number of available synthetic engineered devices has increased exponentially; simple and complex circuits have been built in bacteria, yeast and mammalian cells. These devices can manage and store information, take decisions based on past and present inputs, and even convert a transient signal into a sustained response. The field is experiencing a fast growth and every day it is easier to implement more complex biological functions. This is mainly due to advances in in vitro DNA synthesis, new genome editing tools, novel molecular cloning techniques, continuously growing part libraries as well as other technological advances. This allows that digital computation can now be engineered and implemented in biological systems. Simple logic gates can be implemented and connected to perform novel desired functions or to better understand and redesign biological processes. Synthetic biological digital circuits could lead to new therapeutic approaches, as well as new and efficient ways to produce complex molecules such as antibiotics, bioplastics or biofuels. Biological computation not only provides possible biomedical and biotechnological applications, but also affords a greater understanding of biological systems.
Computing, information, and communications: Technologies for the 21. Century
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-11-01
To meet the challenges of a radically new and technologically demanding century, the Federal Computing, Information, and Communications (CIC) programs are investing in long-term research and development (R and D) to advance computing, information, and communications in the United States. CIC R and D programs help Federal departments and agencies to fulfill their evolving missions, assure the long-term national security, better understand and manage the physical environment, improve health care, help improve the teaching of children, provide tools for lifelong training and distance learning to the workforce, and sustain critical US economic competitiveness. One of the nine committees of themore » National Science and Technology Council (NSTC), the Committee on Computing, Information, and Communications (CCIC)--through its CIC R and D Subcommittee--coordinates R and D programs conducted by twelve Federal departments and agencies in cooperation with US academia and industry. These R and D programs are organized into five Program Component Areas: (1) HECC--High End Computing and Computation; (2) LSN--Large Scale Networking, including the Next Generation Internet Initiative; (3) HCS--High Confidence Systems; (4) HuCS--Human Centered Systems; and (5) ETHR--Education, Training, and Human Resources. A brief synopsis of FY 1997 accomplishments and FY 1998 goals by PCA is presented. This report, which supplements the President`s Fiscal Year 1998 Budget, describes the interagency CIC programs.« less
Sikorsky interactive graphics surface design/manufacturing system
NASA Technical Reports Server (NTRS)
Robbins, R.
1975-01-01
An interactive graphics system conceived to be used in the design, analysis, and manufacturing of aircraft components with free form surfaces was described. In addition to the basic surface definition and viewing capabilities inherent in such a system, numerous other features are present: surface editing, automated smoothing of control curves, variable milling patch boundary definitions, surface intersection definition and viewing, automatic creation of true offset surfaces, digitizer and drafting machine interfaces, and cutter path optimization. Documented costs and time savings of better than six to one are being realized with this system. The system was written in FORTRAN and GSP for use on IBM 2250 CRT's in conjunction with an IBM 370/158 computer.
Dynamic optimization of CELSS crop photosynthetic rate by computer-assisted feedback control
NASA Astrophysics Data System (ADS)
Chun, C.; Mitchell, C. A.
1997-01-01
A procedure for dynamic optimization of net photosynthetic rate (Pn) for crop production in Controlled Ecological Life-Support Systems (CELSS) was developed using leaf lettuce as a model crop. Canopy Pn was measured in real time and fed back for environmental control. Setpoints of photosynthetic photon flux (PPF) and CO_2 concentration for each hour of the crop-growth cycle were decided by computer to reach a targeted Pn each day. Decision making was based on empirical mathematical models combined with rule sets developed from recent experimental data. Comparisons showed that dynamic control resulted in better yield per unit energy input to the growth system than did static control. With comparable productivity parameters and potential for significant energy savings, dynamic control strategies will contribute greatly to the sustainability of space-deployed CELSS.
Stabilizing canonical-ensemble calculations in the auxiliary-field Monte Carlo method
NASA Astrophysics Data System (ADS)
Gilbreth, C. N.; Alhassid, Y.
2015-03-01
Quantum Monte Carlo methods are powerful techniques for studying strongly interacting Fermi systems. However, implementing these methods on computers with finite-precision arithmetic requires careful attention to numerical stability. In the auxiliary-field Monte Carlo (AFMC) method, low-temperature or large-model-space calculations require numerically stabilized matrix multiplication. When adapting methods used in the grand-canonical ensemble to the canonical ensemble of fixed particle number, the numerical stabilization increases the number of required floating-point operations for computing observables by a factor of the size of the single-particle model space, and thus can greatly limit the systems that can be studied. We describe an improved method for stabilizing canonical-ensemble calculations in AFMC that exhibits better scaling, and present numerical tests that demonstrate the accuracy and improved performance of the method.
Jiménez-Osés, Gonzalo; Brockway, Anthony J; Shaw, Jared T; Houk, K N
2013-05-01
The mechanism of direct displacement of alkoxy groups in vinylogous and aromatic esters by Grignard reagents, a reaction that is not observed with expectedly better tosyloxy leaving groups, is elucidated computationally. The mechanism of this reaction has been determined to proceed through the inner-sphere attack of nucleophilic alkyl groups from magnesium to the reacting carbons via a metalaoxetane transition state. The formation of a strong magnesium chelate with the reacting alkoxy and carbonyl groups dictates the observed reactivity and selectivity. The influence of ester, ketone, and aldehyde substituents was investigated. In some cases, the calculations predicted the formation of products different than those previously reported; these predictions were then verified experimentally. The importance of studying the actual system, and not simplified models as computational systems, is demonstrated.
Jiménez-Osés, Gonzalo; Brockway, Anthony J.; Shaw, Jared T.; Houk, K. N.
2013-01-01
The mechanism of direct displacement of alkoxy groups in vinylogous and aromatic esters by Grignard reagents, a reaction that is not observed with expectedly better tosyloxy leaving groups, is elucidated computationally. The mechanism of this reaction has been determined to proceed through the inner-sphere attack of nucleophilic alkyl groups from magnesium to the reacting carbons via a metalaoxetane transition state. The formation of a strong magnesium chelate with the reacting alkoxy and carbonyl groups dictates the observed reactivity and selectivity. The influence of ester, ketone and aldehyde substituents was investigated. In some cases, the calculations predicted the formation of products different than those previously reported; these predictions were then verified experimentally. The importance of studying the actual system, and not simplified models as computational systems, is demonstrated. PMID:23601086
Quantitative computer simulations of extraterrestrial processing operations
NASA Technical Reports Server (NTRS)
Vincent, T. L.; Nikravesh, P. E.
1989-01-01
The automation of a small, solid propellant mixer was studied. Temperature control is under investigation. A numerical simulation of the system is under development and will be tested using different control options. Control system hardware is currently being put into place. The construction of mathematical models and simulation techniques for understanding various engineering processes is also studied. Computer graphics packages were utilized for better visualization of the simulation results. The mechanical mixing of propellants is examined. Simulation of the mixing process is being done to study how one can control for chaotic behavior to meet specified mixing requirements. An experimental mixing chamber is also being built. It will allow visual tracking of particles under mixing. The experimental unit will be used to test ideas from chaos theory, as well as to verify simulation results. This project has applications to extraterrestrial propellant quality and reliability.
NASA Astrophysics Data System (ADS)
Erhard, Jannis; Bleiziffer, Patrick; Görling, Andreas
2016-09-01
A power series approximation for the correlation kernel of time-dependent density-functional theory is presented. Using this approximation in the adiabatic-connection fluctuation-dissipation (ACFD) theorem leads to a new family of Kohn-Sham methods. The new methods yield reaction energies and barriers of unprecedented accuracy and enable a treatment of static (strong) correlation with an accuracy of high-level multireference configuration interaction methods but are single-reference methods allowing for a black-box-like handling of static correlation. The new methods exhibit a better scaling of the computational effort with the system size than rivaling wave-function-based electronic structure methods. Moreover, the new methods do not suffer from the problem of singularities in response functions plaguing previous ACFD methods and therefore are applicable to any type of electronic system.
NASA Astrophysics Data System (ADS)
Yao, Che; Li, Tao; Zhang, Hong; Zhou, Yanming
2017-08-01
In this paper, the characters of two control valves used for ammonia injection in SCR system are discussed. The linear/quadratic character between pressure drop/outlet flow rate and valve opening/dynamic pressure inlet are investigated using computational fluid dynamic (CFD) and response surface analysis (RSA) methods. The results show that the linear character of brake valve is significantly better than butterfly valve, which means that the brake valve is more suitable for ammonia injection adjustment than the butterfly valve.
Fractional Order and Dynamic Simulation of a System Involving an Elastic Wide Plate
NASA Astrophysics Data System (ADS)
David, S. A.; Balthazar, J. M.; Julio, B. H. S.; Oliveira, C.
2011-09-01
Numerous researchers have studied about nonlinear dynamics in several areas of science and engineering. However, in most cases, these concepts have been explored mainly from the standpoint of analytical and computational methods involving integer order calculus (IOC). In this paper we have examined the dynamic behavior of an elastic wide plate induced by two electromagnets of a point of view of the fractional order calculus (FOC). The primary focus of this study is on to help gain a better understanding of nonlinear dynamic in fractional order systems.
1997-12-16
An image of the F-16XL #1 during its functional flight check of the Digital Flight Control System (DFCS) on December 16, 1997. The mission was flown by NASA research pilot Dana Purifoy, and lasted 1 hour and 25 minutes. The tests included pilot familiarly, functional check, and handling qualities evaluation maneuvers to a speed of Mach 0.6 and 300 knots. Purifoy completed all the briefed data points with no problems, and reported that the DFCS handled as well, if not better than the analog computer system that it replaced.
Reinforcement learning in depression: A review of computational research.
Chen, Chong; Takahashi, Taiki; Nakagawa, Shin; Inoue, Takeshi; Kusumi, Ichiro
2015-08-01
Despite being considered primarily a mood disorder, major depressive disorder (MDD) is characterized by cognitive and decision making deficits. Recent research has employed computational models of reinforcement learning (RL) to address these deficits. The computational approach has the advantage in making explicit predictions about learning and behavior, specifying the process parameters of RL, differentiating between model-free and model-based RL, and the computational model-based functional magnetic resonance imaging and electroencephalography. With these merits there has been an emerging field of computational psychiatry and here we review specific studies that focused on MDD. Considerable evidence suggests that MDD is associated with impaired brain signals of reward prediction error and expected value ('wanting'), decreased reward sensitivity ('liking') and/or learning (be it model-free or model-based), etc., although the causality remains unclear. These parameters may serve as valuable intermediate phenotypes of MDD, linking general clinical symptoms to underlying molecular dysfunctions. We believe future computational research at clinical, systems, and cellular/molecular/genetic levels will propel us toward a better understanding of the disease. Copyright © 2015 Elsevier Ltd. All rights reserved.
A Method for Growing Bio-memristors from Slime Mold.
Miranda, Eduardo Reck; Braund, Edward
2017-11-02
Our research is aimed at gaining a better understanding of the electronic properties of organisms in order to engineer novel bioelectronic systems and computing architectures based on biology. This specific paper focuses on harnessing the unicellular slime mold Physarum polycephalum to develop bio-memristors (or biological memristors) and bio-computing devices. The memristor is a resistor that possesses memory. It is the 4th fundamental passive circuit element (the other three are the resistor, the capacitor, and the inductor), which is paving the way for the design of new kinds of computing systems; e.g., computers that might relinquish the distinction between storage and a central processing unit. When applied with an AC voltage, the current vs. voltage characteristic of a memristor is a pinched hysteresis loop. It has been shown that P. polycephalum produces pinched hysteresis loops under AC voltages and displays adaptive behavior that is comparable with the functioning of a memristor. This paper presents the method that we developed for implementing bio-memristors with P. polycephalum and introduces the development of a receptacle to culture the organism, which facilitates its deployment as an electronic circuit component. Our method has proven to decrease growth time, increase component lifespan, and standardize electrical observations.
A secure communication using cascade chaotic computing systems on clinical decision support.
Koksal, Ahmet Sertol; Er, Orhan; Evirgen, Hayrettin; Yumusak, Nejat
2016-06-01
Clinical decision support systems (C-DSS) provide supportive tools to the expert for the determination of the disease. Today, many of the support systems, which have been developed for a better and more accurate diagnosis, have reached a dynamic structure due to artificial intelligence techniques. However, in cases when important diagnosis studies should be performed in secret, a secure communication system is required. In this study, secure communication of a DSS is examined through a developed double layer chaotic communication system. The developed communication system consists of four main parts: random number generator, cascade chaotic calculation layer, PCM, and logical mixer layers. Thanks to this system, important patient data created by DSS will be conveyed to the center through a secure communication line.
Playing Computer Games Versus Better Learning.
ERIC Educational Resources Information Center
Din, Feng S.; Caleo, Josephine
This study investigated whether kindergarten students who played Sony Play Station (Lightspan) computer games learned better than peers who did not play such games. Participants were 47 African-American kindergartners from two classes of an urban school in the Northeast. A pretest and posttest with control group design was used in the study. The…
Supporting Students' Learning in the Domain of Computer Science
ERIC Educational Resources Information Center
Gasparinatou, Alexandra; Grigoriadou, Maria
2011-01-01
Previous studies have shown that students with low knowledge understand and learn better from more cohesive texts, whereas high-knowledge students have been shown to learn better from texts of lower cohesion. This study examines whether high-knowledge readers in computer science benefit from a text of low cohesion. Undergraduate students (n = 65)…
NASA Astrophysics Data System (ADS)
Endy, Drew; You, Lingchong; Yin, John; Molineux, Ian J.
2000-05-01
We created a simulation based on experimental data from bacteriophage T7 that computes the developmental cycle of the wild-type phage and also of mutants that have an altered genome order. We used the simulation to compute the fitness of more than 105 mutants. We tested these computations by constructing and experimentally characterizing T7 mutants in which we repositioned gene 1, coding for T7 RNA polymerase. Computed protein synthesis rates for ectopic gene 1 strains were in moderate agreement with observed rates. Computed phage-doubling rates were close to observations for two of four strains, but significantly overestimated those of the other two. Computations indicate that the genome organization of wild-type T7 is nearly optimal for growth: only 2.8% of random genome permutations were computed to grow faster, the highest 31% faster, than wild type. Specific discrepancies between computations and observations suggest that a better understanding of the translation efficiency of individual mRNAs and the functions of qualitatively "nonessential" genes will be needed to improve the T7 simulation. In silico representations of biological systems can serve to assess and advance our understanding of the underlying biology. Iteration between computation, prediction, and observation should increase the rate at which biological hypotheses are formulated and tested.
Remote gaze tracking system for 3D environments.
Congcong Liu; Herrup, Karl; Shi, Bertram E
2017-07-01
Eye tracking systems are typically divided into two categories: remote and mobile. Remote systems, where the eye tracker is located near the object being viewed by the subject, have the advantage of being less intrusive, but are typically used for tracking gaze points on fixed two dimensional (2D) computer screens. Mobile systems such as eye tracking glasses, where the eye tracker are attached to the subject, are more intrusive, but are better suited for cases where subjects are viewing objects in the three dimensional (3D) environment. In this paper, we describe how remote gaze tracking systems developed for 2D computer screens can be used to track gaze points in a 3D environment. The system is non-intrusive. It compensates for small head movements by the user, so that the head need not be stabilized by a chin rest or bite bar. The system maps the 3D gaze points of the user onto 2D images from a scene camera and is also located remotely from the subject. Measurement results from this system indicate that it is able to estimate gaze points in the scene camera to within one degree over a wide range of head positions.
NASA Astrophysics Data System (ADS)
Christou, Michalis; Christoudias, Theodoros; Morillo, Julián; Alvarez, Damian; Merx, Hendrik
2016-09-01
We examine an alternative approach to heterogeneous cluster-computing in the many-core era for Earth system models, using the European Centre for Medium-Range Weather Forecasts Hamburg (ECHAM)/Modular Earth Submodel System (MESSy) Atmospheric Chemistry (EMAC) model as a pilot application on the Dynamical Exascale Entry Platform (DEEP). A set of autonomous coprocessors interconnected together, called Booster, complements a conventional HPC Cluster and increases its computing performance, offering extra flexibility to expose multiple levels of parallelism and achieve better scalability. The EMAC model atmospheric chemistry code (Module Efficiently Calculating the Chemistry of the Atmosphere (MECCA)) was taskified with an offload mechanism implemented using OmpSs directives. The model was ported to the MareNostrum 3 supercomputer to allow testing with Intel Xeon Phi accelerators on a production-size machine. The changes proposed in this paper are expected to contribute to the eventual adoption of Cluster-Booster division and Many Integrated Core (MIC) accelerated architectures in presently available implementations of Earth system models, towards exploiting the potential of a fully Exascale-capable platform.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Lei; Solomon, Jonathan M.; Asta, Mark
2015-09-01
The energetics of rare earth substituted UO2 solid solutions (U1-xLnxO2-0.5x+y, where Ln = La, Y, and Nd) are investigated employing a combination of calorimetric measurements and density functional theory based computations. Calculated and measured formation enthalpies agree within 10 kJ/mol for stoichiometric oxygen/metal compositions. To better understand the factors governing the stability and defect binding in rare earth substituted urania solid solutions, systematic trends in the energetics are investigated based on the present results and previous computational and experimental thermochemical studies of rare earth substituted fluorite oxides (A1-xLnxO2-0.5x, where A = Hf, Zr, Ce, and Th). A consistent trend towardsmore » increased energetic stability with larger size mismatch between the smaller host tetravalent cation and the larger rare earth trivalent cation is found for both actinide and non-actinide fluorite oxide systems where aliovalent substitution of Ln cations is compensated by oxygen vacancies. However, the large exothermic oxidation enthalpy in the UO2 based systems favors oxygen rich compositions where charge compensation occurs through the formation of uranium cations with higher oxidation states.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mather, Barry
The increasing deployment of distribution-connected photovoltaic (DPV) systems requires utilities to complete complex interconnection studies. Relatively simple interconnection study methods worked well for low penetrations of photovoltaic systems, but more complicated quasi-static time-series (QSTS) analysis is required to make better interconnection decisions as DPV penetration levels increase. Tools and methods must be developed to support this. This paper presents a variable-time-step solver for QSTS analysis that significantly shortens the computational time and effort to complete a detailed analysis of the operation of a distribution circuit with many DPV systems. Specifically, it demonstrates that the proposed variable-time-step solver can reduce themore » required computational time by as much as 84% without introducing any important errors to metrics, such as the highest and lowest voltage occurring on the feeder, number of voltage regulator tap operations, and total amount of losses realized in the distribution circuit during a 1-yr period. Further improvement in computational speed is possible with the introduction of only modest errors in these metrics, such as a 91 percent reduction with less than 5 percent error when predicting voltage regulator operations.« less
Thong, Patricia S P; Tandjung, Stephanus S; Movania, Muhammad Mobeen; Chiew, Wei-Ming; Olivo, Malini; Bhuvaneswari, Ramaswamy; Seah, Hock-Soon; Lin, Feng; Qian, Kemao; Soo, Khee-Chee
2012-05-01
Oral lesions are conventionally diagnosed using white light endoscopy and histopathology. This can pose a challenge because the lesions may be difficult to visualise under white light illumination. Confocal laser endomicroscopy can be used for confocal fluorescence imaging of surface and subsurface cellular and tissue structures. To move toward real-time "virtual" biopsy of oral lesions, we interfaced an embedded computing system to a confocal laser endomicroscope to achieve a prototype three-dimensional (3-D) fluorescence imaging system. A field-programmable gated array computing platform was programmed to enable synchronization of cross-sectional image grabbing and Z-depth scanning, automate the acquisition of confocal image stacks and perform volume rendering. Fluorescence imaging of the human and murine oral cavities was carried out using the fluorescent dyes fluorescein sodium and hypericin. Volume rendering of cellular and tissue structures from the oral cavity demonstrate the potential of the system for 3-D fluorescence visualization of the oral cavity in real-time. We aim toward achieving a real-time virtual biopsy technique that can complement current diagnostic techniques and aid in targeted biopsy for better clinical outcomes.
Performance implications from sizing a VM on multi-core systems: A Data analytic application s view
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, Seung-Hwan; Horey, James L; Begoli, Edmon
In this paper, we present a quantitative performance analysis of data analytics applications running on multi-core virtual machines. Such environments form the core of cloud computing. In addition, data analytics applications, such as Cassandra and Hadoop, are becoming increasingly popular on cloud computing platforms. This convergence necessitates a better understanding of the performance and cost implications of such hybrid systems. For example, the very rst step in hosting applications in virtualized environments, requires the user to con gure the number of virtual processors and the size of memory. To understand performance implications of this step, we benchmarked three Yahoo Cloudmore » Serving Benchmark (YCSB) workloads in a virtualized multi-core environment. Our measurements indicate that the performance of Cassandra for YCSB workloads does not heavily depend on the processing capacity of a system, while the size of the data set is critical to performance relative to allocated memory. We also identi ed a strong relationship between the running time of workloads and various hardware events (last level cache loads, misses, and CPU migrations). From this analysis, we provide several suggestions to improve the performance of data analytics applications running on cloud computing environments.« less
An advanced technique for the prediction of decelerator system dynamics.
NASA Technical Reports Server (NTRS)
Talay, T. A.; Morris, W. D.; Whitlock, C. H.
1973-01-01
An advanced two-body six-degree-of-freedom computer model employing an indeterminate structures approach has been developed for the parachute deployment process. The program determines both vehicular and decelerator responses to aerodynamic and physical property inputs. A better insight into the dynamic processes that occur during parachute deployment has been developed. The model is of value in sensitivity studies to isolate important parameters that affect the vehicular response.
Acosta-Mesa, Héctor Gabriel; Cruz-Ramírez, Nicandro; Hernández-Jiménez, Rodolfo
2017-01-01
Efforts have been being made to improve the diagnostic performance of colposcopy, trying to help better diagnose cervical cancer, particularly in developing countries. However, improvements in a number of areas are still necessary, such as the time it takes to process the full digital image of the cervix, the performance of the computing systems used to identify different kinds of tissues, and biopsy sampling. In this paper, we explore three different, well-known automatic classification methods (k-Nearest Neighbors, Naïve Bayes, and C4.5), in addition to different data models that take full advantage of this information and improve the diagnostic performance of colposcopy based on acetowhite temporal patterns. Based on the ROC and PRC area scores, the k-Nearest Neighbors and discrete PLA representation performed better than other methods. The values of sensitivity, specificity, and accuracy reached using this method were 60% (95% CI 50–70), 79% (95% CI 71–86), and 70% (95% CI 60–80), respectively. The acetowhitening phenomenon is not exclusive to high-grade lesions, and we have found acetowhite temporal patterns of epithelial changes that are not precancerous lesions but that are similar to positive ones. These findings need to be considered when developing more robust computing systems in the future. PMID:28744318
Summary: Special Session SpS15: Data Intensive Astronomy
NASA Astrophysics Data System (ADS)
Montmerle, Thierry
2015-03-01
A new paradigm in astronomical research has been emerging - ``Data Intensive Astronomy'' that utilizes large amounts of data combined with statistical data analyses. The first research method in astronomy was observations by our eyes. It is well known that the invention of telescope impacted the human view on our Universe (although it was almost limited to the solar system), and lead to Keplerfs law that was later used by Newton to derive his mechanics. Newtonian mechanics then enabled astronomers to provide the theoretical explanation to the motion of the planets. Thus astronomers obtained the second paradigm, theoretical astronomy. Astronomers succeeded to apply various laws of physics to reconcile phenomena in the Universe; e.g., nuclear fusion was found to be the energy source of a star. Theoretical astronomy has been paired with observational astronomy to better understand the background physics in observed phenomena in the Universe. Although theoretical astronomy succeeded to provide good physical explanations qualitatively, it was not easy to have quantitative agreements with observations in the Universe. Since the invention of high-performance computers, however, astronomers succeeded to have the third research method, simulations, to get better agreements with observations. Simulation astronomy developed so rapidly along with the development of computer hardware (CPUs, GPUs, memories, storage systems, networks, and others) and simulation codes.
Raja, Muhammad Asif Zahoor; Kiani, Adiqa Kausar; Shehzad, Azam; Zameer, Aneela
2016-01-01
In this study, bio-inspired computing is exploited for solving system of nonlinear equations using variants of genetic algorithms (GAs) as a tool for global search method hybrid with sequential quadratic programming (SQP) for efficient local search. The fitness function is constructed by defining the error function for systems of nonlinear equations in mean square sense. The design parameters of mathematical models are trained by exploiting the competency of GAs and refinement are carried out by viable SQP algorithm. Twelve versions of the memetic approach GA-SQP are designed by taking a different set of reproduction routines in the optimization process. Performance of proposed variants is evaluated on six numerical problems comprising of system of nonlinear equations arising in the interval arithmetic benchmark model, kinematics, neurophysiology, combustion and chemical equilibrium. Comparative studies of the proposed results in terms of accuracy, convergence and complexity are performed with the help of statistical performance indices to establish the worth of the schemes. Accuracy and convergence of the memetic computing GA-SQP is found better in each case of the simulation study and effectiveness of the scheme is further established through results of statistics based on different performance indices for accuracy and complexity.
Analog Computer-Aided Detection (CAD) information can be more effective than binary marks.
Cunningham, Corbin A; Drew, Trafton; Wolfe, Jeremy M
2017-02-01
In socially important visual search tasks, such as baggage screening and diagnostic radiology, experts miss more targets than is desirable. Computer-aided detection (CAD) programs have been developed specifically to improve performance in these professional search tasks. For example, in breast cancer screening, many CAD systems are capable of detecting approximately 90% of breast cancer, with approximately 0.5 false-positive detections per image. Nevertheless, benefits of CAD in clinical settings tend to be small (Birdwell, 2009) or even absent (Meziane et al., 2011; Philpotts, 2009). The marks made by a CAD system can be "binary," giving the same signal to any location where the signal is above some threshold. Alternatively, a CAD system presents an analog signal that reflects strength of the signal at a location. In the experiments reported, we compare analog and binary CAD presentations using nonexpert observers and artificial stimuli defined by two noisy signals: a visible color signal and an "invisible" signal that informed our simulated CAD system. We found that analog CAD generally yielded better overall performance than binary CAD. The analog benefit is similar at high and low target prevalence. Our data suggest that the form of the CAD signal can directly influence performance. Analog CAD may allow the computer to be more helpful to the searcher.
Xu, Xiaochao; Kim, Joshua; Laganis, Philip; Schulze, Derek; Liang, Yongguang; Zhang, Tiezhi
2011-10-01
To demonstrate the feasibility of Tetrahedron Beam Computed Tomography (TBCT) using a carbon nanotube (CNT) multiple pixel field emission x-ray (MPFEX) tube. A multiple pixel x-ray source facilitates the creation of novel x-ray imaging modalities. In a previous publication, the authors proposed a Tetrahedron Beam Computed Tomography (TBCT) imaging system which comprises a linear source array and a linear detector array that are orthogonal to each other. TBCT is expected to reduce scatter compared with Cone Beam Computed Tomography (CBCT) and to have better detector performance. Therefore, it may produce improved image quality for image guided radiotherapy. In this study, a TBCT benchtop system has been developed with an MPFEX tube. The tube has 75 CNT cold cathodes, which generate 75 x-ray focal spots on an elongated anode, and has 4 mm pixel spacing. An in-house-developed, 5-row CT detector array using silicon photodiodes and CdWO(4) scintillators was employed in the system. Hardware and software were developed for tube control and detector data acquisition. The raw data were preprocessed for beam hardening and detector response linearity and were reconstructed with an FDK-based image reconstruction algorithm. The focal spots were measured at about 1 × 2 mm(2) using a star phantom. Each cathode generates around 3 mA cathode current with 2190 V gate voltage. The benchtop system is able to perform TBCT scans with a prolonged scanning time. Images of a commercial CT phantom were successfully acquired. A prototype system was developed, and preliminary phantom images were successfully acquired. MPFEX is a promising x-ray source for TBCT. Further improvement of tube output is needed in order for it to be used in clinical TBCT systems.
Merelli, Ivan; Pérez-Sánchez, Horacio; Gesing, Sandra; D'Agostino, Daniele
2014-01-01
The explosion of the data both in the biomedical research and in the healthcare systems demands urgent solutions. In particular, the research in omics sciences is moving from a hypothesis-driven to a data-driven approach. Healthcare is additionally always asking for a tighter integration with biomedical data in order to promote personalized medicine and to provide better treatments. Efficient analysis and interpretation of Big Data opens new avenues to explore molecular biology, new questions to ask about physiological and pathological states, and new ways to answer these open issues. Such analyses lead to better understanding of diseases and development of better and personalized diagnostics and therapeutics. However, such progresses are directly related to the availability of new solutions to deal with this huge amount of information. New paradigms are needed to store and access data, for its annotation and integration and finally for inferring knowledge and making it available to researchers. Bioinformatics can be viewed as the “glue” for all these processes. A clear awareness of present high performance computing (HPC) solutions in bioinformatics, Big Data analysis paradigms for computational biology, and the issues that are still open in the biomedical and healthcare fields represent the starting point to win this challenge. PMID:25254202
Using digital photo technology to improve visualization of gastric lumen CT images
NASA Astrophysics Data System (ADS)
Pyrgioti, M.; Kyriakidis, A.; Chrysostomou, S.; Panaritis, V.
2006-12-01
In order to evaluate the gastric lumen CT images better, a new method is being applied to images using an Image Processing software. During a 12-month period, 69 patients with various gastric symptoms and 20 normal (as far as it concerns the upper gastrointestinal system) volunteers underwent computed tomography of the upper gastrointestinal system. Just before the examination the patients and the normal volunteers underwent preparation with 40 ml soda water and 10 ml gastrografin. All the CT images were digitized with an Olympus 3.2 Mpixel digital camera and further processed with an Image Processing software. The administration per os of gastrografin and soda water resulted in the distension of the stomach and consequently better visualization of all the anatomic parts. By using an Image Processing software in a PC, all the pathological and normal images of the stomach were better diagnostically estimated. We believe that the photo digital technology improves the diagnostic capacity not only of the CT image but also in MRI and probably many other imaging methods.
NASA Astrophysics Data System (ADS)
See, Swee Lan; Tan, Mitchell; Looi, Qin En
This paper presents findings from a descriptive research on social gaming. A video-enhanced diary method was used to understand the user experience in social gaming. From this experiment, we found that natural human behavior and gamer’s decision making process can be elicited and speculated during human computer interaction. These are new information that we should consider as they can help us build better human computer interfaces and human robotic interfaces in future.
System Dynamics Modeling of Transboundary Systems: The Bear River Basin Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerald Sehlke; Jake Jacobson
2005-09-01
System dynamics is a computer-aided approach to evaluating the interrelationships of different components and activities within complex systems. Recently, system dynamics models have been developed in areas such as policy design, biological and medical modeling, energy and the environmental analysis, and in various other areas in the natural and social sciences. The Idaho National Engineering and Environmental Laboratory, a multi-purpose national laboratory managed by the Department of Energy, has developed a systems dynamics model in order to evaluate its utility for modeling large complex hydrological systems. We modeled the Bear River Basin, a transboundary basin that includes portions of Idaho,more » Utah and Wyoming. We found that system dynamics modeling is very useful for integrating surface water and groundwater data and for simulating the interactions between these sources within a given basin. In addition, we also found system dynamics modeling is useful for integrating complex hydrologic data with other information (e.g., policy, regulatory and management criteria) to produce a decision support system. Such decision support systems can allow managers and stakeholders to better visualize the key hydrologic elements and management constraints in the basin, which enables them to better understand the system via the simulation of multiple “what-if” scenarios. Although system dynamics models can be developed to conduct traditional hydraulic/hydrologic surface water or groundwater modeling, we believe that their strength lies in their ability to quickly evaluate trends and cause–effect relationships in large-scale hydrological systems; for integrating disparate data; for incorporating output from traditional hydraulic/hydrologic models; and for integration of interdisciplinary data, information and criteria to support better management decisions.« less
System Dynamics Modeling of Transboundary Systems: the Bear River Basin Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerald Sehlke; Jacob J. Jacobson
2005-09-01
System dynamics is a computer-aided approach to evaluating the interrelationships of different components and activities within complex systems. Recently, system dynamics models have been developed in areas such as policy design, biological and medical modeling, energy and the environmental analysis, and in various other areas in the natural and social sciences. The Idaho National Engineering and Environmental Laboratory, a multi-purpose national laboratory managed by the Department of Energy, has developed a systems dynamics model in order to evaluate its utility for modeling large complex hydrological systems. We modeled the Bear River Basin, a transboundary basin that includes portions of Idaho,more » Utah and Wyoming. We found that system dynamics modeling is very useful for integrating surface water and ground water data and for simulating the interactions between these sources within a given basin. In addition, we also found system dynamics modeling is useful for integrating complex hydrologic data with other information (e.g., policy, regulatory and management criteria) to produce a decision support system. Such decision support systems can allow managers and stakeholders to better visualize the key hydrologic elements and management constraints in the basin, which enables them to better understand the system via the simulation of multiple “what-if” scenarios. Although system dynamics models can be developed to conduct traditional hydraulic/hydrologic surface water or ground water modeling, we believe that their strength lies in their ability to quickly evaluate trends and cause–effect relationships in large-scale hydrological systems; for integrating disparate data; for incorporating output from traditional hydraulic/hydrologic models; and for integration of interdisciplinary data, information and criteria to support better management decisions.« less
NASA Astrophysics Data System (ADS)
Sankaran, A.; Chuang, Keh-Shih; Yonekawa, Hisashi; Huang, H. K.
1992-06-01
The imaging characteristics of two chest radiographic equipment, Advanced Multiple Beam Equalization Radiography (AMBER) and Konica Direct Digitizer [using a storage phosphor (SP) plate] systems have been compared. The variables affecting image quality and the computer display/reading systems used are detailed. Utilizing specially designed wedge, geometric, and anthropomorphic phantoms, studies were conducted on: exposure and energy response of detectors; nodule detectability; different exposure techniques; various look- up tables (LUTs), gray scale displays and laser printers. Methods for scatter estimation and reduction were investigated. It is concluded that AMBER with screen-film and equalization techniques provides better nodule detectability than SP plates. However, SP plates have other advantages such as flexibility in the selection of exposure techniques, image processing features, and excellent sensitivity when combined with optimum reader operating modes. The equalization feature of AMBER provides better nodule detectability under the denser regions of the chest. Results of diagnostic accuracy are demonstrated with nodule detectability plots and analysis of images obtained with phantoms.
Systems biology of stored blood cells: can it help to extend the expiration date?
Paglia, Giuseppe; Palsson, Bernhard Ø; Sigurjonsson, Olafur E
2012-12-05
With increasingly stringent regulations regarding deferral and elimination of blood donors it will become increasingly important to extend the expiration date of blood components beyond the current allowed storage periods. One reason for the storage time limit for blood components is that platelets and red blood cells develop a condition called storage lesions during their storage in plastic blood containers. Systems biology provides comprehensive bio-chemical descriptions of organisms through quantitative measurements and data integration in mathematical models. The biological knowledge for a target organism can be translated in a mathematical format and used to compute physiological properties. The use of systems biology represents a concrete solution in the study of blood cell storage lesions, and it may open up new avenues towards developing better storage methods and better storage media, thereby extending the storage period of blood components. This article is part of a Special Issue entitled: Integrated omics. Copyright © 2012 Elsevier B.V. All rights reserved.
Effect of Computer Support on Younger Women with Breast Cancer
Gustafson, David H; Hawkins, Robert; Pingree, Suzanne; McTavish, Fiona; Arora, Neeraj K; Mendenhall, John; Cella, David F; Serlin, Ronald C; Apantaku, Funmi M; Stewart, James; Salner, Andrew
2001-01-01
OBJECTIVE Assess impact of a computer-based patient support system on quality of life in younger women with breast cancer, with particular emphasis on assisting the underserved. DESIGN Randomized controlled trial conducted between 1995 and 1998. SETTING Five sites: two teaching hospitals (Madison, Wis, and Chicago, Ill), two nonteaching hospitals (Chicago), and a cancer resource center (Indianapolis, Ill). The latter three sites treat many underserved patients. PARTICIPANTS Newly diagnosed breast cancer patients (N = 246) under age 60. INTERVENTIONS Experimental group received Comprehensive Health Enhancement Support System (CHESS), a home-based computer system providing information, decision-making, and emotional support. MEASUREMENTS AND MAIN RESULTS Pretest and two post-test surveys (at two- and five-month follow-up) measured aspects of participation in care, social/information support, and quality of life. At two-month follow-up, the CHESS group was significantly more competent at seeking information, more comfortable participating in care, and had greater confidence in doctor(s). At five-month follow-up, the CHESS group had significantly better social support and also greater information competence. In addition, experimental assignment interacted with several indicators of medical underservice (race, education, and lack of insurance), such that CHESS benefits were greater for the disadvantaged than the advantaged group. CONCLUSIONS Computer-based patient support systems such as CHESS may benefit patients by providing information and social support, and increasing their participation in health care. These benefits may be largest for currently underserved populations. PMID:11520380
System and method for leveraging human physiological traits to control microprocessor frequency
Shye, Alex; Pan, Yan; Scholbrock, Benjamin; Miller, J. Scott; Memik, Gokhan; Dinda, Peter A; Dick, Robert P
2014-03-25
A system and method for leveraging physiological traits to control microprocessor frequency are disclosed. In some embodiments, the system and method may optimize, for example, a particular processor-based architecture based on, for example, end user satisfaction. In some embodiments, the system and method may determine, for example, whether their users are satisfied to provide higher efficiency, improved reliability, reduced power consumption, increased security, and a better user experience. The system and method may use, for example, biometric input devices to provide information about a user's physiological traits to a computer system. Biometric input devices may include, for example, one or more of the following: an eye tracker, a galvanic skin response sensor, and/or a force sensor.
Advancing Cyberinfrastructure to support high resolution water resources modeling
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Ogden, F. L.; Jones, N.; Horsburgh, J. S.
2012-12-01
Addressing the problem of how the availability and quality of water resources at large scales are sensitive to climate variability, watershed alterations and management activities requires computational resources that combine data from multiple sources and support integrated modeling. Related cyberinfrastructure challenges include: 1) how can we best structure data and computer models to address this scientific problem through the use of high-performance and data-intensive computing, and 2) how can we do this in a way that discipline scientists without extensive computational and algorithmic knowledge and experience can take advantage of advances in cyberinfrastructure? This presentation will describe a new system called CI-WATER that is being developed to address these challenges and advance high resolution water resources modeling in the Western U.S. We are building on existing tools that enable collaboration to develop model and data interfaces that link integrated system models running within an HPC environment to multiple data sources. Our goal is to enhance the use of computational simulation and data-intensive modeling to better understand water resources. Addressing water resource problems in the Western U.S. requires simulation of natural and engineered systems, as well as representation of legal (water rights) and institutional constraints alongside the representation of physical processes. We are establishing data services to represent the engineered infrastructure and legal and institutional systems in a way that they can be used with high resolution multi-physics watershed modeling at high spatial resolution. These services will enable incorporation of location-specific information on water management infrastructure and systems into the assessment of regional water availability in the face of growing demands, uncertain future meteorological forcings, and existing prior-appropriations water rights. This presentation will discuss the informatics challenges involved with data management and easy-to-use access to high performance computing being tackled in this project.
Automated microdensitometer for digitizing astronomical plates
NASA Technical Reports Server (NTRS)
Angilello, J.; Chiang, W. H.; Elmegreen, D. M.; Segmueller, A.
1984-01-01
A precision microdensitometer was built under control of an IBM S/1 time-sharing computer system. The instrument's spatial resolution is better than 20 microns. A raster scan of an area of 10x10 sq mm (500x500 raster points) takes 255 minutes. The reproducibility is excellent and the stability is good over a period of 30 hours, which is significantly longer than the time required for most scans. The intrinsic accuracy of the instrument was tested using Kodak standard filters, and it was found to be better than 3%. A comparative accuracy was tested measuring astronomical plates of galaxies for which absolute photoelectric photometry data were available. The results showed an accuracy excellent for astronomical applications.
Accelerating k-NN Algorithm with Hybrid MPI and OpenSHMEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Jian; Hamidouche, Khaled; Zheng, Jie
2015-08-05
Machine Learning algorithms are benefiting from the continuous improvement of programming models, including MPI, MapReduce and PGAS. k-Nearest Neighbors (k-NN) algorithm is a widely used machine learning algorithm, applied to supervised learning tasks such as classification. Several parallel implementations of k-NN have been proposed in the literature and practice. However, on high-performance computing systems with high-speed interconnects, it is important to further accelerate existing designs of the k-NN algorithm through taking advantage of scalable programming models. To improve the performance of k-NN on large-scale environment with InfiniBand network, this paper proposes several alternative hybrid MPI+OpenSHMEM designs and performs a systemicmore » evaluation and analysis on typical workloads. The hybrid designs leverage the one-sided memory access to better overlap communication with computation than the existing pure MPI design, and propose better schemes for efficient buffer management. The implementation based on k-NN program from MaTEx with MVAPICH2-X (Unified MPI+PGAS Communication Runtime over InfiniBand) shows up to 9.0% time reduction for training KDD Cup 2010 workload over 512 cores, and 27.6% time reduction for small workload with balanced communication and computation. Experiments of running with varied number of cores show that our design can maintain good scalability.« less
NEXUS - Resilient Intelligent Middleware
NASA Astrophysics Data System (ADS)
Kaveh, N.; Hercock, R. Ghanea
Service-oriented computing, a composition of distributed-object computing, component-based, and Web-based concepts, is becoming the widespread choice for developing dynamic heterogeneous software assets available as services across a network. One of the major strengths of service-oriented technologies is the high abstraction layer and large granularity level at which software assets are viewed compared to traditional object-oriented technologies. Collaboration through encapsulated and separately defined service interfaces creates a service-oriented environment, whereby multiple services can be linked together through their interfaces to compose a functional system. This approach enables better integration of legacy and non-legacy services, via wrapper interfaces, and allows for service composition at a more abstract level especially in cases such as vertical market stacks. The heterogeneous nature of service-oriented technologies and the granularity of their software components makes them a suitable computing model in the pervasive domain.
A Component-based Programming Model for Composite, Distributed Applications
NASA Technical Reports Server (NTRS)
Eidson, Thomas M.; Bushnell, Dennis M. (Technical Monitor)
2001-01-01
The nature of scientific programming is evolving to larger, composite applications that are composed of smaller element applications. These composite applications are more frequently being targeted for distributed, heterogeneous networks of computers. They are most likely programmed by a group of developers. Software component technology and computational frameworks are being proposed and developed to meet the programming requirements of these new applications. Historically, programming systems have had a hard time being accepted by the scientific programming community. In this paper, a programming model is outlined that attempts to organize the software component concepts and fundamental programming entities into programming abstractions that will be better understood by the application developers. The programming model is designed to support computational frameworks that manage many of the tedious programming details, but also that allow sufficient programmer control to design an accurate, high-performance application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sartori, E.; Roussin, R.W.
This paper presents a brief review of computer codes concerned with checking, plotting, processing and using of covariances of neutron cross-section data. It concentrates on those available from the computer code information centers of the United States and the OECD/Nuclear Energy Agency. Emphasis will be placed also on codes using covariances for specific applications such as uncertainty analysis, data adjustment and data consistency analysis. Recent evaluations contain neutron cross section covariance information for all isotopes of major importance for technological applications of nuclear energy. It is therefore important that the available software tools needed for taking advantage of this informationmore » are widely known as hey permit the determination of better safety margins and allow the optimization of more economic, I designs of nuclear energy systems.« less
NASA Astrophysics Data System (ADS)
Gómez-Uribe, Carlos A.; Verghese, George C.
2007-01-01
The intrinsic stochastic effects in chemical reactions, and particularly in biochemical networks, may result in behaviors significantly different from those predicted by deterministic mass action kinetics (MAK). Analyzing stochastic effects, however, is often computationally taxing and complex. The authors describe here the derivation and application of what they term the mass fluctuation kinetics (MFK), a set of deterministic equations to track the means, variances, and covariances of the concentrations of the chemical species in the system. These equations are obtained by approximating the dynamics of the first and second moments of the chemical master equation. Apart from needing knowledge of the system volume, the MFK description requires only the same information used to specify the MAK model, and is not significantly harder to write down or apply. When the effects of fluctuations are negligible, the MFK description typically reduces to MAK. The MFK equations are capable of describing the average behavior of the network substantially better than MAK, because they incorporate the effects of fluctuations on the evolution of the means. They also account for the effects of the means on the evolution of the variances and covariances, to produce quite accurate uncertainty bands around the average behavior. The MFK computations, although approximate, are significantly faster than Monte Carlo methods for computing first and second moments in systems of chemical reactions. They may therefore be used, perhaps along with a few Monte Carlo simulations of sample state trajectories, to efficiently provide a detailed picture of the behavior of a chemical system.
Towards Dynamic Remote Data Auditing in Computational Clouds
Khurram Khan, Muhammad; Anuar, Nor Badrul
2014-01-01
Cloud computing is a significant shift of computational paradigm where computing as a utility and storing data remotely have a great potential. Enterprise and businesses are now more interested in outsourcing their data to the cloud to lessen the burden of local data storage and maintenance. However, the outsourced data and the computation outcomes are not continuously trustworthy due to the lack of control and physical possession of the data owners. To better streamline this issue, researchers have now focused on designing remote data auditing (RDA) techniques. The majority of these techniques, however, are only applicable for static archive data and are not subject to audit the dynamically updated outsourced data. We propose an effectual RDA technique based on algebraic signature properties for cloud storage system and also present a new data structure capable of efficiently supporting dynamic data operations like append, insert, modify, and delete. Moreover, this data structure empowers our method to be applicable for large-scale data with minimum computation cost. The comparative analysis with the state-of-the-art RDA schemes shows that the proposed scheme is secure and highly efficient in terms of the computation and communication overhead on the auditor and server. PMID:25121114
Grids, virtualization, and clouds at Fermilab
Timm, S.; Chadwick, K.; Garzoglio, G.; ...
2014-06-11
Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture andmore » the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). Lastly, this work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.« less
Grids, virtualization, and clouds at Fermilab
NASA Astrophysics Data System (ADS)
Timm, S.; Chadwick, K.; Garzoglio, G.; Noh, S.
2014-06-01
Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture and the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). This work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.
Towards dynamic remote data auditing in computational clouds.
Sookhak, Mehdi; Akhunzada, Adnan; Gani, Abdullah; Khurram Khan, Muhammad; Anuar, Nor Badrul
2014-01-01
Cloud computing is a significant shift of computational paradigm where computing as a utility and storing data remotely have a great potential. Enterprise and businesses are now more interested in outsourcing their data to the cloud to lessen the burden of local data storage and maintenance. However, the outsourced data and the computation outcomes are not continuously trustworthy due to the lack of control and physical possession of the data owners. To better streamline this issue, researchers have now focused on designing remote data auditing (RDA) techniques. The majority of these techniques, however, are only applicable for static archive data and are not subject to audit the dynamically updated outsourced data. We propose an effectual RDA technique based on algebraic signature properties for cloud storage system and also present a new data structure capable of efficiently supporting dynamic data operations like append, insert, modify, and delete. Moreover, this data structure empowers our method to be applicable for large-scale data with minimum computation cost. The comparative analysis with the state-of-the-art RDA schemes shows that the proposed scheme is secure and highly efficient in terms of the computation and communication overhead on the auditor and server.
Effects on Training Using Illumination in Virtual Environments
NASA Technical Reports Server (NTRS)
Maida, James C.; Novak, M. S. Jennifer; Mueller, Kristian
1999-01-01
Camera based tasks are commonly performed during orbital operations, and orbital lighting conditions, such as high contrast shadowing and glare, are a factor in performance. Computer based training using virtual environments is a common tool used to make and keep CTW members proficient. If computer based training included some of these harsh lighting conditions, would the crew increase their proficiency? The project goal was to determine whether computer based training increases proficiency if one trains for a camera based task using computer generated virtual environments with enhanced lighting conditions such as shadows and glare rather than color shaded computer images normally used in simulators. Previous experiments were conducted using a two degree of freedom docking system. Test subjects had to align a boresight camera using a hand controller with one axis of rotation and one axis of rotation. Two sets of subjects were trained on two computer simulations using computer generated virtual environments, one with lighting, and one without. Results revealed that when subjects were constrained by time and accuracy, those who trained with simulated lighting conditions performed significantly better than those who did not. To reinforce these results for speed and accuracy, the task complexity was increased.
Decoding of intended saccade direction in an oculomotor brain-computer interface
NASA Astrophysics Data System (ADS)
Jia, Nan; Brincat, Scott L.; Salazar-Gómez, Andrés F.; Panko, Mikhail; Guenther, Frank H.; Miller, Earl K.
2017-08-01
Objective. To date, invasive brain-computer interface (BCI) research has largely focused on replacing lost limb functions using signals from the hand/arm areas of motor cortex. However, the oculomotor system may be better suited to BCI applications involving rapid serial selection from spatial targets, such as choosing from a set of possible words displayed on a computer screen in an augmentative and alternative communication (AAC) application. Here we aimed to demonstrate the feasibility of a BCI utilizing the oculomotor system. Approach. We developed a chronic intracortical BCI in monkeys to decode intended saccadic eye movement direction using activity from multiple frontal cortical areas. Main results. Intended saccade direction could be decoded in real time with high accuracy, particularly at contralateral locations. Accurate decoding was evident even at the beginning of the BCI session; no extensive BCI experience was necessary. High-frequency (80-500 Hz) local field potential magnitude provided the best performance, even over spiking activity, thus simplifying future BCI applications. Most of the information came from the frontal and supplementary eye fields, with relatively little contribution from dorsolateral prefrontal cortex. Significance. Our results support the feasibility of high-accuracy intracortical oculomotor BCIs that require little or no practice to operate and may be ideally suited for ‘point and click’ computer operation as used in most current AAC systems.
ERIC Educational Resources Information Center
Conn, Samuel S.; Reichgelt, Han
2013-01-01
Cloud computing represents an architecture and paradigm of computing designed to deliver infrastructure, platforms, and software as constructible computing resources on demand to networked users. As campuses are challenged to better accommodate academic needs for applications and computing environments, cloud computing can provide an accommodating…
Microsoft Kinect Sensor Evaluation
NASA Technical Reports Server (NTRS)
Billie, Glennoah
2011-01-01
My summer project evaluates the Kinect game sensor input/output and its suitability to perform as part of a human interface for a spacecraft application. The primary objective is to evaluate, understand, and communicate the Kinect system's ability to sense and track fine (human) position and motion. The project will analyze the performance characteristics and capabilities of this game system hardware and its applicability for gross and fine motion tracking. The software development kit for the Kinect was also investigated and some experimentation has begun to understand its development environment. To better understand the software development of the Kinect game sensor, research in hacking communities has brought a better understanding of the potential for a wide range of personal computer (PC) application development. The project also entails the disassembly of the Kinect game sensor. This analysis would involve disassembling a sensor, photographing it, and identifying components and describing its operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelley, B. M.
The electric utility industry is undergoing significant transformations in its operation model, including a greater emphasis on automation, monitoring technologies, and distributed energy resource management systems (DERMS). With these changes and new technologies, while driving greater efficiencies and reliability, these new models may introduce new vectors of cyber attack. The appropriate cybersecurity controls to address and mitigate these newly introduced attack vectors and potential vulnerabilities are still widely unknown and performance of the control is difficult to vet. This proposal argues that modeling and simulation (M&S) is a necessary tool to address and better understand these problems introduced by emergingmore » technologies for the grid. M&S will provide electric utilities a platform to model its transmission and distribution systems and run various simulations against the model to better understand the operational impact and performance of cybersecurity controls.« less
A portable data-logging system for industrial hygiene personal chlorine monitoring.
Langhorst, M L; Illes, S P
1986-02-01
The combination of suitable portable sensors or instruments with small microprocessor-based data-logger units has made it possible to obtain detailed monitoring data for many health and environmental applications. Following data acquisition in field use, the logged data may be transferred to a desk-top personal computer for complete flexibility in manipulation of data and formating of results. A system has been assembled from commercial components and demonstrated for chlorine personal monitoring applications. The system consists of personal chlorine sensors, a Metrosonics data-logger and reader unit, and an Apple II Plus personal computer. The computer software was developed to handle sensor calibration, data evaluation and reduction, report formating and long-term storage of raw data on a disk. This system makes it possible to generate time-concentration profiles, evaluate dose above a threshold, quantitate short-term excursions and summarize time-weighted average (TWA) results. Field data from plant trials demonstrated feasibility of use, ruggedness and reliability. No significant differences were found between the time-weighted average chlorine concentrations determined by the sensor/logger system and two other methods: the sulfamic acid bubbler reference method and the 3M Poroplastic diffusional dosimeter. The sensor/data-logger system, however, provided far more information than the other two methods in terms of peak excursions, TWAs and exposure doses. For industrial hygiene applications, the system allows better definition of employee exposures, particularly for chemicals with acute as well as chronic health effects.(ABSTRACT TRUNCATED AT 250 WORDS)
Page, Andrew J.; Keane, Thomas M.; Naughton, Thomas J.
2010-01-01
We present a multi-heuristic evolutionary task allocation algorithm to dynamically map tasks to processors in a heterogeneous distributed system. It utilizes a genetic algorithm, combined with eight common heuristics, in an effort to minimize the total execution time. It operates on batches of unmapped tasks and can preemptively remap tasks to processors. The algorithm has been implemented on a Java distributed system and evaluated with a set of six problems from the areas of bioinformatics, biomedical engineering, computer science and cryptography. Experiments using up to 150 heterogeneous processors show that the algorithm achieves better efficiency than other state-of-the-art heuristic algorithms. PMID:20862190
Artificial intelligence applied to process signal analysis
NASA Technical Reports Server (NTRS)
Corsberg, Dan
1988-01-01
Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.
On a new iterative method for solving linear systems and comparison results
NASA Astrophysics Data System (ADS)
Jing, Yan-Fei; Huang, Ting-Zhu
2008-10-01
In Ujevic [A new iterative method for solving linear systems, Appl. Math. Comput. 179 (2006) 725-730], the author obtained a new iterative method for solving linear systems, which can be considered as a modification of the Gauss-Seidel method. In this paper, we show that this is a special case from a point of view of projection techniques. And a different approach is established, which is both theoretically and numerically proven to be better than (at least the same as) Ujevic's. As the presented numerical examples show, in most cases, the convergence rate is more than one and a half that of Ujevic.
New low noise CCD cameras for Pi-of-the-Sky project
NASA Astrophysics Data System (ADS)
Kasprowicz, G.; Czyrkowski, H.; Dabrowski, R.; Dominik, W.; Mankiewicz, L.; Pozniak, K.; Romaniuk, R.; Sitek, P.; Sokolowski, M.; Sulej, R.; Uzycki, J.; Wrochna, G.
2006-10-01
Modern research trends require observation of fainter and fainter astronomical objects on large areas of the sky. This implies usage of systems with high temporal and optical resolution with computer based data acquisition and processing. Therefore Charge Coupled Devices (CCD) became so popular. They offer quick picture conversion with much better quality than film based technologies. This work is theoretical and practical study of the CCD based picture acquisition system. The system was optimized for "Pi of The Sky" project. But it can be adapted to another professional astronomical researches. The work includes issue of picture conversion, signal acquisition, data transfer and mechanical construction of the device.
A pilot trial of a telecommunications system in sleep apnea management.
DeMolles, Deborah A; Sparrow, David; Gottlieb, Daniel J; Friedman, Robert
2004-08-01
Continuous positive airway pressure (CPAP) is an effective therapy for obstructive sleep apnea syndrome (OSAS), although many patients have difficulty adhering to this therapy. The purpose of this study was to investigate the effectiveness of totally automated telephone technology in improving adherence to prescribed CPAP therapy. This pilot study was a randomized clinical trial in 30 patients being started on CPAP therapy for OSAS. Patients were randomly assigned to use of a computer telephone system designed to improve CPAP adherence (telephone-linked communications for CPAP [TLC-CPAP]) in addition to usual care (n = 15) or to usual care alone (n = 15) for a period of 2 months. TLC-CPAP is a computer-based system that monitors patients' self-reported behavior and provides education and reinforcement through a structured dialogue. A sleep symptoms checklist and the Functional Outcomes of Sleep Questionnaire were administered at study entry and at 2-month follow up. Hours of CPAP use at effective mask pressure were measured by the CPAP device, stored in its memory, and retrieved at the 2-month visit. At 2 months, patients randomized to TLC-CPAP had fewer reported sleep-related symptoms (9.4 vs. 13.4, P = 0.047) than those receiving usual care. The average nightly CPAP use in the TLC-CPAP group was 4.4 hours compared with 2.9 hours (P = 0.076) in the usual-care group. This pilot study suggests that patients with OSAS started on CPAP and a concurrently administered automated education and counseling system had better CPAP adherence and better control of OSAS symptoms.
Lessons about Virtual-Environment Software Systems from 20 years of VE building
Taylor, Russell M.; Jerald, Jason; VanderKnyff, Chris; Wendt, Jeremy; Borland, David; Marshburn, David; Sherman, William R.; Whitton, Mary C.
2010-01-01
What are desirable and undesirable features of virtual-environment (VE) software architectures? What should be present (and absent) from such systems if they are to be optimally useful? How should they be structured? To help answer these questions we present experience from application designers, toolkit designers, and VE system architects along with examples of useful features from existing systems. Topics are organized under the major headings of: 3D space management, supporting display hardware, interaction, event management, time management, computation, portability, and the observation that less can be better. Lessons learned are presented as discussion of the issues, field experiences, nuggets of knowledge, and case studies. PMID:20567602
User participation in the development of the human/computer interface for control centers
NASA Technical Reports Server (NTRS)
Broome, Richard; Quick-Campbell, Marlene; Creegan, James; Dutilly, Robert
1996-01-01
Technological advances coupled with the requirements to reduce operations staffing costs led to the demand for efficient, technologically-sophisticated mission operations control centers. The control center under development for the earth observing system (EOS) is considered. The users are involved in the development of a control center in order to ensure that it is cost-efficient and flexible. A number of measures were implemented in the EOS program in order to encourage user involvement in the area of human-computer interface development. The following user participation exercises carried out in relation to the system analysis and design are described: the shadow participation of the programmers during a day of operations; the flight operations personnel interviews; and the analysis of the flight operations team tasks. The user participation in the interface prototype development, the prototype evaluation, and the system implementation are reported on. The involvement of the users early in the development process enables the requirements to be better understood and the cost to be reduced.
I/O Router Placement and Fine-Grained Routing on Titan to Support Spider II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ezell, Matthew A; Dillow, David; Oral, H Sarp
2014-01-01
The Oak Ridge Leadership Computing Facility (OLCF) introduced the concept of Fine-Grained Routing in 2008 to improve I/O performance between the Jaguar supercomputer and Spider, OLCF s center-wide Lustre file system. Fine-grained routing organizes I/O paths to minimize congestion. Jaguar has since been upgraded to Titan, providing more than a ten-fold improvement in peak performance. To support the center s increased computational capacity and I/O demand, the Spider file system has been replaced with Spider II. Building on the lessons learned from Spider, an improved method for placing LNET routers was developed and implemented for Spider II. The fine-grained routingmore » scripts and configuration have been updated to provide additional optimizations and better match the system setup. This paper presents a brief history of fine-grained routing at OLCF, an introduction to the architectures of Titan and Spider II, methods for placing routers in Titan, and details about the fine-grained routing configuration.« less
Structure and Dynamics of Quasi-Ordered Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eckert, J.; Redondo, A.; Henson, N.J.
1999-07-09
The functionality of many materials of both fundamental and technological interest is often critically dependent on the nature and extent of any disorder that may be present. In addition, it is often difficult to understand the nature of disorder in quite well ordered systems. There is therefore an urgent need to develop better tools, both experimental and computational, for the study of such quasi-ordered systems. To this end, the authors have used neutron diffraction studies in an attempt to locate small metal clusters or molecules randomly distributed inside microporous catalytic materials. Specifically, they have used pair distribution function (PDF) analysis,more » as well as inelastic neutron scattering (INS) spectroscopy, to study interactions between adsorbate molecules and a microporous matrix. They have interfaced these experimental studies with computations of PDF analysis as well as modeling of the dynamics of adsorbates. These techniques will be invaluable in elucidating the local structure and function of many of these classes of materials.« less
A power-efficient ZF precoding scheme for multi-user indoor visible light communication systems
NASA Astrophysics Data System (ADS)
Zhao, Qiong; Fan, Yangyu; Deng, Lijun; Kang, Bochao
2017-02-01
In this study, we propose a power-efficient ZF precoding scheme for visible light communication (VLC) downlink multi-user multiple-input-single-output (MU-MISO) systems, which incorporates the zero-forcing (ZF) and the characteristics of VLC systems. The main idea of this scheme is that the channel matrix used to perform pseudoinverse comes from the set of optical Access Points (APs) shared by more than one user, instead of the set of all involved serving APs as the existing ZF precoding schemes often used. By doing this, the waste of power, which is caused by the transmission of one user's data in the un-serving APs, can be avoided. In addition, the size of the channel matrix needs to perform pseudoinverse becomes smaller, which helps to reduce the computation complexity. Simulation results in two scenarios show that the proposed ZF precoding scheme has higher power efficiency, better bit error rate (BER) performance and lower computation complexity compared with traditional ZF precoding schemes.
Lensfree Computational Microscopy Tools and their Biomedical Applications
NASA Astrophysics Data System (ADS)
Sencan, Ikbal
Conventional microscopy has been a revolutionary tool for biomedical applications since its invention several centuries ago. Ability to non-destructively observe very fine details of biological objects in real time enabled to answer many important questions about their structures and functions. Unfortunately, most of these advance microscopes are complex, bulky, expensive, and/or hard to operate, so they could not reach beyond the walls of well-equipped laboratories. Recent improvements in optoelectronic components and computational methods allow creating imaging systems that better fulfill the specific needs of clinics or research related biomedical applications. In this respect, lensfree computational microscopy aims to replace bulky and expensive optical components with compact and cost-effective alternatives through the use of computation, which can be particularly useful for lab-on-a-chip platforms as well as imaging applications in low-resource settings. Several high-throughput on-chip platforms are built with this approach for applications including, but not limited to, cytometry, micro-array imaging, rare cell analysis, telemedicine, and water quality screening. The lack of optical complexity in these lensfree on-chip imaging platforms is compensated by using computational techniques. These computational methods are utilized for various purposes in coherent, incoherent and fluorescent on-chip imaging platforms e.g. improving the spatial resolution, to undo the light diffraction without using lenses, localization of objects in a large volume and retrieval of the phase or the color/spectral content of the objects. For instance, pixel super resolution approaches based on source shifting are used in lensfree imaging platforms to prevent under sampling, Bayer pattern, and aliasing artifacts. Another method, iterative phase retrieval, is utilized to compensate the lack of lenses by undoing the diffraction and removing the twin image noise of in-line holograms. This technique enables recovering the complex optical field from its intensity measurement(s) by using additional constraints in iterations, such as spatial boundaries and other known properties of objects. Another computational tool employed in lensfree imaging is compressive sensing (or decoding), which is a novel method taking advantage of the fact that natural signals/objects are mostly sparse or compressible in known bases. This inherent property of objects enables better signal recovery when the number of measurement is low, even below the Nyquist rate, and increases the additive noise immunity of the system.
Anderson, T.W.
1980-01-01
The U.S. Geological Survey has started a 4-year study of the alluvial basins in south-central Arizona and parts of California , Nevada, and New Mexico to describe the hydrologic setting, available groundwater resources, and effects of historical development on the groundwater system. To aid in the study, mathematical models of selected basins will be developed for appraising local and regional flow systems. Major components necessary to accomplish the study objectives include the accumulation of existing data on groundwater quantity and quality, entering the data into a computer file, identification of data deficiencies, and development of a program to remedy the deficiencies by collection of additional data. The approach to the study will be to develop and calibrate models of selected basins for which sufficient data exist and to develop interpretation-transfer techniques whereby general predevelopment and postdevelopment conceptual models of the hydrologic system in other basins may be synthesized. The end result of the project will be a better definition of the hydrologic parameters and a better understanding of the workings of the hydrologic system. The models can be used to study the effects of management alternatives and water-resources development on the system. (USGS)
The Technology Acceptance of a TV Platform for the Elderly Living Alone or in Public Nursing Homes
Santana-Mancilla, Pedro C.; Anido-Rifón, Luis E.
2017-01-01
In Mexico, many seniors are alone for most of the day or live in public nursing homes. Simple interaction with computer systems is required for older people. This is why we propose the exploration of a medium well known by seniors, such as the television (TV). The primary objective of this study is to improve the quality of life of seniors through an easier reminder system, using the television set. A technological platform was designed based on interactive television, through which seniors and their caregivers can have a better way to track their daily activities. Finally, an evaluation of the technology adoption was performed with 50 seniors living in two public nursing homes. The evaluation found that the elderly perceived the system as useful, easy to use, and they had a positive attitude and good intention to use it. This helped to generate initial evidence that the system supported them in achieving a better quality of life, by reminding them to take their medications and increasing their rate of attendance to their medical appointments. PMID:28594386
The Technology Acceptance of a TV Platform for the Elderly Living Alone or in Public Nursing Homes.
Santana-Mancilla, Pedro C; Anido-Rifón, Luis E
2017-06-08
In Mexico, many seniors are alone for most of the day or live in public nursing homes. Simple interaction with computer systems is required for older people. This is why we propose the exploration of a medium well known by seniors, such as the television (TV). The primary objective of this study is to improve the quality of life of seniors through an easier reminder system, using the television set. A technological platform was designed based on interactive television, through which seniors and their caregivers can have a better way to track their daily activities. Finally, an evaluation of the technology adoption was performed with 50 seniors living in two public nursing homes. The evaluation found that the elderly perceived the system as useful, easy to use, and they had a positive attitude and good intention to use it. This helped to generate initial evidence that the system supported them in achieving a better quality of life, by reminding them to take their medications and increasing their rate of attendance to their medical appointments.
A Method for Decentralised Optimisation in Networks
NASA Astrophysics Data System (ADS)
Saramäki, Jari
2005-06-01
We outline a method for distributed Monte Carlo optimisation of computational problems in networks of agents, such as peer-to-peer networks of computers. The optimisation and messaging procedures are inspired by gossip protocols and epidemic data dissemination, and are decentralised, i.e. no central overseer is required. In the outlined method, each agent follows simple local rules and seeks for better solutions to the optimisation problem by Monte Carlo trials, as well as by querying other agents in its local neighbourhood. With proper network topology, good solutions spread rapidly through the network for further improvement. Furthermore, the system retains its functionality even in realistic settings where agents are randomly switched on and off.
Guidance of visual attention by semantic information in real-world scenes
Wu, Chia-Chien; Wick, Farahnaz Ahmed; Pomplun, Marc
2014-01-01
Recent research on attentional guidance in real-world scenes has focused on object recognition within the context of a scene. This approach has been valuable for determining some factors that drive the allocation of visual attention and determine visual selection. This article provides a review of experimental work on how different components of context, especially semantic information, affect attentional deployment. We review work from the areas of object recognition, scene perception, and visual search, highlighting recent studies examining semantic structure in real-world scenes. A better understanding on how humans parse scene representations will not only improve current models of visual attention but also advance next-generation computer vision systems and human-computer interfaces. PMID:24567724
The effect of feature selection methods on computer-aided detection of masses in mammograms
NASA Astrophysics Data System (ADS)
Hupse, Rianne; Karssemeijer, Nico
2010-05-01
In computer-aided diagnosis (CAD) research, feature selection methods are often used to improve generalization performance of classifiers and shorten computation times. In an application that detects malignant masses in mammograms, we investigated the effect of using a selection criterion that is similar to the final performance measure we are optimizing, namely the mean sensitivity of the system in a predefined range of the free-response receiver operating characteristics (FROC). To obtain the generalization performance of the selected feature subsets, a cross validation procedure was performed on a dataset containing 351 abnormal and 7879 normal regions, each region providing a set of 71 mass features. The same number of noise features, not containing any information, were added to investigate the ability of the feature selection algorithms to distinguish between useful and non-useful features. It was found that significantly higher performances were obtained using feature sets selected by the general test statistic Wilks' lambda than using feature sets selected by the more specific FROC measure. Feature selection leads to better performance when compared to a system in which all features were used.
Menzies, Kevin
2014-08-13
The growth in simulation capability over the past 20 years has led to remarkable changes in the design process for gas turbines. The availability of relatively cheap computational power coupled to improvements in numerical methods and physical modelling in simulation codes have enabled the development of aircraft propulsion systems that are more powerful and yet more efficient than ever before. However, the design challenges are correspondingly greater, especially to reduce environmental impact. The simulation requirements to achieve a reduced environmental impact are described along with the implications of continued growth in available computational power. It is concluded that achieving the environmental goals will demand large-scale multi-disciplinary simulations requiring significantly increased computational power, to enable optimization of the airframe and propulsion system over the entire operational envelope. However even with massive parallelization, the limits imposed by communications latency will constrain the time required to achieve a solution, and therefore the position of such large-scale calculations in the industrial design process. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
NASA Astrophysics Data System (ADS)
Stoykov, S.; Atanassov, E.; Margenov, S.
2016-10-01
Many of the scientific applications involve sparse or dense matrix operations, such as solving linear systems, matrix-matrix products, eigensolvers, etc. In what concerns structural nonlinear dynamics, the computations of periodic responses and the determination of stability of the solution are of primary interest. Shooting method iswidely used for obtaining periodic responses of nonlinear systems. The method involves simultaneously operations with sparse and dense matrices. One of the computationally expensive operations in the method is multiplication of sparse by dense matrices. In the current work, a new algorithm for sparse matrix by dense matrix products is presented. The algorithm takes into account the structure of the sparse matrix, which is obtained by space discretization of the nonlinear Mindlin's plate equation of motion by the finite element method. The algorithm is developed to use the vector engine of Intel Xeon Phi coprocessors. It is compared with the standard sparse matrix by dense matrix algorithm and the one developed by Intel MKL and it is shown that by considering the properties of the sparse matrix better algorithms can be developed.
Explicit Content Caching at Mobile Edge Networks with Cross-Layer Sensing
Chen, Lingyu; Su, Youxing; Luo, Wenbin; Hong, Xuemin; Shi, Jianghong
2018-01-01
The deployment density and computational power of small base stations (BSs) are expected to increase significantly in the next generation mobile communication networks. These BSs form the mobile edge network, which is a pervasive and distributed infrastructure that can empower a variety of edge/fog computing applications. This paper proposes a novel edge-computing application called explicit caching, which stores selective contents at BSs and exposes such contents to local users for interactive browsing and download. We formulate the explicit caching problem as a joint content recommendation, caching, and delivery problem, which aims to maximize the expected user quality-of-experience (QoE) with varying degrees of cross-layer sensing capability. Optimal and effective heuristic algorithms are presented to solve the problem. The theoretical performance bounds of the explicit caching system are derived in simplified scenarios. The impacts of cache storage space, BS backhaul capacity, cross-layer information, and user mobility on the system performance are simulated and discussed in realistic scenarios. Results suggest that, compared with conventional implicit caching schemes, explicit caching can better exploit the mobile edge network infrastructure for personalized content dissemination. PMID:29565313
Explicit Content Caching at Mobile Edge Networks with Cross-Layer Sensing.
Chen, Lingyu; Su, Youxing; Luo, Wenbin; Hong, Xuemin; Shi, Jianghong
2018-03-22
The deployment density and computational power of small base stations (BSs) are expected to increase significantly in the next generation mobile communication networks. These BSs form the mobile edge network, which is a pervasive and distributed infrastructure that can empower a variety of edge/fog computing applications. This paper proposes a novel edge-computing application called explicit caching, which stores selective contents at BSs and exposes such contents to local users for interactive browsing and download. We formulate the explicit caching problem as a joint content recommendation, caching, and delivery problem, which aims to maximize the expected user quality-of-experience (QoE) with varying degrees of cross-layer sensing capability. Optimal and effective heuristic algorithms are presented to solve the problem. The theoretical performance bounds of the explicit caching system are derived in simplified scenarios. The impacts of cache storage space, BS backhaul capacity, cross-layer information, and user mobility on the system performance are simulated and discussed in realistic scenarios. Results suggest that, compared with conventional implicit caching schemes, explicit caching can better exploit the mobile edge network infrastructure for personalized content dissemination.
When Everybody Anticipates in a Different Way …
NASA Astrophysics Data System (ADS)
Kindler, Eugene
2002-09-01
The paper is oriented to the computer modeling of anticipatory systems in which there are more than one anticipating individuals. The anticipating of each of them can mutually differ. In such a case we can meet four main cases: (1) the anticipating persons make a dialogue to access some agreement and by such a way they can optimize the anticipation, (2) one of the anticipating persons is a teacher of the other ones and can show them where they had to be better in their anticipation, (3) the anticipating persons compete, each of them expecting to make the best anticipation and wishes to apply it in order to make the other ones weaker, (4) the anticipating persons do not mutually communicate. A human often anticipates so that he imagines the possible processes of the future and so he performs a certain "mental simulation", but nowadays a human uses computer simulation to replace that (insufficient) mental simulation. All the variants were simulated so that the human imagining was transferred to a computer simulation. Thus systems containing several simulating elements were simulated. Experiences with that "nested" simulation and applications of it are described.
The SAMPL4 host-guest blind prediction challenge: an overview.
Muddana, Hari S; Fenley, Andrew T; Mobley, David L; Gilson, Michael K
2014-04-01
Prospective validation of methods for computing binding affinities can help assess their predictive power and thus set reasonable expectations for their performance in drug design applications. Supramolecular host-guest systems are excellent model systems for testing such affinity prediction methods, because their small size and limited conformational flexibility, relative to proteins, allows higher throughput and better numerical convergence. The SAMPL4 prediction challenge therefore included a series of host-guest systems, based on two hosts, cucurbit[7]uril and octa-acid. Binding affinities in aqueous solution were measured experimentally for a total of 23 guest molecules. Participants submitted 35 sets of computational predictions for these host-guest systems, based on methods ranging from simple docking, to extensive free energy simulations, to quantum mechanical calculations. Over half of the predictions provided better correlations with experiment than two simple null models, but most methods underperformed the null models in terms of root mean squared error and linear regression slope. Interestingly, the overall performance across all SAMPL4 submissions was similar to that for the prior SAMPL3 host-guest challenge, although the experimentalists took steps to simplify the current challenge. While some methods performed fairly consistently across both hosts, no single approach emerged as consistent top performer, and the nonsystematic nature of the various submissions made it impossible to draw definitive conclusions regarding the best choices of energy models or sampling algorithms. Salt effects emerged as an issue in the calculation of absolute binding affinities of cucurbit[7]uril-guest systems, but were not expected to affect the relative affinities significantly. Useful directions for future rounds of the challenge might involve encouraging participants to carry out some calculations that replicate each others' studies, and to systematically explore parameter options.
Processing Diabetes Mellitus Composite Events in MAGPIE.
Brugués, Albert; Bromuri, Stefano; Barry, Michael; Del Toro, Óscar Jiménez; Mazurkiewicz, Maciej R; Kardas, Przemyslaw; Pegueroles, Josep; Schumacher, Michael
2016-02-01
The focus of this research is in the definition of programmable expert Personal Health Systems (PHS) to monitor patients affected by chronic diseases using agent oriented programming and mobile computing to represent the interactions happening amongst the components of the system. The paper also discusses issues of knowledge representation within the medical domain when dealing with temporal patterns concerning the physiological values of the patient. In the presented agent based PHS the doctors can personalize for each patient monitoring rules that can be defined in a graphical way. Furthermore, to achieve better scalability, the computations for monitoring the patients are distributed among their devices rather than being performed in a centralized server. The system is evaluated using data of 21 diabetic patients to detect temporal patterns according to a set of monitoring rules defined. The system's scalability is evaluated by comparing it with a centralized approach. The evaluation concerning the detection of temporal patterns highlights the system's ability to monitor chronic patients affected by diabetes. Regarding the scalability, the results show the fact that an approach exploiting the use of mobile computing is more scalable than a centralized approach. Therefore, more likely to satisfy the needs of next generation PHSs. PHSs are becoming an adopted technology to deal with the surge of patients affected by chronic illnesses. This paper discusses architectural choices to make an agent based PHS more scalable by using a distributed mobile computing approach. It also discusses how to model the medical knowledge in the PHS in such a way that it is modifiable at run time. The evaluation highlights the necessity of distributing the reasoning to the mobile part of the system and that modifiable rules are able to deal with the change in lifestyle of the patients affected by chronic illnesses.
NASA Astrophysics Data System (ADS)
Ballestra, Luca Vincenzo; Pacelli, Graziella; Radi, Davide
2016-12-01
We propose a numerical method to compute the first-passage probability density function in a time-changed Brownian model. In particular, we derive an integral representation of such a density function in which the integrand functions must be obtained solving a system of Volterra equations of the first kind. In addition, we develop an ad-hoc numerical procedure to regularize and solve this system of integral equations. The proposed method is tested on three application problems of interest in mathematical finance, namely the calculation of the survival probability of an indebted firm, the pricing of a single-knock-out put option and the pricing of a double-knock-out put option. The results obtained reveal that the novel approach is extremely accurate and fast, and performs significantly better than the finite difference method.
Distributed computing feasibility in a non-dedicated homogeneous distributed system
NASA Technical Reports Server (NTRS)
Leutenegger, Scott T.; Sun, Xian-He
1993-01-01
The low cost and availability of clusters of workstations have lead researchers to re-explore distributed computing using independent workstations. This approach may provide better cost/performance than tightly coupled multiprocessors. In practice, this approach often utilizes wasted cycles to run parallel jobs. The feasibility of such a non-dedicated parallel processing environment assuming workstation processes have preemptive priority over parallel tasks is addressed. An analytical model is developed to predict parallel job response times. Our model provides insight into how significantly workstation owner interference degrades parallel program performance. A new term task ratio, which relates the parallel task demand to the mean service demand of nonparallel workstation processes, is introduced. It was proposed that task ratio is a useful metric for determining how large the demand of a parallel applications must be in order to make efficient use of a non-dedicated distributed system.
Computation of water hammer protection of modernized pumping station
NASA Astrophysics Data System (ADS)
Himr, Daniel
2014-03-01
Pumping station supplies water for irrigation. Maximal capacity 2 × 1.2m3·s-1 became insufficient, thus it was upgraded to 2 × 2m3·s-1. Paper is focused on design of protection against water hammer in case of sudden pumps trip. Numerical simulation of the most dangerous case (when pumps are giving the maximal flow rate) showed that existing air vessels were not able to protect the system and it would be necessary to add new vessels. Special care was paid to influence of their connection to the main pipeline, because the resistance of the connection has a significant impact on the scale of pressure pulsations. Finally, the pump trip was performed to verify if the system worked correctly. The test showed that pressure pulsations are lower (better) than computation predicted. This discrepancy was further analysed.
Kalantari, Mohammad Hassan; Ghoraishian, Seyed Ahmad; Mohaghegh, Mina
2017-01-01
The aim of this in vitro study was to evaluate the accuracy of shade matching using two spectrophotometric devices. Thirteen patients who require a full coverage restoration for one of their maxillary central incisors were selected while the adjacent central incisor was intact. 3 same frameworks were constructed for each tooth using computer-aided design and computer-aided manufacturing technology. Shade matching was performed using Vita Easyshade spectrophotometer, Shadepilot spectrophotometer, and Vitapan classical shade guide for the first, second, and third crown subsequently. After application, firing, and glazing of the porcelain, the color was evaluated and scored by five inspectors. Both spectrophotometric systems showed significantly better results than visual method ( P < 0.05) while there were no significant differences between Vita Easyshade and Shadepilot spectrophotometers ( P < 0.05). Spectrophotometers are a good substitute for visual color selection methods.
Kalantari, Mohammad Hassan; Ghoraishian, Seyed Ahmad; Mohaghegh, Mina
2017-01-01
Objective: The aim of this in vitro study was to evaluate the accuracy of shade matching using two spectrophotometric devices. Materials and Methods: Thirteen patients who require a full coverage restoration for one of their maxillary central incisors were selected while the adjacent central incisor was intact. 3 same frameworks were constructed for each tooth using computer-aided design and computer-aided manufacturing technology. Shade matching was performed using Vita Easyshade spectrophotometer, Shadepilot spectrophotometer, and Vitapan classical shade guide for the first, second, and third crown subsequently. After application, firing, and glazing of the porcelain, the color was evaluated and scored by five inspectors. Results: Both spectrophotometric systems showed significantly better results than visual method (P < 0.05) while there were no significant differences between Vita Easyshade and Shadepilot spectrophotometers (P < 0.05). Conclusion: Spectrophotometers are a good substitute for visual color selection methods. PMID:28729792
NASA Technical Reports Server (NTRS)
Voecks, G. E.
1983-01-01
Insufficient theoretical definition of heterogeneous catalysts is the major difficulty confronting industrial suppliers who seek catalyst systems which are more active, selective, and stable than those currently available. In contrast, progress was made in tailoring homogeneous catalysts to specific reactions because more is known about the reaction intermediates promoted and/or stabilized by these catalysts during the course of reaction. However, modeling heterogeneous catalysts on a microscopic scale requires compiling and verifying complex information on reaction intermediates and pathways. This can be achieved by adapting homogeneous catalyzed reaction intermediate species, applying theoretical quantum chemistry and computer technology, and developing a better understanding of heterogeneous catalyst system environments. Research in microscopic reaction modeling is now at a stage where computer modeling, supported by physical experimental verification, could provide information about the dynamics of the reactions that will lead to designing supported catalysts with improved selectivity and stability.
On the duality of resilience and privacy†.
Crowcroft, Jon
2015-03-08
Protecting information has long been an important problem. We would like to protect ourselves from the risk of loss: think of the library of Alexandria; and from unauthorized access: consider the very business of the 'Scandal Sheets', going back centuries. This has never been more true than today when vast quantities of data (dare one say lesser quantities of information) are stored on computer systems, and routinely moved around the Internet, at almost no cost. Computer and communication systems are both fragile and vulnerable, and so the risk of catastrophic loss or theft is potentially much higher. A single keystroke can delete a public database, or expose a private dataset to the world. In this paper, I consider the problems of providing resilience against loss, and against unacceptable access as a dual . Here, we see that two apparently different solutions to different technical problems may be transformed into one another, and hence give better insight into both problems.
NASA Technical Reports Server (NTRS)
Majda, George
1986-01-01
One-leg and multistep discretizations of variable-coefficient linear systems of ODEs having both slow and fast time scales are investigated analytically. The stability properties of these discretizations are obtained independent of ODE stiffness and compared. The results of numerical computations are presented in tables, and it is shown that for large step sizes the stability of one-leg methods is better than that of the corresponding linear multistep methods.
User Expectations: Nurses' Perspective.
Gürsel, Güney
2016-01-01
Healthcare is a technology-intensive industry. Although all healthcare staff needs qualified computer support, physicians and nurses need more. As nursing practice is an information intensive issue, understanding nurses' expectations from healthcare information systems (HCIS) is a must issue to meet their needs and help them in a better way. In this study perceived importance of nurses' expectations from HCIS is investigated, and two HCIS is evaluated for meeting the expectations of nurses by using fuzzy logic methodologies.
Clinical Information Systems - From Yesterday to Tomorrow.
Gardner, R M
2016-06-30
To review the history of clinical information systems over the past twenty-five years and project anticipated changes to those systems over the next twenty-five years. Over 250 Medline references about clinical information systems, quality of patient care, and patient safety were reviewed. Books, Web resources, and the author's personal experience with developing the HELP system were also used. There have been dramatic improvements in the use and acceptance of clinical computing systems and Electronic Health Records (EHRs), especially in the United States. Although there are still challenges with the implementation of such systems, the rate of progress has been remarkable. Over the next twenty-five years, there will remain many important opportunities and challenges. These opportunities include understanding complex clinical computing issues that must be studied, understood and optimized. Dramatic improvements in quality of care and patient safety must be anticipated as a result of the use of clinical information systems. These improvements will result from a closer involvement of clinical informaticians in the optimization of patient care processes. Clinical information systems and computerized clinical decision support have made contributions to medicine in the past. Therefore, by using better medical knowledge, optimized clinical information systems, and computerized clinical decision, we will enable dramatic improvements in both the quality and safety of patient care in the next twenty-five years.
Clustering single cells: a review of approaches on high-and low-depth single-cell RNA-seq data.
Menon, Vilas
2017-12-11
Advances in single-cell RNA-sequencing technology have resulted in a wealth of studies aiming to identify transcriptomic cell types in various biological systems. There are multiple experimental approaches to isolate and profile single cells, which provide different levels of cellular and tissue coverage. In addition, multiple computational strategies have been proposed to identify putative cell types from single-cell data. From a data generation perspective, recent single-cell studies can be classified into two groups: those that distribute reads shallowly over large numbers of cells and those that distribute reads more deeply over a smaller cell population. Although there are advantages to both approaches in terms of cellular and tissue coverage, it is unclear whether different computational cell type identification methods are better suited to one or the other experimental paradigm. This study reviews three cell type clustering algorithms, each representing one of three broad approaches, and finds that PCA-based algorithms appear most suited to low read depth data sets, whereas gene clustering-based and biclustering algorithms perform better on high read depth data sets. In addition, highly related cell classes are better distinguished by higher-depth data, given the same total number of reads; however, simultaneous discovery of distinct and similar types is better served by lower-depth, higher cell number data. Overall, this study suggests that the depth of profiling should be determined by initial assumptions about the diversity of cells in the population, and that the selection of clustering algorithm(s) is subsequently based on the depth of profiling will allow for better identification of putative transcriptomic cell types. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Fractional Order PIλDμ Control for Maglev Guiding System
NASA Astrophysics Data System (ADS)
Hu, Qing; Hu, Yuwei
To effectively suppress the external disturbances and parameter perturbation problem of the maglev guiding system, and improve speed and robustness, the electromagnetic guiding system is exactly linearized using state feedback method, Fractional calculus theory is introduced, the order of integer order PID control was extended to the field of fractional, then fractional order PIλDμ Controller was presented, Due to the extra two adjustable parameters compared with traditional PID controller, fractional order PIλDμ controllers were expected to show better control performance. The results of the computer simulation show that the proposed controller suppresses the external disturbances and parameter perturbation of the system effectively; the system response speed was increased; at the same time, it had flexible structure and stronger robustness.
Using 3D computer simulations to enhance ophthalmic training.
Glittenberg, C; Binder, S
2006-01-01
To develop more effective methods of demonstrating and teaching complex topics in ophthalmology with the use of computer aided three-dimensional (3D) animation and interactive multimedia technologies. We created 3D animations and interactive computer programmes demonstrating the neuroophthalmological nature of the oculomotor system, including the anatomy, physiology and pathophysiology of the extra-ocular eye muscles and the oculomotor cranial nerves, as well as pupillary symptoms of neurological diseases. At the University of Vienna we compared their teaching effectiveness to conventional teaching methods in a comparative study involving 100 medical students, a multiple choice exam and a survey. The comparative study showed that our students achieved significantly better test results (80%) than the control group (63%) (diff. = 17 +/- 5%, p = 0.004). The survey showed a positive reaction to the software and a strong preference to have more subjects and techniques demonstrated in this fashion. Three-dimensional computer animation technology can significantly increase the quality and efficiency of the education and demonstration of complex topics in ophthalmology.
Bacterial computing: a form of natural computing and its applications.
Lahoz-Beltra, Rafael; Navarro, Jorge; Marijuán, Pedro C
2014-01-01
The capability to establish adaptive relationships with the environment is an essential characteristic of living cells. Both bacterial computing and bacterial intelligence are two general traits manifested along adaptive behaviors that respond to surrounding environmental conditions. These two traits have generated a variety of theoretical and applied approaches. Since the different systems of bacterial signaling and the different ways of genetic change are better known and more carefully explored, the whole adaptive possibilities of bacteria may be studied under new angles. For instance, there appear instances of molecular "learning" along the mechanisms of evolution. More in concrete, and looking specifically at the time dimension, the bacterial mechanisms of learning and evolution appear as two different and related mechanisms for adaptation to the environment; in somatic time the former and in evolutionary time the latter. In the present chapter it will be reviewed the possible application of both kinds of mechanisms to prokaryotic molecular computing schemes as well as to the solution of real world problems.
Knowledge acquisition, semantic text mining, and security risks in health and biomedical informatics
Huang, Jingshan; Dou, Dejing; Dang, Jiangbo; Pardue, J Harold; Qin, Xiao; Huan, Jun; Gerthoffer, William T; Tan, Ming
2012-01-01
Computational techniques have been adopted in medical and biological systems for a long time. There is no doubt that the development and application of computational methods will render great help in better understanding biomedical and biological functions. Large amounts of datasets have been produced by biomedical and biological experiments and simulations. In order for researchers to gain knowledge from original data, nontrivial transformation is necessary, which is regarded as a critical link in the chain of knowledge acquisition, sharing, and reuse. Challenges that have been encountered include: how to efficiently and effectively represent human knowledge in formal computing models, how to take advantage of semantic text mining techniques rather than traditional syntactic text mining, and how to handle security issues during the knowledge sharing and reuse. This paper summarizes the state-of-the-art in these research directions. We aim to provide readers with an introduction of major computing themes to be applied to the medical and biological research. PMID:22371823
Bacterial computing: a form of natural computing and its applications
Lahoz-Beltra, Rafael; Navarro, Jorge; Marijuán, Pedro C.
2014-01-01
The capability to establish adaptive relationships with the environment is an essential characteristic of living cells. Both bacterial computing and bacterial intelligence are two general traits manifested along adaptive behaviors that respond to surrounding environmental conditions. These two traits have generated a variety of theoretical and applied approaches. Since the different systems of bacterial signaling and the different ways of genetic change are better known and more carefully explored, the whole adaptive possibilities of bacteria may be studied under new angles. For instance, there appear instances of molecular “learning” along the mechanisms of evolution. More in concrete, and looking specifically at the time dimension, the bacterial mechanisms of learning and evolution appear as two different and related mechanisms for adaptation to the environment; in somatic time the former and in evolutionary time the latter. In the present chapter it will be reviewed the possible application of both kinds of mechanisms to prokaryotic molecular computing schemes as well as to the solution of real world problems. PMID:24723912
Computational study of arc discharges: Spark plug and railplug ignitors
NASA Astrophysics Data System (ADS)
Ekici, Ozgur
A theoretical study of electrical arc discharges that focuses on the discharge processes in spark plug and railplug ignitors is presented. The aim of the study is to gain a better understanding of the dynamics of electrical discharges, more specifically the transfer of electrical energy into the gas and the effect of this energy transfer on the flow physics. Different levels of computational models are presented to investigate the types of arc discharges seen in spark plugs and railplugs (i.e., stationary and moving arc discharges). Better understanding of discharge physics is important for a number of applications. For example, improved fuel economy under the constraint of stricter emissions standards and improved plug durability are important objectives of current internal combustion engine designs. These goals can be achieved by improving the existing systems (spark plug) and introducing more sophisticated ignition systems (railplug). In spite of the fact spark plug and railplug ignitors are the focus of this work, the methods presented in this work can be extended to study the discharges found in other applications such as plasma torches, laser sparks, and circuit breakers. The system of equations describing the physical processes in an air plasma is solved using computational fluid dynamics codes to simulate thermal and flow fields. The evolution of the shock front, temperature, pressure, density, and flow of a plasma kernel were investigated for both stationary and moving arcs. Arc propagation between the electrodes under the effects of gas dynamics and electromagnetic processes was studied for moving arcs. The air plasma is regarded as a continuum, single substance material in local thermal equilibrium. Thermophysical properties of high temperature air are used to take into consideration the important processes such as dissociation and ionization. The different mechanisms and the relative importance of several assumptions in gas discharges and thermal plasma modeling were investigated. Considering the complex nature of the studied problem, the computational models aid in analyzing the analytical theory and serve as relatively inexpensive tools when compared to experiments in design process.
Design of barrier bucket kicker control system
NASA Astrophysics Data System (ADS)
Ni, Fa-Fu; Wang, Yan-Yu; Yin, Jun; Zhou, De-Tai; Shen, Guo-Dong; Zheng, Yang-De.; Zhang, Jian-Chuan; Yin, Jia; Bai, Xiao; Ma, Xiao-Li
2018-05-01
The Heavy-Ion Research Facility in Lanzhou (HIRFL) contains two synchrotrons: the main cooler storage ring (CSRm) and the experimental cooler storage ring (CSRe). Beams are extracted from CSRm, and injected into CSRe. To apply the Barrier Bucket (BB) method on the CSRe beam accumulation, a new BB technology based kicker control system was designed and implemented. The controller of the system is implemented using an Advanced Reduced Instruction Set Computer (RISC) Machine (ARM) chip and a field-programmable gate array (FPGA) chip. Within the architecture, ARM is responsible for data presetting and floating number arithmetic processing. The FPGA computes the RF phase point of the two rings and offers more accurate control of the time delay. An online preliminary experiment on HIRFL was also designed to verify the functionalities of the control system. The result shows that the reference trigger point of two different sinusoidal RF signals for an arbitrary phase point was acquired with a matched phase error below 1° (approximately 2.1 ns), and the step delay time better than 2 ns were realized.
NASA Technical Reports Server (NTRS)
Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip
2011-01-01
Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potter, Kristin C; Brunhart-Lupo, Nicholas J; Bush, Brian W
We have developed a framework for the exploration, design, and planning of energy systems that combines interactive visualization with machine-learning based approximations of simulations through a general purpose dataflow API. Our system provides a visual inter- face allowing users to explore an ensemble of energy simulations representing a subset of the complex input parameter space, and spawn new simulations to 'fill in' input regions corresponding to new enegery system scenarios. Unfortunately, many energy simula- tions are far too slow to provide interactive responses. To support interactive feedback, we are developing reduced-form models via machine learning techniques, which provide statistically soundmore » esti- mates of the full simulations at a fraction of the computational cost and which are used as proxies for the full-form models. Fast com- putation and an agile dataflow enhance the engagement with energy simulations, and allow researchers to better allocate computational resources to capture informative relationships within the system and provide a low-cost method for validating and quality-checking large-scale modeling efforts.« less
AbuHassan, Kamal J; Bakhori, Noremylia M; Kusnin, Norzila; Azmi, Umi Z M; Tania, Marzia H; Evans, Benjamin A; Yusof, Nor A; Hossain, M A
2017-07-01
Tuberculosis (TB) remains one of the most devastating infectious diseases and its treatment efficiency is majorly influenced by the stage at which infection with the TB bacterium is diagnosed. The available methods for TB diagnosis are either time consuming, costly or not efficient. This study employs a signal generation mechanism for biosensing, known as Plasmonic ELISA, and computational intelligence to facilitate automatic diagnosis of TB. Plasmonic ELISA enables the detection of a few molecules of analyte by the incorporation of smart nanomaterials for better sensitivity of the developed detection system. The computational system uses k-means clustering and thresholding for image segmentation. This paper presents the results of the classification performance of the Plasmonic ELISA imaging data by using various types of classifiers. The five-fold cross-validation results show high accuracy rate (>97%) in classifying TB images using the entire data set. Future work will focus on developing an intelligent mobile-enabled expert system to diagnose TB in real-time. The intelligent system will be clinically validated and tested in collaboration with healthcare providers in Malaysia.
PRESAGE: Protecting Structured Address Generation against Soft Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation (to index large arrays) have not been widely researched. We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGE is that any addressmore » computation scheme that flows an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Enabling the flow of errors allows one to situate detectors at loop exit points, and helps turn silent corruptions into easily detectable error situations. Our experiments using PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less
PRESAGE: Protecting Structured Address Generation against Soft Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation have not been widely researched (especially in the context of indexing large arrays). We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGEmore » is that any address computation scheme that propagates an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Ensuring the propagation of errors allows one to place detectors at loop exit points and helps turn silent corruptions into easily detectable error situations. Our experiments using the PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less
Event- and Time-Driven Techniques Using Parallel CPU-GPU Co-processing for Spiking Neural Networks
Naveros, Francisco; Garrido, Jesus A.; Carrillo, Richard R.; Ros, Eduardo; Luque, Niceto R.
2017-01-01
Modeling and simulating the neural structures which make up our central neural system is instrumental for deciphering the computational neural cues beneath. Higher levels of biological plausibility usually impose higher levels of complexity in mathematical modeling, from neural to behavioral levels. This paper focuses on overcoming the simulation problems (accuracy and performance) derived from using higher levels of mathematical complexity at a neural level. This study proposes different techniques for simulating neural models that hold incremental levels of mathematical complexity: leaky integrate-and-fire (LIF), adaptive exponential integrate-and-fire (AdEx), and Hodgkin-Huxley (HH) neural models (ranged from low to high neural complexity). The studied techniques are classified into two main families depending on how the neural-model dynamic evaluation is computed: the event-driven or the time-driven families. Whilst event-driven techniques pre-compile and store the neural dynamics within look-up tables, time-driven techniques compute the neural dynamics iteratively during the simulation time. We propose two modifications for the event-driven family: a look-up table recombination to better cope with the incremental neural complexity together with a better handling of the synchronous input activity. Regarding the time-driven family, we propose a modification in computing the neural dynamics: the bi-fixed-step integration method. This method automatically adjusts the simulation step size to better cope with the stiffness of the neural model dynamics running in CPU platforms. One version of this method is also implemented for hybrid CPU-GPU platforms. Finally, we analyze how the performance and accuracy of these modifications evolve with increasing levels of neural complexity. We also demonstrate how the proposed modifications which constitute the main contribution of this study systematically outperform the traditional event- and time-driven techniques under increasing levels of neural complexity. PMID:28223930
Liu, Ping; Li, Guodong; Liu, Xinggao; Xiao, Long; Wang, Yalin; Yang, Chunhua; Gui, Weihua
2018-02-01
High quality control method is essential for the implementation of aircraft autopilot system. An optimal control problem model considering the safe aerodynamic envelop is therefore established to improve the control quality of aircraft flight level tracking. A novel non-uniform control vector parameterization (CVP) method with time grid refinement is then proposed for solving the optimal control problem. By introducing the Hilbert-Huang transform (HHT) analysis, an efficient time grid refinement approach is presented and an adaptive time grid is automatically obtained. With this refinement, the proposed method needs fewer optimization parameters to achieve better control quality when compared with uniform refinement CVP method, whereas the computational cost is lower. Two well-known flight level altitude tracking problems and one minimum time cost problem are tested as illustrations and the uniform refinement control vector parameterization method is adopted as the comparative base. Numerical results show that the proposed method achieves better performances in terms of optimization accuracy and computation cost; meanwhile, the control quality is efficiently improved. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
The advantages of the surface Laplacian in brain-computer interface research.
McFarland, Dennis J
2015-09-01
Brain-computer interface (BCI) systems frequently use signal processing methods, such as spatial filtering, to enhance performance. The surface Laplacian can reduce spatial noise and aid in identification of sources. In BCI research, these two functions of the surface Laplacian correspond to prediction accuracy and signal orthogonality. In the present study, an off-line analysis of data from a sensorimotor rhythm-based BCI task dissociated these functions of the surface Laplacian by comparing nearest-neighbor and next-nearest neighbor Laplacian algorithms. The nearest-neighbor Laplacian produced signals that were more orthogonal while the next-nearest Laplacian produced signals that resulted in better accuracy. Both prediction and signal identification are important for BCI research. Better prediction of user's intent produces increased speed and accuracy of communication and control. Signal identification is important for ruling out the possibility of control by artifacts. Identifying the nature of the control signal is relevant both to understanding exactly what is being studied and in terms of usability for individuals with limited motor control. Copyright © 2014 Elsevier B.V. All rights reserved.
Multi-Scale Computational Models for Electrical Brain Stimulation
Seo, Hyeon; Jun, Sung C.
2017-01-01
Electrical brain stimulation (EBS) is an appealing method to treat neurological disorders. To achieve optimal stimulation effects and a better understanding of the underlying brain mechanisms, neuroscientists have proposed computational modeling studies for a decade. Recently, multi-scale models that combine a volume conductor head model and multi-compartmental models of cortical neurons have been developed to predict stimulation effects on the macroscopic and microscopic levels more precisely. As the need for better computational models continues to increase, we overview here recent multi-scale modeling studies; we focused on approaches that coupled a simplified or high-resolution volume conductor head model and multi-compartmental models of cortical neurons, and constructed realistic fiber models using diffusion tensor imaging (DTI). Further implications for achieving better precision in estimating cellular responses are discussed. PMID:29123476
Summary: Experimental validation of real-time fault-tolerant systems
NASA Technical Reports Server (NTRS)
Iyer, R. K.; Choi, G. S.
1992-01-01
Testing and validation of real-time systems is always difficult to perform since neither the error generation process nor the fault propagation problem is easy to comprehend. There is no better substitute to results based on actual measurements and experimentation. Such results are essential for developing a rational basis for evaluation and validation of real-time systems. However, with physical experimentation, controllability and observability are limited to external instrumentation that can be hooked-up to the system under test. And this process is quite a difficult, if not impossible, task for a complex system. Also, to set up such experiments for measurements, physical hardware must exist. On the other hand, a simulation approach allows flexibility that is unequaled by any other existing method for system evaluation. A simulation methodology for system evaluation was successfully developed and implemented and the environment was demonstrated using existing real-time avionic systems. The research was oriented toward evaluating the impact of permanent and transient faults in aircraft control computers. Results were obtained for the Bendix BDX 930 system and Hamilton Standard EEC131 jet engine controller. The studies showed that simulated fault injection is valuable, in the design stage, to evaluate the susceptibility of computing sytems to different types of failures.
Automated Microbiological Detection/Identification System
Aldridge, C.; Jones, P. W.; Gibson, S.; Lanham, J.; Meyer, M.; Vannest, R.; Charles, R.
1977-01-01
An automated, computerized system, the AutoMicrobic System, has been developed for the detection, enumeration, and identification of bacteria and yeasts in clinical specimens. The biological basis for the system resides in lyophilized, highly selective and specific media enclosed in wells of a disposable plastic cuvette; introduction of a suitable specimen rehydrates and inoculates the media in the wells. An automated optical system monitors, and the computer interprets, changes in the media, with enumeration and identification results automatically obtained in 13 h. Sixteen different selective media were developed and tested with a variety of seeded (simulated) and clinical specimens. The AutoMicrobic System has been extensively tested with urine specimens, using a urine test kit (Identi-Pak) that contains selective media for Escherichia coli, Proteus species, Pseudomonas aeruginosa, Klebsiella-Enterobacter species, Serratia species, Citrobacter freundii, group D enterococci, Staphylococcus aureus, and yeasts (Candida species and Torulopsis glabrata). The system has been tested with 3,370 seeded urine specimens and 1,486 clinical urines. Agreement with simultaneous conventional (manual) cultures, at levels of 70,000 colony-forming units per ml (or more), was 92% or better for seeded specimens; clinical specimens yielded results of 93% or better for all organisms except P. aeruginosa, where agreement was 86%. System expansion in progress includes antibiotic susceptibility testing and compatibility with most types of clinical specimens. Images PMID:334798
Scaling to Nanotechnology Limits with the PIMS Computer Architecture and a new Scaling Rule
DOE Office of Scientific and Technical Information (OSTI.GOV)
Debenedictis, Erik P.
2015-02-01
We describe a new approach to computing that moves towards the limits of nanotechnology using a newly formulated sc aling rule. This is in contrast to the current computer industry scali ng away from von Neumann's original computer at the rate of Moore's Law. We extend Moore's Law to 3D, which l eads generally to architectures that integrate logic and memory. To keep pow er dissipation cons tant through a 2D surface of the 3D structure requires using adiabatic principles. We call our newly proposed architecture Processor In Memory and Storage (PIMS). We propose a new computational model that integratesmore » processing and memory into "tiles" that comprise logic, memory/storage, and communications functions. Since the programming model will be relatively stable as a system scales, programs repr esented by tiles could be executed in a PIMS system built with today's technology or could become the "schematic diagram" for implementation in an ultimate 3D nanotechnology of the future. We build a systems software approach that offers advantages over and above the technological and arch itectural advantages. Firs t, the algorithms may be more efficient in the conventional sens e of having fewer steps. Second, the algorithms may run with higher power efficiency per operation by being a better match for the adiabatic scaling ru le. The performance analysis based on demonstrated ideas in physical science suggests 80,000 x improvement in cost per operation for the (arguably) gene ral purpose function of emulating neurons in Deep Learning.« less
NASA Technical Reports Server (NTRS)
Evans, Austin Lewis
1988-01-01
The paper presents a computer program developed to model the steady-state performance of the tapered artery heat pipe for use in the radiator of the solar dynamic power system of the NASA Space Station. The program solves six governing equations to ascertain which one is limiting the maximum heat transfer rate of the heat pipe. The present model appeared to be slightly better than the LTV model in matching the 1-g data for the standard 15-ft test heat pipe.
Consideration of computer limitations in implementing on-line controls. M.S. Thesis
NASA Technical Reports Server (NTRS)
Roberts, G. K.
1976-01-01
A formal statement of the optimal control problem which includes the interval of dicretization as an optimization parameter, and extend this to include selection of a control algorithm as part of the optimization procedure, is formulated. The performance of the scalar linear system depends on the discretization interval. Discrete-time versions of the output feedback regulator and an optimal compensator, and the use of these results in presenting an example of a system for which fast partial-state-feedback control better minimizes a quadratic cost than either a full-state feedback control or a compensator, are developed.
1983-07-18
architecture . Design , performance, and cost of BRISC is presented. Performance is shown to be better than high end mainframes such as the IBM 3081 and Amdahl 470V/8 on integer benchmarks written in C, Pascal and LISP. The cost, conservatively estimated to be $132,400 is about the same as a high end minicomputer such as the VAX-11/780. BRISC has a CPU cycle time of 46 ns, providing a RISC I instruction execution rate of greater than 15 MIPs. BRISC is designed with a Structured Computer Aided Logic Design System (SCALD) by Valid Logic Systems. An evaluation of the utility of